After giving Parler 24 hours to introduce new content moderation policies, Apple announced today that Parler has officially been kicked off the App Store. This means the app has been removed from the App Store for new downloads.
Apple bans Parler from the App Store
This comes after Apple threatened to remove Parler yesterday evening. At the time, Apple said that Parler had 24 hours to implement measures to address the objectionable content on its platform. Apple now says that Parler failed to address these concerns in an adequate manner.
We have always supported diverse points of view being represented on the App Store, but there is no place on our platform for threats of violence and illegal activity. Parler has not taken adequate measures to address the proliferation of these threats to people’s safety. We have suspended Parler from the App Store until they resolve these issues.
The removal of Parler from the App Store will stop new users from downloading it, but the app will likely continue working for those who have already installed. However, this does mean that Parler will be unable to release updates to the app, and that future iOS updates could render it obsolete.
Below is the communication that Apple sent to the Parler developers today informing them of its intentions to kick Parler off the App Store. Apple says that Parler’s proposed solutions to address the “dangerous and harmful content” on the platform is not sufficient.
Note that Apple’s communication to Parler says that the app is banned “until we receive an update that is compliant with the App Store Review Guidelines,” which could mean the app returns in the future. Nonetheless, Parler has publicly touted that it will forever take a hands-off approach to moderation.
Thank you for your response regarding dangerous and harmful content on Parler. We have determined that the measures you describe are inadequate to address the proliferation of dangerous and objectionable content on your app.
Parler has not upheld its commitment to moderate and remove harmful or dangerous content encouraging violence and illegal activity, and is not in compliance with the App Store Review Guidelines.
In your response, you referenced that Parler has been taking this content “very seriously for weeks.” However, the processes Parler has put in place to moderate or prevent the spread of dangerous and illegal content have proved insufficient. Specifically, we have continued to find direct threats of violence and calls to incite lawless action in violation of Guideline 1.1 – Safety – Objectionable Content.
Your response also references a moderation plan “for the time being,” which does not meet the ongoing requirements in Guideline 1.2 – Safety – User Generated content. While there is no perfect system to prevent all dangerous or hateful user content, apps are required to have robust content moderation plans in place to proactively and effectively address these issues. A temporary “task force” is not a sufficient response given the widespread proliferation of harmful content.
For these reasons, your app will be removed from the App Store until we receive an update that is compliant with the App Store Review Guidelines and you have demonstrated your ability to effectively moderate and filter the dangerous and harmful content on your service.
Parler was also removed from the Google Play Store on Android yesterday, as our colleagues over at 9to5Google reported at the time.
Apple kill-switching Parler no longer necessary for the app to stop working unless it finds a new web host. https://t.co/7RFfIP0r6e
— Mark Gurman (@markgurman) January 10, 2021