Apple has pulled the social media app Parler from the App Store, after the conservative-leaning service failed to present plans to moderate controversial content posted by users within Apple’s 24-hour deadline.
The takedown has removed the app from view in the App Store, with it no longer appearing in searches, following Apple’s demand for change. New downloads of the app are no longer possible until the app is reinstated, though existing installations will still be able to access the service as normal.
Google pulled the app from the Google Play Store within hours of Apple’s announcement, making the app unavailable to download to Android devices via that digital storefront.
On Friday, Apple contacted the developers behind Parler about complaints it received regarding content and its use, including how it was allegedly employed to “plan, coordinate, and facilitate the illegal activities in Washington D.C.,” an email from the iPhone producer said. As well as enabling users to storm the U.S. Capitol, which led to the “loss of life, numerous injuries, and the destruction of property,” Apple believed the app was continuing to be used to plan “yet further illegal and dangerous activities.”
Apple gave Parler 24 hours to make changes to the app to more effectively moderate content posted by users, or face ejection from the App Store until the changes are actually implemented.
Shortly before 8 P.M. Eastern Time, almost an hour after the deadline, the app was removed from the App Store.
In a statement, Apple said “We have always supported diverse points of view being represented on the App Store, but there is no place on our platform for threats of violence and illegal activity. Parler has not taken adequate measures to address the proliferation of these threats to people’s safety. We have suspended Parler from the App Store until they resolve these issues.”
Parler bills itself as being a “non-biased, free speech social media focused on protecting user’s rights,” and has become the online home for conservatives and radicals that have been kicked off other mainstream social networks like Facebook and Twitter. In recent months, the app had gained a reputation for being a safe-haven for conspiracy theorists and far-right extremists, including people who called for protests and violence after the latest U.S. presidential election.
While Parler believes it is a “neutral town square that just adheres to the law,” as said by Parler CEO John Matze and quoted by Apple in the email, Apple insists Parler is “in fact responsible for all the user generated content present on [the] service,” and to make sure it meets the App Store requirements regarding user safety and protection. “We won’t distribute apps that present dangerous or harmful content,” wrote Apple to Parler.
Parler’s CEO responded to the initial email by declaring standards applied to the app are not applied to other entities, including Apple itself. An earlier post from the CEO said “We will not save to pressure from anti-competitive actors! We will and have enforced our rules against violence and illegal activity. But we won’t cave to politically motivated companies and those authoritarians who hate free speech!”
In a second email explaining the removal of Parler, Apple’s App Review Board explains it had received a response from Parler’s developers, but had determined the measures described by the developers as “inadequate to address the proliferation of dangerous and objectionable content on your app.”
The decision was due to two reasons, with the primary problem being the insufficient moderation to “prevent the spread of dangerous and illegal content,” including “direct threats of violence and calls to incite lawless action.”
Apple also objects to Parler’s mention of a moderation plan as “for the time being,” which indicates any measures would be limited in duration rather than ongoing. Citing a need for “robust content moderation plans,” Apple adds “A temporary ‘task force’ is not a sufficient response given the widespread proliferation of harmful content.”
The threat from Apple occurred during a wider attempt by tech companies and social media services to cut access to accounts operated by activists, organizations, and political leaders who were linked to the Capital Hill attack. This includes President Donald Trump, who was suspended from both Twitter and Facebook for his inflammatory messaging to followers.
The full letter from Apple to Parler follows:
Thank you for your response regarding dangerous and harmful content on Parler. We have determined that the measures you describe are inadequate to address the proliferation of dangerous and objectionable content on your app.
Parler has not upheld its commitment to moderate and remove harmful or dangerous content encouraging violence and illegal activity, and is not in compliance with the App Store Review Guidelines.
In your response, you referenced that Parler has been taking this content “very seriously for weeks.” However, the processes Parler has put in place to moderate or prevent the spread of dangerous and illegal content have proved insufficient. Specifically, we have continued to find direct threats of violence and calls to incite lawless action in violation of Guideline 1.1 – Safety – Objectionable Content.
Your response also references a moderation plan “for the time being,” which does not meet the ongoing requirements in Guideline 1.2 – Safety – User Generated content. While there is no perfect system to prevent all dangerous or hateful user content, apps are required to have robust content moderation plans in place to proactively and effectively address these issues. A temporary “task force” is not a sufficient response given the widespread proliferation of harmful content.
For these reasons, your app will be removed from the App Store until we receive an update that is compliant with the App Store Review Guidelines and you have demonstrated your ability to effectively moderate and filter the dangerous and harmful content on your service.