Regulators have significant tools at their disposal to enforce their will on Twitter and on Mr. Musk. Penalties for noncompliance with Europe’s Digital Services Act could total as much as 6 percent of the company’s annual revenue. In the United States, the F.T.C. has shown an increasing willingness to exact significant fines for noncompliance with its orders (like a blockbuster $5 billion fine imposed on Facebook in 2019). In other key markets for Twitter, such as India, in-country staff members work with the looming threat of personal intimidation and arrest if their employers fail to comply with local directives. Even a Musk-led Twitter will struggle to shrug off these constraints.
There is one more source of power on the web — one that most people don’t think much about but may be the most significant check on unrestrained speech on the mainstream internet: the app stores operated by Google and Apple.
While Twitter has been publicly tight-lipped about how many people use the company’s mobile apps (rather than visit Twitter on a web browser), its 2021 annual report didn’t mince words: The company’s release of new products “is dependent upon and can be impacted by digital storefront operators” that decide the guidelines and enforce them, it reads. “Such review processes can be difficult to predict, and certain decisions may harm our business.”
“May harm our business” is an understatement. Failure to adhere to Apple’s and Google’s guidelines would be catastrophic, risking Twitter’s expulsion from their app stores and making it more difficult for billions of potential users to get Twitter’s services. This gives Apple and Google enormous power to shape the decisions Twitter makes.
Apple’s guidelines for developers are reasonable and plainly stated: They emphasize creating “a safe experience for users” and stress the importance of protecting children. The guidelines quote Justice Potter Stewart’s “I know it when I see it” quip, saying the company will ban apps that are “over the line.”
In practice, the enforcement of these rules is fraught.
In my time at Twitter, representatives of the app stores regularly raised concerns about content available on our platform. On one occasion, a member of an app review team contacted Twitter, saying with consternation that he had searched for “#boobs” in the Twitter app and was presented with … exactly what you’d expect. Another time, on the eve of a major feature release, a reviewer sent screenshots of several days-old tweets containing an English-language racial slur, asking Twitter representatives whether they should be permitted to appear on the service.
Reviewers hint that app approval could be delayed or perhaps even withheld entirely if issues are not resolved to their satisfaction — although the standards for resolution are often implied. Even as they appear to be driven largely by manual checks and anecdotes, these review procedures have the power to derail company plans and trigger all-hands-on-deck crises for weeks or months at a time.