TikTok, the user-generated video sharing app from Chinese publisher Bytedance that has been a global runaway success, has stumbled hard in one of the world’s biggest mobile markets, India, over illicit content in its app.
Today, the country’s main digital communications regulator, the Ministry of Electronics and Information Technology, ordered both Apple and Google to remove the app from its app stores, per a request from High Court in Madras after the latter investigated and determined that the app — which has hundreds of millions of users, including minors — was encouraging pornography and other illicit content. (A source in India, however, tells us that the app is still available in both stores currently.)
This is the second time in two months that TikTok’s content has been dinged by regulators, after the app was fined $5.7 million by the FTC in the US over violating child protection policies.
The order in India does not impact the 120 million users in the country who already have the app downloaded, or those on Android who might download it from a source outside of Google’s official Android store. But it’s a strong strike against TikTok that will impede its growth, harm its reputation, and potentially pave the way for further sanctions or fines against the app in India (and elsewhere taking India’s lead).
TikTok has issued no less than three different statements — each subsequently less aggressive — as it scrambles to respond to the order.
“We welcome the decision of the Madras High Court to appoint Arvind Datar as Amicus Curae (independent counsel) to the court,” the statement from TikTok reads. “We have faith in the Indian judicial system and we are optimistic about an outcome that would be well received by over 120 million monthly active users in India, who continue using TikTok to showcase their creativity and capture moments that matter in their everyday lives.”
(A previous version of the statement from TikTok was less ‘welcoming’ of the decision and instead highlighted how TikTok was making increased efforts to police its content without outside involvement. It noted that it had removed more than 6 million videos that violated its terms of use and community guidelines, following a review of content generated by users in India. That alone speaks to the actual size of the problem.)
On top of prohibiting downloads, the High Court also directed the regulator to bar media companies from broadcasting any videos — illicit or otherwise — made with or posted on TikTok. Bytedance has been working to try to appeal the orders, but the Supreme Court, where the appeal was heard, upheld it.
This is not the first time that TikTok has faced government backlash over the content that it hosts on its platform. In the US, two months ago, the Federal Trade Commission ruled that the app violated children’s privacy laws and fined it $5.7 million, and through a forced app updated, required all users to verify that they were over 13, or otherwise be redirected to a more restricted experience. Musically, TikTok’s predecessor, had also faced similar regulatory violations.
More generally the problems that TikTok is facing right now are not unfamiliar ones. Social media apps, relying on user-generated content as both the engine of their growth and the fuel for that engine, have long been problematic when it comes to illicit content. The companies that create and run these apps have argued that they are not responsible for what people produce on the platform, as long as it fits within its terms of use, but that has left a large gap where content is not policed as well as it should be. On the other hand, as these platforms rely on growth and scale for their business models, some have argued that this has made them less inclined to proactively police their platforms to bar the illicit content in the first place.
Additional reporting Rita Liao
Source: TechCrunch http://j.mp/2ZcRTE6
No comments:
Post a Comment