By Chia-Shuo Tang and Sam Robbins
Taiwan’s Digital Intermediary Service Act, proposed by the National Communications Council (NCC) in late June this year, was withdrawn in the Executive Yuan within a little over a month before being officially reviewed by the parliament. The legislation aimed to establish greater accountability for social media platforms and websites for the content on their sites and create mechanisms for greater transparency and responsibility for content moderation. Following the launch of the Ministry of Digital Affairs in August and curriculum reforms to promote media literacy, the proposal shows the government is taking concerted action to address disinformation campaigns in a corner of the globe most at risk of such attacks.
Despite being listed as one of the most at risk but least prepared countries for foreign disinformation campaigns in the 2019 report by Digital Society Project, Taiwan strongly resists the legislation that would put greater responsibility on international digital platforms. Within less than two months, more than 30,000 people voted against the act on the governmental Online Participation Platform For Public Policy, while less than 150 people expressed support. Internet users also took to social media to criticize the legislation by posting an image containing the sentence: “The content has been removed due to violation of the Digital Intermediary Service Act,” questioning if it amounts to censorship. In response to the backlash, the NCC indefinitely postponed the third public hearing for the bill, which typically means it had been abandoned.
At first glance, the proposed law seems to be based on international best practices, especially the Digital Services Act (DSA) passed by the European Union in April this year. The NCC was also keen to highlight the similarities between the two acts, likely banking on the existence of precedent in democratic states to help legitimize the bill. Like the DSA, the act required digital platforms to set up notice-and-takedown mechanisms to effectively tackle illegal content. The act also demanded that platforms disclose their content moderation practices via publishing transparency reports and make explicit how their recommender systems works. In addition, the act also borrowed the safe-harbor principle from DSA, which exempted platforms from being liable for user-uploaded content while still encouraging self-regulation.
This said, there was a critical difference between Taiwan’s bill and the DSA. The former authorized government agencies to identify illegal content “in correspondence to their regulatory authorities” and left the legal system to determine whether it should be taken down. For example, the Ministry of Health and Welfare would have the power to define what constitutes medical disinformation, whereas the Ministry of Economic Affairs would determine what economic disinformation is. Taiwan’s government bodies have historically not been involved in content moderation, and such a law would likely throw the administration into disarray.
But this level of state control wouldn’t have been unprecedented for Taiwan, which endured almost four decades of martial law and only began democratization in the 1990's. Today, it’s still easy to find laws containing broad and vaguely defined terms that give government officials significant authority, jurisdiction and crucially, discretion to exercise on particular cases. For example, spreading rumors is illegal according to the Social Order Maintenance Act, but the criteria for what counts as a rumor has never been defined. This conceptual vagueness has led to procedural issues since the end of martial law. National statistics indicate a drastic 28-time increase in the number of prosecutions for spreading rumors between 2016 and 2020, a time when combating misinformation became a well-known government agenda. But the court ruled against the government in most cases, finding no evidence of rumor spreading. Without a clear definition of what a rumor is, the administration has been taking an increasingly liberal approach to prosecuting falsehoods, and the legislature, perhaps similarly due to the same reason, has largely counterbalanced these efforts.
While Taiwan’s post-authoritarian characteristics result in an overreaching Digital Service Intermediary Act, it has also created a civil society that dares to push back. Since the Sunflower Movement, the public has been alert to any abuse of government power and administrative overreach. Political parties were quick to oppose the legislation when it was proposed. The main opposition Kuomintang suggested the goal of the act was to ensure that Taiwan’s digital space is pro-DPP by design.
In Taiwan, regulating freedom of speech on social media is a touchy issue. Any attempt to do so risks being accused of censorship, a practice common during martial law. Hong Kong also serves as a cautionary tale for how easily freedom of speech could be lost under an authoritarian regime.
At the same time, Taiwan is one of the global hot spots for disinformation operations. Although many civil society groups have emerged and the government has emphasized the importance of media literacy in school curricula, robust and well-designed legislation for disinformation is still necessary. Left to their own devices, social media platforms tend to favor content that garners the most attention, which is not always the most truthful.
Taiwan’s most recent draft of the Digital Intermediary Service Act has failed to win the support of a public that cherishes freedom of speech as an abstract marker of freedom writ large and of a civil society wary of administrative overreach. This doesn’t mean that this is the end of platform regulation. Considering that this legislation is already five years in the making and that this draft is not the first draft, it likely won’t be the last. The NCC will likely produce another version of the bill that addresses public concern. Two questions remain: 1) what type of system will be acceptable to the Taiwanese public and to Taiwanese legislatures? 2) will this system be able to effectively tackle disinformation while avoiding administrative overreach?
Ideally, new versions of the bill will move power away from existing government agencies to new independent review bodies subject to both judicial and public oversight, but it is still an open question of whether a system of content moderation would be acceptable. Increasing platform transparency without attempting to wade into moderation is a likely move. Balancing freedom of speech, a diversity of opinions while tackling a media ecosystem increasingly embattled by disinformation campaigns is a fraught task in Taiwan and all democracies, but is also an increasingly prescient one. With national elections ending the current legislative session in two years, it remains to be seen what’s next for the future of platform moderation in Taiwan and what balance of regulation and freedom will be accepted.
READ NEXT: The Many Faces of the Hokkien-language Internet
TNL Editor: Bryan Chou (@thenewslensintl)
If you enjoyed this article and want to receive more story updates in your news feed, please be sure to follow our Facebook.