Tuesday 23 Apr 2024
By
main news image

This article first appeared in Digital Edge, The Edge Malaysia Weekly on December 13, 2021 - December 19, 2021

Over the last decade, the DNA of social media platforms has mutated. Facebook, which started off as a platform to connect and share thoughts with close friends, is now considered a public digital space for sharing news and information. Instagram, which started off as a photo-sharing app, recently announced that it was shifting its focus to video and entertainment.

The power of social media was demonstrated over the last two years as platforms needed greater moderation as more people relied on social media for news and content. From social activism to the filtering of fake news (which, for example, ultimately led to the suspension of former US president Donald Trump’s Twitter account), platforms such as Facebook, Instagram, Twitter and TikTok are taking moderation seriously, but there are still gaps in their community guidelines.

Moderation and censorship, however, are not black and white. Most platforms use artificial intelligence (AI) algorithms to track, identify and flag offensive posts, but there are cultural and geographical nuances that fall through the cracks. 

Twitter, however, seems to have the right recipe for moderation on its platform. Its co-founder and former CEO Jack Dorsey was one of the first to realise that there was something wrong with the authenticity and reliability of news and facts on the platform. Twitter added a feature that flagged tweets that were either questionable or not fully accurate, cautiously remaining as a responsible public space while upholding freedom of speech.

This feature was also how Trump was permanently suspended from Twitter after the company scrutinised his tweets, concluding that it contributed to the violent acts committed at the US Capitol on Jan 6 this year. “We have permanently suspended the account due to the risk of further incitement of violence,” Twitter’s statement read at the time.

Meanwhile on TikTok, local and international creators have seen their account banned by the platform without explanation. Creators would receive a message informing them of the ban with no explanation on how to appeal. Local TikTok creator Ah Leong (@ahleonggggg on TikTok and @ah_leonggg on Instagram) saw her hard work disappear one day before her birthday, when TikTok banned her 1.2 million-follower account. She then resorted to creating a second account.

"Do [platforms] really want to censor everyone? No, because censorship in itself is challenging the freedom of speech legislation in every country.” - Eckert

The most recent victim of social media censorship are Austrian art museums, which found that social media companies were banning images of artwork featuring nudes. To make a stand against these platforms, the museums partnered with OnlyFans, an app known mostly for its association with sex workers, to showcase these works of art.

Dr Pauline Leong, associate professor at the Department of Communication School of Arts at Sunway University, says the move by Viennese museums to use OnlyFans to display its works is a double-edged sword. While the controversy has given it much publicity, being on a platform that is associated with pornography and sex workers may have an impact on its branding and reputation as bastions of arts and culture.

Peter Eckert, co-founder and chief experience officer of projekt202, says when he first learnt about what happened with the museums, he instinctively felt it was fundamentally wrong not to allow art on social media platforms. It is not wrong to feel this way, he says, because if art gets censored, then what will be next?

“Do [platforms] really want to censor everyone? No, because censorship in itself is challenging the freedom of speech legislation in every country. But I think it’s the explosion of social media, coupled with the lack of thinking through this explosive growth and the lack of clear guidelines, that is the core issue here,” he tells Digital Edge.

“Of course, these platforms are not nuanced enough to understand the difference between art and offensive content. If you go to any social media feed where people show their bodies, it’s very questionable how much can be shown and for what purpose.

“In this case, it should be very obvious if it’s an art exhibition. Museums are known for this. If someone is managing social media posts, they should have been able to discern that it’s art. How is it possible that it fell through the cracks? This is done by either AI, which is not nuanced enough, or someone who just blindly goes by the rulebook and guidelines.”

Leong says that in the online world, social media can be seen as public squares where anyone in any part of the world can access material. Thus, these platforms are concerned about the impact of such content on their users, especially when its reach is to worldwide communities with different cultures and varying standards of morality.

She believes these platforms are erring on the side of caution. “In the offline world, the public pays ticket fees to museums in Vienna so that they can view exhibitions, which may include nude paintings and/or sculptures. The segment of the public that goes there is likely to have the mindset that these are works of art and culture.

"If platforms take the censorship route, it will always be 15 to 20 years behind the curve. Self-regulation might be the better route to take, in this case, as it covers the grey areas not covered by censorship.” - Ong

“The test of obscenity is this [as defined by Sir Alexander James Edmund Cockburn, Lord Chief Justice of England in the late 19th century]: ‘Whether the tendency of the matter charged as obscenity is to deprave and corrupt those whose minds are open to such immoral influences and into whose hands a publication of this sort may fall.’

“So, for a discerning adult who is aware of culture and the arts, such nude paintings and sculptures are unlikely to deprave and corrupt, especially when they have paid fees to obtain access to such content. But it is unlikely that nude paintings and sculptures would be placed in a public square, especially if it is unacceptable to the cultural norms of the society that it is in.”

Kenny Ong, Astro Radio CEO and The Content Forum chairman, points out that, often, when censorship and self-regulation are brought up in the context of social media platforms, whatever is being done today is about a decade outdated.

By the time these platforms manage to control themselves, an alternative would already have popped up — such as Discord, Clubhouse and Roblox, which have seen an influx of new users over the last two years.

Ong says: “Censorship is a difficult term to get around because it came about back in the day, when broadcasters and print media were linear. But now, how do you censor Roblox?”

“Censorship will never be able to catch up because we are trying to catch our own tails. If platforms take the censorship route, it will always be 15 to 20 years behind the curve. Self-regulation might be the better route to take, in this case, as it covers the grey areas not covered by censorship.”

On the opposite end of the spectrum is TikTok, which has banned creators for merely defending themselves against trolls in the comments section. Some creators were banned for a few days or a week, but others saw their accounts completely banned, despite having millions of followers.

But younger creators have found creative ways around visuals and words that may be flagged by AI algorithms as inappropriate, such as using a combination of letters and numbers to type out words. This is ubiquitous on TikTok, especially with stories and content surrounding sex.

“Nowadays, we can have all these censorship tools, but children are very smart and, if they want to, they’re going to circumvent it any way they can. If you tell them not to watch Squid Game, they’ll just go on Discord, where somebody will switch it on and share the screen,” says The Content Forum executive director Mediha Mahmood.

“For us, it’s not so much about the censorship tools that are out there, but the mentality that we have and the awareness of self-regulation.”

There are no clear guidelines for the public as to what censorship or moderation entails, says Eckert. These platforms need to generate revenue and, thus, have business models, too. Every time they moderate content that might cut into their revenue stream, it becomes a real issue for the platform.

“If creators want to use these channels to monetise, they will need to deal with the public opinion that comes with it. So, how does it work for things like an art exhibition? It doesn’t, because [the community guidelines are] too black and white right now,” he says.

“If I were a social media company, what I would do is actively enable a specific channel for artists, where they are treated differently. But, then, where do you start? You will need to do this for writing, filmmaking and all forms of expression. It’s a very complex issue.”

"Internet algorithms, especially on social media, have contributed towards increasing polarisation, extremism and radicalisation as users consume content that conforms to their ideology and beliefs.” - Leong

From good to bad: Loopholes in social media community guidelines

The advent of digital technology marked a new frontier for free speech. When the World Wide Web emerged in the early 1990s, its supporters declared that this would be a new way for unfiltered free expression compared with highly regulated traditional media.

Sunway University’s Leong says social media platforms have transformed the way people communicate by enabling them to exercise their right to free speech and facilitating exchange of information and ideas across social, political and geographic divides. 

Social activism was seen on social media platforms, too, which contributed to revolutions such as the Arab Spring, a revolution on the ground that was ignited by activists and protestors using social media to organise and publicise their protests.

Social media enabled the Arab Spring, sparking a series of anti-government protests, uprisings and armed rebellions that spread across the Arab world in the early 2010s. It began in response to corruption and economic stagnation and was influenced by the Tunisian Revolution.

Leong says these tech giants have seized the opportunity to brand themselves as symbols of democratisation of free expression and diversification of public discourse. Such unfettered freedom has resulted, however, in irresponsible use of the internet, with the emergence of illegal and harmful content such as hate speech, terrorism, cyber harassment and invasion of privacy, obscenity as well as misinformation, disinformation and fake news.

“Internet algorithms, especially on social media, have contributed towards increasing polarisation, extremism and radicalisation as users consume content that conforms to their ideology and beliefs,” she says.

“Therefore, social media platforms face pressure from governments and civil society to moderate and censor harmful content that creates fissures that cause disharmony and destabilise society. Despite the prevailing belief that social media are digital public squares, US courts view social media platforms as private entities, not public forums; they can moderate or censor any user-generated content as they see fit. Section 230 of the Communications Decency Act in the US allows social media platforms to restrict, censor and exercise editorial control over content, if it deems it ‘obscene, lewd, lascivious, filthy, excessively violent, harassing or otherwise objectionable’.”

The Content Forum’s Mediha says that while there are tools on social media sites to block and report harmful content, it is not always taken down. Eckert concurs, adding that the possible explanation for this is that whoever is vetting reported posts have been given specific instructions on what to look out for. These people may not make the effort to ask upper management whether there is a grey area.

He says: “We see this in all big companies, even those unrelated to social media. People in the trenches don’t have a mechanism to report up and ask questions. They are just being asked to weed out content that is deemed to go against the community guidelines. This is possibly how some content falls through the cracks of clear guidelines and regulation.”

Mediha adds, however, that if any digital content is not taken down after reporting, people can approach the Malaysian Communications and Multimedia Commission (MCMC) for further action. In addition, The Content Forum has a complaints bureau, which can mitigate when there are grey areas involving digital content.

“Sections 211 and 233 of the Communi­cations and Multimedia Act 1998 (CMA) talks about indecent and offensive content. So, if a case is as serious as it is, people can actually take this route to send instructions to the internet service provider to take down a site or any other action. They do have that legal power,” she says. 

Still, the complaints bureau does not have much power over social media, as its authority depends on each platform’s community standards and moderation methods. Mediha says it is disappointing to hear stories about how users have reported content but it was not taken down.

This was seen in the comments section of news article postings on Ain Husniza Saiful Nizam, who blew the whistle on her male teacher on TikTok for allegedly making lewd jokes in class. There were also reports of supporters of Ain receiving death threats as well.

Mediha says: “There probably needs to be some work done with us speaking to Facebook and other platforms to explain how these community guidelines are not homogeneous. The guidelines have to adapt to the culture and language of the country that you are in. We plan to work with MCMC in having these discussions with the relevant parties.

"Nowadays, we can have all these censorship tools, but children are very smart and, if they want to, they’re going to circumvent it any way they can. If you tell them not to watch Squid Game, they’ll just go on Discord, where somebody will switch it on and share the screen.” - Mediha

“Social media regulation is not really covered under our content code. What we do want to do is to emulate other countries that have attempted self-regulation by codes of conduct or other such voluntary instruments. We’ll need it to refer to content in Malaysia, for example, to create an understanding of how ‘mangkuk’ can actually be an offensive statement and not mean ‘bowl’ when translated.”

Ideally, social media platforms should have somebody who has a cultural understanding of a country to vet posts and think a little bit more contextually about the messaging in flagged content. Leong says it would be good to have localised moderation and censorship teams, but this will incur costs for social media platforms, as they would have to increase their operations worldwide. The question is whether the companies are willing to go the extra mile for this.

Eckert says the cultural implications of different issues, such as sexual education and offensive slang, should also be taken into consideration.

“We should be learning from industries that have actually mastered that, like the TV industry. Every country has some form of regulation and there are organisations here in Malaysia that look at every piece of content to eliminate what is inappropriate. 

“They have also found ways to still allow content that is artful. TV companies have figured this out, which means they have invested in it. And social media companies are so new that they are just catching on to this idea of moderation.”

The Content Forum’s Ong says we have to stop thinking there is going to be a utopian society online. He adds that the language barrier on social media was seen in the US as well, especially in regard to misinformation surrounding Covid-19. 

“One of the major reasons a chunk of the US population, especially the non-Caucasian community, are hesitant with vaccinations is that there is a lot of misinformation about vaccinations on the platform,” he says.

“Facebook spends a lot of resources, manpower and technology to filter out misinformation, but it’s in the English language. The Hispanic community was the most hit with misinformation because the content is in a language that Facebook can’t filter. The problem becomes bigger when you multiply it by the number of languages that are seen in a country like Malaysia.”

Will social media be deemed irrelevant?

Like most businesses, Facebook started as a small idea that eventually captured people’s attention. The social media space then saw a landslide of competing platforms, some of which eventually consolidated, like how Instagram and Facebook are now under the same company.

Right after that, Eckert says, a sophisticated fragmentation typically takes place, introducing niche platforms that are more meaningful than the bigger platforms. With that in mind, he believes there will be an implosion in the social media landscape, with the emergence of smaller, dedicated platforms for different people.

Leong concurs, adding that the cyber world is very fluid and people will gravitate to sites that are trendy. At the end of the day, if a platform is dominated by unverified and false content, it might affect the credibility of the platform and result in average right-thinking users abandoning it for better sites.

“This leaves behind those who thrive on conspiracy theories and fringe ideas, thus creating an echo chamber. I know of people who moved away from social media because of privacy concerns or to protect their mental health by avoiding exposure to its toxic environment,” she says.

As for TikTok, Leong believes the platform is likely to be able to get the correct balance by having proper discussions with the right stakeholders, adjusting its policies and training its staff who will be executing those policies. 

Eckert says, however, that it is unclear whether over censorship of the platform will lead to its downfall, but it is somewhat an open secret that China’s censorship stance is applied to the platform as well.

“[TikTok] has been banning any evidence about the Tiananmen Incident that took place in the 1980s. China has been censoring this because they don’t want the new generation to know that this took place. So, China is definitely going in the wrong direction right now, as they’re censoring any opinion that doesn’t fit into their larger agenda,” he says.

“I am concerned for them because the larger agenda of China will eventually unravel. People want to express themselves, and this is where art is really important. We need these forms of expression because, if it is removed, what are we living for?

“If TikTok becomes cumbersome with self-expression, the next TikTok is just around the corner.”

Save by subscribing to us for your print and/or digital copy.

P/S: The Edge is also available on Apple's App Store and Android's Google Play.

      Print
      Text Size
      Share