For the most powerful people in tech, congressional hearings are becoming quite routine. The latest round of questioning happened on March 25th, marking the first time the CEOs appeared before the house after the US Capitol riots that took place on January 6th. 

Congressional delegates posed the tough questions to Facebook CEO Mark Zuckerberg, Twitter CEO Jack Dorsey, and Google CEO Sundar Pichai. Issues covered during the hearing ranged from misinformation on social media platforms, regulatory measures for content moderation, children’s screen time, and more. This hearing follows continued calls from lawmakers, experts, and public figures for stricter measures to curb online abuse and misinformation. It also took place in the backdrop of laws being considered in both the Congress and Senate on the issue. 

Misinformation is a growing problem on social media platforms. It takes various shapes and forms and has been responsible for polarization among users and society in general. Social media’s role in the 2016 Presidential election has already been a hot topic that continues to remain.  The 2020 elections and their aftermath showed more of the same in some ways, with misinformation campaigns around vote-rigging and election fraud ringing clear. Currently, misinformation campaigns (such as those concerning COVID vaccinations) don’t just put the issue into the spotlight again – they pose a major public health risk. 

Let’s look into why we should be talking about misinformation now and taking prompt action. We’ll look at what happened at the hearing and discuss what brands and individuals have to do to help clean up the misinformation’s social media spaces. 

 

Why Talk About Misinformation Now?

In recent years, misinformation, fake news, and hate speech have become ever-present topics in the news cycle. Misinformation or “fake news” is not a new idea, and it has always been a tool used by political fringe elements and extremist groups. Propaganda comes in many forms; in the past, it was in terms of pamphlets or newsletters. But even as the primary news mediums have developed (mostly in parallel with advancements in technology), the content and its intent haven’t shifted substantially. In modern times, the pamphlets and newsletters of the past have evolved into social media content.

Misinformation campaigns have wreaked havoc in many areas of society—politics, religion, entertainment, social security, immigration – you name it. The list of affected areas is seemingly endless. 

Fake news and misinformation campaigns polarize the public in many ways. The rise of anti-maskers and anti-vaxxers amid the pandemic are sobering examples: these have a direct impact on public health outcomes, and as we have seen first-hand over the past calendar year, fighting a pandemic is not just a government-level job. It’s one that requires a sense of personal responsibility, and these misinformation campaigns place barriers in creating proper awareness of that public responsibility. 

Another reason why we need to talk about misinformation now? We’re seeing tomorrow’s generation grow up in the social media era.

Children as young as 14 or 15 are regularly exposed to content on various social media sites. At this impressionable age, it is easy to form opinions and shape the minds of the youth through fake news. It’s a legitimate issue when these individuals are exposed to information that can place their mindset on the wrong side of critical issues.

Political and racial hate speech is another major area of consideration. There has been an increase in racial violence across the country, fuelled by hate speech and divisive agendas, and they have been enabled in part by access to fake news and “alternative facts” on social media platforms. 

The US Capitol riots were a watershed moment, not just because of the scale of the acts themselves but also how things were organized. The riots weren’t just the result of a single thing the former president may have said: There was a buildup on social media that some would argue indicated a clear and present danger emerging. Trump and others leveraged and amplified the angst by sharing incendiary statements and fake reports. Whether fair or unfair in evaluating where the responsibility lies, social media platforms like Facebook have accountability here. 

We shouldn’t be quick to forget how Facebook and other social media platforms allowed political content and ads to be targeted towards various demographic groups, with minimal content oversight to help mitigate the spread of misinformation. It’s all but undeniable that this had a profound impact on incidents like Charlottesville, and will continue to do so until a higher degree of oversight is implemented. 

Whether we like it or not, misinformation and Fake news is a “deal with it now” problem. We can’t afford to be lax on this issue. There are far-reaching implications across the various domains of society if we fail to take action.  The current scenario forms the perfect backdrop for the Congressional hearings that took place in hopes of catalyzing this change.

 

What Happened In The Hearings?

Mark Zuckerberg, Sundar Pichai, and Jack Dorsey seem to be getting used to facing the tough questions from Congress and the Senate. This most recent hearing is not the first time, and it is not likely to be the last. The core of this week’s conversation revolved around the steps that need to be taken by social media companies to curb the menace of misinformation and also the regulatory changes lawmakers need to consider. 

Rep. Mike Doyle led the charge initially and noted in his opening remarks, “You can take this content down. You can reduce the vision. You can fix this. But you choose not to. You have the means. But time after time, you are picking engagement and profit over the health and safety of users.” This was after he mentioned how his staff could easily find anti-vaccine content on nearly every major platform, including Facebook, Instagram, Twitter, and YouTube. 

There were also questions about the role that the tech giants played in the buildup to the US Capitol riots. All 3 CEOs remained circumspect in assuming any blame. However, Jack Dorsey of Twitter conceded that their platform did play a role in this, despite their best efforts to beat back the misinformation campaigns. 

Section 230 of the Communications Act of 1934 was a major topic of discussion. This section provides media platforms with legal immunity for content that users publish. Recent events only further validate the need to upgrade these laws and align with the changing times. In his opening written statement, Zuckerberg made remarks in favor of narrowing the definition of the law and assuming conditional liability in case platforms do not follow the moderation rules.  Sundar Pichai and Jack Dorsey did not comment on the law. 

The hearing also offered some views on what the consensus is on content moderation. Pichai presented his views on stricter content moderation policies and also giving users a way to appeal decisions. Twitter favored user-led moderation efforts using improved settings and tools that the users can employ to customize their experience. 

Some would say the questioning was quite harsh at times, with both sides of the aisle mincing no words while grilling the tech CEOs. Republicans focused on the platform’s impact on the mental health of children and teens, while the Democrats took the CEOs to task about their opaque algorithms that prioritize engagement and profits over risks to society. 

 

What Can Brands And Social Media Platforms Do?

The onus of curbing the menace of misinformation, fake news, and hate speech falls on the social media platforms. But brands, influencers, and regular users have a big role to play as well.

Social media platforms and their policy decisions have so far been out of sync with political thought on content moderation and accountability. New content policies and regulations should allow users to report and take down objectionable content and also restrict the spread of such posts. Twitter’s move to tag possible fake news using a statement that appears with the tweet is a step in the right direction, but more is needed. It is also important to identify the people behind large-scale campaigns and ensure that they can’t access the platform once it’s proven that they spread misinformation, working to become more proactive than reactive. 

Responsibility also lies on brands and influencers as well. They too, should be responsible for creating an environment of trust and reliable information. If you are coming across user content in your community or space that does not follow the content guidelines, it is up to you to report it and warn the users responsible for it. When we make each of our communities safer, we will have a safer place to interact on social media. 

 

Final Words

Fake news, misinformation campaigns, and hate speech are real issues that directly impact the lives of many people. Anti-vaccine and anti-mask content is already taking a toll on public health efforts. Racial violence, trolling, and online abuse are related issues that all contribute to making social media an increasingly toxic space in some facets. The impact these issues have on users’ mental health cannot be ignored, nor can the users’ responsibility themselves.

Lawmakers have a leading role to play in the months and years to come. Legislation and regulations that govern the social media space need updating, and holding users and social media platforms more accountable is also an issue of interest. The buck doesn’t stop with the lawmakers, though. Tech giants like Facebook, Google, and Twitter have to play a more proactive role in ensuring that their platforms are not festering with malicious content.

Brands, influencers, and users are also crucial stakeholders in this. Critically evaluating content before you engage with it will go a long way in stopping the spread of misinformation. We live in a world where digital media has never before been this polarized, and all of us – users, brands, platform owners, and lawmakers – must work together and find creative solutions to this problem.

Social media platforms have a larger responsibility towards society, and they can’t be treated exactly the same as other businesses. Social media content needs to be held to a higher standard, and we owe it to ourselves and to future generations to be actively seeking ways to fix and improve the system. Especially for those who are going to live in a world that is more connected than ever before.

We only do marketing that works.

Work with us →

Other posts you might like

Post link
Blog

How To Get TikTok Famous in 2021

**Originally published March 3rd, 2020. Updated May 10th, 2021** The COVID-19  pandemic has forced us to change our ways. Now more than ever, we live in a time where showcasing your talents is apropos of no
Post link
Blog

What Twitter’s New Super Follow Feature Means for Influencers

Twitter recently announced major changes on the platform, introducing two new features that will pique the interest of brands and influencers alike. Both the features are aimed at creating more niche communi
Post link
Blog

5 Tips For Using TikTok to Promote Your Business in 2021

**Originally published July 24th, 2020. Updated May 6th, 2021** What began as a platform to spotlight short, often hilarious videos has quickly grown into one of the most popular and highly downloaded social