The House of Representatives Subcommittee on Communications and Technology (of Committee on Energy and Commerce), continued hearing testimony this week from witnesses on “Big Tech,” including social media platforms.
Frances Haugen, a Facebook whistleblower, testified during the hearings “Holding Big Tech accountable: Targeted Reforms for Tech’s Legal Immunity.” Kara Frederick (research fellow in technology policy at The Heritage Foundation) also gave testimony. She explained the issues that need to be resolved. Haugen talked out about the use of social media’s divisive algorithm, and Frederick said the same. He also criticized Republican legislators for ignoring right-leaning views.
It is unclear how much liability the social media platforms should bear for content that is uploaded to their sites.
“These platforms don’t want to be held accountable and the users suffering harm deserve better from us and we will act,” said Rep. Mike Doyle (D-Penn), chair of the subcommittee.
Section 230 under the Communications Decency Act states that social media platforms, such as Facebook or Twitter, are not considered to be publishers. They therefore can’t take responsibility for any user’s content. Critics warn that allows them to have it both ways – they claim not to be responsible for what is said, yet still, delete posts that they argue violate their own community standards.
“Somehow they exist on a completely different plane and are allowed to have a completely different set of rules than everyone else,” said Rashad Robinson, president of Color of Change, one of the witnesses at Wednesday’s hearing. The fact is that freedom of speech does not mean freedom from its consequences.
Lawmakers on both sides of the aisle agree there is a problem – yet they can’t agree on what the problem actually is!
“Frances Haugen’s whistleblowing is being drowned out because Congress returns to its old dog whistles where the Republicans view the platforms as muzzling conservative vocals and the Democrats see them exercise monopoly powers and not doing enough for the misinformation outbreak,” Bhaskar Chakravorti – dean, global business, Fletcher School at Tufts University – explained via email.
“In parallel, the Biden administration has lined up a who’s who of get-tough-on-tech stars – Messrs Wu, Khan, Kanter and Chopra. Chakravorti stated that this all amounts to rhetoric over sound strategy. “While there is a lot of legitimate reason to create a framework for these platforms and visionary policy to govern them, it seems that we are stuck with partisan leadership who doesn’t have the guts to move forward. We are left to whistling in darkness, thanks to both the whistleblowers as well the dog-whistlers.
Does anyone have to be held accountable?
Most people in tech see the problem but don’t think enough is being done.
“Whether enough is being done – the answer is no. Big tech – ranging from search engines to social platforms – has, for many people, a major trust issue. Many worry about their data being sold and the possibility of shadow banning, public opinion skewing, or other problems.” Tom Garrubba was vice president of Shared Assessments. The global membership organization is dedicated to helping people to identify the most effective practices, education, and tools that can be used to increase third party risk insurance.
“As a government and a society, we are doing nothing to hold ‘Big Tech’ accountable,” warned Jane Grafton, vice president at cybersecurity research firm Gurucul. It’s not only social media. Facial recognition software, as well as other machine-learning technologies are being misused in practice. They’re biased towards certain situations and demographics. Deepfakes will not cease to be influential on national opinion. Many strategic decisions are driven by the profit motive at Big Tech firms.
In the case of social media, the problem has grown because the platforms are so widely used – yet remain completely unregulated.
Grafton stated via email that “Social media platforms began benignly but have devolved into the point where they give extreme opinions and even lies to it’s outsized audience.” Although it might be the ideal thing to get social media off the political stage, that is almost unlikely to occur. Everyday users of social media platforms must become more conscious of how unsupported statements can supplant reality in their daily conversations. Perhaps we can help people who use social media or other tech that have the potential to cause harm.
They are Publishers
According to conservatives, the social networks want Section 230 to continue to protect them. However they also have the ability to suppress or ban any opinions that aren’t their favorites. Republican lawmakers believe social media platforms need to be treated like publishers in order to lose these protections.
“Many big tech companies have emerged as today’s town criers – roles that have evolved organically and that have also been self-architected. Garrubba said that one problem is user PII, or Personally Identifiable Information (Personally Identifiable Information), being sold to high-bidders such marketing firms.
He added that another area of concern was what many consider to be a “revolving door” between Federal government and Big Tech positions. If there is a time of reckoning in the area of legislating large tech, having former colleagues and friends in positions that can help to liaise with big tech entities as well as those who are working on public policy could have an impact on softening legislation. This is also why some of the biggest tech players are diversifying – to ward of potential antitrust legislation.”
Unanswered questions such as “Is there a free and open discussion being muzzled?”
Garrubba stated that some claim that even a sexist comment about an election or public issue could get someone banned from and from major tech platforms. This can have a chilling impact on public discourse. China is believed to have used social scoring, which decides the privileges and travel capabilities of people as well as their employment rights. This is something that some are worried about in America and other countries. They find it particularly concerning, especially if the final arbiters are big tech.