Following the testimony of former Facebook Inc. employee and whistleblower Frances Haugen, most of the public said in a recent Morning Consult survey that the company is not doing enough to protect user safety and is supporting congressional and corporate actions to remedy the situation.
Sixty percent of American adults said Facebook isn’t doing enough to protect users, compared to 19 percent who said the social media company is doing enough.
Audience on board with stricter content controls
Among the problems Facebook has faced in recent weeks has included reports that the company’s platforms, such as Instagram, have had a negative effect on the mental health of young users, as well as claims that the company has allowed misinformation on topics such as COVID-19 vaccines. proliferate on its sites. The poll says the public would support steps Facebook could take to address some of these concerns.
One of the most popular actions for Facebook has been to implement stricter regulations and standards to ban content that encourages disinformation, hate speech, illicit activity or acts of violence, which have collectively received the support of 69% of those questioned. Receiving an equal share of support increased the age of eligibility to have an account on the company’s platforms.
Despite public support for lifting the barrier to entry, some experts are not convinced. Sean Blair, assistant professor of marketing at Georgetown University’s McDonough School of Business, said such restrictions “just don’t work.”
“People will always find ways to get around them, so the problem won’t just go away,” Blair said. “Now that doesn’t mean we shouldn’t have any barriers or age requirements at all, but it does mean that we probably shouldn’t be relying on them to solve the problem. Ultimately, I think everyone – businesses, users, parents, kids, regulators – will have to play a role in the process. “
Most support expanding Facebook’s ability to censor and remove certain types of content, as well as a move to have the News Feed display content in chronological order rather than using algorithms to personalize. individually what is displayed.
Majority support for the action of the Congress
The public is also in favor of congressional intervention and increased regulation of the social media giant.
The most popular suggestion was to step up the protection of children on social media platforms, which received 77% support. This follows Facebook’s announcement to suspend its Instagram Kids initiative, a sustained movement by 52 percent of American adults in a Morning Consult poll conducted shortly after the announcement.
A plan for Congress to create an independent government body made up of former techies that would investigate Facebook’s use of algorithms and the risk they pose to the public has garnered strong support, as have regulations that demand more. great transparency around algorithms and how social media companies use them. .
And 64% said they supported social media companies to be at least somewhat responsible for the actions of their users, which a group of House Democrats are seeking to do. a new invoice which would hold the platforms responsible if the personalized recommendations made through their content algorithms promote harmful content causing emotional or physical damage.
This would be a major challenge for corporate liability protections under Section 230 of the Communications Decency Act, although some tech groups have said it is not the best way forward.
“Instead of blaming the algorithm, Congress should work with platforms to develop best practices for quickly identifying and removing harmful content and giving users the skills and tools they need to stay safe online. “, Daniel Castro, vice president of the Information Technology and Innovation Foundation. , said in a statement.
Adam Kovacevich, director general of the House’s technology policy group of Progress, warned in a statement that the bill “exacerbates the problem” of harmful content. “By prohibiting companies from using personal data to recommend relevant content to users, platforms could be forced to rely more on measures such as viral engagement which leads to the spread of bad content,” he said. -he declares.
What else can we do
Others have suggested a different approach. A group of more than 40 human rights organizations called for a Federal Data Protection Act to make it illegal for social media companies to collect data and use it for their personalized recommendation algorithms. The groups said the law should be “strong enough to end Facebook’s current business model.”
A law that would require Facebook to publicly disclose its internal research also received strong support, with 68% saying they were in favor of such a move. Much of the recent controversy surrounding the company stems from the Wall Street Journal article “Facebook filesWhich released a wealth of internal documents showing how the company downplayed various negative aspects associated with its platforms, including Instagram’s impacts on children’s mental health.
The public appears to be in favor of regulating social media companies in general, as 52% said they were in favor of such a move by lawmakers. And 43% said Facebook was not regulated enough, compared with 19% who said it had the right amount of oversight and 17% who said it had too much.
A Facebook spokesperson declined to comment on the findings and referred to an opinion piece by Nick Clegg, the company’s vice president of global affairs, in which he called for new internet regulations, including section 230 reform.