LONDON – A United Kingdom parliamentary committee rebuked Facebook Inc. in a new report that calls for regulation and intensified scrutiny of social media companies.
The report urged a compulsory code of ethics for technology companies to deal with harmful or illegal content on their sites. It also called for the creation of an independent regulator that has the power to launch legal action against companies in breach of the code that could result in hefty fines.
Large sections of the report were devoted to criticism of Facebook, which it said had intentionally and knowingly violated both privacy and anti-competition laws in how it handled user data and tried to stifle competitors.
The report expands on earlier recommendations from the committee published in July. It follows a months-long inquiry into tech companies and issues of privacy, misinformation and the power of their platforms in the wake of the scandal involving the data-analytics firm Cambridge Analytica’s access to Facebook users’ information.
The committee also published another set of Facebook emails following an earlier trove released by Parliament in December that revealed the company’s tactics in dealing with competitors and monetizing its user data.
Facebook did not immediately respond to a request for comment.
The committee recommended laws governing privacy, data protection, antitrust and competition should be used to rein in companies.
“The big tech companies must not be allowed to expand exponentially, without constraint or proper regulatory oversight,” it said.
“If companies become monopolies they can be broken up, in whatever sector,” the House of Commons Digital, Culture, Media and Sport Committee said in its final report on disinformation and “fake news.”
It is not clear how many of the panel’s recommendations will be taken up by the British government, but some are likely to be incorporated in UK government policy proposals to be put forward in coming months. The UK is scheduled to leave the European Union on March 29, losing its voice in a more influential market covering 500 million people.
“The big tech companies are failing in the duty of care they owe to their users to act against harmful content, and to respect their data-privacy rights,” said Damian Collins, the committee’s chairman.
“Companies like Facebook exercise massive market power which enables them to make money by bullying the smaller technology companies and developers who rely on this platform to reach their customers,” he said.
The report expressed concern at the “porous nature of Facebook data security protocols” and accused the company of continuing to choose profit over data privacy. It also said the company took “highly aggressive action” against certain apps that were competitors.
The committee released on Monday more documents provided to UK lawmakers as part of a lawsuit against Facebook filed by Six4Three LLC, the developer of a now-defunct app. Six4Three sued Facebook in 2015, alleging the social network’s data policies were anticompetitive and favored certain companies over others.
Emails show Facebook as far back as 2011 wrestling with where to draw the line on protecting users. In an internal email that year, a Facebook executive told colleagues he feared the company was routinely erring too far on the side of outside developers who use Facebook, rather than users.
Another executive’s response: “One week everyone is yelling that we’re not protecting users enough. The next week everyone is swooping in and saying we’re being too aggressive.”
In other newly released emails, Facebook employees discussed how to decide whether to bar outside parties from building apps on Facebook’s platform. They created a list of risks to weigh each partner against; one was whether the outside party was a competitor.
Facebook previously addressed the release of its internal emails in December. Chief Executive Mark Zuckerberg at the time said: “Like any organization, we had a lot of internal discussion and people raised different ideas. Ultimately, we decided on a model where we continued to provide the developer platform for free and developers could choose to buy ads if they wanted. This model has worked well.”
The British report in one section focused on “inferred data,” characteristics based not on information shared by users but on an analysis of users’ data profiles.
It noted that under EU rules now law in the UK, such inferred data was not protected like personal data, suggesting this was a loophole regulators should address.
It said Zuckerberg had shown “contempt for the UK Parliament” for refusing to appear before the committee.
“The management structure of Facebook is opaque to those outside the business and this seemed to be designed to conceal knowledge of and responsibility for specific decisions,” it said.