Give UK regulator more teeth on online scams – PIMFA

The UK’s proposed new Online Safety Bill is controversial and is designed to remove fraudulent and harmful content from social media and the internet. have the power to remove certain content.

A wealth management industry group wants the UK regulator – the Financial Conduct Authority – to have the power to order authorities to remove fraudulent content from the internet under the powers of the new Privacy Bill. online security.

The Personal Investment Management & Financial Advice Association, or PIMFA, has asked the FCA to order Ofcom to remove this material. He released a statement yesterday.

The bill, which aims to protect users of online services, has previously been criticized as paving the way for censorship. UK wealth manager Quilter said it should be used to remove financial scams from websites.

PIMFA has made its recommendations on how the law should work when testifying before MPs considering the Online Safety Bill.

Tim Fassam, director of government relations and policy at PIMFA, is urging an amendment to the bill that would see partner regulators such as the FCA provide strategic support to Ofcom to prevent harm to consumers from financial services.

While the bill deals very specifically with fraud and breaches of the Financial Services and Markets Act, PIMFA said it is unclear how Ofcom will ensure it has the necessary expertise. to identify violations.

Fassam referred to the case of London Capital & Finance where the regulated firm was able to introduce harm into the market by selling unregulated speculative mini-bonds, aided specifically by advertising, offering high yields in a low interest rate economy. ‘interest. If the FCA were able to quickly prevent adverts of this nature through Ofcom, it could significantly reduce the risk of potential harm to consumers, PIMFA said.

He said PIMFA also supports an amendment to the bill to ensure search engines have the same duty of care as social media websites to weed out fraudulent ads on their platforms.

According to the government’s website, “The Bill introduces new rules for companies that host user-generated content, i.e. those that allow users to post their own content online or interact with each other, and for search engines, which will have tailored obligations focused on minimizing the presentation of harmful search results to users Platforms that fail to protect people will have to answer to the regulator and could be fined up to up to 10% of their income or, in the most serious cases, be blocked.

Civil liberties activists argue that the law, while designed to protect the public, creates a risk of censorship because the definition of “harm” in some cases is ambiguous.

As previously reported, Matthew Lesh, head of public policy at the Institute of Economic Affairs, a UK think tank, said the bill was dangerous.

“Companies will be required to remove anything that might be illegal, from ‘hate speech’ to emotionally distressing content – under threat of multi-billion pound fines. This will allow easily offended and malicious actors to lobby for the withdrawal of [free] speech,” he said.