The chairman of the Lords Democracy and Digital Committee said the government’s online protection bill could be postponed for years.
Lord Puttnam said the Online Damage Bill will not go into effect until 2023 or 2024, after a government minister said she could not commit to bringing it to parliament next year.
“I think we laughed,” he said.
The government, however, said the legislation would be introduced “as soon as possible”.
The Online Harms Bill was introduced last year amid a wave of political action after the story of 14-year-old Molly Russell, who killed herself after viewing online images of personal injury, came to light.
It is seen as a potential tool to hold websites accountable if they cannot deal with harmful content online – but it is still in the proposal, or in the “White Paper”.
The Department of Digital, Culture, Media and Sport (DCMS) said the legislation will be ready at this parliamentary session.
But the Lords committee report said that DCMS Minister Caroline Dinenage would not commit to bringing a bill to parliament before the end of 2021, prompting fears of a long delay.
In her evidence for the committee in May, she had warned that the Covid-19 pandemic had caused delays.
But speaking to the BBC Today program, Lord Puttnam said, “It is over.”
“Here is a project that the government has shown to be very important – and it is – that they managed to lose in some way.”
The government originally introduced the idea of online regulation in 2017, following it with the White Paper 18 months later, and a full response is not expected until the end of this year.
Lord Puttnam said a potential 2024 date for it to take effect would be “seven years from conception – in the world of technology, which has two lives”.
Lord Puttnam was speaking after the release of his committee’s latest report on the collapse of confidence in the digital age.
In a statement, the committee said that democracy itself is threatened by a “pandemic” of online misinformation, which can be an “existential threat” to our way of life.
He said the threat of online misinformation had become even clearer in recent months during the coronavirus pandemic.
Among the report’s 45 recommendations was that the social media regulator – considered the current broadcast regulator, Ofcom – should hold platforms responsible for the content they recommend to a large number of people, once it exceeds a certain limit.
It also recommended that companies that do not repeatedly comply should be blocked at the provider level and fined up to 4% of their global turnover, and that political advertising should be conducted with stricter standards.
The new chief executive of Ofcom has warned that heavy fines would be part of his plans if he were appointed regulator.
The DCMS said: “Since the beginning of the pandemic, specialized government units have been working around the clock to identify and refute false information about the coronavirus.
“We are also working closely with social media platforms to help them remove incorrect claims about the virus that could endanger people’s health.”