Instagram is releasing a feature that will let users easily reset their algorithms, as the government strengthens its regulation of online safety.
With the new reset feature, users can clear their recommended content from Explore, Reels and their feed, potentially reducing the amount of harmful content they are exposed to.
It's all part of Meta's push to make the app safer for young people, after announcing more private Teen accounts in September.
The feature, which will soon be rolled out globally, was announced as the government outlined its priorities for online safety.
Peter Kyle, Labour's technology secretary, said Ofcom should ensure the concept of "safety by design" is being followed by tech companies from the outset.
That would ensure more harm is caught before it occurs.
He also pushed for more transparency from tech giants on what harms are occurring on their platforms.
"From baking safety into social media sites from the outset, to increasing platform transparency, these priorities will allow us to monitor progress, collate evidence, innovate, and act where laws are coming up short," Mr Kyle said.
While the announcement was welcomed by child protection groups, some cautioned that the government needed to go further.
Follow Sky News on WhatsApp
Keep up with all the latest news from the UK and around the world by following Sky News
Ian Russell, chair of trustees at Molly Rose Foundation, said: "This announcement outlines a much needed course correction, vital for improved online safety, and to prevent the new regulation falling badly short of expectations.
"However, while this lays down an important marker for Ofcom to be bolder, it is also abundantly clear that we need a new Online Safety Act to strengthen current structural deficiencies and focus minds on the importance of harm reduction.
Read more from Sky News:
Civil plane goes supersonic for the first time since Concorde
Trump watches Space X launch but it does not go to plan
Google's AI chatbot Gemini tells user to 'please die'
Meanwhile, the NSPCC has urged social media platforms to be more transparent and proactive about child safety.
"They should be disrupting 'safe havens' for offenders by tackling the hidden abuse taking place through private messaging," said Maria Neophytou, director of strategy and knowledge at the NSPCC.
"It is right that the government is focusing on driving innovation and new technology that can identify and disrupt abuse and prevent harm from happening in the first place.
"The regulatory framework has the potential to change the online world for children."