
Tech companies must start putting in place measures to protect users from child sexual abuse images and other illegal content in Britain from Monday as enforcement of its online safety regime ramps up.
Media regulator Ofcom said Meta’s (META.O) Facebook, ByteDance’s TikTok, Alphabet’s (GOOGL.O) YouTube and other companies must now implement measures such as better moderation, easier reporting and built-in safety tests to tackle criminal activity and make their platforms