This is a UK government White Paper, setting out the plans to provide for a major reform of the obligations of various online services towards illegal content and user safety. It is under consultation until the 1st of July 2019.
The core of the new proposals is a novel, statutory duty of care, tied to tackling illegal content in an adequate and efficient manner, as well ensuring the safety of the service's users. This duty is to be placed on a wide category of entities - “companies that allow users to share or discover user-generated content or interact with each other online”. The exact content of the duty in question is not specified yet - this is to occur through a series of corresponding codes of practice. For now, the possible obligations include: operating specific notice & takedown procedures, with corresponding appeal procedures; ensuring the visibility of certain terms & conditions; producing annual transparency reports, focused on the amounts and variety of illegal content passing through the service, as well as the steps taken to counter the presence of such content. Unless the company can show that eg. its own way of dealing with illegal content reflects the new duty of care better than the code(s) of practice, it has to follow the relevant code(s).
The shape of the codes of practice, as well as their implementation, is to be developed and overseen by a new, independent regulatory body. It is not certain yet whether this regulator is planned to be a wholly new body, or an existing one adapted for the discussed purpose. In order to ensure the platforms' compliance with the new duty of care, the regulator is likely to have at its disposal measures such as fines, senior management liability, as well as blocking of non-compliant services. In terms of funding, the plans are for the regulator to be funded by the industry in the medium term, with potential transition to a system of fees/levies/charges, assumingly sourced from the public budget.