In 2023, the German Association for Voluntary Self-Regulation of Digital Media Service Providers (FSM e. V.) once again used its legal and media literacy expertise to promote effective online youth protection. Together with our members and partner organisations, we are continuously working to improve the Internet for the younger generation.

With over 30,000 reports the FSM-Hotline received and processed a new record number of reports in 2023. Even though we are pleased with the increased public awareness of our Hotline, we are particularly concerned about the enormous increase in Child Sexual Abuse Material.

With its established media literacy projects (Elternguide.online, Media goes to School and weitklick), the FSM offered parents and educational professionals a wide range of support services in 2023.

As a reliable partner in the field of online youth protection the FSM participated in over 124 events, lectures, training courses and specialist exchanges. Commitment at international level was further strengthened. For example, FSM is now represented as an observer organisation in the Global Online Safety Regulators Network and as a member of the EU Commission’s Special Group on the EU Code of Conduct on Age-Appropriate Design.

In 2023 we have also paid particular attention to two special focal topics: the Digital Services Act as well as the nexus of disinformation and hate speech.

Read on for an overview of our 2023 highlights and further insights into our areas of work.

“Looking at future European regulatory projects, we see our many years of experience in the system of regulated self-regulation as an important starting point for actively promoting effective online youth safety.”

Martin Drechsler, FSM Managing Director

Martin Drechsler

Focal Topics 2023

European Online Safety coming together with the DSA

Creating harmonised and comprehensive regulation of digital services throughout Europe – that is the declared aim of the Digital Services Act (DSA). A milestone for media regulation across national borders – also with a focus on the protection of children and teens.

Since summer 2023, large platforms (VLOPs) and search engines (VLOSE) have been already obliged to comply with the DSA. Since February 2024, the DSA has been fully effective all over Europe. But there is still a lot to clarify between the EU Commission and service providers. Meanwhile the implementation of the DSA in Germany through the “Digitale Dienste Gesetz” (DDG) is slowly taking shape.

As the German Protection of Young Persons Act (JuSchG) and the German Interstate Treaty on the Protection of Minors in the Media (JMStV) in Germany already provide a high standard of protection for children and young people, many elements of the DSA are already integrated in the German youth media protection landscape. Moreoverover, the FSM, as a self-regulatory organisation, has been providing sustainable youth protection for many years, responding to new challenges and helping to empower young people to use media safely and confidently. The FSM advises its member companies from a legal and media literacy perspective, for example in the area of precautionary measures. In addition, FSM member companies that are not fully subject to German legislation also undertake to comply with German youth media protection law in many areas by recognising the FSM codes of conduct.

In addition to legal harmonisation of youth media protection, the DSA also led to the German Network Enforcement Act (NetzDG) losing almost all of its effectiveness. As the mechanisms of regulated self-regulation within the NetzDG  were unfortunately not utilised within the DSA, the FSM discontinued its work as a NetzDG self-regulator at the end of June 2023 (more on this below).

All in all, the Digital Services Act marks a significant step for the protection of minors from harmful online experiences across European borders. The implementation of the DSA and the cooperation between the EU Commission, national authorities, service providers and other stakeholders – such as the self-regulatory organisations – will be crucial to achieving these goals and further strengthening the protection of minors in Europe.

Focal Topics 2023

Fighting Hate Speech and Disinformation

Hate speech and disinformation are not new phenomena, but they reach new dimensions – especially in times of crisis and during election campaigns. There can be reciprocal effects between hate speech and disinformation: The spread of disinformation can lead to more hate speech, while hate speech narratives in turn utilise and spread disinformation.

Young people in Germany are increasingly confronted with disinformation and hate speech, according to the latest edition of the JIM study by Medienpädagogischer Forschungsverbund Südwest. The exposure to disinformation and hate speech online does not only restrict discussion spaces and can disorientate young people in terms of social ethics. But it can also make it more difficult to fulfil important developmental tasks when growing up to become responsible and socially competent individuals in our democratic, pluralistic society.

Media literacy and education form the basis for identifying and dealing with disinformation and hate speech online. With the weitklick project, the FSM has been dedicated to the topic of online disinformation in media education for many years.

In addition to media education and training, an effective legal framework to remove hate speech from the Internet, a functioning supervisory and regulatory system to enforce provider obligations and consistent criminal prosecution are particularly important. In this respect, the FSM is looking forward to the implementation of the Digital Services Act and the discussions on a law against digital violence in Germany.

The FSM-Hotline makes an important contribution to the rapid removal of online Child Sexual Abuse Material.

FSM-Hotline

Reports of online content at all time high

Since 1997 the FSM-Hotline has been a reliable point of contact for Internet users. In its function as a hotline, the FSM is recognised by all relevant platforms as a trustworthy source of reports. The FSM’s legal team examines each report individually and ensures that illegal content is removed from the Internet as quickly as possible.

In 2023, the FSM-Hotline received a total of 30,573 reports about illegal or youth-endangering online content. This more than doubles the number of reports compared to the previous year (2022: 12,956 reports) – a new record high (see figure below). In 74 percent of cases (22,739 reports), the content in question violated German youth media protection laws (substantiated reports) following a comprehensive case-by-case examination by our Hotline staff.

Child Sexual Abuse Material (CSAM) accounted for the largest proportion of substantiated reports at 57 percent (2022: 37 percent). In absolute numbers, this is a new high in terms of reported CSAM. Pornographic content accounted for the second-largest share of substantiated complaints at 39 percent. In addition, there were small proportions of reports regarding depictions of extreme violence (3 %), other content classified as harmful to minors (1 %) and hate crime (1 %).

Our ability to take swift and effective action against illegal online content is underpinned by numerous long-standing collaborations with other hotlines, youth media protection organisations and authorities in Germany and around the world. As part of the Digital Europe Programme the FSM-Hotline is further co-financed by the European Union.

Development of the number of reports

  • Child Sexual Abuse Material (CSAM)
  • Total reports

2023 Hotline Overview

  • 30,573 reports in total (2022: 12,956 reports)
  • 74 percent of reports flagged content violating German youth media protection laws (2022: 68 %)
  • 57 percent of substantiated reports identified CSAM (2022: 37 %)
  • Within 1.2 days on average reported CSAM hosted in Germany was removed (2022: 1.5 days)

Co-funded by the European Union

Within the framework of the Safer Internet DE network, the FSM-Hotline is co-financed by the European Union through the Digital Europe Programme.

NetzDG

Farewell to self-regulation according to NetzDG

Under the German Network Enforcement Act (NetzDG), platform operators have been obliged to swiftly process complaints about illegal content since 2017. With the implementation of the Digital Services Act, most of the NetzDG is now outdated. The FSM has therefore discontinued its work in this area at the end of June 2023.

Four years ago, the FSM set up self-regulation mechanisms for NetzDG cases with a short lead time. Here, it benefited from its many years of activity as an established and recognised self-regulatory body for youth media protection. In January 2020, the FSM was accordingly recognised by the Federal Office of Justice (BfJ) as the first and only institution of regulated self-regulation under the NetzDG.

In this role, the FSM was able to serve as an important interface between companies, authorities and politics . Platform providers used the FSM’s review for cases that were particularly difficult to assess. From March 2020 to the end of June 2023, the FSM review panel examined a total of 230 cases from social networks for illegality. Content that was assessed as illegal by the NetzDG review committees (86 cases in total) was immediately removed by the providers.

Quality features of regulated self-regulation under the NetzDG
  1. The involvement of independent experts is an important addition to the internal work of platform operators.
  2. Mechanisms of regulated self-regulation create a valuable orientation framework for cases that are difficult to assess legally.
  3. The timeliness of the reviewed cases and the transparent publication of the decisions provide insights into new phenomena of hate speech and their legal assessment.
  4. External review of content can help to avoid overblocking.
  5. Established review mechanisms enable quick remedies for those affected.
  6. Privilege effect gives platforms legal certainty when participating in regulated self-regulation.

With the Digital Services Act politicians have so far refrained from implementing options of state-recognised regulated self-regulation. However, with future legislation it would be possible to apply the system of regulated self-regulation beyond the area of illegal hate speech. In this way, other content areas regulated by the DSA could also be taken into account as well.

Read more

“The work of the FSM and the concept of regulated self-regulation have proven their worth in the fight against online hate speech. The wealth of experience gained should lead the way for future mechanisms of content regulation.”

Martin Drechsler, FSM Managing Director

Martin Drechsler

Media Literacy

Enabling competent media use from a young age

Media literacy is a key requirement for the protection of children and young people in the digital space. With its diverse media education offers, the FSM supports parents, teachers and educational professionals in particular. The aim is to provide them with the necessary information and tools to support children and young people in dealing with online media. The most relevant media education topics are presented in an up-to-date and practical way for specific target groups.

The focus of the FSM’s media education work with projects such as Elternguide.online, Media goes to School  and weitklick  is to encourage young people to actively participate in the digital society and at the same time develop the skills to deal with potential risks. Families, schools and extracurricular educational programmes have a special responsibility to empower young people through media education and media literacy – our aim is to support them in this endeavour. In order to make information on media education available to a broader audience, the more than 500 articles on Elternguide.online have also been available in English since September 2023.

In 2023, the FSM’s media education team focused in particular on low-threshold approaches to vulnerable groups, including offering information in Leichte Sprache (easy read) and new formats for media education work with parents. Moreover, the media education team addressed themes such as cyberbullying and the interaction between disinformation and hate speech (see focal topics). FSM is also actively represented in a wide range of specialist media education committees, round tables, panel discussions and training courses.

“With the support of the FSM, companies can develop customised youth protection solutions that are tailored to the respective usage realities.”

Gabriele Schmeichel, FSM President

Gabriele-Schmeichel

Online Youth Protection

Ensuring reliable measures of online youth protection

Who is responsible for ensuring that children and young people can discover online media without encountering risks at the content or interaction level? Parents have a great responsibility to help their children grow up in the digital world through media education. In addition, legal requirements and restrictions apply to providers of digital media content in order to ensure the protection of young people. As an important and reliable partner, this is where FSM supports service providers in their special responsibility.

The FSM has been an officially recognised voluntary self-regulation association since 2005, thus being an integral part of the regulated self-regulation system that operates in Germany. Through this system the state provides a legal framework within which recognised self-regulatory bodies, such as the FSM, can act independently regarding to content relevant to the protection of minors and can exercise a control function vis-à-vis their members.

The FSM advises providers on suitable protective measures and options. In the case of online media, there are various – particularly technical – options available for regulating access to certain content. The FSM works closely with its member companies including the option of reviewing new functions and services. This guarantees that the protection of minors is already implemented during the design phase.

In order to ensure that the technical means used fulfil the legal requirements, the FSM offers its member companies the service of submitting their tools to a commission of experts. A specially assembled review committee, consisting of three experts from science and practice, evaluates the submitted programme in detail. In 2023, the FSM recognised several youth protection programmes in accordance with the JMStV. An overview of all recognised youth protection programmes and the seals of approval awarded can be found on the FSM website.

About FSM

The German Association for Voluntary Self-Regulation of Digital Media Service Providers (FSM e. V.) is a self-regulatory body recognised by the Commission for the Protection of Minors in the Media (KJM) for the telemedia sector. It supervises and advises a large number of companies in the telecommunications and online sector. Since 1997, the organisation has been working to ensure that children and young people can grow up with a safer and better Internet – in particular by combating illegal content in online media that is harmful to minors and developmentally harmful.

To this end, the FSM operates a hotline that anyone can contact to report online content. The FSM-Hotline is co-financed by the European Union under the umbrella of Saferinternet.de. The FSM’s other tasks also include extensive educational work and the promotion of media literacy among children, young people and adults.

www.fsm.de/en

Interested in a FSM-Membership?

We are happy to offer companies a brief, non-binding advance consultation in order to discuss their needs and what form their membership might take.