Creators for Change all show how public and private stakeholders have come to collaborate at international level to foster constructive and productive conversations and partnerships around complex issues such a hate speech, tolerance and mutual respect. Meanwhile, these initiatives are translated across and beyond EU Member States. For instance, in Germany, projects such as LOVE-Storm or NichtEgal have taken very similar angles, aiming for informed, open-minded and tolerant citizenship, particularly among younger generations.
To make one of these examples more specific, for several years, the No Hate Speech Movement has been mobilising young people to combat hate speech and promote human rights online. Launched in 2013, it was rolled out at the national and local levels through national campaigns in 45 countries. The movement continues to be active through the work of the various independent national campaigns, online activists and partners. The initiative provides numerous examples of how public and private stakeholders can work together to prevent or counteract hate and discrimination, while creating and promoting alternative narratives. Regardless of it primary focus on hate speech, the movement also embraces a broader societal ambition, to preserve human rights and freedom of speech across Europe while nurturing an open and respectful democratic society.
Initiatives such as YouTube’s Creators for Change and its German spin-off NichtEgal show how industry can also play a pro-active role in education and awareness-raising efforts. These kind of industry efforts often emerge alongside (or in response to) political priority setting. For instance, in 2016, the European Commission agreed with IT companies Facebook, Microsoft, Twitter and YouTube on a Code of Conduct on countering illegal hate speech online to help users to notify illegal hate speech on social media platforms, while improving the support to civil society as well as the coordination with national authorities. Meanwhile, in Germany, the government introduced the Network Enforcement Act, which entered into force on 1 January 2018. The so-called “NetzDG” obliges operators of profit-oriented social networks to delete “obviously criminal content” within 24 hours after receipt of a complaint. Failure to comply with this requirement may impose fines of up to five million euros on companies.
As part of their corporate social responsibility strategy, companies not only engage in education and awareness-raising efforts, they typically do this in collaboration with other stakeholders. In the German example, YouTube’s initiative NichtEgal is supported by partners ranging from the German Association for Voluntary Self-Regulation of Digital Media Service Providers (FSM) to the national Safer Internet Centre’s Awareness initiative klicksafe (which is operated by Landeszentrale für Medien und Kommunikation (LMK) – The State Media Authority of Rhineland-Palatinate).
These public and private actions and initiatives show how hate speech has triggered complex yet productive debates about the role and responsibilities of social media networks. Yet, while multi-stakeholder approaches to education and awareness raising are undoubtedly important, the social responsibility of social media goes substantially further. As we have implied early on, companies also need to invest in a variety of other measures such as adequate community guidelines, effective complaint mechanisms, informative help centres or appropriate default settings.
More broadly, when designing and administering their platforms and services, platforms need to aim for an online environment in which users are empowered to react to hate speech, while having enjoyable, respectful, and creative experiences. This applies to online content and services targeting all kinds of age groups, but given their vulnerability and their right to be protected, this should be particularly true for children and young people.
A more general way to look at a social media environment which is less prone to hate is to approach it from a more general Positive Online Content perspective.
Online content and services targeting children and young people offer a variety of opportunities to support children and young people in developing skills that will help them become competent and responsible digital citizens. Looking at the overall picture, competences and skills like critical thinking, participation in society, social and emotional understanding, building and maintaining positive relationships, respect for their own identity and for others, are not to be acquired at a certain point in life. Rather, they need to be developed, trained and fostered over a longer period of time, preferably from a young age onwards. Growing up in online environments supporting these competences will support today’s children and young people to become the competent digital citizens of tomorrow.
In terms of requirements and the demands this puts on social media providers, this very much resonates with a more general Positive Online Content perspective:
“Positive online content is digital content aimed at children, which enables them to learn, have fun, create, enjoy, develop a positive view of themselves and respect for their identity, enhance their participation in society and produce and distribute their own positive content.”
Ensuring that children have access to high-quality positive online experiences from an early age can assist and empower them to become active and participatory citizens.
Within this context, a number of Positive Online Content checklists have been developed, for instance as part of the EC-funded POSCON (Positive Online Content and Services for Children in Europe) thematic network as well as the ongoing work implementing the EC Better Internet for Kids strategy.
Content providers can use these checklists when developing new content and services to ensure that their products are fit for purpose, and take measures to ensure that children and young people can go online free from risk of harm, whether this be in terms of content, contact, conduct, or commercial considerations. Parents, carers and educators can also benefit from the tool by being better aware of the features they should look out for when choosing online experiences for younger children.
Important criteria for positive online content, services and apps include (here you find a revised checklist with special focus on the SELMA perspective):
The complete list and further information about Positive Online Content can be found at https://www.betterinternetforkids.eu/web/positiveonlinecontent/home.