From online hate speech to positive content and services

To conclude, we provide some further background in regards the responsibility social media providers have in the fight against online hate, while elaborating why and how positive content and services might provide an appropriate answer.

If tailored to the needs of needs of children and young people, positive online content and services can make them flourish. It enables them to learn, have fun, create, enjoy, develop a positive view of themselves and others, while enhancing their participation in society, in order to become active and responsible European citizens – both digitally and politically!

The social responsibility of social media

Online hate has become a widespread societal issue – any attempt to fight it has to be part of a bigger scenario, of a broader and holistic approach. This is no longer a problem merely to be addressed by governments or civil society. Ongoing societal and political debates have made it clear that social media platforms have a crucial responsibility to ensure a safe and positive online environment for their users, above all for children and young people.

Initiatives such as the Council of Europe’s No Hate Speech Movement, the United Nation’s International Day for Tolerance, or YouTube’s Creators for Change all show how public and private stakeholders have come to collaborate at international level to foster constructive and productive conversations and partnerships around complex issues such a hate speech, tolerance and mutual respect. Meanwhile, these initiatives are translated across and beyond EU Member States. For instance, in Germany, projects such as LOVE-Storm or NichtEgal have taken very similar angles, aiming for informed, open-minded and tolerant citizenship, particularly among younger generations.

To make one of these examples more specific, for several years, the No Hate Speech Movement has been mobilising young people to combat hate speech and promote human rights online. Launched in 2013, it was rolled out at the national and local levels through national campaigns in 45 countries. The movement continues to be active through the work of the various independent national campaigns, online activists and partners. The initiative provides numerous examples of how public and private stakeholders can work together to prevent or counteract hate and discrimination, while creating and promoting alternative narratives. Regardless of it primary focus on hate speech, the movement also embraces a broader societal ambition, to preserve human rights and freedom of speech across Europe while nurturing an open and respectful democratic society.

Initiatives such as YouTube’s Creators for Change and its German spin-off NichtEgal show how industry can also play a pro-active role in education and awareness-raising efforts. These kind of industry efforts often emerge alongside (or in response to) political priority setting. For instance, in 2016, the European Commission agreed with IT companies Facebook, Microsoft, Twitter and YouTube on a Code of Conduct on countering illegal hate speech online to help users to notify illegal hate speech on social media platforms, while improving the support to civil society as well as the coordination with national authorities. Meanwhile, in Germany, the government introduced the Network Enforcement Act, which entered into force on 1 January 2018. The so-called “NetzDG” obliges operators of profit-oriented social networks to delete “obviously criminal content” within 24 hours after receipt of a complaint. Failure to comply with this requirement may impose fines of up to five million euros on companies.

As part of their corporate social responsibility strategy, companies not only engage in education and awareness-raising efforts, they typically do this in collaboration with other stakeholders. In the German example, YouTube’s initiative NichtEgal is supported by partners ranging from the German Association for Voluntary Self-Regulation of Digital Media Service Providers (FSM) to the national Safer Internet Centre’s Awareness initiative klicksafe (which is operated by Landeszentrale für Medien und Kommunikation (LMK) – The State Media Authority of Rhineland-Palatinate).

These public and private actions and initiatives show how hate speech has triggered complex yet productive debates about the role and responsibilities of social media networks. Yet, while multi-stakeholder approaches to education and awareness raising are undoubtedly important, the social responsibility of social media goes substantially further. As we have implied early on, companies also need to invest in a variety of other measures such as adequate community guidelines, effective complaint mechanisms, informative help centres or appropriate default settings.

More broadly, when designing and administering their platforms and services, platforms need to aim for an online environment in which users are empowered to react to hate speech, while having enjoyable, respectful, and creative experiences. This applies to online content and services targeting all kinds of age groups, but given their vulnerability and their right to be protected, this should be particularly true for children and young people.

Positive Online Content: an early start to digital citizenship

A more general way to look at a social media environment which is less prone to hate is to approach it from a more general Positive Online Content perspective.

Online content and services targeting children and young people offer a variety of opportunities to support children and young people in developing skills that will help them become competent and responsible digital citizens. Looking at the overall picture, competences and skills like critical thinking, participation in society, social and emotional understanding, building and maintaining positive relationships, respect for their own identity and for others, are not to be acquired at a certain point in life. Rather, they need to be developed, trained and fostered over a longer period of time, preferably from a young age onwards. Growing up in online environments supporting these competences will support today’s children and young people to become the competent digital citizens of tomorrow.

In terms of requirements and the demands this puts on social media providers, this very much resonates with a more general Positive Online Content perspective:

“Positive online content is digital content aimed at children, which enables them to learn, have fun, create, enjoy, develop a positive view of themselves and respect for their identity, enhance their participation in society and produce and distribute their own positive content.”

Ensuring that children have access to high-quality positive online experiences from an early age can assist and empower them to become active and participatory citizens.

Within this context, a number of Positive Online Content checklists have been developed, for instance as part of the EC-funded POSCON (Positive Online Content and Services for Children in Europe) thematic network as well as the ongoing work implementing the EC Better Internet for Kids strategy.

Content providers can use these checklists when developing new content and services to ensure that their products are fit for purpose, and take measures to ensure that children and young people can go online free from risk of harm, whether this be in terms of content, contact, conduct, or commercial considerations. Parents, carers and educators can also benefit from the tool by being better aware of the features they should look out for when choosing online experiences for younger children.

Important criteria for positive online content, services and apps include (here you find a revised checklist with special focus on the SELMA perspective):

Consider the basics:

  • Decide target age range.
  • Define objectives.
  • Plan the benefits to the child.

Provide clear and transparent objectives:

  • Keep the intended age range foremost in mind for the content design and interface.
  • Refer to recognised child development abilities appropriate to that age (cognitive, linguistic social and emotional norms) while keeping in mind every child is an individual and that they can differ enormously.
  • Consider the socio-cultural context.

Develop stimulating digital experiences:

  • Plan and design creative, interactive, stimulating, innovative, entertaining and/or educational elements and features.

Develop content and services that are accessible and inclusive:

  • Ensure that the needs and requirements of people with disabilities regarding vision, hearing, mobility or cognitive aspects are considered when planning, developing and producing the content/service with all participating parties.
  • Consider the structure, language/text/speech, sound, images and colours of content and services. Ensure they are designed in a way to support assistive technologies and provide alternative texts/attributes.

Ensure content is reliable:

  • Consider how the content/service/app will comply with the relevant legislation or regulations, for example, regarding the protection of the minors, data protection, commercial communication, copyright, and so on.
  • Provide information about the provider/creator.
  • Ensure contact details are easily accessed and respond within a reasonable amount of time.
  • Ensure accurate and reliable content which is maintained and reviewed regularly.
  • Where relevant, ensure that the offered content is true, up to date and topical.
  • Offer clear information for parents and caretakers.

Ensure safety is a priority for the child:

  • Be aware that the content is not harmful to minors: it does not contain offensive material or other harmful elements (for example, pornography, racist/violent/offending/xenophobic content, pictures or videos).
  • Develop an effective monitoring and moderation methodology as well as providing effective reporting mechanisms easy to find and use to contact help/advice or to report and block potentially harmful content or contact.

Keep in mind that the privacy of children and young people is paramount:

  • Ensure that privacy laws are respected.
  • Be careful not to gather more data than necessary.
  • Provide information about privacy measures and ensure that policies are clearly visible and in a language or format suitable for the target group and his/her parents.
  • If the data of users is processed or authorisation is needed while installing or using the app/site/service, this should be made transparent.
  • Ensure that personal data is treated confidentially and exceptions (for example, for the purpose of delivering a prize) are made transparent and it is stated clearly that the data is deleted afterwards.
  • If children can share their personal data, be aware that they have to actively confirm parental consent.

Sensitively develop social media elements and communication features: social networks, chat rooms, forums, guest books, video/picture sharing platforms, messengers, and so on:

  • Consider specific rules and security information on how to use the features safely and if they are offered (for example, guidance on netiquette while communicating with others, protection of personal data, protection against cyberbullying, and so on)
  • Encourage children to ask for parental consent before they create a user account and consider how this can be realised during the registration process.
  • Ensure easy to use and find reporting mechanisms are provided (for example, an alarm button in case children need help or advice or need to report potentially harmful content or contact).
  • Ensure constant and active monitoring and moderation of user contributions to ensure that all content that may be harmful to children is deleted.

If relevant, develop the commercial elements in a responsible manner: advertising, sponsoring, online shopping, in-app-purchases, and so on:

  • Ensure that regulations and laws regarding advertising and commercial elements addressing children are respected.
  • Ensure that commercial elements, advertising and online shopping facilities are clearly set apart from the content, easily recognisable, labeled as such and not age-inappropriate to children (for example, no advertising or shopping for alcohol, cigarettes, plastic surgery, diet products, and so on).
  • Ensure that the commercial proposition is openly communicated.
  • Consider the notion that apps should offer no possibility to spend money unless it is in a protected area and it is clear for the target group that they should ask an adult for permission. Do not pressure children to buy additional features.
  • Ensure that the commercial elements do not restrict the user's control of actions.
  • Ensure that there is a financial limit to what children can spend on the website/app/service.
  • Ensure that the payment methods require parental control.

The complete list and further information about Positive Online Content can be found at https://www.betterinternetforkids.eu/web/positiveonlinecontent/home.