The SEL activities helped the learners self identify the key qualities required for different roles to effectively campaign and engage with online stakeholders. They also considered the importance of empathy in these interactions, recognising that online services are not just run by technology but also by people, and that suitable engagement with those people is key to having your voice heard and taken seriously.
The Media analysis unit encouraged learners to consider different types of campaign message and the best platforms online for sharing them. Recognising the features of online platforms and their audiences is key to getting a message heard, be it by platform users, service providers or individuals in that online community with significant power (e.g. politicians, leaders and changemakers).
This Media production unit will enable learners to understand the community standards and guidelines that social networks and other online spaces put in place to help users know what is acceptable and unacceptable behaviour. Learners will also consider the purpose of reporting tools, how they could be improved to help combat hate speech, and how they can educate their community (online and offline) about the successful use of these tools.
These questions are provided as examples to initiate and guide discussions around the topic in this focus area.
- What action would you take if you saw hateful content online?
- When would you report it?
- Who would you report the content to?
- Does the platform/type of content affect how you report it?
- Do you think online reporting tools are effective?
- How do you think reporting mechanisms could be improved online?
- Do you think your peers/others in the community know how to report online?
- What could be done to help others use reporting tools successfully?
- Who should be responsible for deciding if hateful/harmful content gets removed?
The SELMA project short definition of hate speech is:
“Any online content targeting someone based on protected characteristics with the intent or likely effect of inciting, spreading or promoting hatred or other forms of discrimination.”
This activity will benefit from access to technology.
Please note: This activity is more suited to learners aged 15 years old and over owing to the language and description of acts in the Facebook community standards. Please review the standards and form your own professional judgement on whether or not these are suitable to share with your group.
The activity provides learners with the opportunity to understand the community standards that exist in online spaces and consider how reporting tools could be developed/improved.
Explain to the group that they have been tasked with creating a more successful reporting process for their online community.
Explain that, before they start creating their own process, it is useful to understand the existing processes used by successful social media services. Facebook can be used as a good example of a social network that has a well-developed set of reporting tools and guidelines.
Provide copies or display the Facebook reporting processes resource.
Allow time for the group to review the content, look at the community standards and, if time allows, read the blog post. You could also choose to split the group into smaller groups and have each group look at one aspect of the resource.
Discuss the challenges Facebook encounter. Ask:
- Do Facebook get it right all the time?
- Is it right that a private company should decide what is/is not hate speech?
- Should Facebook be doing more? If so, what should they do? (This article by Mark Zuckerberg goes into great detail about the steps Facebook have taken and will be taking in the future. Note: it is a long piece; this article summarises most of the key points.)
Explore the story from 2013 of how Facebook changed its community standards as a result of pressure from the Women, Action and the Media group and writer and activist Soraya Chemaly. The Open Letter from the group describes the mobilisation and action taken.
Share and discuss this news report about Facebook blocking comedian Marcia Belsky’s posts.
Explain the decision to report can be a difficult one for some of the reasons that have been explored. The learners job is to complete the blank decision tree for their online community. Imagine that a member of the community has seen something they did not like and has hit the report button. What are the steps you might lead them through prior to allowing them to make a report?
Display these key points to build into the decision tree:
- How can you provide a meta-moment for the person considering making a report?
- Is reporting always the right action to take?
- Do I report to the platform, police, trusted adult, or the online community?
- What actions can/should the community take?
- What happens if the platform/police/community do not do what you expected?
Allow time for the learners to begin to design their process. They should be encouraged to share their thinking with others. A template has been provided but learners should be free to construct their decision tree in any way that they wish - the Media hotlist section contains links to a number of free online tools that can be used to create a decision tree.
Where time allows, ask individual learners to share their process with the group for feedback and discussion. You may wish to undertake this feedback cycle several times during the session.
Link this back to the peer-mentoring work in Theme 7 (if undertaken) and the importance of creating suitable controls for online groups.
These tools allow for quick creation of decision trees (note - some require creation of a free account):
Evaluate how online platforms handle reports of hateful content. Develop strategies to aid online users to successfully report hate speech.