The SEL activities helped the learners self identify the key qualities required for different roles to effectively campaign and engage with online stakeholders. They also considered the importance of empathy in these interactions, recognising that online services are not just run by technology but also by people, and that suitable engagement with those people is key to having your voice heard and taken seriously.
The Media analysis unit encouraged learners to consider different types of campaign message and the best platforms online for sharing them. Recognising the features of online platforms and their audiences is key to getting a message heard, be it by platform users, service providers or individuals in that online community with significant power (e.g. politicians, leaders and changemakers).
The Media production unit enabled learners to understand the community standards and guidelines that social networks and other online spaces put in place to help users know what is acceptable and unacceptable behaviour. Learners also considered the purpose of reporting tools, how they could be improved to help combat hate speech, and how they can educate their community (online and offline) about the successful use of these tools.
The Citizenship unit investigated who to target and how to persuade them in order to effect change within a company or organisation. Learners considered viable strategies for persuading online service providers to make changes to their standards or the functionality of their platform.
The SELMA project short definition of hate speech is:
“Any online content targeting someone based on protected characteristics with the intent or likely effect of inciting, spreading or promoting hatred or other forms of discrimination.”
Activity: Creating a safe space online
Note: This activity can be adapted to suit a wide range of group sizes. learners should work in small groups of 2-4, depending on the overall group size.
What you will need:
- A printed copy of the “Creating a safe space online - Learner resource” for each group of participants.
- A printed copy of the “Creating a safe space online - Trainer resource” page per group of participants. You should cut out the twenty boxes, and place them in an envelope. There should be one envelope for each group of participants.
- At least one pen per group of participants.
- One stick of glue per group of participants.
- A whiteboard/flipchart and marker for the trainer.
- Remind learners that the SELMA toolkit is a set of modules designed to address hate speech through a social and emotional learning approach. Learners will be participating in this activity designed to get them thinking about how they can support each other to test out their campaigns before they launch them more widely.
- It’s important that learners feel safe, comfortable and warmed up before participating in an activity. Check out our handy “How to” guide for general recommendations on how to introduce the peer-mentoring activities.
- Draw learners’ attention to the first page of the “Creating a safe space online - Learner resource”. Tell learners to take five minutes to think about all the things that come to mind when they hear the word “safe” and to write them down in the circle. Explain to learners that you would like them to think about “safety” from a physical and emotional perspective, and you might suggest that learners visualise a time that they felt really safe, and use that as a guide. Explain that one learner from each group will be presenting this briefly to the whole group at the end of the five minutes.
- At the end of the five minutes, ask one representative from each group to present what the group came up with, and take notes on the whiteboard/ flipchart.
- When all groups have finished presenting, look at the notes taken on the whiteboard/flipchart and reflect on general themes. What does safety seem to mean to this group?
- Draw learners’ attention to the second page of the “Creating a safe space online - Learner resource”. Tell learners that, in the future, they will be developing campaigns and ideas for tackling online hate speech. You would like them to create a safe space online where those ideas and campaigns can be tested out before they are shared more widely and publicly.
- Ask learners to reflect on what a “safe space” would look like online, and to discuss this with their group. They should take five minutes to discuss this, and take notes in the top box on page 3 of the “Creating a safe space online - Learner resource”. Some prompts you might use to help learners are:
- Would a safe space mean nobody could criticise someone else’s ideas?
- Would a safe space be somewhere where only nice things were said?
- Do you think there is a difference between something being “unkind” and something being “unsafe”? And if so, what is the difference?
- When the five minutes are up, ask one representative from each group to call out some of the words that people used to describe an online safe space. Take notes on the whiteboard/flipchart and reflect on general themes with the group. Take the opportunity to talk about constructive forms of disagreement and criticism, explaining that the best ideas thrive in environments that are open, inclusive and welcoming.
- Now, explain that learners are faced with a scenario where they did try to set up this safe space, but hadn’t established any rules. They wake up in the morning to find lots of notifications: a range of comments beneath a campaign video that somebody had posted. The comments aren’t all great! They have five minutes, as a group, to open their envelopes, and sort the comments into the three boxes on page 3 of the “Design your safe space” booklet: safe, unkind but not unsafe, and unsafe. Explain that you’ll come together as a group again after the five minutes are up to talk about how each group sorted the comments.
- When the five minutes have passed, start a group discussion about how the comments were sorted. What were the differences between comments that were “just” unkind, versus those that were unsafe? Why were some comments that contained criticism still placed in the “safe” box? Use this as an opportunity to talk about the fact that “safety” doesn’t mean lack of challenge or disagreement - but it’s about the way that ideas are challenged and disagreement is communicated. Telling somebody that their video is great if you think it’s not very good isn’t helpful to them - but explaining constructively which bits of it you liked and some areas you would suggest they worked on is both kind and helpful. On the other hand, completely shooting down their idea without any constructive suggestions is unkind and unhelpful. Then, when we begin attacking them as people, that can actually contribute to an environment where that person no longer feels safe. But it’s also worth pointing out here that people might disagree about the point at which an “unkind” comment can be considered “unsafe” - and that in fact these kinds of debates are central to broader social debates about online safety.
- Once they’ve sorted the comments, it’s time they got to work thinking about how they’re going to manage this space in the future. Clearly, setting up an online space without thinking about the rules around it wasn’t a great idea! On page 4 of the “Creating a safe space online - Learner resource”, they have a series of questions to guide their decision-making process. Ask groups to take 5-10 minutes to work together to answer the six questions (the first five are multiple choice, the last one is a free-flowing answer where they’ll get creative). It’s a good idea to talk them through the questions first to make sure they’ve all understood.
- When the 5-10 minutes are up, bring the groups back to talk through their answers. What’s important at this stage is the discussion that these answers prompt - there isn’t a “right” or “wrong” answer - it’s about considering the pros and cons of each decision. Here are some prompts to help guide the discussion:
- Membership: Allowing anyone to join is more inclusive, but gives admins less of a clear idea of who is and is not part of the group, which could be problematic (e.g. a non-learner joins, and is really just there to troll).
- User anonymity: Allowing people to join anonymously may encourage people to share ideas more comfortably (if they’re shy or don’t want everyone to know they worked on an idea), but it also means people might feel more emboldened to be cruel, without fear that they’ll be identified and face consequences.
- Comment approval: Users might feel frustrated if they cannot immediately share their ideas, and it could create the sense that their speech is being limited, it also creates more work for admins. However, allowing anyone to comment without prior approval is risky, and could end up creating more problems and work for admins at a second stage, if comments become abusive.
- Unkind comments: Deciding on what comments should be considered “unkind but not unsafe” is not always straightforward - even if possible, allowing unkind comments can create an environment which encourages more abusive forms of communication. At the same time, limiting discussions too much might leave users feeling that they are not being allowed to express themselves freely.
- Content removal logic: Whichever approach is taken, a clear policy for what forms of speech are and are not allowed in the space will be key. Concentrating the decision-making in the hands of the admins can be good if it means that they are able to then apply that policy consistently - but allowing users to report means that there is wider input and buy-in to the moderation process.
- Proactive policies: When we think about content moderation, we often think about removing or reducing negativity. That’s very important - but it’s also important to think about proactive ways to encourage good behaviour in the first place. Some people argue that having clear community rules that everyone is made to see and sign can make a difference (there’s currently a study going on to see whether making community rules clearer will reduce hateful comments on Twitter, for example). In December 2018, the team that won the SELMA hacking hate hackathon, came up with the idea of rewarding gamers who showed good behaviour with points and tokens, to incentivise a healthy gaming culture. What ideas do your learners come up with?
- People have different ideas about what “safety” means - although there are also common themes (repeat the themes that came up across the board). What’s key to emphasise is that safety does not mean the absence of criticism and disagreement - but it’s about how that criticism and disagreement is expressed and handled.
- Although we often think of online spaces as places where people can share ideas freely, there is always a
- lot of work going on in the background to manage that - and that is true for most, if not all, of social media platforms today. Especially if you are committed to keeping that space safe for people, you really need to think through what your policy is, what kind of speech you do and do not accept, and then what systems you’ll put in place to make your policy work in practice. This is why we’d strongly recommend that you have the help of a responsible adult - you shouldn’t have to carry the burden alone.
Reflect on the meaning of feeling “safe” online. Consider the tools that online stakeholders can use to make users feel safer online, and some of the advantages and disadvantages of each tool.