​​​​ ​​​​

Blog

Expert talk webinar recording: confronting Disinformation campaigns

On the occasion of the launch of Sogicampaigns new free online course on how to fight disinformation, we organised a webinar with experts from different world regions to share insights into challenges and responses in dealing with disinformation campaigns.

We are sharing the recording of this webinar here. It brought together:

Mariam Kvaratskhelia – co-director Tbilisi Pride

Mariami Kvaratskhelia (she/her) is a passionate advocate for LGBTQI rights and equality and is recognized as a prominent leader in the community. As a co-founder and director of Tbilisi Pride, Mariam has been tirelessly campaigning and advocating for the rights of LGBTQI individuals in Georgia since 2015. 

Umut Pajaro – Researcher and consultant on AI ethics, Colombia

Umut (they/them) is a Black Caribbean non-binary person from Cartagena, Colombia working as a researcher and consultant on topics related to AI ethics, and AI Governance focusing on finding solutions to the biases towards gender expressions, race, and other forms of diversity usually excluded or marginalized. They are part of the Internet Society as Chair of the Gender Standing Group. They were speaker and moderator on the Internet Governance Forum, Mozilla Festival, RightsCon, and other tech and digital rights conventions, mainly focusing on sessions related to AI. They also were Mozilla Festival Wrangler 2022 and Ambassador 2022 – 2023, and Queer in AI core organizer from 2020 to 2021. 

Robert Akoto Amofao – Advocacy Manager Pan Africa ILGA, Johannesburg, South Africa

Robert Akoto Amoafo (he/him) is human rights advocate, organisational development coach and certified trainer. He was the Country Director of Amnesty International Ghana from 2018 to 2021, Communications Advisor to the Ministry of Gender, Children and Social Protection in Ghana, and Technical Advisor on the HIV Continuum of Care Project at FHI 360. Robert was a member of the International Advisory Committee of the Power of PRIDE Project run by COC Netherlands, Pan-African International Lesbian, Gay, Bisexual Trans Intersex Association (ILGA) and ILGA Asia. 

Damjan Denkovski – Deputy Executive Director of the Centre for Feminist Foreign Policy, Berlin

Damjan (he/him) is the Deputy Executive Director of the Centre for Feminist Foreign Policy in Berlin, leading the Centre’s work on human rights and international cooperation. He has been with the Centre since 2020 and is mostly curious about how we can strengthen cross-movement alliances and solidarity to counter exclusionary actors and narratives. He comes from a background in civil society capacity strengthening, peace building, and research. 

Moderator:

Alistair Alexander – Reclaimed.Systems, Berlin

Alistair Alexander (he/him) leads projects that explore technology and its impact on the information and physical environment, From 2016 to 2020 Alistair led The Glass Room project (www.theglassroom.org) – with exhibitions worldwide, reaching over 150,000 people worldwide.

At https://reclaimed.systems, recent projects have included: Resonant Signals sonification workshops with ZLB Libraries, Facing Disinformation online training programme, Green4Europe Hackathon for tech sustainability projects in Georgia and Ukraine, digital sustainability for the Gallery Climate Coalition

 

 

IDENTIFYING HARMFUL PERVASIVE NARRATIVES

When we communicate, our stories at the surface look like they are perfect. But we are seldom aware of the underlying narratives that these stories also propagate and which can be harmful.

One campaign video in the US featuring a Trans woman telling a perfectly uplifting story was not producing the desired effect in the audience. More research found out that the fact that the woman was featured alone was reinforcing the stereotype that Trans people are isolated and lonely. The set-up of the video contradicted the message, and visuals always win over words. The video was shot again showing the woman surrounded by friends and at last the message hit the spot.

Identifying the underlying negative frames is not easy. The Radcomms network has issues an interesting brief on this. Here are some excerpts :

As storytellers, we may reinforce tropes that perpetuate harmful pervasive ideas even when we don’t intend to. As you craft your story, or work with someone else to share theirs, avoid contributing to the proliferation of harmful, damaging stereotypes and stories. Stories that oversimplify people’s lives are almost always harmful because they lean into these established narratives. They may include:

Deservingness: These are stories that describe an individual’s moral merit. They might focus on factors like hard work or military service to show that they “deserve” success; support; and forms of public assistance like tuition aid, housing, or food assistance. The individual may be presented as an outlier who may easily be described through harmful stereotypes, but is one of the “good ones.”

Hero stories: These stories are about a single individual who, through extraordinary commitment, generosity, and skill, is able to “save” or “fix” people who are suffering the consequences of poverty. Often, this person’s success is presented without acknowledgment of others who participated in collective action.

White saviorism: In such stories, white people provide the help that they believe BIPOC need. These kinds of stories are doubly harmful because they exacerbate privilege and deprive people of agency. They also reinforce narratives that people rather than systems require fixing, and deny the power and importance of collective action.

Fixed-pie or zero-sum: These stories are written from a perspective that there is a fixed pie of resources, and that one person or group’s gain is a loss for someone else. Language that reinforces this narrative might include phrases like “getting ahead” or “left behind”.

Success stories, including “against-all-odds”and bootstrap stories : Success stories are tempting to tell for a range of reasons. Organizations often use them to demonstrate their effectiveness (in which case, they become savior stories) or to gain support from donors. These stories can be harmful because they can create an impression that if anyone can succeed against impossible barriers, everyone should.

Photo credit: Guilia Forsythe – Creative Commons

BUILDING MEANINGFUL CALLS TO ACTION

The Radcomms network has issued a useful brief on how to build good calls to action.

 

Powerful stories move people to action. Here’s how to create calls to action that work.

Great calls to action are:

1 Specific: Are you asking people to take an action that is observable and that can support your metrics of evaluation? For example, instead of suggesting that people give to your cause, provide a link and suggested amounts, or a way to give a small amount every month. Instead of suggesting that people “educate” themselves, provide a reading list; links to discussion forums; and links to where they can purchase, borrow, or rent resources.

2 Meaningful toward the issue: Will the thing you are asking people to do actually move the needle on the problem you’ve set out to address? Will they feel that, too? When people feel their actions make a difference, they’re more likely to stay engaged and keep taking action. For example, a call to action like “Stand up to racism” might inspire someone, but it could also leave them feeling like they are acting alone, and their action might seem to them like a drop in the bucket in the face of a large and complex problem. It may be more meaningful for people to share their story of how a policy harms them with an elected official, either through a meeting or a letter.

3 Achievable: Is the thing you’re asking someone to do actually possible? Sometimes, we use the goals of our efforts in place of calls to action. This can leave people feeling overwhelmed or uncertain about what action they can take, and lead them to do nothing at all. Instead of offering a call to action like “End structural racism,” which might leave even the most committed and well-meaning activist at a loss, identify the specific conditions you’re trying to change. How might the community you’re inspiring to act create pressure on those who can change the conditions?

4 Easy: As much as possible, make it easy for people to act. Link to sign-ups for events or rallies, create donation pages that make it easy to give, and provide clear instructions about what you want them to do and why it’s important.

5 Participatory: Create space and opportunities for people to bring their own voices and personalities to accepting your call to action.

6 Something it feels like everyone is doing: Our behaviors as individuals are heavily influenced by our perceptions of what people we see as similar to us are doing. So, your call to action might include language like, “People who care about ending economic injustice are [taking XYZ action].”

7 Activate emotions that keep people engaged: Such emotions might include pride, hope, awe, parental love, and sometimes anger.

Linking and Learning officer

Work opportunity

Develop and implement our learning sharing programme

Background:

SOGI Campaigns is a global hub for campaigners working on promoting LGBT rights and countering ant-gender discourse, SOGI campaigns has created online and offline spaces, allowing experts in campaigning to share and produce knowledge. 

Objective:

The main objective of the work is to develop and animate those spaces, especially focusing on the global South and East.

Main Tasks: 

  • Document and analyse interesting/successful campaigns
  • Produce learning products such as web articles, newsletters, webinars, workshops, etc.
  • Develop and monitor content for SOGIcampaigns’ social media
  • Actively engage with community members and generate conversations
  • Promote our online courses to LGBTQI+ activists worldwide
  • Offer assistance and guiding to individuals who enrol in our online courses

Deliverables:

· Every month: one in-depth documentation of a successful campaign
· Every two weeks: identification and documentation of one campaign tactic and/or trend
· Every month: one webinar with external contributors on issues related to campaigning
· Twice a month: one newsletter with original content
· Weekly social media posting and engagement
· Monthly reporting on inputs and outcomes of promotion of the online courses

Timeline/fees:

We are seeking a regular collaboration over an initial period of  one year. Time load will be defined in discussions with successful candidates.
Fees will be discussed according to time load. Candidates will be expected to develop a costed proposal after time load has been jointly agreed on.

Profile:

· Passion for creative campaigning on LGBTQI+ issues
· Social media fluency
· Experience in developing online learning events such as webinars
· Experience in writing articles.
· Fluent written and spoken English
· Ability to work independently and to report on work

This consultancy does not involve travels and all activities shall be performed from consultant’s home and with consultant’s equipment.
Applications, including a CV and a cover letter indicating ideas on how to meet the objectives, are to be sent to contact@sogicampaigns.org by September 30, 2022

Confronting Disinformation Spreaders on Twitter Only Makes It Worse, MIT Scientists Say

This article appeared on vice.com

Of all the reply guy species, the most pernicious is the correction guy. You’ve seen him before, perhaps you’ve even been him. When someone (often a celebrity or politician) tweets bad science or a provable political lie, the correction guy is there to respond with the correct information. According to a new study conducted by researchers at MIT, being corrected online just makes the original posters more toxic and obnoxious

Basically, the new thinking is that correcting fake news, disinformation, and horrible tweets at all is bad and makes everything worse. This is a “perverse downstream consequence for debunking,” and is the exact title of MIT research published in the ‘2021 CHI Conference on Human Factors in Computing Systems.’ The core takeaway is that “being corrected by another user for posting false political news increases subsequent sharing of low quality, partisan, and toxic content.”

The MIT researchers’ work is actually a continuation of their study into the effects of social media. This recent experiment started because the team had previously discovered something interesting about how people behave online. “In a recent paper published in Nature, we found that a simple accuracy nudge—asking people to judge the accuracy of a random headline—improved the quality of the news they shared afterward (by shifting their attention towards the concept of accuracy),” David Rand, an MIT researcher and co-author of the paper told Motherboard in an email.

“In the current study, we wanted to see whether a similar effect would happen if people who shared false news were directly corrected,” he said. “Direct correction could be an even more powerful accuracy prime—or, it could backfire by making people feel defensive or focusing their attention on social factors (eg embarrassment) rather than accuracy.”

According to the study, in which researchers went undercover as reply guys, the corrections backfired. The team started by picking lies they’d correct. It chose 11 political lies that had been fact checked and thoroughly debunked by Snopes. It included a mix of liberal and conservative claims being passed around online as if they were hard truths. These included simple lies about the level of donations the Clinton Foundation received from Ukraine, a story about Donald Trump evicting a disabled veteran with a therapy dog from a Trump property, and a fake picture of Ron Jeremy hanging out with Melania Trump.

Armed with the lies they’d seen spreading around online and the articles that would help set the record straight, the team looked for people on Twitter spreading the misinformation. “We selected 2,000 of these users to include in our study, attempting to recreate as much ideological balance as possible,” the study said.

Then the researchers created “human-looking bot accounts that appeared to be white men. We kept the race and gender constant across bots to reduce noise, and we used white men since a majority of our subjects were also white men.” The researchers waited three months to give the accounts time to mature and all had more than 1,000 followers by the time they started correcting people on Twitter.

The bots did this by sending out a public reply to a user’s tweet that contained a link to the false story. The reply would always contain a polite phrase like “I’m uncertain about this article—it might not be true. I found a link on Snopes that says this headline is false,” followed by a link to the Snopes article. In all, the bots sent 1,454 corrective messages.

After the reply guy bot butted in, the researchers watched the accounts to see what they’d tweet and retweet. “What we found was that getting corrected slightly decreased the quality of the news people retweeted afterward (and had no effect on primary tweets),” Rand said. “These results are a bit discouraging—it would have been great if direct corrections caused people to clean up their act and share higher quality news! But they emphasize the social element of social media. Getting publicly corrected for sharing falsehoods is a very social experience, and it’s maybe not so surprising that this experience could focus attention on social factors.”

Getting corrected by a reply guy didn’t change the way people tweeted, but it did make them retweet more false news, lean into their own partisan slant, and use more toxic language on Twitter. Rand and the rest of the team could only speculate as to why this occurred—the best guess is the social pressure that comes from being publicly corrected—but they are not done studying the topic.

“We want to figure out what exactly are the key differences between this paper and our prior work on accuracy nudge—that is, to figure out what kinds of interventions increase versus decrease the quality of news people share,” he said. “There is no question that social media has changed the way people interact. But understanding how exactly it’s changed things is really difficult. At the very least, it’s made it possible to have dialogue (be it constructive, or not so much) with people all over the world who otherwise you would never meet or interact with.”