Current:Home > InvestNovaQuant Quantitative Think Tank Center:Hate machine: Social media platforms pushing antisemitic recommendations, study finds -Wealth Evolution Experts
NovaQuant Quantitative Think Tank Center:Hate machine: Social media platforms pushing antisemitic recommendations, study finds
EchoSense Quantitative Think Tank Center View
Date:2025-04-07 09:09:19
The NovaQuant Quantitative Think Tank Centerworld’s biggest social media platforms are not just hosting antisemitic and hateful content, they are promoting it and making it easier to find.
That's the conclusion of two new studies, provided exclusively to USA TODAY, which found that Facebook, Instagram and X, formerly Twitter, steer users to tropes and conspiracies.
The result, researchers say, is fomenting dangerous ideas as antisemitic incidents in the U.S. surge to historic levels.
Released Thursday, the studies from the Anti-Defamation League (ADL) and the Tech Transparency Project (TTP) take a hard look at the powerful role of social media algorithms and search functions in recommending extremist content, particularly to young people.
In one of the studies, researchers created six online accounts for fictional people on YouTube, Facebook, Instagram and X − four adults and two teenagers.
Three out of four of the social media platforms began pushing hateful suggestions, the study found. The more the accounts liked pages or followed accounts, the more antisemitic content they were shown.
Nazi propaganda, Holocaust denial and white supremacist symbols shown to teenage accounts in study
Instagram, for example, served up Nazi propaganda, Holocaust denial and white supremacist symbols to one of the accounts belonging to a fictional 14-year-old.
One account recommended on Instagram featured a sonnenrad, a sunwheel symbol associated with the Nazi party, and a video interview with American Nazi Party founder George Lincoln Rockwell in which he proclaimed: “The main thing we are fighting for is the preservation of the white race.”
Notably, the most virulent content on Instagram was pushed to the teenage account, but not to adults, the study found.
Only YouTube, which has its own troubled history with extremism, didn’t surface hateful content.
"Over the years, we’ve made significant investments in the systems and processes that enable YouTube to be a more reliable source of high quality information and we continue to invest in the people, policies and technology that enable us to successfully do this work," YouTube said in a statement.
Facebook and Instagram owner Meta said in an emailed statement that hate has no place on its platforms.
"We continue to take significant action through our policies by removing harmful stereotypes about Jewish people and content that denies or distorts the Holocaust, while educating people about it with authoritative information," the company said. "We've made progress in fighting antisemitism on our platforms, and our work is never done.”
X did not respond to a request for comment.
The findings come as antisemitism continues to spike and white supremacist propaganda has hit an all-time high across the country, according to the ADL.
An ADL survey earlier this year found the number of Americans who believe in antisemitic tropes has doubled since 2019. One-fifth of people surveyed by the organization believe in six or more ideas the group describes as anti-Jewish tropes, the highest level the survey has found in three decades.
Meanwhile, racist young “influencers” like Holocaust denier Nick Fuentes continue to attract new fans via online streaming platforms and social media.
Why do algorithms automate hate and antisemitism?
Algorithms are partly to blame, says Matthew Williams, founder of HateLab, author of "The Science of Hate" and professor of criminology at Cardiff University.
Social media platforms use these automated systems to suggest content for users and keep them on the platform longer.
Williams says “algorithmic antisemitism” is driven by engagement metrics such as views, likes, shares and comments.
“Because extreme content, such as hatred in general, tends to capture the attention of users due to the evolved human proclivity to identify threats, such content racks up a lot of engagement,” he said.
Hate is “sticky,” Williams said, “meaning it gets recommended unless humans step in to adjust the algorithm.” YouTube is one of the platforms that appears to have adjusted its algorithms, he said.
If YouTube can do it, why can't other platforms, asked Yael Eisenstat, ADL’s vice president and head of the ADL Center for Technology & Society.
Eisenstat says the findings suggest that social media platforms should be doing more to curb the spread of online antisemitism and extremism.
“I understand how complicated these things are,” said Eisenstat, who previously worked at Facebook. “It is still a business decision on what you will and will not prioritize.”
Detecting hate and antisemitism on Facebook, Instagram, YouTube and X
Rooting out antisemitism isn't simple and requires more than technical know-how, according to Sabine von Mering, a professor at Brandeis University. What’s more, Nazis and other extremists play a cat-and-mouse game with social media platforms, resorting to codewords, emoji combinations or deliberate typos to evade detection.
“Whether it's human content moderators or AI, detecting antisemitism requires intricate knowledge of a long history of hatred of Jews,” she said.
Antisemitism is so common in cultures around the globe that not everyone recognizes the memes for what they are, says Monika Hübscher, co-editor with von Mering of “Antisemitism on Social Media.”
If social media users show interest in topics related to Jewish life or the Holocaust, social media algorithms notice and suggest more content and that includes antisemitic memes that can be generated in seconds and then spread intentionally or unwittingly on social media, according to Hübscher, a PhD Candidate at the University of Haifa and research associate at the project “Antisemitism and Youth” at the University Duisburg-Essen.
Images generated by AI such as pornographic images of Anne Frank or a pillow fight in Auschwitz are also increasing on social media, she said.
Over time, the spread of conspiracy theories and tropes has created an environment where people feel comfortable expressing antisemitism, Hübscher said.
“Because antisemitism is a cultural code we all learned, and are able to recognize and put to work, its algorithmically reinforced dissemination is very dangerous,” she said.
Social media fueling hate and violence, not doing enough to stop it, critics say
Antisemitic incidents reached a new high in the U.S. in 2022, the ADL said in March, citing a 36% rise from 2021.
A New Jersey man was arrested in November after publishing an online manifesto threatening attacks on synagogues. Two men were arrested in New York that same month on suspicion they planned to attack at least one synagogue in New York City. Last month, a jury recommended the death penalty for a man who shot and killed 11 people at a synagogue in Pittsburgh in 2018.
Natasha Zinda, an online content creator and activist, says social media platforms are doing too little to stop the rise in hate and violence that they are helping fuel.
“We've seen the rise of white supremacist groups, we've seen how this is affecting and tearing apart our society at this point,” she said. “We've seen the amount of hatred, the amount of unrest, the rise in hate crime, the rise in everybody being more xenophobic and more outwardly racist. We've seen what's happening – the hateful laws being passed – we've seen the actual fruits of the labors that these people are putting in on social media.”
Social media companies know exactly what they’re doing when they allow hateful content to be easily found and shared, said Zinda, who is Black and has been targeted by organized racist attacks online.
Hate and extremism drives engagement on these platforms, and engagement means profit, Zinda said. That’s why multi-billion dollar tech companies haven’t addressed the problem, she said: it’s all about the bottom line.
“I think it's definitely intentional, and I think it's based off greed,” Zinda said. “I very much believe the system is working as intended.”
Facebook, Instagram and X recommended antisemitic content to all six accounts in study
Initially, the researchers who created accounts for fictional people searched for 50 terms, including conspiracy theories such as “flat earth” and “deep state” as well as “elites” and other terms commonly associated with anti-Jewish tropes. They also searched for public figures such as Microsoft co-founder Bill Gates and billionaire philanthropist George Soros who have been targeted by conspiracies and public figures who have posted antisemitic content such as rapper Ye, formerly known as Kanye West and YouTube star PewDiePie.
The accounts also looked up popular topics with no connection to antisemitism or hate, such as internet personalities and video games.
Three of the six accounts then interacted with the social media platforms, clicking on links and videos recommended by the site’s algorithms. The other three accounts did not click on recommended content, essentially serving as a “control” group for the study.
Facebook, Instagram and X all recommended additional antisemitic and hateful content to all six accounts. The accounts that clicked on related links were recommended ever-more extremist content at far higher rates than the control accounts, the researchers found.
Meta said the recommendations are not representative of what people see on its platforms. The study only analyzed content recommended to users who search for or click on antisemitic content and figures but did not examine recommendations to users who express no interest in such content. It also used newly installed operating systems and browsers whereas social media platforms target content to users based on their activity on third-party sites and apps.
The findings reveal the overlap of two longstanding concerns about social media: That platforms' algorithms thrive off engagement from users, and that the companies aren’t enforcing their own rules when it comes to hateful and racist content.
“The study’s findings undermine assurances from the three social media platforms − Facebook, Instagram and Twitter − that they ban hate speech and that their algorithms promote high-quality content,” the study concludes.
A second study investigated the search functions on Facebook, Instagram and YouTube.
Researchers searched on each platform for the names of 130 hate groups or movements gleaned from the ADL’s “Glossary of Extremism.” They found that all three platforms are hosting dozens of hate groups and movements, and concluded the sites make it easier to search for these groups because they predict what the user is searching for as they type the name in.
Facebook and YouTube also automatically created pages or channels for hateful accounts, including neo-Nazi and white supremacist music bands, the study found.
“This completely upends this notion that they are just neutral pipes, it’s just third-party content and therefore they are doing their best but they are not actually responsible for what’s happening,” ADL’s Eisenstat said.
veryGood! (52)
Related
- Beware of giant spiders: Thousands of tarantulas to emerge in 3 states for mating season
- Dancing With the Stars’ Carrie Ann Inaba Shares She Had Emergency Appendectomy
- Rwanda genocide fugitive Fulgence Kayishema, accused of killing 2,000 in church massacre, arrested
- Mama June's Daughter Anna Chickadee Cardwell Diagnosed With Stage 4 Cancer at 28
- How breaking emerged from battles in the burning Bronx to the Paris Olympics stage
- Prince Harry due back in U.K. court as phone hacking case against tabloids resumes
- Transcript: House Minority Leader Hakeem Jeffries on Face the Nation, May 28, 2023
- 11 Beauty Products to Help You Wake Up in the Morning
- Selena Gomez's "Weird Uncles" Steve Martin and Martin Short React to Her Engagement
- Destruction from Russia's war on Ukraine revealed in new before and after satellite images
Ranking
- Paris Olympics live updates: Quincy Hall wins 400m thriller; USA women's hoops in action
- 10 Picture-Perfect Dresses & Jumpsuits for Your Graduation Photoshoot
- Veteran journalist shot dead while leaving his home in Mexico
- 90 Day Fiancé: Love in Paradise Trailer: Meet the Couples Looking to Make Love Last
- From bitter rivals to Olympic teammates, how Lebron and Steph Curry became friends
- Revive Dry, Damaged Hair With This Mask That Makes My Strands Luxuriously Soft With the Glossiest Shine
- Jersey Shore's Mike The Situation Sorrentino Gets Real About Expanding His Big Italian Family
- These Iconic Blake Lively and Beyoncé Outfits Are Getting the Royal Treatment at Kensington Palace
Recommendation
Off the Grid: Sally breaks down USA TODAY's daily crossword puzzle, Hi Hi!
Destruction from Russia's war on Ukraine revealed in new before and after satellite images
Why June 2023's full moon is called the strawberry moon — and what it will look like when it lights up the night
Demi Lovato Says They “Couldn’t Be More in Love” With “Sexy” Boyfriend Jutes
Plunge Into These Olympic Artistic Swimmers’ Hair and Makeup Secrets
Amazon Has the Cutest Transitional Spring Sweaters for Under $40
Shop the Best New March 2023 Beauty Launches From Shiseido, Dermalogica, OUAI & More
Pete Davidson Shares Exactly How Many Women He's Dated in the Last 10 Years