top of page
  • Writer's pictureKailey Herbrich

In the chaos of conflict, truth is compromised

As the world observes the ongoing crisis in the Middle East, a new battleground has emerged: social media is accused of fanning the fires of misinformation.

By Kailey Herbrich



Nobody could have foreseen the magnitude of uproar that would follow from a single tweet about a church explosion. 


When fake news becomes a weapon, especially targeting a church, it's particularly dangerous. 


This danger becomes alarmingly clear in instances like the following post on X, formerly known as Twitter, just two days after the Israel-Hamas war broke out. 


"Israel just blew up the third oldest church in the world," the Oct. 9, 2023, post said.


The war began in a surprise assault Oct. 7 on Israel by Hamas. Israel declared war and imposed a siege on the Gaza Strip, an area of Palestinians Hamas had controlled since 2007. 


The initial attack resulted in a death toll around 1,200 and the capture of about 150 hostages and prompted airstrikes and troop deployment around Gaza and along the Lebanese border in retaliation.


Despite the church bombing claim on social media. there were no confirmed reports as of 9 p.m. Oct. 11 of the Saint Porphyrius Church being targeted, according to Agence France-Presse.


"Saint Porphyrios Church in Gaza is untouched and operating in service of the community and our congregation. The news about it being damaged is false," a Facebook post in Arabic stated on Oct. 9.


Ryan Maloney of the U.S. Military states that misinformation often leads to skewed perceptions. 


“Hamas militants used hospitals as bases, launching attacks from there. When Israel retaliated, headlines were misleading, blaming Israel for bombings without providing the full context,” said Maloney.


Navigating falsehoods in conflict and media

Falsehoods in conflict are highly dangerous because they distort perceptions, influence decisions, and worsen societal divisions, ultimately risking destabilization and escalating conflicts.


The term "fake news" gained popularity in 2016 and peaked in 2018. A significant portion of American adults believe that news on social media is biased, with 66% stating that 76% or more of it is biased.


The blurring of the lines between truth and falsehood makes it challenging to verify traditional information.


 It's crucial to differentiate between "disinformation," which is deliberately false in order to mislead, and "misinformation," which is false but not maliciously intended.


Disinformation can be found in the fabricated article during France’s 2017 election falsely claiming Prime Minister Emmanuel Macron received funding from Saudi Arabia.


This is different from misinformation, an example of which would be the erroneous report suggesting Pope Francis endorsed Donald Trump for president that was later debunked.


Now more than ever, democratization of information dissemination through the internet and social media platforms has only exacerbated the spread of such false information.


While these platforms have made it easier for people to find and share information, they have also facilitated the quick spread of misinformation.


Profit-driven fake news websites known as "troll farms" exploit societal tensions, amplify them, and spread false information for profit through the use of echo chambers, social bots, and legal tactics.


Technological advancements, such as artificial intelligence, combined with government intervention, make it increasingly challenging to identify false information and assess reliability.


Despite efforts by tech giants like Facebook to change their algorithms and collaborate with fact-checking organizations, the spread of false information has persisted. It has even influenced global events like Brexit, and it is only likely to get worse. 


Images an easy target for spreading false information

As the Israel-Hamas conflict continues, misinformation through photo and video editing has become a common tactic.


Both sides share content showing casualties, damaged infrastructure, and alleged acts of both brutality and bravery.


A video shared on TikTok purported to show a fire in an urban area,  initially believed to depict attacks on Gaza, or the Israeli Skyline after a Hamas strike, was later found to be footage from Algerian football team celebrations.


Another video, falsely claimed to depict Israelis creating fake videos of Hamas harming children, was actually behind-the-scenes footage from a short film directed by Muhammad Awwad, based on the life of a Palestinian boy.


These videos skew perceptions, making it more difficult to distinguish between propaganda and reality.


Lee likens misinformation to a fog because it distorts reality.


"Misinformation is like gaslighting, creating confusion to the point where it's infeasible to discern truth,” Lee said. “It floods us with so many stories, true, false, or half-true, that verification becomes nearly impossible."

For example, a deepfake video that depicted model Bella Hadid , who is half-Palestinian, sympathizing with Israel and apologizing for past remarks shaped public perceptions and perpetuated false narratives.


Artificial intelligence was used to modify video from a 2016 event in which Hadid discussed her battle with Lyme disease rather than her views on Israel. 


Even though Hadid has always highlighted her support for Palestine, Israeli model Nataly Dadon shared it online, claiming that Hadid had switched her allegiance from Palestine to Israel.


An even more disconcerting situation happened when an unverified report by Israeli journalist Nicole Zedeck claimed Hamas decapitated 40 babies. That claim was repeated by a spokesperson for Israel’s Prime Minister Benjamin Netanyahu, and a few hours later President Joe Biden was sharing the false information. 


Eventually the Israeli government admitted it had no evidence of such beheadings, though it did continue to imply it happened.


False information like this being spread during such a turbulent time not only misinforms an outraged public, but could lead to escalated action by governments unnecessarily. 


Social media prime place for misinformation 

Radical groups, state actors, and political supporters leverage platforms like Facebook, X, and TikTok to spread defamatory stories, justify violent acts, and disrupt peace efforts.


It’s an information battle where hashtags, memes, and viral videos serve as potent tools for reaching millions rapidly.


In fact, Lee noted that states aligned with both sides have flooded social media with misleading and false stories because it’s so easy to use as a weapon in the information war. 


The Anti-Defamation League found that 70% of survey respondents reported exposure to misinformation or hate on social media related to the Israel conflict.


Part of the problem, says Maloney, is the impact algorithms have on narratives.


"A lot of social media algorithms filter out stuff that doesn't fit a narrative, making for a less informed population," he said.

Since the Oct. 7 Hamas attack in Gaza, researchers have uncovered numerous accounts on "X" that are orchestrating misinformation campaigns.


Elon Musk, who assumed control of X in October 2022, redesigned the social network in March 2023, allowing significantly less regulation, introducing purchasable "verified checkmarks" formerly indicating credibility, and altering post prioritization.


Changing the system to paid verification benefited bad actors spreading misinformation, which was evident with the Israeli-Hamas conflict.


During the first week of the conflict (Oct. 7-14, 2023), NewsGuard analyzed the 250 most-engaged posts promoting false narratives. Shockingly, 74% of these posts came from verified X accounts.


The posts promoted 10 false or unsubstantiated claims, which collectively received over 1.3 million engagements and were viewed over 100 million times globally in just one week.


Additionally, X Premium accounts shared a fake White House memo alleging $8 billion in aid to Israel, reaching tens of millions of people.


The Tech Transparency Project also identified X Premium accounts promoting Hamas propaganda videos, highlighting the potential for spreading terrorist content.


Despite Musk and X CEO Linda Yaccarino's efforts to combat misinformation on X, they have fallen short. 


Their promotion of Community Notes, which allows users to annotate posts to correct misinformation, ultimately faced doubts about its effectiveness due to manipulation risks and internal disputes.


Attraction of financial gain drives misinformation

“Marketers are essentially in the business of producing misinformation in ads," said Robert Pitts, a marketing and society professor at the College of Charleston.


Marketers often work together to spread misinformation, polarize public opinion, and undermine trust in government and media, aiming to prolong the conflict and prevent genuine resolutions by creating confusion and anger.


Misinformation isn't confined to war propaganda; it can also incite violence across various domains.


During the COVID-19 pandemic, social media hinted that vaccines might make people infertile. This scared many into avoiding vaccines, making it harder to fight diseases.


Similarly, influencers like Nicole Bendayan promoted untested "natural" forms of birth control, which may dissuade some people from using reputable techniques and increase the risk of unexpected pregnancies and untreated medical issues.


One of the most dangerous elements of social media is the added roadblocks to uncovering where content is coming from.


This is happening as social media platforms are making their Application Programming Interfaces more restrictive so researchers cannot detect, monitor, and publicize harmful or false content.


Disinformation tactics worsening the Israel-Hamas conflict

Outside entities, including state-backed troll operations, political players, and extremist groups, often exploit the Israel-Hamas conflict to further their own agendas or disrupt regional power dynamics.


China, Russia, and Iran have become significant allies of Hamas, amplifying the group's message and distributing propaganda through social media and official channels. 


Iran, for example, openly supports Hamas, portraying it as a legitimate resistance movement against Israeli aggression. 


Iranian state media criticize Israeli military actions and praise Hamas' attacks on Israel.


Aside from offering apparent support, foreign entities engage in deceptive campaigns utilizing various tactics, such as spreading misleading material and utilizing bots, proxies, and fake accounts, to manipulate public opinion and demonize both sides of the conflict.


Accused of disseminating false information about Israel's actions in Gaza, Russian state-run media exacerbates the cycle of violence and retaliation, fueling hostilities and widening the gap between the conflicting parties.


“Anything and everything is constantly being said, and they make more money as engagement increases, driven not by truth, but by anger,” said Lee.  “So, incentives actually work in the opposite direction.”


“Platforms like X thrive on controversy and sensationalism,” Lee continued. “They prioritize content that generates heated debates and emotional reactions because it keeps users glued to their screens, clicking, commenting, and sharing. It's a business model built on exploiting our basest instincts, and unfortunately, it's incredibly effective.”

The global reach of these disinformation tactics raises concerns about the possibility of further regional unrest. 


China's image as an impartial mediator reflects the Middle East's complex geopolitical dynamics, as well as Russia's desires to use the crisis to weaken the power of the West.


Echo chambers, confirmation bias help spread misinformation

It is human nature to ignore opposing viewpoints and seek information that validates pre-existing beliefs. 


But that only exacerbates society’s fast-developing echo chambers, where people surround themselves with people of only like-minded views.


"Everyone misinterprets what's happening... fitting their already established opinions without considering all the facts,” said Maloney. "People label things they dislike as disinformation. They see a story and dismiss it as ridiculous, even if it's from a reliable source, simply because they disagree with it."


This tendency was made worse by "echo chambers" on platforms like X, in which users surrounded themselves with other like-minded people who reinforced their preexisting beliefs.


Misinformation grew uncontrolled within these echo chambers as individuals discovered and shared incorrect information without properly examining it.


"They prefer to believe something from a source that echoes their beliefs because we always like to hear our own attitudes echoed," added Pitts.

 

In the current climate of deception, the need for media literacy cannot be stressed enough. 


Techniques like the SIFT method empower users to distinguish between credible and unreliable online sources, playing a significant role in combating misinformation through promoting responsible social media engagement and fact-checking practices.


Individuals propagating unverified stories during the battle serve as a gloomy reminder of the dangers of blind information consumption and dissemination. 


Maloney believes part of the responsibility falls with news organizations to make sure they are telling the full story and not allowing their own biases or ignorance to portray a story incorrectly.


The Black Hawk pilot recalled a story early on in the Israel-Hamas conflict where the emphasis appeared negative toward an Israeli attack on a Gaza hospital. But the story didn't report that the Israeli military had intelligence that the hospital was a Hamas command center.


“When a company filters out stories that offer a broader perspective and tailors content to match existing beliefs, it results in a less informed population,” said Maloney.


Many struggle to differentiate between credible and unreliable sources and critically analyze information, making it easier for distributors of false information, whether deliberate or accidental, to deceive them.


"Who can you trust?... People have trouble distinguishing between trained journalists... and those who claim to be journalists but lack a method,” Lee said succinctly capturing this dilemma.

Thus, programs that promote media literacy and digital citizenship have become essential for combating disinformation and building a more informed and resilient society.


Moreover, the conflict between Israel and Hamas highlights the importance of improving media literacy and critical thinking abilities in order to effectively combat the flood of false information that circulates on social media platforms during times of crisis.


Lee advocates reading the news rather than watching it to navigate today's complex information vacuum. 


The communication professor has stopped watching the news himself as there is a built-in bias to focus on the most outlandish or sensational images to get attention.


“I don't personally ever watch any news. I only read news,” Lee said, pointing out that it’s not journalism to focus on the “weird, confusing, muddled reality that we're faced with.” 

“I think reading the news is the only way forward,” he said.

Recent Posts

See All

Comments


bottom of page