Secret humanitarian database systems, social media filtering and everyday citizen journalists. The invisible struggle keeping hope alive – where none is in sight
By Gabriele Scalise
From the Arab Spring onwards, a new social media aesthetics blossomed in North African countries and later coloured the Middle East. A cornucopia of protests and marches. The internet first welcomed this exciting stream. Nowadays, instead, algorithms block us from viewing most of it – shadow banning or bringing down footage of human rights violations. Did all humanitarian content become less palatable or inappropriate at once? More than a change in users’ taste, the reasons are multiple, and companies are not entirely to blame. In June 2017, Google, Twitter, Facebook and Microsoft created the Global Internet Forum to Counter Terrorism (GIFCT) to remove harmful content before it could become viral. They developed heavy-handed machine learning algorithms without human supervision.
To be fair, most governments encouraged hasty censoring during ISIS’ rise. As a student of media and communication, I am worried about these mechanisms’ lack of transparency. See, machine learning always begins with human inputs – and biases. More so, these companies are financially driven: the boundaries between counter-terrorism codes and market-oriented ones become nuanced, if not existent at all. Algorithms are shaped by employees (flesh and blood), who are pressured against each other to promote engagement – of any kind – a boiling pot without an escape valve. Adrienne LaFrance, executive editor of The Atlantic, in his History will not treat us kindly, reports how Facebook’s quest for “meaningful social interactions” made its News Feed teams compete with internal groups tasked with limiting harm, producing a questionable Frankenstein strategy.
NGOs are today the only reliable channel for evidence to reach international bodies. An example is the Syrian Archive (from an NGO called Mnemonic). Their objective? “Archiving disappearing digital material” and hopefully “reinstate it” on social platforms. Though, this is only possible if they download the content before it is taken down – a matter of hours. It’s a race against time. On the ground, instead, citizen journalists store the data in secret, volunteer-run archives. Source tracing in warzones is geopolitically relevant: the absence of Western journalists makes citizen journalists the only source of global information. Losing them means going dark.
In Egypt, Tunisia, Sudan, Afghanistan and many more places of current terror and regime, one object fascinates me the most: database systems. Monolithic and crude, yet warm and energy-thirsty; immobile, yet enabling millions of data movements. Sort of a Harry Potter’s horcrux, Pandora’s box full of dark deeds, hidden away. Actually, very few people know what a secret humanitarian database system looks like. It could be any kind of computer, hard drive, or router. Pristine, dusty, tangled in cables or a minimalist’s dream: these objects store evidence in the most disparate of file formats. Finding reliable descriptions is quite a feat. That might be because the topic is not sexy enough in the eyes of research institutes, but also because publishing rigorous literature on it would be unethical: activists would be exposed and thus put in harm’s way. Kari Andén-Papadopoulos is amongst the few experts who provide a recent account of what a database of this sort looks like – proof of their cryptic nature. She studied the Egyptian Mosireen Media Collective: a handful of volunteers collecting and sending files to NGOs. In their daily operations, they limit digital and physical interactions alike, being careful not to reveal their movements. They particularly lack the manpower and tools to identify genuine content from fake, slowing international legal procedures. She stresses how even though YouTube is not reliable anymore, the bulk of file exchanges between activists and international organisations is centred around it, lacking easy access to more secure technologies.
In a discussion with a UN internee here at Uppsala University, I learnt that even humanitarian experts get their emergency briefings through a few local sources on Signal or even WhatsApp. I was surprised not to find long editorials – nor major reports – on the issue of humanitarian source tracing. It’s a grey area where shadows reign supreme. For example, what can be found in some old online articles – and confirmed by Israeli researcher Niva Elkin-Koren – is how YouTube’s machine learning algorithm deleted approx. 100.000 videos on Syrian chemical attacks: irreparable damage to international accountability. Take the famous Khan Shaykhun’s chemical attack of 2017: while the West televised – and understood – the gravity of the situation, behind the scenes video evidence was deleted, never to see the light of day again. Robert Gorwa, a doctoral candidate at Oxford University, in his article Algorithmic content moderation concludes that algorithms exclude most international legal material from mainstream platforms and trends. Even if we live in a time where social media outrage can move entire nations, international courts remain the place where the humanitarian yarn is unravelled and geopolitical strands are pulled.
Continuing with Syria, amongst media experts, the country is called “the first social media war”. I expected news outlets to weave their networks closer to local database systems. The main exceptions are The New York Times’ The Lede live blog, which incorporated citizens’ content, and Human Rights Watch, which worked with trusted locals. It’s key because social media are not only used by pro-democracy activists but by regime-sponsored ones too. Researchers Gabriele Cosentino and Omar Al-Ghazzi explain how regimes constantly craft disinformation campaigns, targeting humanitarian NGOs; via armies of trolls, they challenge opposition discourses. Often, dictatorships intimidate the opposition even with leaks of horrendous torture footage. But then again… conflicts are shady: no one knows if these leaks are intentional or stem from security breaches.
On top of online psychological warfare, biased social media filtering takes a toll on genuine content. Studies such as Like trainer, like bot? by Reuben Binns, show how the perception of “toxicity levels” (the perception of a content’s offensiveness) varies deeply between tech employees. The result is non-neutral – nor consistent – automatic censoring. And what control do we even have in deciding what “genuine content” is, to begin with? After all, objectivity is a slippery concept. Yet one thing is sure: who controls knowledge-production owns the starting point of the discussion. Who defines objectivity can portray themselves as just. For Foucault – my beloved French philosopher – “archive” is the law of what can be said, the power shared by a restricted number of people deciding what content can be produced, be that an algorithm, an oppressive dictatorship or tech limitations.
Applying this notion to the Middle East, I see secret database systems as places of resistance. In a globalised media world, their local presence is vital; the secrecy of their content is paramount, as well as their contributors. Visual evidence is a powerful currency, but every coin has two sides: local actors face an extreme risk of reprisal in the hope of international legal action. From the end of the Second World War, Middle Eastern conflicts have been fit into easy-to-consume, mass-produced formats. Tuned to Western tastes and televised, the West formed “archives” about those countries. In the 80s, Hollywood vigorously flexed its muscles (remember Rambo III and America’s praise for the mujahideen against the Soviets?) in mobilisation to idealise the Muslim world as anti-communist. Then again – embedded in frontline military operations – CNN revolutionised the portrayal of war. And yet… yet the info we consume on the Middle East is largely polished by news outlets. As Al-Jazeera cartoonist Khalid Albaih highlighted, Western media even preferred high-resolution footage from terrorists to genuine activism achieved with sketchy recording tools.
The distaste for rudimental human rights content, even when verified, is a Western outlets’ approach we should be worried about. Western media – and parliaments – can do it through rigorous collaborations with local media collectives without abandoning NGOs. People can do their part too, by demanding proper coverage from their countries’ representatives.