Connect with us

Analysis

Ethiopia: Facebook and the Civil War

Published

on

Note: This is a translation of a German article published at Zeit Online, a german weekly newspaper


A professor is murdered in Ethiopia after malicious online posts. Mark Zuckerberg actually wanted to take action against lies and hate speech – and ensure that his platform no longer fueled political conflicts worldwide. Why didn’t it work? 

By Kerstin Kohlenberg

December 7, 2022 

Ethiopian Nigist Hailu mourns the death of her husband; the American Mark Zuckerberg sees the market of the future in Africa. © [M] ZEIT ONLINE; Photos: Jessica Chou/​The New York Times/​Redux/​laif; Kerstin Kohlenberg (left)

Every murder has its own scent. The murder of Ethiopian professor Meareg Amare smells like mango and diesel. Meareg and his wife Nigist planted the mango tree behind the gate to their house. The heavy, pungent odor of diesel hangs over almost every city in Ethiopia , over Addis Ababa, the capital with its congested streets, and over Bahir Dar, where the murder happened on November 3, 2021. It’s the smell of a poor country’s rapid growth. The smell of freedom and destruction.

Nigist Hailu was in church when the call came. An unknown man was on the phone, she says, and he told her not to go home under any circumstances, but to go to the police . She was startled and immediately ran away. Nobody wanted to tell her anything at the police station, instead she was taken to her house. At the gate to the property, she says, a crowd had already gathered. “Then I saw Meareg. He was lying in front of the house under the mango tree.” Her husband’s mouth was open.

Nigist later learned that the three perpetrators had come with a car and two motorcycles. All three wore the uniform of the local government’s armed special forces. They shot him, Professor Meareg Amare, in the legs and back, then disappeared.

Nigist says she collapsed, hugged her dead husband and tried to shut his mouth. A single person from the crowd helped her. The others just stood there and watched.

The drive to Nigist – in Ethiopia, first names are what surnames are in our country – leads through heavy traffic to the edge of Addis. Past goatherds driving their herds across the road while they look at their smartphones, past a gigantic shell complex for a Chinese high-tech company, to sheds selling coffee. Nigist lives here on the third floor of an apartment building because she is no longer safe in her home country. She had a metal grille put up in front of the door. You have to be careful here, too. The family belongs to the small ethnic group of the Tigrayans, against whom the Ethiopian government has been at war for two years. It is probably the deadliest conflict of our time. According to estimates by the University of Ghent, 500,000 people have fallen victim to it. For several weeks there has been a vague hope of peace. The parties have pledged to end the fighting. But what about the hate?

Nigist closes the door and stuffs a blanket over the gap under the door. Nobody should hear what is said inside. She straightens the black tulle over her hair. She is 57 years old and her hair has turned gray since her husband’s murder, she says. In the living room a corner sofa, a table, a picture of Meareg on the wall. Then nothing. A room like a waiting room. Her mother is there too, the two of them have prepared breakfast, lentils, flatbread, hot sauces.

Nigist reaches into her pocket, like every third Ethiopian she has a smartphone. A Samsung. After the murder, she was called on this cell phone by the stranger, on this cell phone she saved what she has left of her husband. She shows pictures of a tall, slim, thoughtful-looking man in a jacket and shirt, college professor’s attire, his hands casually in his pockets. A video shows her laughing as she tries to get him to dance. On another, he softly sings to her. Nigist has tears rolling down his cheeks as he looks at it. The cell phone saves her life. It connects her via Facebook with her four children, who have now all fled abroad, to Sweden, France and the USA. And it connects them to some of their old hometown neighbors. The Ethiopians have never been so close. Never before have they been so much enemies.

Before Meareg died at the age of 60, he was hounded – on a Facebook account called BDU STAFF. BDU stands for Bahir Dar University. Staff means staff. The account looks like an official page the university set up for its employees, people like Meareg, who worked there for 16 years. First as a lecturer in chemistry, then as an assistant professor, shortly before his death he was given a regular professorship. However, BDU STAFF not only posts everyday information about campus life, academic or sporting successes, but also enthusiastic posts about the government’s war against the Tigrayans.

The account has many readers, 50,000 people follow it. On October 9, 2021, almost four weeks before the crime, a photo of Meareg was posted there with the caption: “His name is Professor Meareg Amare Abreha. He is a Tigrayan.” The post is long, the anonymous author claims that Meareg fought on the side of the Tigray People’s Liberation Front against government forces, after which he fled to the United States. A day later, BDU STAFF publishes another post about Meareg, again including a photo of him. This time it says that the professor embezzled university funds and used them to build his house and buy various cars.

In the following days, some of his students write in the comment column under the post that Meareg is a good guy, a great teacher, a nice person. But most commentators are demanding revenge: “What are you waiting for. Are you sleeping? You are so embarrassed, why haven’t you drunk his blood yet?”

How should Facebook be regulated?

On no other continent is internet usage increasing as fast as in Africa. A country like Ethiopia, with its 120 million inhabitants, the vast majority of whom still have no access to the Internet, must appear to a platform like Facebook as an ideal future market. In the rich countries of the West, Facebook has long since stopped growing, almost everyone has a smartphone, almost everyone has a Facebook account, and the young are even turning away from the platform. In Africa, on the other hand, Facebook’s parent company Meta is in the process of laying 45,000 kilometers of cable around the entire continent to provide 18 countries with high-speed Internet. How many lies, how much baiting will this cable bring to Africa?

Facebook has imposed a kind of house rules. Rules of conduct, valid worldwide, for all three billion users. Hatred, death threats, glorification of violence, racism, sex, conspiracy theories or fake accounts are all not allowed. If a contribution violates the house rules, it should be provided with a warning or deleted. Actually quite simple. And yet quite complicated. At what point does legitimate criticism become hatred or racism? When will freedom of expression on disinformation or the glorification of violence begin? Should what a politician writes be treated the same as what a common citizen posts? When is the moment just before too late in a political camp struggle that you have to intervene?

Impossible to answer such questions in a set of rules that is aimed at almost 40 percent of humanity. Facebook constantly has to reconsider the freedom of open discourse and protection against disinhibition. Of course, business interests also play a role. Facebook makes its money by advertising to people who look at other people’s posts. If too many posts are deleted, eventually sales will drop.

That’s why it came as a surprise when Mark Zuckerberg founded the Oversight Board – a kind of supreme court for Facebook. It began work in October 2020, a year before Meareg’s murder. It should make final judgments about the correct application of the house rules. Facebook has promised to submit to these rulings: if its court determines that a post should be deleted, Facebook must do so. So the board sets the limits of what can be said on the platform. Although it is financed by Facebook, through a $280 million fund, it is said to make its decisions independently of the company’s business interests. No one can fire their members, even if they criticize Mark Zuckerberg.

Some see the Oversight Board as proof that after all the criticism of the damage the platform has done to society, after all the debates about polarization, fake news and anonymous hate postings, Facebook has finally come to its senses. For them it looks as if Facebook now wants to take responsibility for the consequences of its actions. Former Danish Prime Minister Helle Thorning-Schmidt was recruited as a member of the Oversight Board, as well as the former editor-in-chief of the English Guardian,  Alan Rusbridger, Yemeni Nobel Peace Prize Laureate Tawakkol Karman. A total of 23 lawyers, ex-politicians and journalists. They all have one thing in common: according to the New Yorker magazine, they get a six-figure sum every year for 15 hours of work a month. And they have a reputation to lose.

The board also has critics. They simply recognize in it the attempt of a billion-dollar corporation to save its business model: before state authorities regulate us, we prefer to regulate ourselves. A little bit.

The Oversight Board is based in London. There is no sign on the door, the exact location is to remain secret. Inside there are then high ceilings, industrial loft look, coffee shop atmosphere, the disguise of modernity. Normally there are people in the office now who view new cases or research current ones. In addition to the 23 members, the board has 78 employees working here in London, San Francisco and Washington. But because everyone has been in the home office since Corona, only Thomas Hughes is there on this day in summer 2022. The executive director of the oversight board, not a member of the decision-making body himself, has just returned from California. Mark Zuckerberg had invited him and the members to a first face-to-face meeting. “Until now, everyone only knew each other from Zoom,” says Hughes.

Facebook’s Supreme Court has never granted entry to journalists. The risk that some of the sometimes heated debates would leak out seemed too great. Only after a long back and forth there came an okay for this visit.

Before he began working for the Board, Thomas Hughes was director of an NGO working to protect freedom of expression. All his life he has tried to turn the volume up so that people hear more than just one voice. He set up radio stations in Indonesia after the 2004 tsunami, helped found newspapers in Iraq after the war and worked on freedom of the press laws in Liberia. It was always about independent journalism. Now there is social media, says Hughes. There is a lot of yelling there, and the aim is to make the yelling bearable. “There will always be problems in a society. Also on Facebook. We just need a better system and better processes to recognize these problems.”

In just the first year after the Oversight Board was formed, users around the world appealed a decision made by Facebook more than a million times. Hughes and his staff look at the objections, then sift through them like gold miners in multiple iterations – looking for the perfect examples, precedents for dealing with hate, violence, disinformation. “I think we can help Facebook learn from these cases and get better,” says Hughes.

About once a quarter, the 23 judges select three cases that they want to work on, each in small groups. Each case is then researched, experts are interviewed, and Facebook has to answer a questionnaire. A lengthy process, only 31 judgments have been made to date. For example, it was about the image of a female breast in connection with raising awareness about breast cancer, the question of whether the prison conditions of the head of the Kurdistan Workers’ Party (PKK) can be discussed on Facebook, or whether it should be allowed calling someone a “coward”. In all three cases, Facebook had ordered a deletion: the chest remained bare, the PKK is classified by Facebook as a “dangerous organization” and “cowardly” as a negative character designation.

But the biggest case was another: Donald Trump. On January 7, 2021, the day after the storming of the Capitol in Washington, Facebook permanently suspended Trump’s account. It was a highly political decision, in the midst of the Kulturkampf between Democrats and Republicans in the USA, in which Facebook has repeatedly come under fire from both sides. For many on the left, the rules on the platform are too lax, they blame Facebook for the triumph of Trumpism, for hate speech, violence and conspiracy theories from the right. Many right-wingers, on the other hand, suspect with every deletion: left-wing Silicon Valley wants to censor us! Of course, now again, after the Trump affair.

Facebook itself wanted to know from its Supreme Court whether the lockdown was correct. The board had only been around for a few months at the time, and members wondered if this case wasn’t a bit big to start with. “What if we make a mistake?” – this is how someone who was there at the time describes the concern. They finally accepted the case, took almost four months and came to the conclusion that Trump’s posts had contributed to the storming of the Capitol. He had told his followers the lie about the stolen election on Facebook, and they had believed it. However, according to the judgment of the last instance, a lifetime ban is arbitrary. The judges reduced it to half a year and demanded universal rules for such decisions from Facebook.

The Internet changed the way people communicate

Facebook amended its house rules and extended the ban to two years. It will expire soon, on January 7, 2023. By then at the latest, Mark Zuckerberg will have to decide whether to let Trump back in, just like Elon Musk just did on Twitter .

When they announced their verdict on Trump, the judges became fundamental: they called on Facebook to keep a better eye on political conflicts like the one that plunged the USA into such a deep crisis. conflicts all over the world. And Facebook promised to do so. At least as important as the question of what Facebook is doing with Trump is the question of what Facebook has learned from Trump.

In principle, Facebook always monitors its platform in the same way, whether in the USA, in Europe or in East Africa. An algorithm searches for objectionable content. If it finds any, it automatically deletes most of them. But sometimes he also assigns them to a person who then has to make a decision on behalf of Facebook: get rid of it or not? There are many such so-called content moderators in economically strong countries such as the USA or Germany. Hardly any are responsible for countries like Ethiopia, Yemen, Iraq or Myanmar. Facebook just spends less money there. It is precisely the regions that are often politically particularly unstable.

In May 2021, when the Oversight Board announced its landmark decision in the Trump case, tens of thousands of people had already died in Ethiopia. In the north of the country, where most Tigrayans live, government troops are cracking down on rebels, massacres and looting occur, and the government has cut off all humanitarian aid to the region. On Facebook, they must now quickly redeem their promise. They have been sorting states according to risk categories for some time now, Ethiopia is in the highest level. Facebook can algorithmically slow the distribution of posts it deems potentially dangerous. It can feed the algorithm more keywords that will lead to post deletion. Or enable special filters to better monitor posts. What exactly Facebook is doing about it in Ethiopia and whether anything will change after Trump remains unclear. All that is known is that Facebook is hiring new content moderators.

Everything may look good from the outside. But how does it feel when you’re right in the middle of it?

When the two anonymous Facebook posts about Meareg go online in October 2021, there is a small truth in them – as in any effective slander. Meareg really isn’t in his hometown. Contrary to what the BDU STAFF account claims, however, he did not flee to the USA because he had fought for the Tigray People’s Liberation Front. He went to the capital Addis to help relatives who have Corona .

In her apartment in Addis, his wife tells how worried she was after the posts. To accuse someone like that in such an explosive situation – she immediately knew that it would put him in danger. “After the war broke out, neighbors stopped greeting us. When we walked past them, they suddenly fell silent,” says Nigist. “They were friends of ours!” She asked her husband to stay in Addis for the time being. “But Meareg didn’t want to. He said he wasn’t a political person after all.”

The two met in 1982. Meareg was a young chemistry teacher in the mountainous, green north of Ethiopia, on the outskirts of the medieval city of Gondar. The country was then ruled by communists, who had Meareg arrested in 1983. He was accused of being an opponent of the regime, even back then. They tortured him, hitting his feet with nails, Nigist and her mother say. After a year he was released. They show the discharge document. After that they would take care of him. Nigist and Meareg married. Later they built their house in Bahir Dar and started a family.

In 1991, the Tigray People’s Liberation Front and allies won victory over the communists. A good time began for Nigist, Meareg and the four children. It didn’t matter what ethnic group you belonged to at the time, Nigist says, they had good friends who weren’t Tigrayans. They converted their garage and rented it out. The mango tree in front of her house bore more fruit every year.

The new millennium came. All over the world, the Internet was changing the way people thought, communicated and acted, but in Ethiopia, for the time being, almost everything stayed the same. Hardly anyone was ever online, the government monitored access to the network, many sites were completely blocked, and journalists and bloggers were regularly arrested. Mass protests erupted in 2018. The new Prime Minister was Abiy Ahmed, who based his power in the multi-ethnic state of Ethiopia not on the old cadres from Tigray, but on members of another ethnic group, the Amhara. A bad time began for the family of Nigist and Meareg and for other Tigrayers.

Abiy Ahmet lifted the censorship of the Internet, and Facebook was soon allowed without restrictions. There, Amhara began anti-Tigray incitement, and Amharic politicians called for “taking back” the Tigray region. Ethnic tensions increased. Just two and a half years after the new prime minister took office, the war between the government and the People’s Liberation Front of Tigray broke out.

A 31-year-old man sits in the lounge of a student residence in Paris. He has witnessed how radical voices on Facebook have grown louder among Ethiopians. It is Abrham, the second eldest son of Nigist and Meareg. He is actually doing his doctorate in Ethiopia in peace and conflict studies, but after the murder of his father he received a student visa for France . Abraham doesn’t speak a word of French. Sometimes it might be better not to understand anything of what is going on around you.

Abrham moved to Paris after the murder. © Kerstin Kohlenberg

He and his friends initially welcomed the election of the new government, says Abrham. “Like all young people, we were looking forward to more freedom.”

At the time, he mainly followed his cousins ​​on Facebook and did not post anything political. And yet, a few months before the murder of his father, the Facebook algorithm put the following post by a well-known pro-government nationalist with 250,000 followers in front of him, the Tigrayan: “The political demands of the Amhara must never again be compromised by the Tigrayans. Our fight will make them a loser. It’s over!”

As if a door had opened and cold air was blowing in, Abrham received more and more such contributions after that. For example this one: “If the Amhara don’t like the color of your eyes, they will find ways and means. Let me tell you that!” The author is an Ethiopian living in the US, fueling the conflict from there. Abrham doesn’t know the man – and still got a direct message from him: “What are you doing in Bahir Dar, you dirty Tigrayan?”

The platform has known for a long time how good such content is for Facebook, from a purely economic point of view. In 2007, Facebook founded a “growth team”. The users became something like patients, who were fed ever new experimental drugs with the help of constantly changing algorithms. The team observed how each algorithm cocktail affected a user’s behavior. Did they stay longer on Facebook? So that we could show them more ads? It soon became clear which drug worked best: hate. Man reacts most strongly to what upsets him the most. He then shares the indignation, he wants more of it. In this way, Facebook does not become a place for friendly get-togethers, but a football stadium where people grow together by being against the fans of the other team.

Abrham reported nearly 20 posts to Facebook. It’s very easy, you just have to move the mouse to the three dots on the right edge of a post, click, a window opens, you press “Report post”, click again, you choose between categories such as “harassment”, “terrorism” or “hate speech”. Click. Finished.

Facebook refused to delete each time on the grounds that the content did not violate the house rules. If Abrham no longer wanted to see such messages in his news feed, he could block the person, Facebook wrote to him.

Some on the Oversight Board know exactly what it’s like to be the victim of a hate campaign on social media. Maina Kiai is one of four board members from Africa. On a day in the summer of 2022, he is sitting in a restaurant in Washington, ordering tapas. Kiai is a former UN special rapporteur and director of Amnesty International’s Africa program and currently works for Human Rights Watch. After the 2007 presidential election in his native Kenya, many people died because the losing candidate claimed the election had been rigged. Kiai later lobbied for an investigation into the riots, after which he was attacked online. At some point, the attackers were in front of his door. To this day he does not have a Facebook account. “That’s too intrusive for me.”

After an acquaintance wrote him that Facebook wanted to talk to him about a post on the Oversight Board, he thought about it for a long time, says Maina Kiai. He asked around who else was doing everything – all great people. And he was interested in finding out how Facebook actually works.

His conclusion so far in this job? In 70 percent of the cases, he and his colleagues would have reversed a decision made by Facebook, says Kiai. The platform did that every time. However, based on the lessons learned from these cases, they also made 86 general recommendations to Facebook. Only 28 of these have been fully implemented so far. At least a start, thinks Kiai. Facebook has translated the House Rules into numerous additional languages ​​and is now answering the questions of its Chief Justices more frequently on the individual cases.

He sounds like a politician describing the agonizingly slow transformation of a powerful global institution. Meanwhile, however, this institution is changing the world at the speed of light – and is not always able to keep up.

The people who are supposed to deal with the political crisis in Ethiopia for Facebook work in Nairobi, the capital of Kenya. It’s the content moderators who are supposed to review potentially dangerous posts. One of them is called Senait here, but she doesn’t want to give her real name publicly for fear of losing her job. Senait studied linguistics and works for a subcontractor commissioned by Facebook. When she talks on the phone about her everyday life in the fight against hate, one thinks the same as when speaking to Maina Kiai: Yes, Facebook is doing something. But not enough.

They are now 32 in the team, more than before. Some newcomers have just been hired, they speak not only Amharic but also Tigrinya, so they can now review posts from both sides of the ethnic conflict. But here too, in the middle of the capital of a neighboring country, this conflict has long since arrived, says Senait. For example, one of the team, an Amhara, almost never deleted hate speech by Amharic nationalists. It was noticed at some point, the woman had to go. The problem with the lack of time remained.

Senait has 50 seconds to decide whether a post needs to be deleted. Each week their average processing time is evaluated. The company calls it action handling time . “That’s a lot of pressure,” says Senait. Her team recently got a higher salary, instead of $600 they now get $800 a month. Before that, five had resigned. The high turnover makes it difficult to gain experience – to learn to separate the harmful from the harmless.

Even though Facebook knew, accounts weren’t deleted

Do you know the BDU STAFF account?

Senait calls it on Facebook. Don’t know it, she says. She scrolls and clicks through the page, “hm, hm”, click, click, “lots of swear words for the Tigrayans”, click, click. Senait is surprised at how often university employees are accused without evidence of having stolen funds from the university or of being rich for other dubious reasons. Click, click, “oh – that’s really problematic now”. One of the comments reads, “If you were a real man, you would kill them all.” She looks at the Facebook pages of some of the commenters. “They belong to the Fano militia,” she says. Fano militia. That doesn’t sound like everyday life on campus anymore. It is an armed group of the Amhara, who fought on the side of the government. Independent observers accuse it of taking part in war crimes against the Tigrayans, mass rape and ethnic cleansing.

Now one could assume that Facebook is simply drowning in the mass of all the posts. However, a document by former Facebook employee and whistleblower Frances Haugen shows that the platform actually knows exactly who fueled the conflict in Ethiopia. The document is available to ZEIT, it was only intended for internal purposes on Facebook and was never intended to be published. It describes a network of Facebook accounts that have one thing in common: they are linked to the Fano militia. This network, it says, “spreads incitement to violence and hate speech in Ethiopia.”

So while Facebook knew, it saw no reason to delete the accounts.

Why this is so is evident from a second document by the whistleblower, which is also available to ZEIT. Mark Zuckerberg is quoted as saying that he supports additional algorithms that delete, slow down and block dangerous content. However, he makes one caveat: such algorithms should not have any impact on the growth of the platform. It’s as if the railway control center were sending a radio message to the driver’s cab: I’m happy to drive more slowly, but only if you don’t arrive at your destination later.

And then there’s the celebrity thing. Facebook has a kind of special program for politicians, journalists, stars and other people with many followers. It’s called a cross check. Anyone who is included in this program by Facebook does not have to adhere quite so strictly to the house rules. While members of the Oversight Board were deliberating on Donald Trump’s ban, they were informed by Facebook that cross-checking is only about a few personalities.

Frances Haugen, the whistleblower, later revealed how many users Facebook apparently sorted into exactly this program: 5.8 million.

On Tuesday of this week, Facebook’s Supreme Court addressed the public, urging the company to massively overhaul its cross-check program. It is said that anyone who puts their commercial interests in the foreground as much as Facebook does with its special treatment for celebrities is not fulfilling its responsibility in matters of human rights. You can tell from the message that the Oversight Board people are angry.

“It opened our eyes,” says board member Suzanne Nossel about Facebook lying to her and her colleagues. Nossel is a lawyer and chair of the PEN America authors’ association. She is aware, she says, that the Oversight Board will be judged on its ability to get a handle on the negative impact of Facebook. “So the destruction of public discourse.” Nossel pauses for a moment. “We’re not in the best position to do that at the moment.” If they just sit on the sidelines and look at one case after the other without changing the structure of Facebook, then that is wasted time.

The social media and the hate. You could compare it to another existential crisis that doesn’t have much time left. Perhaps hatred of platforms like Facebook is what gas and coal are to traditional industrial societies – the fuel that fuels growth. And maybe a civil war , just like global warming, is the consequence that one accepts for it. They promise to fight them, but in reality don’t do much.

It could have turned out differently. Facebook’s growth was politically driven from the start. More than a quarter of a century ago, American politicians decided to exempt social media from legal responsibility for the content posted on their websites. It was a gift from Washington to Silicon Valley. A jump-start that seemed harmless at the time. Algorithms that perfect people’s attention to hate didn’t exist yet. It was unthinkable that a billion-dollar Californian corporation would one day have a say in the course of political conflicts around the world.

In August 2021, Suzanne Nossel, Maina Kiai and the other Board members took a case from Ethiopia. A Facebook user posted a post claiming that Tigrayan civilians were helping fighters from the Tigray People’s Liberation Front to commit atrocities. They led the militias from door to door, killing Amhara women and children. He got the information from people from the affected regions. The post ended by saying, “We will win our freedom through our struggle.”

The deletion of the Facebook post came too late

Facebook’s algorithms reported the post. The Nairobi team checked it and decided it was breaking house rules. The post has been deleted. Its author appealed against this on Facebook. The content team checked again and again judged: The post must be deleted.

But the author did not give up. He now turned to the Oversight Board. The case was taken care of there. A heated debate ensued. Some of the members, so those involved say today, were in favor of deletion – and reminded of another civil war country, Myanmar. Rumors had been spread there on Facebook for a while to defame members of the Muslim minority living in the country. The hatred culminated in a devastating outbreak of violence, thousands of Muslims were murdered and almost a million fled to neighboring Bangladesh. Facebook has long downplayed its role in this genocide . Only after an official United Nations investigation did the company admit it hadn’t done enough to prevent the violence.

The members who were against the deletion argued with the right to information. The people could not find out about the official media in Ethiopia, there was hardly any reporting on the conflict. In such a situation, such a contribution via a platform like Facebook could give the population important information.

Using the Ethiopia case, the Board wanted to answer the question of how to deal with rumors of atrocities in a country where people can often only use social media to warn each other about dangers. A matter of balancing, again, between the violence that the rumor will incite and the protection it can offer if true.

Board member Nossel says the Ethiopia case has been one of her most difficult to date. How should they decide?

On October 14, 2021, Professor Meareg’s son, Abrham, reported the inflammatory posts of the BDU STAFF account to Facebook through his father.

On October 30, the Facebook algorithm popped up a post on Abrham’s Facebook page: “46,000 Tigrayans live with their families in Bahir Dar. Whether we like it or not, we have to defend ourselves against the terrorists. The stupid and deaf will disappear. “

This time Abrham knows the author. It’s his best friend. The two were neighbors, they went to school together, they played soccer together, they used to tell each other when they fell in love with a girl.

On October 31, 2021, the friend wrote: “We must closely monitor the Tigrayans living in Gondar or Bahir Dar and take the necessary steps. If we will be as cruel as they are, then we can prevail.”

Three days later, on the morning of November 3, 2021, Meareg drove to his university one last time. Shortly before, his employer had suggested that he hand in his notice of termination. Nobody there talks to him anymore. He picks up a few last things and drives back home. There he meets his killers.

On November 4th, Facebook deploys an extra emergency team to Ethiopia.

On November 11, the company sent a message to Abrham. The post about his father violated community standards and was removed. The father has been dead for a week.

On December 14, 2021, the Oversight Board will give its verdict on the case from Ethiopia: The post with the rumor should be deleted. In addition, Facebook recommends commissioning an independent review: What role does the platform play in this conflict? Facebook replies that something like this is very “time-consuming”. To date nothing has happened.

Meareg has been dead for a year now. His wife Nigist does not know if and where he was buried. The house they shared was taken from her and is now a quarters for the armed special forces.

The BDU STAFF account has since agitated against several other Bahir Dar University professors and staff alleged to be Tigrayans. Upon request, the university confirms that BDU STAFF is an anonymous account that is not operated by it. It causes big problems. The account is still online. Facebook’s parent company Meta does not comment on repeated requests.

In the meantime, the US Supreme Court has become active. Not the Oversight Board. But the real one, the Supreme Court in Washington. For the first time, it has accepted a case in which it must decide whether digital platforms like Facebook are responsible for their content – and therefore also for the consequences that content has in the real world. For example, for the death of a 60-year-old chemistry professor in a town in north-west Ethiopia.

Continue Reading
1 Comment

1 Comment

  1. Abrham

    December 9, 2022 at 4:25 am

    Dear @Tghat, I truly appreciate the dedication in translating and republishing the lifetime achievements and silenced tragedy of our father caused by hate speech and disinformation.

Leave a Reply

Your email address will not be published.