Meta’s independent appeals system against its decisions to delete content on Facebook or Instagram attracted about 1.1 Million cases during its first year.

Most of the disputed posts were from Europe, USA, or Canada. They had been removed largely because of violence, hate speech, or bullying.

Meta was ruled against 14 times in 20 cases that The Oversight Board published its decisions.

One case involved the removal of images of breasts from a post about breast cancer.

Other posters featured an image showing a child who had died alongside text about whether China’s treatment of Uighur Muslims was justifiable and the decision to ban Donald Trump after the Capitol Hill riots.

The board overturned Meta’s decision to remove these two examples but supported its decision not to ban Donald Trump, even though it criticised the “indefinite” period.

Meta admitted that it had wrongly selected 51 cases.

Thomas Hughes, board director, stated that the Board was looking for cases with “problematic components” and “emblematic” cases to pursue.

He said that hate speech, violence, and bullying are “difficult to judge issues”, especially for automated systems.

He said, “Also, in many of these cases context is extremely important.”

The board just published its first annual report covering the period from October 2020 to December 2021.

Meta and anyone else can appeal to the court if they disagree about a decision to remove content. Only 47 of the 1.1million cases received over the 14-month period were from the firm.

On average, 2,600 cases were reported per day.

Facebook has over two billion users worldwide, so this is a small percentage of the vast content. Also, it was noticeable that very few complainants came from other countries.

The following are the most recent cases that were submitted to the board

The Oversight Board, also known as a “supreme Court”, was created by Mark Zuckerberg, Meta boss. Although it is an independent entity, its wages and other expenses are covered by Meta. It includes journalists, lawyers, academics, and human rights activists.

Mr Hughes described the relationship between Meta and the board as “constructive, but critical”.

It made 86 more recommendations to the tech company, including translating its policies in more languages and being more specific about explaining why content was removed because of hate speech.