Dive into the virtual whirlpool of Meta, previously known as Facebook, as we explore the company's decision to draw the blinds on two posts depicting the raw human suffering sparked by the Israel-Hamas war. This veil was soon lifted - albeit with warning screens added due to the violent content. Yet the spectacle isn't over, as the company's magic show-like autonomy means it's not compelled to heed the critique of its decisions. A genuinely riveting storyline, wouldn't you agree?
Let's turn the page to the next chapter - the Oversight Board, the entity created by Meta to be its moral compass, expressed discontent. Its bone of contention lies in Meta's knack for making the posts about the tell-tale suffering 'disappear' from Facebook and Instagram's suggestions. Their argument? Such posts, intended to shed light on the reality of war, were now out of sight, out of mind. Moreover, they point out Meta's sleight of hand with its automated tool use, swiftly sweeping away 'potentially harmful' content - and possibly invaluable posts that spoke truth to power, and could bear the watermark of human rights violations. The Board's plea for preservation of such content hangs in the ether.
Our adventure into Meta continues by taking a sharp tour into two urgent case files, metaphorically ripped out from the oversight board's classified vault, and signified as the first ever expedited rulings, briskly ruled upon within 12 days in lieu of the usual 90.
File one: The removal and subsequent reinstatement of a heartbreaking Instagram video documenting the grotesque aftermath of a strike near Al-Shifa Hospital in Gaza City. Censored initially as it danced too close to Meta's no-tolerance waltz for violent, graphic content. However, even upon its return to the platform, it was treated as an elusive phantom - blanketed by a warning screen and demoted to reduced visibility. A decision, the board decidedly, disapproves of.
File two: It captures a chilling Facebook video revealing an Israeli woman's fervent plea to her captors to spare her life during the terrifying Hamas raids. The Board observed a steep rise of approximately 200% in the daily average of revolts by users connected to the Middle East and North Africa region following these events.
The plot thickens as Meta welcomed the Oversight Board's ruling keeping the mask of both expression and safety intact. Asserting that no further action was needed following its initial backpedaling on the removal, it declared the case closed, neatly sidestepping any further updates or recommendations.
An intriguing climax indeed. But behold, even at the end there's a twist! As Meta had temporarily relaxed its guard, lowering thresholds for automated tools that ferreted out violative content. A move the Board noted, increased the likelihood of innocent content being erroneously erased. Confirmed as of Dec. 11, Meta's thresholds remained lowered past their pre-Oct. 7 levels.
Meta's initial transformation into the Oversight Board was in response to a clamor for quicker action against misinformation, hate speech, and propaganda campaigns littering its platforms. Now comprising 22 members reflecting a global mix of legal scholars, human rights experts, and journalists, the board’s rulings in scenarios such as this are binding. However, their wider policy findings are only advisory, keeping Meta's sovereignty intact.
And so, our journey through Meta's labyrinth ends, leaving behind a conundrum of questions - a virtual Pandora's box waiting to be explored.