{"id":1190,"date":"2021-11-15T00:19:00","date_gmt":"2021-11-15T00:19:00","guid":{"rendered":"https:\/\/share.242studio.com\/zaboravljena-ljudska-karika-kritika-aktuelne-rasprave-o-algoritamskoj-moderaciji-sadrzaja-iz-perspektive-rada\/"},"modified":"2025-02-02T13:54:32","modified_gmt":"2025-02-02T13:54:32","slug":"forgotten-humans-in-the-loop-labour-oriented-critique-of-the-current-discussion-of-algorithmic-content-moderation","status":"publish","type":"post","link":"https:\/\/sharefoundation.info\/en\/forgotten-humans-in-the-loop-labour-oriented-critique-of-the-current-discussion-of-algorithmic-content-moderation\/","title":{"rendered":"Forgotten humans-in-the-loop. Labour-oriented critique of the current discussion of algorithmic content moderation"},"content":{"rendered":"\n<p>In the past decade, thriving online harassment and hate speech, armies of bots spreading disinformation as a mean of interference in the elections, far-right propaganda, and waves of obscurantism disseminating through the COVID-19 related fake news repeatedly made online platforms\u2019 content moderation the topic on everyone\u2019s lips and newsfeeds. A recent round of discussion was triggered by the suspension of former US president Donald Trump\u2019s Twitter account in January 2021 in the aftermath of Capitol storming. Twitter\u2019s controversial decision to limit interactions, and later hide Trump\u2019s posts, followed by the ultimate suspension of his account, was&nbsp;<a href=\"https:\/\/blog.twitter.com\/en_us\/topics\/company\/2020\/suspension.html\" target=\"_blank\" rel=\"noreferrer noopener\">taken<\/a>&nbsp;at the board of directors level. However, not every single user on the platform is paid equal attention when it comes to content moderation. More often, due to the large scope of data, machine learning systems are tasked with moderation. Human moderators are only responsible for reviewing the machine decisions in cases when users appeal those decisions or when the machine learning algorithms flag a given case as contentious.<\/p>\n\n\n\n<p>Concerned by the implications algorithmic moderation can have on freedom of expression, David Kaye, a former UN special rapporteur on the promotion and protection of the right to freedoms of opinion and expression,&nbsp;<a href=\"https:\/\/www.ohchr.org\/EN\/Issues\/FreedomOpinion\/Pages\/ReportOnlineHateSpeech.aspx\" target=\"_blank\" rel=\"noreferrer noopener\">called on<\/a>&nbsp;online platforms to \u201c[ensure] that any use of automation or artificial intelligence tools [in this particular case, for the enforcement of hate speech rules] involve human-in-the-loop\u201d. Although the watchdog expressed valid concerns about the implications of algorithmic moderation for the freedom of expression, there is a strong case to argue that humans have never been out of the content moderation loop.&nbsp;<\/p>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>Human-in-the-loop refers to the need for human interaction with machine learning systems in order to improve their performance. Indeed, algorithmic moderation systems cannot function without humans serving it. The machine has to be designed, created, maintained and constantly provided with new training data, which requires a complex human labor supply chain.<\/p>\n<\/blockquote>\n\n\n\n<p>Given the increasing relevance of content moderation in public discourse, it is important to adopt a labour-oriented perspective to understand how algorithmic moderation functions. Contrary to&nbsp;<a href=\"https:\/\/viafoura.com\/blog\/human-vs-machine-moderation-wars\/\" target=\"_blank\" rel=\"noreferrer noopener\">popular<\/a>&nbsp;fallacy that contraposes machine moderation to human moderation, indeed, current moderation presents a mixture of humans and machines. In other words, humans are pretty much \u201cin-the-loop\u201d.<\/p>\n\n\n\n<p>The aggregated supply chain of all the human involvement is visually presented:<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/sharefoundation.info\/wp-content\/uploads\/lobanov1.png\" alt=\"\" class=\"wp-image-4386\"\/><\/figure>\n\n\n\n<p><em>Human labour supply chain involved into algorithmic content moderation<\/em><\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Boards of Platforms<\/h3>\n\n\n\n<p>Algorithmic moderation starts from platforms boards\u2019 members and employed consultants who decide on the general rules for moderation. These rules are primarily a product of the need to manage external risks. Among risks that have to be managed are pressures from civil society, regulators, and internet gatekeeping companies.<\/p>\n\n\n\n<p>For instance, Facebook\u2019s initial policy was to remove breastfeeding photos. The platform was defending itself by referring to the general no nudity policy which the fully exposed breasts violated. It was only in 2014, when after&nbsp;<a href=\"https:\/\/www.huffpost.com\/entry\/freethenipple-facebook-changes_b_5473467\" target=\"_blank\" rel=\"noreferrer noopener\">years of pressure<\/a>&nbsp;from Free the Nipple activists, Facebook allowed pictures of women \u201cactively engaged in breastfeeding\u201d. Among other achievements of pressure from civil society was tightening of most of the platforms\u2019 policies towards hate speech like misogyny, racism, and explicit threats of rape and violence.<\/p>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>A number of more established civil society groups also propose&nbsp;<a href=\"https:\/\/www.article19.org\/resources\/why-decentralisation-of-content-moderation-might-be-the-best-way-to-protect-freedom-of-expression-online\/\" target=\"_blank\" rel=\"noreferrer noopener\">solutions<\/a>&nbsp;on how to moderate based on their proposed ethics or derived from the existing norms of national and international law. Intergovernmental organizations like&nbsp;<a href=\"https:\/\/www.osce.org\/files\/f\/documents\/9\/f\/456319_0.pdf\" target=\"_blank\" rel=\"noreferrer noopener\">OSCE<\/a>,&nbsp;<a href=\"https:\/\/www.coe.int\/en\/web\/freedom-expression\/internet-freedom1\" target=\"_blank\" rel=\"noreferrer noopener\">Council of Europe<\/a>, and&nbsp;<a href=\"https:\/\/www.ohchr.org\/EN\/Issues\/FreedomOpinion\/Pages\/ContentRegulation.aspx\" target=\"_blank\" rel=\"noreferrer noopener\">the United Nations<\/a>&nbsp;have their own projects dedicated to ensuring the freedom of expression while respecting the existing norms of international law.<\/p>\n<\/blockquote>\n\n\n\n<p>The pressure from the regulators most visibly manifested itself in the adoption of the rules and practices aimed at curtailing the spread of terrorist content and fake news. The latter problem, which became widely discussed after the alleged interference of Russia\u2019s government in the US elections, received additional attention due to the&nbsp;<a href=\"https:\/\/www.bbc.com\/news\/world-us-canada-57870778\" target=\"_blank\" rel=\"noreferrer noopener\">biopolitical concerns<\/a>&nbsp;raised by Covid-19 pandemic.<\/p>\n\n\n\n<p>When it comes to pressure from the gatekeeping companies, the example of the Parler social network is the most graphical. This social network&nbsp;<a href=\"https:\/\/www.nytimes.com\/2021\/01\/09\/technology\/apple-google-parler.html\" target=\"_blank\" rel=\"noreferrer noopener\">ceased to exist<\/a>&nbsp;after Amazon stopped providing the platform with cloud computing under the pretext of insufficient moderation, and both major distribution platforms App Store and Google Play, suspended Parler\u2019s apps. In similar fashion, nudity ban on Tumblr, which led to a mass exodus of users from that platform, came after Apple banned Tumblr\u2019s app from the iOS App Store due to reported child pornography. Likewise, Telegram\u2019s CEO Pavel Durov reported that his decision to remove channels disclosing the personal data of law enforcement officers responsible for brutal dispersion of rally participants in Russia was forced by the gatekeeping company:&nbsp;<a href=\"https:\/\/t.me\/durov_russia\/30\" target=\"_blank\" rel=\"noreferrer noopener\">he claimed<\/a>&nbsp;that Apple did not allow the update for the IOS app to be released until these channels were removed. During the 2021 Russian parliamentary elections, Apple and Google,&nbsp;<a href=\"https:\/\/www.aljazeera.com\/news\/2021\/9\/2\/russia-warns-google-apple-navalny-app\" target=\"_blank\" rel=\"noreferrer noopener\">being squeezed<\/a>&nbsp;by the Russian government, in their turn&nbsp;<a href=\"https:\/\/telegra.ph\/Why-Telegram-had-to-follow-Apple-and-Google-when-they-suspended-a-voting-app-09-25\" target=\"_blank\" rel=\"noreferrer noopener\">demanded<\/a>&nbsp;Telegram to suspend a chatbot associated with the Smart Voting project run by the allies of jailed politician Alexey Navalny. The chatbot&nbsp;<a href=\"https:\/\/en.wikipedia.org\/wiki\/Smart_Voting\" target=\"_blank\" rel=\"noreferrer noopener\">provided<\/a>&nbsp;recommendations for Russian voters on which candidates to support in order to prevent the representatives of ruling party from getting the mandates<\/p>\n\n\n\n<p>Not every platform CEOs would employ or at least report on using the algorithmic moderation systems as a solution. Clubhouse\u2019s moderation, for example,&nbsp;<a href=\"https:\/\/cyber.fsi.stanford.edu\/io\/news\/clubhouse-china\" target=\"_blank\" rel=\"noreferrer noopener\">works<\/a>&nbsp;in a way that no machine-learning algorithm is employed. The conversations are recorded and stored by Agora, the Shanghai-based company providing the back-end for Clubhouse. In case of a complaint either by the users or a government, the platform could study the recording and pass the verdict.<\/p>\n\n\n\n<p>The decision on whether to manage the aforementioned risks with the help of algorithmic moderation, always lies with board members of the platform. The boards create the rules, the engineers find the ways to impose them \u2013 although the border is not well demarcated, given that the CEOs often hold degrees in engineering themselves and might also be directly involved in designing the systems\u2019 architecture.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Engineers<\/h3>\n\n\n\n<p>It is engineers who decide which algorithm to introduce for content moderation, design and maintain that algorithm, and seek ways to modernize or replace it.<\/p>\n\n\n\n<p>Engineers choose between&nbsp;<a href=\"https:\/\/journals.sagepub.com\/doi\/full\/10.1177\/2053951719897945\" target=\"_blank\" rel=\"noreferrer noopener\">two main categories<\/a>&nbsp;of algorithms, which are commonly both applied to content moderation.<\/p>\n\n\n\n<p>One category of algorithms deals with searching for partial similarity (perceptual hashing) between newly uploaded content and an existing database of inappropriate content. For example, perceptual hashing is effective in preventing the circulation of inappropriate content such as viral videos of mass shootings, extremist texts or copyrighted films and songs. The most well-known example of a perceptual hashing-based algorithm is the Shared Industry Hash Database (SIHD), used by companies like Google, Facebook, Microsoft, LinkedIn, and Reddit. The database was created in 2017, contains terrorism-related content and has been criticized for its lack of transparency.<\/p>\n\n\n\n<p>The second category encompasses algorithms that predict (machine learning) if content is inappropriate. Machine learning technologies like speech recognition and computer vision are effective in classifying user-generated content that infringes on the platforms\u2019 terms of services (ToS). This technology has however drawn criticism for discriminating against certain groups, as in the case of overmoderation of&nbsp;<a href=\"https:\/\/www.aclweb.org\/anthology\/W19-3504\/\" target=\"_blank\" rel=\"noreferrer noopener\">tweets written in Black English<\/a>. These biases are not generated by the algorithm itself, but are formed due to inappropriately compiled datasets in the training of that algorithm.<\/p>\n\n\n\n<p>Driven by the orders coming from the platforms\u2019 board, engineers constantly seek new datasets to improve the work of their algorithms. These datasets are manually labelled by human moderators, outsourced data flaggers, and regular users.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Human moderators<\/h3>\n\n\n\n<p>The main role of human moderators is to review users\u2019 appeals against particular machine-made decisions and to decide in those cases when the level of machine learning algorithm confidence is low. Moderators often work for outsourced companies based in the Global South and the conditions of their labour are a&nbsp;<a href=\"https:\/\/restofworld.org\/2020\/facebook-international-content-moderators\/\" target=\"_blank\" rel=\"noreferrer noopener\">matter of concern of the human rights activists<\/a>. Besides usually being in economically precarious situations, the moderators suffer&nbsp;<a href=\"https:\/\/mashable.com\/article\/the-cleaners-content-moderators-facebook-twitter-google\/\" target=\"_blank\" rel=\"noreferrer noopener\">huge psychological pressure<\/a>&nbsp;by dealing with very sensitive content like videos of live streamed suicides on a daily basis.<\/p>\n\n\n\n<blockquote class=\"wp-block-quote is-layout-flow wp-block-quote-is-layout-flow\">\n<p>Moderators\u2019 role is not limited to resolving disputes between the user and platform. If human moderators confirm that the uploaded content violates the ToS of the platform, this content, now verified by the expert, can further augment the dataset used for algorithmic training.<\/p>\n<\/blockquote>\n\n\n\n<p>The mainstream practice of human moderation presupposes anonymity of the moderators. The course on the pioneering approach has been taken by Facebook. In their attempt to meet the demand for an increased transparency and improve the company\u2019s legitimacy, Facebook\u2019s board has introduced the system considerably reminding of the constitutional technology of separation of powers employed by the national states. Indeed, the idea of creating a quasi-legal judicial body within Facebook being dedicated to content moderation matters&nbsp;<a href=\"https:\/\/www.newyorker.com\/tech\/annals-of-technology\/inside-the-making-of-facebooks-supreme-court\" target=\"_blank\" rel=\"noreferrer noopener\">came from<\/a>&nbsp;Noah Feldman, a professor at Harvard Law School who took part in drafting the interim Iraqi constitution.<\/p>\n\n\n\n<p>In 2020, the platform established the so-called Oversight Board (OB) referred to by commentators as&nbsp;<a href=\"https:\/\/www.bloomberg.com\/news\/articles\/2021-04-15\/facebook-supreme-court-weighs-trump-s-social-media-fate\" target=\"_blank\" rel=\"noreferrer noopener\">Facebook\u2019s Supreme Court<\/a>. The OB comprises twenty members&nbsp;<a href=\"https:\/\/www.newyorker.com\/tech\/annals-of-technology\/inside-the-making-of-facebooks-supreme-court\" target=\"_blank\" rel=\"noreferrer noopener\">\u201cpaid six-figure salaries for putting in about fifteen hours a week\u201d<\/a>&nbsp;among whom are acknowledged human rights activists, journalists, academics, lawyers, as well as former judges and politicians. By October 2021, the OB has adopted 18 decisions, some of which have overturned the initial decision passed by anonymous human moderators or the board itself. Other decisions, the most significant of which is Trump\u2019s account suspension, have been upheld by the OB. In passing its decisions, the OB refers to both the platform\u2019s community guidelines and the international human rights standards, namely the provisions of International Covenant on Civil and Political Rights. Clearly, twenty OB members are unable to review all the cases eligible for appeal so their goal is limited to reviewing the most representative ones, chosen by Facebook, to produce advisory policy recommendations and, supposedly, create the precedents human moderators can refer to in their practice. The company&nbsp;<a href=\"https:\/\/about.fb.com\/news\/2021\/01\/responding-to-the-oversight-boards-first-decisions\/\" target=\"_blank\" rel=\"noreferrer noopener\">states<\/a>&nbsp;that their \u201cteams are also reviewing the board\u2019s decisions to determine where else they should apply to content that\u2019s identical or similar\u201d.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Crowdsourced flaggers<\/h3>\n\n\n\n<p>Algorithm training sets are usually compiled by crowdsourced flaggers who classify content for a small financial reward, working through platforms such as Amazon\u2019s Mechanical \u2018Turk\u2019 or Yandex\u2019s \u2018Toloka\u2019. Using the example of Yandex Toloka, flaggers are tasked with classifying images into the following six categories: \u201cpornography\u201d, \u201cviolence\u201d, \u201cperversion\u201d, \u201chinting\u201d, \u201cdoesn\u2019t contain porn\u201d, \u201cdidn\u2019t open\u201d. As shown below, the left image, taken from the tutorial, is classified as \u201cdoesn\u2019t contain porn\u201d, while the other two images are classified as \u201chinting\u201d. The explanatory signs indicate that the middle image displays \u201can obvious focus on the genital area\u201d while the right image shows an anatomical depiction of genitals. These classified datasets are most probably used by Yandex to moderate their social media platforms like Messenger and Zen. The latter enjoys relative popularity in the Russian segment of the Web. At the same time, the explicitly norm prescribing manner in which these datasets are compiled serves to illustrate an&nbsp;<a href=\"https:\/\/nooscope.ai\/\" target=\"_blank\" rel=\"noreferrer noopener\">observation<\/a>&nbsp;that \u201cthe training dataset is a cultural construct, not just a technical one\u201d.<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img decoding=\"async\" src=\"https:\/\/sharefoundation.info\/wp-content\/uploads\/lobanov2.png\" alt=\"\" class=\"wp-image-4391\"\/><\/figure>\n\n\n\n<h3 class=\"wp-block-heading\">Users<\/h3>\n\n\n\n<p>Regular users of online platforms also contribute to the training of algorithms or updating the databases for similarity-searching algorithms through reporting on content they deem inappropriate. While for users themselves reporting is a way to make their voices heard by the platform, for the latter the feedback is valuable as any feedback could be and as an unpaid labour of mapping the training datasets for predictive algorithms.&nbsp;<\/p>\n\n\n\n<p>Once a sufficient number of users report that a piece of content doesn\u2019t meet the requirements of the platform, the content is sent to the human moderators for further review. If the moderator confirms that the content violates the ToS of the platform, those users have demonstrably contributed to improving the algorithms.<\/p>\n\n\n\n<h3 class=\"wp-block-heading\">Conclusion<\/h3>\n\n\n\n<p>While current mainstream approaches to the analysis of automated moderation systems focus strictly on the technical details of how the algorithms work, the people involved always go unseen. This paper pays tribute to the humans whose labour makes the automated moderation possible but kept lost in the false human-machine dichotomy, when in fact the current practice of content moderation presents an assemblage of humans and machines intertwined.<\/p>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<p><em>Ilya Lobanov is an independent researcher from Saint-Petersburg, currently based in Vienna. His interests lie in the areas of political economy of digital capitalism, urban politics, and history of mind.<\/em><\/p>\n","protected":false},"excerpt":{"rendered":"<p>In the past decade, thriving online harassment and hate speech, armies of bots spreading disinformation as a mean of interference in the elections, far-right propaganda, and waves of obscurantism disseminating through the COVID-19 related fake news repeatedly made online platforms\u2019 content moderation the topic on everyone\u2019s lips and newsfeeds. A recent round of discussion was [&hellip;]<\/p>\n","protected":false},"author":2,"featured_media":1072,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[26],"tags":[35,37],"class_list":["post-1190","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-uncategorized","tag-algorithms-and-ai","tag-freedom-of-expression"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.3 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Forgotten humans-in-the-loop. Labour-oriented critique of the current discussion of algorithmic content moderation - SHARE Fondacija<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/sharefoundation.info\/en\/forgotten-humans-in-the-loop-labour-oriented-critique-of-the-current-discussion-of-algorithmic-content-moderation\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Forgotten humans-in-the-loop. Labour-oriented critique of the current discussion of algorithmic content moderation - SHARE Fondacija\" \/>\n<meta property=\"og:description\" content=\"In the past decade, thriving online harassment and hate speech, armies of bots spreading disinformation as a mean of interference in the elections, far-right propaganda, and waves of obscurantism disseminating through the COVID-19 related fake news repeatedly made online platforms\u2019 content moderation the topic on everyone\u2019s lips and newsfeeds. A recent round of discussion was [&hellip;]\" \/>\n<meta property=\"og:url\" content=\"https:\/\/sharefoundation.info\/en\/forgotten-humans-in-the-loop-labour-oriented-critique-of-the-current-discussion-of-algorithmic-content-moderation\/\" \/>\n<meta property=\"og:site_name\" content=\"SHARE Fondacija\" \/>\n<meta property=\"article:published_time\" content=\"2021-11-15T00:19:00+00:00\" \/>\n<meta property=\"article:modified_time\" content=\"2025-02-02T13:54:32+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/sharefoundation.info\/wp-content\/uploads\/2021\/11\/Op-ed-o-moderaciji_SHARE-OPINION-1200x550-Web-cover-small.png\" \/>\n\t<meta property=\"og:image:width\" content=\"1200\" \/>\n\t<meta property=\"og:image:height\" content=\"550\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/png\" \/>\n<meta name=\"author\" content=\"Aleksa Boric\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:label1\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data1\" content=\"Aleksa Boric\" \/>\n\t<meta name=\"twitter:label2\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data2\" content=\"11 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\\\/\\\/schema.org\",\"@graph\":[{\"@type\":\"Article\",\"@id\":\"https:\\\/\\\/sharefoundation.info\\\/en\\\/forgotten-humans-in-the-loop-labour-oriented-critique-of-the-current-discussion-of-algorithmic-content-moderation\\\/#article\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/sharefoundation.info\\\/en\\\/forgotten-humans-in-the-loop-labour-oriented-critique-of-the-current-discussion-of-algorithmic-content-moderation\\\/\"},\"author\":{\"name\":\"Aleksa Boric\",\"@id\":\"https:\\\/\\\/sharefoundation.info\\\/#\\\/schema\\\/person\\\/855bab9a30b13d42de38ab5f43b68bfd\"},\"headline\":\"Forgotten humans-in-the-loop. Labour-oriented critique of the current discussion of algorithmic content moderation\",\"datePublished\":\"2021-11-15T00:19:00+00:00\",\"dateModified\":\"2025-02-02T13:54:32+00:00\",\"mainEntityOfPage\":{\"@id\":\"https:\\\/\\\/sharefoundation.info\\\/en\\\/forgotten-humans-in-the-loop-labour-oriented-critique-of-the-current-discussion-of-algorithmic-content-moderation\\\/\"},\"wordCount\":2120,\"publisher\":{\"@id\":\"https:\\\/\\\/sharefoundation.info\\\/#organization\"},\"image\":{\"@id\":\"https:\\\/\\\/sharefoundation.info\\\/en\\\/forgotten-humans-in-the-loop-labour-oriented-critique-of-the-current-discussion-of-algorithmic-content-moderation\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/sharefoundation.info\\\/wp-content\\\/uploads\\\/2021\\\/11\\\/Op-ed-o-moderaciji_SHARE-OPINION-1200x550-Web-cover-small.png\",\"keywords\":[\"algorithms and AI\",\"freedom of expression\"],\"inLanguage\":\"en-US\"},{\"@type\":\"WebPage\",\"@id\":\"https:\\\/\\\/sharefoundation.info\\\/en\\\/forgotten-humans-in-the-loop-labour-oriented-critique-of-the-current-discussion-of-algorithmic-content-moderation\\\/\",\"url\":\"https:\\\/\\\/sharefoundation.info\\\/en\\\/forgotten-humans-in-the-loop-labour-oriented-critique-of-the-current-discussion-of-algorithmic-content-moderation\\\/\",\"name\":\"Forgotten humans-in-the-loop. Labour-oriented critique of the current discussion of algorithmic content moderation - SHARE Fondacija\",\"isPartOf\":{\"@id\":\"https:\\\/\\\/sharefoundation.info\\\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\\\/\\\/sharefoundation.info\\\/en\\\/forgotten-humans-in-the-loop-labour-oriented-critique-of-the-current-discussion-of-algorithmic-content-moderation\\\/#primaryimage\"},\"image\":{\"@id\":\"https:\\\/\\\/sharefoundation.info\\\/en\\\/forgotten-humans-in-the-loop-labour-oriented-critique-of-the-current-discussion-of-algorithmic-content-moderation\\\/#primaryimage\"},\"thumbnailUrl\":\"https:\\\/\\\/sharefoundation.info\\\/wp-content\\\/uploads\\\/2021\\\/11\\\/Op-ed-o-moderaciji_SHARE-OPINION-1200x550-Web-cover-small.png\",\"datePublished\":\"2021-11-15T00:19:00+00:00\",\"dateModified\":\"2025-02-02T13:54:32+00:00\",\"breadcrumb\":{\"@id\":\"https:\\\/\\\/sharefoundation.info\\\/en\\\/forgotten-humans-in-the-loop-labour-oriented-critique-of-the-current-discussion-of-algorithmic-content-moderation\\\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\\\/\\\/sharefoundation.info\\\/en\\\/forgotten-humans-in-the-loop-labour-oriented-critique-of-the-current-discussion-of-algorithmic-content-moderation\\\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/sharefoundation.info\\\/en\\\/forgotten-humans-in-the-loop-labour-oriented-critique-of-the-current-discussion-of-algorithmic-content-moderation\\\/#primaryimage\",\"url\":\"https:\\\/\\\/sharefoundation.info\\\/wp-content\\\/uploads\\\/2021\\\/11\\\/Op-ed-o-moderaciji_SHARE-OPINION-1200x550-Web-cover-small.png\",\"contentUrl\":\"https:\\\/\\\/sharefoundation.info\\\/wp-content\\\/uploads\\\/2021\\\/11\\\/Op-ed-o-moderaciji_SHARE-OPINION-1200x550-Web-cover-small.png\",\"width\":1200,\"height\":550},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\\\/\\\/sharefoundation.info\\\/en\\\/forgotten-humans-in-the-loop-labour-oriented-critique-of-the-current-discussion-of-algorithmic-content-moderation\\\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\\\/\\\/sharefoundation.info\\\/en\\\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Forgotten humans-in-the-loop. Labour-oriented critique of the current discussion of algorithmic content moderation\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\\\/\\\/sharefoundation.info\\\/#website\",\"url\":\"https:\\\/\\\/sharefoundation.info\\\/\",\"name\":\"Share Foundation\",\"description\":\"Dru\u0161tvo, tehnologija, sajberspejs\",\"publisher\":{\"@id\":\"https:\\\/\\\/sharefoundation.info\\\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\\\/\\\/sharefoundation.info\\\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\\\/\\\/sharefoundation.info\\\/#organization\",\"name\":\"Share Foundation\",\"url\":\"https:\\\/\\\/sharefoundation.info\\\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/sharefoundation.info\\\/#\\\/schema\\\/logo\\\/image\\\/\",\"url\":\"https:\\\/\\\/sharefoundation.info\\\/wp-content\\\/uploads\\\/2025\\\/02\\\/share-favicon.png\",\"contentUrl\":\"https:\\\/\\\/sharefoundation.info\\\/wp-content\\\/uploads\\\/2025\\\/02\\\/share-favicon.png\",\"width\":512,\"height\":512,\"caption\":\"Share Foundation\"},\"image\":{\"@id\":\"https:\\\/\\\/sharefoundation.info\\\/#\\\/schema\\\/logo\\\/image\\\/\"}},{\"@type\":\"Person\",\"@id\":\"https:\\\/\\\/sharefoundation.info\\\/#\\\/schema\\\/person\\\/855bab9a30b13d42de38ab5f43b68bfd\",\"name\":\"Aleksa Boric\",\"image\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/8a315c1bd01d5be2ae24fed0ba874eab6fb11d526ea2951006669df6b3ce36cc?s=96&d=mm&r=g\",\"url\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/8a315c1bd01d5be2ae24fed0ba874eab6fb11d526ea2951006669df6b3ce36cc?s=96&d=mm&r=g\",\"contentUrl\":\"https:\\\/\\\/secure.gravatar.com\\\/avatar\\\/8a315c1bd01d5be2ae24fed0ba874eab6fb11d526ea2951006669df6b3ce36cc?s=96&d=mm&r=g\",\"caption\":\"Aleksa Boric\"},\"url\":\"https:\\\/\\\/sharefoundation.info\\\/en\\\/author\\\/aleksa-boric\\\/\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Forgotten humans-in-the-loop. Labour-oriented critique of the current discussion of algorithmic content moderation - SHARE Fondacija","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/sharefoundation.info\/en\/forgotten-humans-in-the-loop-labour-oriented-critique-of-the-current-discussion-of-algorithmic-content-moderation\/","og_locale":"en_US","og_type":"article","og_title":"Forgotten humans-in-the-loop. Labour-oriented critique of the current discussion of algorithmic content moderation - SHARE Fondacija","og_description":"In the past decade, thriving online harassment and hate speech, armies of bots spreading disinformation as a mean of interference in the elections, far-right propaganda, and waves of obscurantism disseminating through the COVID-19 related fake news repeatedly made online platforms\u2019 content moderation the topic on everyone\u2019s lips and newsfeeds. A recent round of discussion was [&hellip;]","og_url":"https:\/\/sharefoundation.info\/en\/forgotten-humans-in-the-loop-labour-oriented-critique-of-the-current-discussion-of-algorithmic-content-moderation\/","og_site_name":"SHARE Fondacija","article_published_time":"2021-11-15T00:19:00+00:00","article_modified_time":"2025-02-02T13:54:32+00:00","og_image":[{"width":1200,"height":550,"url":"https:\/\/sharefoundation.info\/wp-content\/uploads\/2021\/11\/Op-ed-o-moderaciji_SHARE-OPINION-1200x550-Web-cover-small.png","type":"image\/png"}],"author":"Aleksa Boric","twitter_card":"summary_large_image","twitter_misc":{"Written by":"Aleksa Boric","Est. reading time":"11 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"Article","@id":"https:\/\/sharefoundation.info\/en\/forgotten-humans-in-the-loop-labour-oriented-critique-of-the-current-discussion-of-algorithmic-content-moderation\/#article","isPartOf":{"@id":"https:\/\/sharefoundation.info\/en\/forgotten-humans-in-the-loop-labour-oriented-critique-of-the-current-discussion-of-algorithmic-content-moderation\/"},"author":{"name":"Aleksa Boric","@id":"https:\/\/sharefoundation.info\/#\/schema\/person\/855bab9a30b13d42de38ab5f43b68bfd"},"headline":"Forgotten humans-in-the-loop. Labour-oriented critique of the current discussion of algorithmic content moderation","datePublished":"2021-11-15T00:19:00+00:00","dateModified":"2025-02-02T13:54:32+00:00","mainEntityOfPage":{"@id":"https:\/\/sharefoundation.info\/en\/forgotten-humans-in-the-loop-labour-oriented-critique-of-the-current-discussion-of-algorithmic-content-moderation\/"},"wordCount":2120,"publisher":{"@id":"https:\/\/sharefoundation.info\/#organization"},"image":{"@id":"https:\/\/sharefoundation.info\/en\/forgotten-humans-in-the-loop-labour-oriented-critique-of-the-current-discussion-of-algorithmic-content-moderation\/#primaryimage"},"thumbnailUrl":"https:\/\/sharefoundation.info\/wp-content\/uploads\/2021\/11\/Op-ed-o-moderaciji_SHARE-OPINION-1200x550-Web-cover-small.png","keywords":["algorithms and AI","freedom of expression"],"inLanguage":"en-US"},{"@type":"WebPage","@id":"https:\/\/sharefoundation.info\/en\/forgotten-humans-in-the-loop-labour-oriented-critique-of-the-current-discussion-of-algorithmic-content-moderation\/","url":"https:\/\/sharefoundation.info\/en\/forgotten-humans-in-the-loop-labour-oriented-critique-of-the-current-discussion-of-algorithmic-content-moderation\/","name":"Forgotten humans-in-the-loop. Labour-oriented critique of the current discussion of algorithmic content moderation - SHARE Fondacija","isPartOf":{"@id":"https:\/\/sharefoundation.info\/#website"},"primaryImageOfPage":{"@id":"https:\/\/sharefoundation.info\/en\/forgotten-humans-in-the-loop-labour-oriented-critique-of-the-current-discussion-of-algorithmic-content-moderation\/#primaryimage"},"image":{"@id":"https:\/\/sharefoundation.info\/en\/forgotten-humans-in-the-loop-labour-oriented-critique-of-the-current-discussion-of-algorithmic-content-moderation\/#primaryimage"},"thumbnailUrl":"https:\/\/sharefoundation.info\/wp-content\/uploads\/2021\/11\/Op-ed-o-moderaciji_SHARE-OPINION-1200x550-Web-cover-small.png","datePublished":"2021-11-15T00:19:00+00:00","dateModified":"2025-02-02T13:54:32+00:00","breadcrumb":{"@id":"https:\/\/sharefoundation.info\/en\/forgotten-humans-in-the-loop-labour-oriented-critique-of-the-current-discussion-of-algorithmic-content-moderation\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/sharefoundation.info\/en\/forgotten-humans-in-the-loop-labour-oriented-critique-of-the-current-discussion-of-algorithmic-content-moderation\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/sharefoundation.info\/en\/forgotten-humans-in-the-loop-labour-oriented-critique-of-the-current-discussion-of-algorithmic-content-moderation\/#primaryimage","url":"https:\/\/sharefoundation.info\/wp-content\/uploads\/2021\/11\/Op-ed-o-moderaciji_SHARE-OPINION-1200x550-Web-cover-small.png","contentUrl":"https:\/\/sharefoundation.info\/wp-content\/uploads\/2021\/11\/Op-ed-o-moderaciji_SHARE-OPINION-1200x550-Web-cover-small.png","width":1200,"height":550},{"@type":"BreadcrumbList","@id":"https:\/\/sharefoundation.info\/en\/forgotten-humans-in-the-loop-labour-oriented-critique-of-the-current-discussion-of-algorithmic-content-moderation\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/sharefoundation.info\/en\/"},{"@type":"ListItem","position":2,"name":"Forgotten humans-in-the-loop. Labour-oriented critique of the current discussion of algorithmic content moderation"}]},{"@type":"WebSite","@id":"https:\/\/sharefoundation.info\/#website","url":"https:\/\/sharefoundation.info\/","name":"Share Foundation","description":"Dru\u0161tvo, tehnologija, sajberspejs","publisher":{"@id":"https:\/\/sharefoundation.info\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/sharefoundation.info\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/sharefoundation.info\/#organization","name":"Share Foundation","url":"https:\/\/sharefoundation.info\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/sharefoundation.info\/#\/schema\/logo\/image\/","url":"https:\/\/sharefoundation.info\/wp-content\/uploads\/2025\/02\/share-favicon.png","contentUrl":"https:\/\/sharefoundation.info\/wp-content\/uploads\/2025\/02\/share-favicon.png","width":512,"height":512,"caption":"Share Foundation"},"image":{"@id":"https:\/\/sharefoundation.info\/#\/schema\/logo\/image\/"}},{"@type":"Person","@id":"https:\/\/sharefoundation.info\/#\/schema\/person\/855bab9a30b13d42de38ab5f43b68bfd","name":"Aleksa Boric","image":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/secure.gravatar.com\/avatar\/8a315c1bd01d5be2ae24fed0ba874eab6fb11d526ea2951006669df6b3ce36cc?s=96&d=mm&r=g","url":"https:\/\/secure.gravatar.com\/avatar\/8a315c1bd01d5be2ae24fed0ba874eab6fb11d526ea2951006669df6b3ce36cc?s=96&d=mm&r=g","contentUrl":"https:\/\/secure.gravatar.com\/avatar\/8a315c1bd01d5be2ae24fed0ba874eab6fb11d526ea2951006669df6b3ce36cc?s=96&d=mm&r=g","caption":"Aleksa Boric"},"url":"https:\/\/sharefoundation.info\/en\/author\/aleksa-boric\/"}]}},"_links":{"self":[{"href":"https:\/\/sharefoundation.info\/en\/wp-json\/wp\/v2\/posts\/1190","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/sharefoundation.info\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/sharefoundation.info\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/sharefoundation.info\/en\/wp-json\/wp\/v2\/users\/2"}],"replies":[{"embeddable":true,"href":"https:\/\/sharefoundation.info\/en\/wp-json\/wp\/v2\/comments?post=1190"}],"version-history":[{"count":1,"href":"https:\/\/sharefoundation.info\/en\/wp-json\/wp\/v2\/posts\/1190\/revisions"}],"predecessor-version":[{"id":1191,"href":"https:\/\/sharefoundation.info\/en\/wp-json\/wp\/v2\/posts\/1190\/revisions\/1191"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/sharefoundation.info\/en\/wp-json\/wp\/v2\/media\/1072"}],"wp:attachment":[{"href":"https:\/\/sharefoundation.info\/en\/wp-json\/wp\/v2\/media?parent=1190"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/sharefoundation.info\/en\/wp-json\/wp\/v2\/categories?post=1190"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/sharefoundation.info\/en\/wp-json\/wp\/v2\/tags?post=1190"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}