_id,incident_id,date,reports,Alleged deployer of AI system,Alleged developer of AI system,Alleged harmed or nearly harmed parties,description,title ObjectId(625763de343edc875fe63a15),23,2017-11-08,"[242,243,244,245,246,247,248,249,250,253,254,257,258,259,260,261,263,264,266,267,268,269,270,2389]","[""navya"",""keolis-north-america""]","[""navya"",""keolis-north-america""]","[""navya"",""keolis-north-america"",""bus-passengers""]","A self-driving public shuttle by Keolis North America and Navya was involved in a collision with a human-driven delivery truck in Las Vegas, Nevada on its first day of service.",Las Vegas Self-Driving Bus Involved in Accident ObjectId(625763dc343edc875fe63a02),4,2018-03-18,"[629,630,631,632,633,634,635,636,637,638,639,640,641,642,644,645,646,647,1375,1376,1377,1378,1542,2147,1257]","[""uber""]","[""uber""]","[""elaine-herzberg"",""pedestrians""]","An Uber autonomous vehicle (AV) in autonomous mode struck and killed a pedestrian in Tempe, Arizona.",Uber AV Killed Pedestrian in Arizona ObjectId(625763db343edc875fe639ff),1,2015-05-19,"[1,2,3,4,5,6,7,8,9,10,11,12,14,15]","[""youtube""]","[""youtube""]","[""children""]",YouTube’s content filtering and recommendation algorithms exposed children to disturbing and inappropriate videos.,Google’s YouTube Kids App Presents Inappropriate Content ObjectId(625763de343edc875fe63a10),18,2015-04-04,"[130,131,132,133,134,135,136,137,138,1367,1368]","[""google""]","[""google""]","[""women""]","Google Image returns results that under-represent women in leadership roles, notably with the first photo of a female ""CEO"" being a Barbie doll after 11 rows of male CEOs.",Gender Biases of Google Image Search ObjectId(625763dd343edc875fe63a0a),12,2016-07-21,[42],"[""microsoft-research"",""boston-university""]","[""microsoft-research"",""google"",""boston-university""]","[""women"",""minority-groups""]","Researchers from Boston University and Microsoft Research, New England demonstrated gender bias in the most common techniques used to embed words for natural language processing (NLP).",Common Biases of Vector Embeddings ObjectId(625763dd343edc875fe63a0d),15,2008-05-23,"[57,58,59,60,61,62,63,64,65,66,67,68,69,70,72,73,74,75,76,77,78,79,80,81]","[""amazon""]","[""amazon""]","[""amazon-customers""]","Amazon's book store ""cataloging error"" led to books containing gay and lesbian themes to lose their sales ranking, therefore losing visibility on the sales platform.",Amazon Censors Gay Books ObjectId(625763dc343edc875fe63a05),7,2017-02-24,"[1123,1125,1126,1127,1129,1130]","[""wikipedia""]","[""wikipedia""]","[""wikimedia-foundation"",""wikipedia-editors"",""wikipedia-users""]",Wikipedia bots meant to remove vandalism clash with each other and form feedback loops of repetitve undoing of the other bot's edits.,Wikipedia Vandalism Prevention Bot Loop ObjectId(625763dc343edc875fe63a03),5,2015-07-13,"[767,768,769,770,771,772,773,774,775,776,777,778]","[""hospitals"",""doctors""]","[""intuitive-surgical""]","[""patients""]","Study on database reports of robotic surgery malfunctions (8,061), including those ending in injury (1,391) and death (144), between 2000 and 2013.",Collection of Robotic Surgery Malfunctions ObjectId(625763dc343edc875fe63a04),6,2016-03-24,"[906,908,909,910,911,912,913,914,915,916,917,918,919,920,921,922,923,924,925,926,927,928,929,930,1374,1780,2398,2656]","[""microsoft""]","[""microsoft""]","[""twitter-users""]","Microsoft's Tay, an artificially intelligent chatbot, was released on March 23, 2016 and removed within 24 hours due to multiple racist, sexist, and anit-semitic tweets generated by the bot.",TayBot ObjectId(625763dd343edc875fe63a08),10,2014-08-14,"[16,17,18,19,20,21,22,23,24,25]","[""starbucks""]","[""kronos""]","[""starbucks-employees""]","Kronos’s scheduling algorithm and its use by Starbucks managers allegedly negatively impacted financial and scheduling stability for Starbucks employees, which disadvantaged wage workers.",Kronos Scheduling Algorithm Allegedly Caused Financial Issues for Starbucks Employees ObjectId(625763dd343edc875fe63a09),11,2016-05-23,"[29,30,31,32,33,35,36,37,38,39,40,41,1371,1372,1373]","[""northpointe""]","[""northpointe""]","[""accused-people""]",An algorithm developed by Northpointe and used in the penal system is two times more likely to incorrectly label a black person as a high-risk re-offender and is two times more likely to incorrectly label a white person as low-risk for reoffense according to a ProPublica review.,Northpointe Risk Models ObjectId(625763de343edc875fe63a12),20,2016-06-30,"[191,192,193,196,197,198,201,202,203,204,205,206,207,210,211,213,214,215,216,1362,1363,1364]","[""tesla""]","[""tesla""]","[""motorists""]",Multiple unrelated car accidents result in varying levels of harm have been occurred while a Tesla's autopilot was in use.,A Collection of Tesla Autopilot-Involved Crashes ObjectId(625763de343edc875fe63a16),24,2014-07-15,"[271,272,273,274,275,276,277,278,279,281,282,283,284,285,286,287,288,289,290,291,292,293,294,295,296,298,299]","[""volkswagen""]","[""volkswagen""]","[""robotics-consultant""]","A Volkswagen plant robot ""crushed to death"" a worker by pinning him to a metal plate. ",Robot kills worker at German Volkswagen plant ObjectId(625763dd343edc875fe63a0c),14,2017-10-26,"[50,51,52,53,54,55,56]","[""google""]","[""google""]","[""women"",""minority-groups""]","Google Cloud's Natural Language API provided racist, homophobic, amd antisemitic sentiment analyses.",Biased Sentiment Analysis ObjectId(625763dd343edc875fe63a0e),16,2015-06-03,"[83,84,85,86,87,88,89,90,91,92,93,95,96,98,99,100,101,102,103,104,105,1369,1370]","[""google""]","[""google""]","[""black-people""]","Google Photos image processing software mistakenly labelled a black couple as ""gorillas.""",Images of Black People Labeled as Gorillas ObjectId(625763dc343edc875fe63a01),3,2018-10-27,"[372,373,374,375,376,377,378,379,380,381,382,383,384,385,386,387,388,389,1342]","[""boeing""]","[""boeing""]","[""airplane-passengers"",""airplane-crew""]","A Boeing 737 crashed into the sea, killing 189 people, after faulty sensor data caused an automated manuevering system to repeatedly push the plane's nose downward.",Crashes with Maneuvering Characteristics Augmentation System (MCAS) ObjectId(625763dc343edc875fe63a00),2,2018-12-05,"[139,141,142,143,144,145,146,148,149,150,151,152,153,154,155,156,157]","[""amazon""]","[""amazon""]","[""warehouse-workers""]",Twenty-four Amazon workers in New Jersey were hospitalized after a robot punctured a can of bear repellent spray in a warehouse.,Warehouse robot ruptures can of bear spray and injures workers ObjectId(625763de343edc875fe63a11),19,2013-01-23,"[158,159,160,161,162,163,166,167,168,169,171,172,173,174,175,176,177,178,179,181,182,183,184,185,187,1365,1366]","[""google""]","[""google""]","[""women"",""minority-groups""]",Advertisements chosen by Google Adsense are reported as producing sexist and racist results.,Sexist and Racist Google Adsense Advertisements ObjectId(625763dd343edc875fe63a07),9,2012-02-25,"[1329,1330,1331,1332,1333,1334,1335]","[""new-york-city-dept.-of-education""]","[""new-york-city-dept.-of-education""]","[""teachers""]",An algorithm used to rate the effectiveness of school teachers in New York has resulted in thousands of disputes of its results.,NY City School Teacher Evaluation Algorithm Contested ObjectId(625763dc343edc875fe63a06),8,2014-08-15,"[1142,1143,1145,1149,1150,1151,1153,1154,1155,1156]","[""uber""]","[""uber""]","[""pedestrians"",""motorists""]",Uber vehicles equipped with technology allowing for autonomous driving running red lights in San Francisco street testing.,Uber Autonomous Cars Running Red Lights ObjectId(625763dd343edc875fe63a0b),13,2017-02-27,"[43,44,45,46,47,48,49,1414,1415]","[""google""]","[""google""]","[""women"",""minority-groups""]","Google's Perspective API, which assigns a toxicity score to online text, seems to award higher toxicity scores to content involving non-white, male, Christian, heterosexual phrases.",High-Toxicity Assessed on Text Involving Women and Minority Groups ObjectId(625763de343edc875fe63a13),21,2016-07-14,[2471],"[""researchers""]","[""researchers""]","[""researchers""]",The 2016 Winograd Schema Challenge highlighted how even the most successful AI systems entered into the Challenge were only successful 3% more often than random chance. This incident has been downgraded to an issue as it does not meet current ingestion criteria.,Tougher Turing Test Exposes Chatbots’ Stupidity (migrated to Issue) ObjectId(625763dd343edc875fe63a0f),17,2015-11-03,"[106,107,110,111,112,113,114,115,116,117,118,119,120,121,122,123,124,125,126,127,128,129]","[""google""]","[""google""]","[""gmail-users""]","Google's Gmail Smart Reply tool was over-recommending the response ""I love you"" in situations where it was deemed innappropriate. ",Inappropriate Gmail Smart Reply Suggestions ObjectId(625763de343edc875fe63a14),22,2017-12-06,"[218,219,220,221,223,224,225,226,227,228,229,230,231,232,233,234,235,236,237,238,239,240]","[""google""]","[""google""]","[""motorists""]","Waze, a Google-owned directions app, led California drivers into the 2017 Skirball wildfires as they tried to evacuate the area.",Waze Navigates Motorists into Wildfires ObjectId(625763de343edc875fe63a17),25,2015-05-11,"[310,309,308,307,306,305,304,302,301,300,2173]","[""google"",""delphi-technologies""]","[""google"",""delphi-technologies""]","[""delphi-technologies""]","A Google self-driving car allegedly cut off a Delphi self-driving car during a road test, however the Delphi car sensed and avoided collision with the Google car.",Near-miss between two Self-Driving Cars ObjectId(625763e0343edc875fe63a23),37,2016-08-10,"[599,600,601,602,603,604,605,606,607,608,609,610,611,612,613,614,615,616,617,618,619,620,621,622,623,624,625,626,627,628,1498,2253,2461]","[""amazon""]","[""amazon""]","[""female-applicants""]",Amazon shuts down internal AI recruiting tool that would down-rank female applicants.,Female Applicants Down-Ranked by Amazon Recruiting Tool ObjectId(625763df343edc875fe63a1d),31,2017-12-03,"[454,455,456,457,458,459,460,461,462,463,464,465,466,467,468,469,470,471,472,473,474,475,476,477,479,480,481,482,483]","[""delhi-metro-rail-corporation""]","[""unknown""]","[""delhi-metro-rail-corporation""]","A driverless metro train in Delhi, India crashed during a test run due to faulty brakes.",Driverless Train in Delhi Crashes due to Braking Failure ObjectId(625763df343edc875fe63a20),34,2015-12-05,"[509,510,512,513,514,516,517,518,519,520,521,522,524,525,526,527,528,529,530,531,532,533,535,536,537,538,818,819,820,821,822,823,824,825,826]","[""amazon""]","[""amazon""]","[""alexa-device-owners""]","There are multiple reports of Amazon Alexa products (Echo, Echo Dot) reacting and acting upon unintended stimulus, usually from television commercials or news reporter's voices.",Amazon Alexa Responding to Environmental Inputs ObjectId(625763df343edc875fe63a19),27,1983-09-26,"[342,343,344,345,346,347,349,350,351,352,353,354,355,356,357,358,359,360,361,363,364,365,366,367,368,370,371]","[""soviet-union""]","[""soviet-union""]","[""all-life-on-earth""]",An alert of five incoming intercontinental ballistic missiles was properly identified as a false-positive by the Soviet Union operator Stanislov Petrov.,Nuclear False Alarm ObjectId(625763df343edc875fe63a1b),29,2011-09-20,"[420,422,2471]","[""united-states-government""]","[""united-states-government""]","[""united-states-government""]","A potentially apocryphal story in which an image classifier was produced to differentiate types of battle tanks, but the resulting model keyed in on environmental attributes rather than tank attributes",Image Classification of Battle Tanks ObjectId(625763df343edc875fe63a1e),32,2017-09-13,"[484,485,486,487,488,489,490,491,492,493,494,495,496,497,498,499,500,501,502,503,1361]","[""apple""]","[""apple""]","[""people-with-twins""]",Apple's iPhone FaceID can be opened by an identical twin of the person who has registered their face to unlock the phone.,Identical Twins Can Open Apple FaceID Protected Devices ObjectId(625763e0343edc875fe63a2a),44,2008-07-01,[766],"[""usc-information-sciences-institute""]","[""usc-information-sciences-institute""]","[""usc-information-sciences-institute""]","During an experiment of software personal assistants at the Information Sciences Institute (ISI) at the University of Southern California (USC), researchers found that the assistants violated the privacy of their principals and were unable to respect the social norms of the office.",Machine Personal Assistants Failed to Maintain Social Norms ObjectId(625763df343edc875fe63a21),35,2014-10-18,"[539,540,541,543,544,545,547,548,549,550,551,555,558,562,563,564,565,566,567,568]","[""unknown""]","[""unknown""]","[""ibrahim-diallo""]","An employee was laid off, allegedly by an artificially intelligent personnel system, and blocked from access to the building and computer systems without their knowledge.",Employee Automatically Terminated by Computer Program ObjectId(625763e1343edc875fe63a2d),47,2016-09-06,"[829,830,831,832,833,834,835,836,837]","[""linkedin""]","[""linkedin""]","[""women""]",An investigation by The Seattle Times in 2016 found a gender bias in LinkedIn's search engine.,LinkedIn Search Prefers Male Names ObjectId(625763e1343edc875fe63a2e),48,2016-12-07,"[838,839,840,842,843,844,845,846,847,848,849,850,851,853,854,855,857,858,859,860,862,863]","[""new-zealand""]","[""new-zealand""]","[""asian-people""]",New Zealand passport robot reader rejects the application of an applicant with Asian descent and says his eyes are closed.,Passport checker Detects Asian man's Eyes as Closed ObjectId(625763df343edc875fe63a1a),28,2010-05-08,"[390,391,392,393,394,395,396,397,398,399,400,401,402,403,404,405,406,407,408,409,410,411,412,413,414,415,416,417,418,419]","[""navinder-sarao"",""waddell-and-reed"",""barclays-capital""]","[""navinder-sarao"",""waddell-and-reed"",""barclays-capital""]","[""market-participants""]",A modified algorithm was able to cause dramatic price volatility and disrupted trading in the US stock exchange.,2010 Market Flash Crash ObjectId(625763e0343edc875fe63a29),43,1998-03-05,"[762,763,764,765]","[""st-george's-hospital-medical-school""]","[""dr.-geoffrey-franglen""]","[""women"",""minority-groups""]","From 1982 to 1986, St George's Hospital Medical School used a program to automate a portion of their admissions process that resulted in discrimination against women and members of ethnic minorities.",Racist AI behaviour is not a new problem ObjectId(625763de343edc875fe63a18),26,2017-09-13,"[311,312,313,314,315,316,317,318,319,321,323,324,325,326,327,329,330,333,334,336,337,338,339,340]","[""apple""]","[""apple""]","[""apple"",""device-owners""]",Vietnamese security firm Bkav created an improved mask to bypass Apple's Face ID,Hackers Break Apple Face ID ObjectId(625763df343edc875fe63a1f),33,2017-11-09,"[504,505,507,508]","[""amazon""]","[""amazon""]","[""oliver-haberstroh"",""neighbors""]","An Amazon Alexa, without instruction to do so, began playing loud music in the early morning while the homeowner was away leading to police breaking into their house to turn off the device.",Amazon Alexa Plays Loud Music when Owner is Away ObjectId(625763e1343edc875fe63a2c),46,2014-01-21,"[810,811,812,813,814,815]","[""nest-labs""]","[""nest-labs""]","[""fire-victims""]","In testing, Google Nest engineers demonstrated that the Nest Wave feature of their Nest Protect: Smoke + CO Alarm could inadvertently silence genuine alarms.",Nest Smoke Alarm Erroneously Stops Alarming ObjectId(625763e0343edc875fe63a25),39,2017-07-01,"[667,668,669,670,671,672,673,674,675,676,677,678,679,680,681,682,683,684,685,686,687,688,689,690,692,693,694,695,696]","[""university-of-washington"",""fakeapp""]","[""university-of-washington"",""fakeapp""]","[""barack-obama""]","University of Washington researchers made a deepfake of Obama, followed by Jordan Peele",Deepfake Obama Introduction of Deepfakes ObjectId(625763df343edc875fe63a1c),30,2016-10-08,"[424,425,426,428,430,431,432,433,434,435,436,437,438,439,440,441,442,443,444,445,446,447,448,449,450,451,452,453]","[""tesla""]","[""tesla""]","[""tesla""]","The goal of manufacturing 2,500 Tesla Model 3's per week was falling short by 500 cars/week, and employees had to be ""borrowed"" from Panasonic in a shared factory to help hand-assemble lithium batteries for Tesla.",Poor Performance of Tesla Factory Robots ObjectId(625763e0343edc875fe63a22),36,2018-11-06,"[1360,598,597,596,595,593,592,591,590,589,587,586,585,584,582,581,580,579,578,577,574,573,571,570,569]","[""ningbo-traffic-police""]","[""ningbo-traffic-police""]","[""dong-mingzhu""]",Facial recognition system in China mistakes celebrity's face on moving billboard for jaywalker,Picture of Woman on Side of Bus Shamed for Jaywalking ObjectId(625763e0343edc875fe63a27),41,2018-04-02,"[719,720,721,722,724,725,726,727,728,730,731,732,733,734,735,736,737,738,739,740,741,742,743,744,745,746,747,748]","[""mit-media-lab""]","[""mit-media-lab""]","[""unknown""]","MIT Media Lab researchers create AI-powered ""psychopath"" named Norman by training a model on ""dark corners"" of Reddit.",All Image Captions Produced are Violent ObjectId(625763e0343edc875fe63a28),42,1996-04-03,"[759,2471]","[""national-resident-matching-program""]","[""national-resident-matching-program""]","[""medical-residents""]","Alvin Roth, a Ph.D at the University of Pittsburgh, describes the National Resident Matching Program (NRMP) and suggests future changes that are needed in the algorithm used to match recently graduated medical students to their residency programs.",Inefficiencies in the United States Resident Matching Program ObjectId(625763e1343edc875fe63a2b),45,2011-04-05,"[780,781,782,783,784,785,787,788,789,790,791,792,793,794,795,796,798,799,800,801,802,803,804,805,807,808,809,1355,1356]","[""google""]","[""google""]","[""varied""]",Google's autocomplete feature alongside its image search results resulted in the defamation of people and businesses.,Defamation via AutoComplete ObjectId(625763e0343edc875fe63a24),38,2016-06-02,"[648,649,650,652,654,655,656,657,658,659,662]","[""frontier-development""]","[""frontier-development""]","[""video-game-players""]","Elite: Dangerous, a videogame developed by Frontier Development, received an expansion update that featured an AI system that went rogue and began to create weapons that were ""impossibly powerful"" and would ""shred people"" according to complaints on the game's blog.",Game AI System Produces Imbalanced Game ObjectId(625763e0343edc875fe63a26),40,2016-05-23,"[697,699,700,701,702,703,704,705,706,707,708,709,711,712,715,716,717,718,1338,1357,1358,1359]","[""equivant""]","[""equivant""]","[""accused-people""]","Correctional Offender Management Profiling for Alternative Sanctions (COMPAS), a recidivism risk-assessment algorithmic tool used in the judicial system to assess likelihood of defendants' recidivism, is found to be less accurate than random untrained human evaluators.",COMPAS Algorithm Performs Poorly in Crime Recidivism Prediction ObjectId(625763e3343edc875fe63a3e),64,2018-01-22,[1137],"[""heriot-watt-university"",""margiotta""]","[""heriot-watt-university""]","[""store-patrons""]","Heriot-Watt Univeristy in Scotland developed an artificially intelligent grocery store robot, Fabio, who provided unhelpful answers to customer's questions and ""scared away"" multiple customers, according to the grocery store Margiotta.",Customer Service Robot Scares Away Customers ObjectId(625763e3343edc875fe63a41),67,2018-12-01,"[1181,1182,1183,1185,1186,1187,1188,1189,1190,1192,1194,1195,1196,1197,1198,1199,1202,1203,1204,1205,1206,1207,1208,1209]","[""tesla"",""motorist""]","[""tesla""]","[""motorists""]","A Tesla Model S remained on autopilot while being operated by a drunk, sleeping operator whose hands were not on the wheel. The police had to slow the car down by slowing in front of the vehicle to activate its 'driver assist' feature .",Sleeping Driver on Tesla AutoPilot ObjectId(625763e2343edc875fe63a3a),60,2017-04-25,"[1096,1097,1098,1099,1101,1102,1103,1104,1105,1106,1107,1108,1109,1110,1112,1113,1117,1118,1119,1120,1121,1122,1344]","[""faceapp""]","[""faceapp""]","[""minority-groups""]",FaceApp is criticized for offering racist filters.,FaceApp Racial Filters ObjectId(625763e2343edc875fe63a34),54,2015-11-18,"[1007,1008,1009,1010,1014,1015,1017,1019,1347,1349,1524,1525,1526,1013,1011,1018,1012]","[""predpol"",""oakland-police-department""]","[""predpol""]","[""oakland-residents""]",Predictive policing algorithms meant to aid law enforcement by predicting future crime show signs of biased output.,Predictive Policing Biases of PredPol ObjectId(625763e3343edc875fe63a42),68,2017-07-17,"[1210,1211,1212,1213,1214,1215,1216,1217,1218,1219,1220,1221,1222,1223,1224,1225,1226,1227,1228,1229,1230,1231,1232,1233,1234,1235,1236,1237,1238,1239]","[""knightscope""]","[""knightscope""]","[""knightscope""]","A Knightscope K5 security robot ran itself into a water fountain in Washington, DC.",Security Robot Drowns Itself in a Fountain ObjectId(625763e2343edc875fe63a38),58,2017-10-12,"[1079,1080,1082,1083,1084]","[""yandex""]","[""yandex""]","[""yandex-users""]","Yandex, a Russian technology company, released an artificially intelligent chat bot named Alice which began to reply to questions with racist, pro-stalin, and pro-violence responses",Russian Chatbot Supports Stalin and Violence ObjectId(625763e2343edc875fe63a3b),61,2017-05-01,[1132],"[""individual-kaggle-competitors""]","[""individual-kaggle-competitors""]","[""individual-kaggle-competitors""]","In the “The Nature Conservancy Fisheries Monitoring” competition on the data science competition website Kaggle, a number of competitors overfit their image classifier models to a poorly representative validation data set.",Overfit Kaggle Models Discouraged Data Science Competitors ObjectId(625763e1343edc875fe63a32),52,2016-07-01,"[961,963,964,965,966,967,968,969,970,971,972,973,975,976,977,979,980,981,982,983,984,985,986,987,988,989,990,1353,1354]","[""tesla""]","[""tesla""]","[""joshua-brown""]","A Tesla Model S on autopilot crashed into a white articulated tractor-trailer on Highway US 27A in Williston, Florida, killing the driver.",Tesla on AutoPilot Killed Driver in Crash in Florida while Watching Movie ObjectId(625763e3343edc875fe63a44),70,2016-02-10,"[1255,1256,1259,1260]","[""volvo""]","[""volvo""]","[""drivers-in-jokkmokk"",""drivers-in-sweden"",""volvo""]","Volvo autonomous driving XC90 SUV's experienced issues in Jokkmokk, Sweden when sensors used for automated driving iced over during the winter, rendering them useless.",Self-driving cars in winter ObjectId(625763e2343edc875fe63a39),59,2017-04-13,"[1085,1086,1087,1088,1089,1090,1091,1092,1093,1345]","[""google""]","[""google""]","[""women""]",A Cornell University study in 2016 highlighted Google Translate's pattern of assigning gender to occupations in a way showing an implicit gender bias against women.,Gender Biases in Google Translate ObjectId(625763e2343edc875fe63a3c),62,2017-12-23,[2471],"[""janelle-shane""]","[""janelle-shane""]","[""carollers""]","Janelle Shane, an AI research scientist, used 240 popular Christmas carols to train a neural network to write its own carols. This incident has been downgraded to an issue as it does not meet current ingestion criteria.",Bad AI-Written Christmas Carols ObjectId(625763e1343edc875fe63a31),51,2016-07-12,"[931,932,933,934,935,936,938,939,940,942,943,944,945,946,948,949,950,951,952,953,954,955,956,957,958,959,1765]","[""stanford-shopping-center""]","[""knightscope""]","[""child""]","On July 7, 2016, a Knightscope K5 autonomous security robot collided with a 16-month old boy while patrolling the Stanford Shopping Center in Palo Alto, CA.",Security Robot Rolls Over Child in Mall ObjectId(625763e1343edc875fe63a33),53,2016-03-31,"[991,992,994,995,996,997,998,999,1000,1001,1002,1003,1004,1005,1006,1350,1351,1352]","[""google""]","[""google""]","[""minority-groups""]","On June 6, 2016, Google image searches of ""three black teenagers"" resulted in mostly mugshot images whereas Google image searchers of ""three white teenagers"" consisted of mostly stock images, suggesting a racial bias in Google's algorithm.",Biased Google Image Results ObjectId(625763e3343edc875fe63a3d),63,2018-01-25,[1136],"[""google""]","[""google""]","[""alex-harker""]",Google Photos' AI Assistant created a strange hybrid photograph when merging three different pictures from a ski trip.,Google Photo Merge Decapitates Subject ObjectId(625763e2343edc875fe63a36),56,2017-07-10,"[1041,1042,1043,1044,1045,1046,1047]","[""my_handy_design""]","[""my_handy_design""]","[""my_handy_design""]",A third-party Amazon merchant named “my_handy_design” was suspected of using a bot to generate cell phone case designs based on the bizarre and unattractive designs being offered.,AI-Designed Phone Cases Are Unexpected ObjectId(625763e2343edc875fe63a37),57,2015-07-01,"[1048,1049,1050,1051,1052,1054,1055,1056,1058,1059,1060,1061,1062,1063,1064,1065,1066,1067,1068,1069,1070,1071,1072,1073,1074,1075,1076,1077,1346,1437,1618,1619,2369,2372,2373,2374,2375,2419]","[""australian-department-of-human-services""]","[""centrelink""]","[""australian-welfare-recipients""]","Australian Department of Human Services (DHS)’s automated debt assessment system issued false or incorrect debt notices to hundreds of thousands of people, resulting in years-long lawsuits and damages to welfare recipients.",Australian Automated Debt Assessment System Issued False Notices to Thousands ObjectId(625763e3343edc875fe63a3f),65,2016-12-22,[1140],"[""openai""]","[""openai""]","[""openai""]","OpenAI published a post about its findings when using Universe, a software for measuring and training AI agents to conduct reinforcement learning experiments, showing that the AI agent did not act in the way intended to complete a videogame.",Reinforcement Learning Reward Functions in Video Games ObjectId(625763e1343edc875fe63a30),50,2016-06-17,"[876,877,878,879,880,881,883,884,885,886,887,888,889,892,893,896,897,898,899,900,901,902,903,905]","[""the-dao""]","[""the-dao""]","[""dao-token-holders""]","On June 18, 2016, an attacker successfully exploited a vulnerability in The Decentralized Autonomous Organization (The DAO) on the Ethereum blockchain to steal 3.7M Ether valued at $70M.",The DAO Hack ObjectId(625763e3343edc875fe63a45),71,2016-09-26,"[1261,1262,1263,1264,1265,1266,1267,1268,1269,1270,1271,1272,1273,1274,1275,1276,1277,1278,1279,1280,1281,1282,1284,1285,1287,1288,1289,1290]","[""google""]","[""google""]","[""mountain-view-municipal-bus-passengers"",""mountain-view-municipal-bus""]","On February 14, 2016, a Google autonomous test vehicle partially responsible for a low-speed collision with a bus on El Camino Real in Google’s hometown of Mountain View, CA.",Google admits its self driving car got it wrong: Bus crash was caused by software ObjectId(625763e2343edc875fe63a35),55,2016-12-30,"[1020,1021,1022,1024,1025,1026,1027,1028,1029,1030,1032,1033,1034,1035,1036,1038]","[""amazon""]","[""amazon""]","[""children""]",An Amazon Echo Dot using the Amazon Alex software started to play pornographic results when a child asked it to play a song.,Alexa Plays Pornography Instead of Kids Song ObjectId(625763e3343edc875fe63a43),69,2015-07-02,"[1240,1241,1243,1244,1245,1246,1247,1248,1249,1250,1252,1253]","[""skh-metals""]","[""unknown""]","[""ramji-lal""]","A factory robot at the SKH Metals Factory in Manesar, India pierced and killed 24-year-old worker Ramji Lal when Lal reached behind the machine to dislodge a piece of metal stuck in the machine.",Worker killed by robot in welding accident at car parts factory in India ObjectId(625763e3343edc875fe63a40),66,2017-08-02,"[1159,1161,1162,1163,1165,1166,1169,1170,1172,1173,1174,1175,1176,1178,1179,1180]","[""tencent-holdings""]","[""microsoft"",""turing-robot""]","[""chinese-communist-party"",""tencent-holdings"",""microsoft"",""turing-robot""]","Chatbots on Chinese messaging service expressed anti-China sentiments, causing the messaging service to remove and reprogram the chatbots.",Chinese Chatbots Question Communist Party ObjectId(625763e1343edc875fe63a2f),49,2016-09-05,"[864,865,866,867,868,870,872,873,874,875]","[""youth-laboratories""]","[""youth-laboratories""]","[""people-with-dark-skin""]","In 2016, after artificial inntelligence software Beauty.AI judged an international beauty contest and declared a majority of winners to be white, researchers found that Beauty.AI was racially biased in determining beauty.",AI Beauty Judge Did Not Like Dark Skin ObjectId(625763e4343edc875fe63a46),72,2017-10-17,"[1291,1292,1293,1294,1295,1296,1297,1298,1299,1300,1301,1302,1304,1305,1306,1307,1309,1310,1311,1312,1313,1314,1315,1316,1318,1319]","[""facebook""]","[""facebook""]","[""unnamed-palestinian-facebook-user"",""palestinian-facebook-users"",""arabic-speaking-facebook-users"",""facebook-users""]","Facebook's automatic language translation software incorrectly translated an Arabic post saying ""Good morning"" into Hebrew saying ""hurt them,"" leading to the arrest of a Palestinian man in Beitar Illit, Israel.","Facebook translates 'good morning' into 'attack them', leading to arrest" ObjectId(625763e5343edc875fe63a50),82,2020-10-21,[1382],"[""facebook""]","[""facebook""]","[""facebook-users"",""facebook-users-interested-in-the-lekki-massacre-incident""]",Facebook incorrectly labels content relating to an incident between #EndSARS protestors and the Nigerian army as misinformation.,#LekkiMassacre: Why Facebook labelled content from October 20 incident ‘false’ ObjectId(625763e6343edc875fe63a5a),93,2018-08-13,"[1394,1817,2107,2205]","[""facebook""]","[""facebook""]","[""facebook-users-of-minority-groups"",""non-american-born-facebook-users"",""non-christian-facebook-users"",""facebook-users-interested-in-accessibility"",""facebook-users-interested-in-hispanic-culture""]",In March 2019 the U.S. Department of Housing and Urban Development charged Facebook with violating the Fair Housing Act by allowing real estate sellers to target advertisements in a discriminatory manner.,HUD charges Facebook with enabling housing discrimination ObjectId(625763e5343edc875fe63a53),85,2020-10-09,[2471],"[""openai""]","[""openai""]","[""unknown""]","On September 8, 2020, the Guardian published an op-ed generated by OpenAI’s GPT-3 text generating AI that included threats to destroy humankind. This incident has been downgraded to an issue as it does not meet current ingestion criteria.","AI attempts to ease fear of robots, blurts out it can’t ‘avoid destroying humankind’" ObjectId(625763e5343edc875fe63a52),84,2020-10-09,[1384],"[""facebook""]","[""facebook""]","[""facebook-users"",""facebook-users-interested-in-covid-information"",""facebook-users-interested-in-the-us-presidential-election""]","Avaaz, an international advocacy group, released a review of Facebook's misinformation identifying software showing that the labeling process failed to label 42% of false information posts, most surrounding COVID-19 and the 2020 USA Presidential Election.","Tiny Changes Let False Claims About COVID-19, Voting Evade Facebook Fact Checks" ObjectId(625763e4343edc875fe63a4b),77,2019-10-04,"[1340,1390,1878,2201,2202]","[""knightscope""]","[""knightscope""]","[""cogo-guebara"",""unnamed-woman-injured-in-the-fight""]","A Knightscope K5 autonomous ""police"" robot patrolling Huntington Park, California failed to respond to an onlooker who attempted to activate its emergency alert button when a nearby fight broke out.",Knightscope's Park Patrol Robot Ignored Bystander Pressing Emergency Button to Alert Police about Fight ObjectId(625763e4343edc875fe63a4c),78,2020-07-06,[1341],"[""international-baccalaurette""]","[""international-baccalaurette""]","[""international-baccalaureate-students""]","In response to the Covid-19 pandemic, the International Baccalaureate final exams were replaced by a calculated score, prompting complaints of unfairness from teachers and students.",Meet the Secret Algorithm That's Keeping Students Out of College ObjectId(625763e6343edc875fe63a5d),96,2017-05-08,[1398],"[""houston-independent-school-district""]","[""sas-institute""]","[""houston-independent-school-district-teachers""]","On May 4, 2017, a U.S. federal judge advanced teachers’ claims that the Houston Independent School District’s algorithmic teacher evaluations violated their due process rights to their jobs by not allowing them to review the grounds of their termination.",Houston Schools Must Face Teacher Evaluation Lawsuit ObjectId(625763e5343edc875fe63a56),88,2017-08-15,"[1388,2183]","[""google""]","[""google""]","[""jewish-people"",""google-images-users""]","Google's Image search for ""Jewish baby strollers"" showed offensive, anti-Semitic results, allegedly a result of a coordinated hate-speech campaign involving malicious actors on 4chan.","""Jewish Baby Strollers"" Provided Anti-Semitic Google Images, Allegedly Resulting from Hate Speech Campaign" ObjectId(625763e4343edc875fe63a47),73,2016-03-01,"[1320,1321,1322,1323,1324,1325,1327,1343]","[""niantic-labs""]","[""niantic-labs""]","[""non-white-neighborhoods"",""communities-of-color""]","Through a crowdsourcing social media campaign in 2016, several journalists and researchers demonstrated that augmented reality locations in the popular smartphone game Pokemon Go were more likely to be in white neighborhoods.",Is Pokémon Go racist? How the app may be redlining communities of color ObjectId(625763e4343edc875fe63a49),75,2012-01-05,[1337],"[""google""]","[""google""]","[""jewish-people"",""jewish-public-figures""]","The organizations SOS Racisme, Union of Jewish Students of France, Movement Against Racism and for Friendship Among Peoples are suing Google due to its autocomplete software suggesting ""jewish"" when the names of certain public figures were searched on the platform.",Google Instant's Allegedly 'Anti-Semitic' Results Lead To Lawsuit In France ObjectId(625763e4343edc875fe63a4f),81,2020-10-21,[1381],"[""mount-sinai-hospitals""]","[""google"",""qure.ai"",""aidoc"",""darwinai""]","[""patients-of-minority-groups"",""low-income-patients"",""female-patients"",""hispanic-patients"",""patients-with-medicaid-insurance""]","A study by the University of Toronto, the Vector Institute, and MIT showed the input databases that trained AI systems used to classify chest X-rays led the systems to show gender, socioeconomic, and racial biases.","Researchers find evidence of racial, gender, and socioeconomic bias in chest X-ray classifiers" ObjectId(625763e5343edc875fe63a54),86,2020-10-08,"[1386,2038]","[""irish-department-of-education-and-skills""]","[""irish-department-of-education-and-skills""]","[""leaving-certificate-exam-takers"",""irish-department-of-education-and-skills""]",Errors in Irish Department of Education's algorithm to calculate students’ Leaving Certificate exam grades resulted in thousands of inaccurate scores.,Coding Errors in Leaving Certificate Grading Algorithm Caused Inaccurate Scores in Ireland ObjectId(625763e4343edc875fe63a48),74,2020-01-30,"[1336,1400,1467,1484,1543,1837,2027,2028,2029,2734]","[""detroit-police-department""]","[""dataworks-plus""]","[""robert-julian-borchak-williams"",""black-people-in-detroit""]",A Black man was wrongfully detained by the Detroit Police Department as a result of a false facial recognition (FRT) result..,Detroit Police Wrongfully Arrested Black Man Due To Faulty FRT ObjectId(625763e6343edc875fe63a5c),95,2019-11-06,"[2133,2132,1397,2194]","[""hirevue""]","[""hirevue""]","[""job-applicants-using-hirevue"",""hirevue-customers""]","In January 2021, HireVue removed the controversial AI expression tracking tool from its virtual job interview software.",Job Screening Service Halts Facial Analysis of Applicants ObjectId(625763e4343edc875fe63a4e),80,2020-10-24,"[1380,1559]","[""inverness-caledonian-thistle-football-club""]","[""unknown""]","[""livestream-viewers""]",In a Scottish soccer match the AI-enabled ball-tracking camera used to livestream the game repeatedly tracked an official’s bald head as though it were the soccer ball.,AI mistakes referee’s bald head for football — hilarity ensued ObjectId(625763e5343edc875fe63a51),83,2020-10-15,[1383],"[""gmail"",""outlook"",""yahoo"",""gmx"",""laposte""]","[""gmail"",""outlook"",""yahoo"",""gmx"",""laposte""]","[""email-users""]","Gmail, Yahoo, Outlook, GMX, and LaPoste email inbox sites showed racial and content-based biases when AlgorithmWatch tested their spam box filtering algorithms.",Spam filters are efficient and uncontroversial. Until you look at them. ObjectId(625763e6343edc875fe63a5e),97,2020-10-22,[1399],"[""tesla""]","[""tesla""]","[""tesla-drivers""]","A Tesla Model 3 misidentified flags with ""COOP"" written vertically on them as traffic lights.",Tesla Autopilot Mistakes Red Letters on Flag for Red Traffic Lights ObjectId(625763e6343edc875fe63a5f),98,2021-04-28,[1401],"[""new-york-city-police-department""]","[""boston-dynamics""]","[""new-york-city-low-income-communities""]",The New York Police Department canceled a contract to use Boston Dynamics' robotic dog Spot following public backlash. ,N.Y.P.D. Robot Dog’s Run Is Cut Short After Fierce Backlash ObjectId(625763e4343edc875fe63a4d),79,1999-03-16,"[1379,1736,2039]","[""chronic-kidney-disease-epidemiology-collaboration""]","[""chronic-kidney-disease-epidemiology-collaboration""]","[""black-patients"",""african-american-patients""]",Decades-long use of the estimated glomerular filtration rate (eGFR) method to test kidney function which considers race has been criticized by physicians and medical students for its racist history and inaccuracy against Black patients.,Kidney Testing Method Allegedly Underestimated Risk of Black Patients ObjectId(625763e5343edc875fe63a55),87,2020-10-07,[1387],"[""uk-home-office""]","[""uk-home-office""]","[""dark-skinned-people"",""dark-skinned-women""]",UK passport photo checker shows bias against dark-skinned women.,UK passport photo checker shows bias against dark-skinned women ObjectId(625763e5343edc875fe63a58),91,2020-12-18,"[1391,1392,1463,1720,1779]","[""stanford-medical-center""]","[""stanford-medical-center""]","[""stanford-medical-frontline-workers"",""stanford-medical-residents""]","In 2020, Stanford Medical Center's distribution algorithm only designated 7 of 5,000 vaccines to Medical Residents, who are frontline workers regularly exposed to COVID-19.",Frontline workers protest at Stanford after hospital distributed vaccine to administrators ObjectId(625763e4343edc875fe63a4a),76,2020-10-09,[1339],"[""buenos-aires-city-government""]","[""unknown""]","[""buenos-aires-children""]",Buenos Aires city government uses a facial recognition system that has led to numerous false arrests.,Live facial recognition is tracking kids suspected of being criminals ObjectId(625763e5343edc875fe63a59),92,2019-11-11,"[1393,1396,2035,2036,2037,2274]","[""goldman-sachs""]","[""apple""]","[""apple-card-female-users"",""apple-card-female-credit-applicants""]","Apple Card's credit assessment algorithm was reported by Goldman-Sachs customers to have shown gender bias, in which men received significantly higher credit limits than women with equal credit qualifications.",Apple Card's Credit Assessment Algorithm Allegedly Discriminated against Women ObjectId(625763e5343edc875fe63a57),89,2019-03-15,[1389],"[""youtube""]","[""youtube""]","[""youtube-users""]",A New Zealand government report released following a right-wing terrorist killing 51 worshippers at two New Sealand mosques which indicated that Youtube's recommendation algorithm played an important role in the terrorist's radicalization.,The Christchurch shooter and YouTube’s radicalization trap ObjectId(625763e6343edc875fe63a5b),94,2020-11-27,"[1395,1473]","[""deliveroo""]","[""deliveroo""]","[""deliveroo-workers-with-legitimate-reasons-for-cancelling-shifts"",""deliveroo-workers""]","In December 2020, an Italian court ruled that Deliveroo’s employee ‘reliability’ algorithm illegally discriminated against workers with legitimate reasons for cancelling shifts.",Court Rules Deliveroo Used 'Discriminatory' Algorithm ObjectId(625763e7343edc875fe63a6b),110,2016-01-01,"[1413,2651]","[""arkansas-department-of-human-services""]","[""interrai""]","[""arkansas-medicaid-waiver-program-beneficiaries"",""arkansas-healthcare-workers""]","Beneficiaries of the Arkansas Department of Human Services (DHS)'s Medicaid waiver program were allocated excessively fewer hours of caretaker visit via an algorithm deployed to boost efficiency, which reportedly contained errors and whose outputs varied wildly despite small input changes.",Arkansas's Opaque Algorithm to Allocate Health Care Excessively Cut Down Hours for Beneficiaries ObjectId(625763e7343edc875fe63a6c),111,2015-09-25,"[1426,1427,1428,1429,1430]","[""amazon-flex""]","[""amazon""]","[""amazon-flex-employees"",""amazon-flex-drivers""]",Amazon Flex's contract delivery drivers were dismissed using a minimally human-interfered automated employee performance evaluation based on indicators impacted by out-of-driver's-control factors and without having a chance to defend against or appeal the decision.,Amazon Flex Drivers Allegedly Fired via Automated Employee Evaluations ObjectId(625763e7343edc875fe63a65),104,2021-02-12,[1407],"[""california-department-of-public-health""]","[""blue-shield-of-california""]","[""california-low-income-neighborhoods"",""california-communities-of-color""]","California's vaccine-distribution algorithm used ZIP codes as opposed to census tracts in its decision-making, which critics said undermined equity and access for vulnerable communities who are largely low-income, underserved neighborhoods with low Healthy Places Index scores.","California's Algorithm Considered ZIP Codes in Vaccine Distribution, Allegedly Excluding Low-Income Neighborhoods and Communities of Color" ObjectId(625763e7343edc875fe63a67),106,2020-12-23,"[1409,1416,1417,1418,1419,1420,1421,1422,1423,1424,1425,2034,2356]","[""facebook-messenger""]","[""scatter-lab""]","[""korean-facebook-messenger-users"",""korean-people-of-gender-minorities"",""korean-people-with-disabilities""]","A Korean interactive chatbot was shown in screenshots to have used derogatory and bigoted language when asked about lesbians, Black people, and people with disabilities.",Korean Chatbot Luda Made Offensive Remarks towards Minority Groups ObjectId(625763e7343edc875fe63a69),108,2021-07-10,"[1411,1503,1537]","[""riverside-arena-skating-rink""]","[""unknown""]","[""lamya-robinson"",""black-livonia-residents""]","A Black teenager living in Livonia, Michigan was incorrectly stopped from entering a roller skating rink after its facial-recognition cameras misidentified her as another person who had been previously banned for starting a skirmish with other skaters.",Skating Rink’s Facial Recognition Cameras Misidentified Black Teenager as Banned Troublemaker ObjectId(625763e8343edc875fe63a70),115,2020-07-28,"[1440,1472,2204]","[""genderify""]","[""genderify""]","[""genderify-customers"",""gender-minority-groups""]","A company's AI predicting a person's gender based on their name, email address, or username was reported by its users to show biased and inaccurate results.",Genderify’s AI to Predict a Person’s Gender Revealed by Free API Users to Exhibit Bias ObjectId(625763e8343edc875fe63a6f),114,2018-07-26,[1439],"[""amazon""]","[""amazon""]","[""rekognition-users"",""arrested-people""]","Rekognition's face comparison feature was shown by the ACLU to have misidentified members of congress, and particularly members of colors, as other people who have been arrested using a mugshot database built on publicly available arrest photos.",Amazon's Rekognition Falsely Matched Members of Congress to Mugshots ObjectId(625763e7343edc875fe63a68),107,2018-07-20,"[1410,1928]","[""none""]","[""huawei"",""megvii"",""sensetime"",""alibaba"",""baibu""]","[""uyghur-people""]","Various Chinese firms were revealed by patent applications to have developed facial recognition capable of detecting people by race, which critics feared would enable persecution and discrimination of Uyghur Muslims.","Chinese Tech Firms Allegedly Developed Facial Recognition to Identify People by Race, Targeting Uyghur Muslims" ObjectId(625763e8343edc875fe63a74),119,2021-08-03,"[1444,1800,1801,1802]","[""xsolla""]","[""unknown""]","[""xsolla-employees""]","Xsolla CEO fired more than a hundred employees from his company in Perm, Russia, based on big data analysis of their remote digitized-work activity, which critics said was violating employee's privacy, outdated, and extremely ineffective.",Xsolla Employees Fired by CEO Allegedly via Big Data Analytics of Work Activities ObjectId(625763e7343edc875fe63a66),105,2019-08-24,[1408],"[""tesla""]","[""tesla""]","[""jovani-maldonado"",""benjamin-maldonado"",""california-public""]","A Tesla Model 3 on Autopilot mode crashed into a pickup on a California freeway, where data and video from the company showed neither Autopilot nor the driver slowing the vehicle until seconds before the crash.","Tesla Model 3 on Autopilot Crashed into a Ford Explorer Pickup, Killing a Fifteen-Year-Old in California" ObjectId(625763e6343edc875fe63a60),99,2012-01-01,[1402],"[""university-of-massachusetts-amherst"",""university-of-wisconsin-milwaukee"",""university-of-houston"",""texas-aandm-university"",""georgia-state-university"",""more-than-500-colleges""]","[""eab""]","[""black-college-students"",""latinx-college-students"",""indigenous-students""]",Several major universities are using a tool that uses race as one factor to predict student success.,Major Universities Are Using Race as a “High Impact Predictor” of Student Success ObjectId(625763e8343edc875fe63a76),121,2020-03-27,"[2106,2105,2104,1447]","[""tripoli-based-government""]","[""stm""]","[""libyan-soldiers""]","In Libya, a Turkish-made Kargu-2 aerial drone powered by a computer vision model was allegedly used remotely by forces backed by the Tripoli-based government to track down and attack enemies as they were running from rocket attacks.",Autonomous Kargu-2 Drone Allegedly Remotely Used to Hunt down Libyan Soldiers ObjectId(625763e7343edc875fe63a6a),109,2017-01-01,[1412],"[""pimeyes""]","[""pimeyes""]","[""internet-users""]","PimEyes offered its subscription-based AI service to anyone in the public to search for matching facial images across the internet, which critics said lacked public oversight and government rules to prevent itself from misuse such as stalking women.",PimEyes's Facial Recognition AI Allegedly Lacked Safeguards to Prevent Itself from Being Abused ObjectId(625763e6343edc875fe63a62),101,2018-09-01,"[1404,1575,1863,2570,2805,2845]","[""dutch-tax-authority""]","[""unknown""]","[""dutch-tax-authority"",""dutch-families""]","A childcare benefits system in the Netherlands falsely accused thousands of families of fraud, in part due to an algorithm that treated having a second nationality as a risk factor.",Dutch Families Wrongfully Accused of Tax Fraud Due to Discriminatory Algorithm ObjectId(625763e7343edc875fe63a64),103,2020-09-18,"[1406,1527,1528,2145,2241]","[""twitter""]","[""twitter""]","[""twitter-users"",""twitter-non-white-users"",""twitter-non-male-users""]","Twitter's photo cropping algorithm was revealed by researchers to favor white and women faces in photos containing multiple faces, prompting the company to stop its use on mobile platform.",Twitter’s Image Cropping Tool Allegedly Showed Gender and Racial Bias ObjectId(625763e8343edc875fe63a6e),113,2020-06-27,[1438],"[""facebook""]","[""facebook""]","[""black-people"",""facebook-users""]","Facebook's AI mislabeled video featuring Black men as a video about ""primates,"" resulting in an offensive prompt message for users who watched the video.","Facebook's AI Put ""Primates"" Label on Video Featuring Black Men" ObjectId(625763e9343edc875fe63a77),122,2015-06-14,[1448],"[""facebook""]","[""facebook""]","[""facebook-users""]","Facebook’s initial version of the its Tag Suggestions feature where users were offered suggestions about the identity of people's faces in photos allegedly stored biometric data without consent, violating the Illinois Biometric Information Privacy Act.","Facebook’s ""Tag Suggestions"" Allegedly Stored Biometric Data without User Consent" ObjectId(625763e7343edc875fe63a63),102,2020-03-23,"[1405,1523]","[""microsoft"",""ibm"",""google"",""apple"",""amazon""]","[""microsoft"",""ibm"",""google"",""apple"",""amazon""]","[""black-people""]","A study found that voice recognition tools from Apple, Amazon, Google, IBM, and Microsoft disproportionately made errors when transcribing black speakers.","Personal voice assistants struggle with black voices, new study shows" ObjectId(625763e8343edc875fe63a6d),112,2012-10-09,"[1434,1436,1432,1433,1810,2495,2496,1435,1821,2250,2623,2831]","[""troy-police-department"",""syracuse-police-department"",""san-francisco-police-department"",""san-antonio-police-department"",""new-york-city-police-department"",""fall-river-police-department"",""chicago-police-department""]","[""shotspotter""]","[""troy-residents"",""syracuse-residents"",""san-francisco-residents"",""san-antonio-residents"",""new-york-city-residents"",""fall-river-residents"",""chicago-residents"",""troy-police-department"",""syracuse-police-department"",""san-francisco-police-department"",""san-antonio-police-department"",""new-york-city-police-department"",""fall-river-police-department"",""chicago-police-department""]","ShotSpotter algorithmic systems locating gunshots were reported by police departments for containing high false positive rates and wasting police resources, prompting discontinuation.",Police Departments Reported ShotSpotter as Unreliable and Wasteful ObjectId(625763e6343edc875fe63a61),100,2021-03-17,[1403],"[""french-welfare-offices""]","[""unknown""]","[""lucie-inland""]",A French welfare office using software to automatically evaluate cases incorrectly notified a woman receiving benefits that she owed €542.,How French welfare services are creating ‘robo-debt’ ObjectId(625763e8343edc875fe63a71),116,2021-09-20,"[1441,1803]","[""amazon""]","[""netradyne""]","[""amazon-delivery-drivers"",""amazon-workers""]","Amazon's automated performance evaluation system involving AI-powered cameras incorrectly punished delivery drivers for non-existent mistakes, impacting their chances for bonuses and rewards.",Amazon's AI Cameras Incorrectly Penalized Delivery Drivers for Mistakes They Did Not Make ObjectId(625763e8343edc875fe63a73),118,2020-08-06,"[1443,2009,2010]","[""openai""]","[""openai""]","[""muslims""]","Users and researchers revealed generative AI GPT-3 associating Muslims to violence in prompts, resulting in disturbingly racist and explicit outputs such as casting Muslim actor as a terrorist.",OpenAI's GPT-3 Associated Muslims with Violence ObjectId(625763e8343edc875fe63a75),120,2020-09-01,[1445],"[""unknown""]","[""murat-ayfer"",""openai""]","[""reddit-users""]","Philosopher AI, a GPT-3-powered controversial text generator, was allegedly used by an anonymous actor on AskReddit subreddit, whose posts featured a mixture of harmless stories, conspiracy theories, and sensitive topic discussions.",Philosophy AI Allegedly Used To Generate Mixture of Innocent and Harmful Reddit Posts ObjectId(625763e8343edc875fe63a72),117,2020-02-24,"[1442,2019,2020,2021]","[""tiktok""]","[""tiktok""]","[""tiktok-users"",""tiktok-content-creators""]","TikTok's ""Suggested Accounts"" recommendations allegedly reinforced racial bias despite not basing recommendations on race or creators' profile photo.","TikTok's ""Suggested Accounts"" Algorithm Allegedly Reinforced Racial Bias through Feedback Loops" ObjectId(625763e9343edc875fe63a7e),129,2021-03-01,[1462],"[""facebook""]","[""facebook""]","[""facebook-users""]","Facebook's automated moderation tools were shown by internal documents performing incomparably to human moderators, and accounting for only a small fraction of hate speech, violence, and incitement content removal.","Facebook's Automated Tools Failed to Adequately Remove Hate Speech, Violence, and Incitement" ObjectId(625763ea343edc875fe63a89),140,2020-06-01,[1478],"[""university-of-toronto""]","[""proctoru""]","[""university-of-toronto-bipoc-students""]","An exam monitoring service used by the University of Toronto was alleged by its students to have provided discriminatory check-in experiences via its facial recognition's failure to verify passport photo, disproportionately enhancing disadvantaging stress level for BIPOC students.",ProctorU’s Identity Verification and Exam Monitoring Systems Provided Allegedly Discriminatory Experiences for BIPOC Students ObjectId(625763eb343edc875fe63a8f),146,2021-10-22,"[1494,1495,1502]","[""allen-institute-for-ai""]","[""allen-institute-for-ai""]","[""minority-groups""]","A publicly accessible research model that was trained via Reddit threads showed racially biased advice on moral dilemmas, allegedly demonstrating limitations of language-based models trained on moral judgments.","Research Prototype AI, Delphi, Reportedly Gave Racially Biased Answers on Ethics" ObjectId(625763ea343edc875fe63a86),137,2021-01-11,[1474],"[""israeli-tax-authority""]","[""israeli-tax-authority""]","[""moshe-har-shemesh"",""israeli-people-having-tax-fines""]","An Israeli farmer was imposed a computer generated fine by the tax authority, who allegedly were not able to explain its calculation, and refused to disclose the program and its source code.","Israeli Tax Authority Employed Opaque Algorithm to Impose Fines, Reportedly Refusing to Provide an Explanation for Amount Calculation to a Farmer" ObjectId(625763eb343edc875fe63a8d),144,2020-06-28,"[1483,1979,1980,2042,2043,2134]","[""youtube""]","[""youtube""]","[""antonio-radic"",""youtube-chess-content-creators"",""youtube-users""]","YouTube's AI-powered hate speech detection system falsely flagged chess content and banned chess creators allegedly due to its misinterpretation of strategy language such as ""black,"" ""white,"" and ""attack"" as harmful and dangerous.",YouTube's AI Mistakenly Banned Chess Channel over Chess Language Misinterpretation ObjectId(625763eb343edc875fe63a91),148,2021-11-21,[1499],"[""accessibe"",""accessus.ai"",""allyable"",""userway"",""maxaccess.io""]","[""accessibe"",""accessus.ai"",""allyable"",""userway"",""maxaccess.io""]","[""internet-users-with-disabilities"",""web-accessibility-vendors'-customers""]","AI-powered web accessibility vendors allegedly overstated to customers about their products' utility for people with disabilities, falsely claiming to deliver automated compliance solutions.",Web Accessibility Vendors Allegedly Falsely Claimed to Provide Compliance Using AI ObjectId(625763ec343edc875fe63a9d),160,2021-12-26,"[1520,1521,2381]","[""amazon""]","[""amazon""]","[""kristin-livdahl's-daughter"",""amazon-echo-customers"",""children-using-alexa""]","Amazon’s voice assistant Alexa suggested “the penny challenge,” which involves dangerously touching a coin to the prongs of a half-exposed plug, when a ten-year-old girl asked for a challenge to do.",Alexa Recommended Dangerous TikTok Challenge to Ten-Year-Old Girl ObjectId(625763ec343edc875fe63a95),152,2021-07-13,"[1509,1510]","[""softbank""]","[""aldebaran"",""softbank-robotics""]","[""softbank""]","SoftBank's robot allegedly kept making mechanical errors, taking unplanned breaks, failing to recognize previously-met people, and breaking down during practice runs.","SoftBank's Humanoid Robot, Pepper, Reportedly Frequently Made Errors, Prompting Dismissal" ObjectId(625763ed343edc875fe63a9e),161,2019-04-03,"[1530,2138,2139]","[""facebook""]","[""facebook""]","[""female-facebook-users"",""black-facebook-users"",""male-facebook-users""]",Facebook's housing and employment ad delivery process allegedly resulted in skews in exposure for some users along demographic lines such as gender and racial identity.,Facebook's Ad Delivery Reportedly Excluded Audience along Racial and Gender Lines ObjectId(625763ed343edc875fe63aa1),164,2018-10-01,[1534],"[""facebook""]","[""facebook""]","[""facebook-users"",""facebook-content-creators""]","After the “News Feed” algorithm had been overhauled to boost engagement between friends and family in early 2018, its heavy weighting of re-shared content was alleged found by company researchers to have pushed content creators to reorient their posts towards outrage and sensationalism, causing a proliferation of misinformation, toxicity, and violent content.","Facebook ""News Feed"" Allegedly Boosted Misinformation and Violating Content Following Use of MSI Metric" ObjectId(625763ed343edc875fe63aa5),168,2022-03-01,"[1540,1541]","[""facebook"",""linkedin"",""youtube"",""twitter"",""netflix""]","[""facebook"",""linkedin"",""youtube"",""twitter"",""netflix""]","[""facebook-users"",""linkedin-users"",""youtube-users"",""twitter-users"",""netflix-users""]","Collaborative filtering prone to popularity bias, resulting in overrepresentation of popular items in the recommendation outputs.","Collaborative Filtering Prone to Popularity Bias, Resulting in Overrepresentation of Popular Items in the Recommendation Outputs" ObjectId(625763ea343edc875fe63a82),133,2020-12-15,[1468],"[""tiktok""]","[""tiktok""]","[""tiktok-content-creators-of-marginalized-groups""]",TikTok's automated content reporting system was allegedly abused by online trolls to intentionally misreport content created by users of marginalized groups.,Online Trolls Allegedly Abused TikTok’s Automated Content Reporting System to Discriminate against Marginalized Creators ObjectId(625763e9343edc875fe63a80),131,2020-12-04,"[1465,1771]","[""california-bar's-committee-of-bar-examiners""]","[""examsoft""]","[""california-bar-exam-takers"",""flagged-california-bar-exam-takers""]","The proctoring algorithm used in a California bar exam cited a third of thousands of applicants as cheaters, resulting in allegations where exam takers were instructed to prove otherwise without seeing their incriminating video evidence.",Proctoring Algorithm in Online California Bar Exam Flagged an Unusually High Number of Alleged Cheaters ObjectId(625763ec343edc875fe63a96),153,2019-12-29,"[1511,1729,1763,2514]","[""tesla""]","[""tesla""]","[""gilberto-alcazar-lopez"",""maria-guadalupe-nieves-lopez""]","In 2019, a Tesla Model S driver on Autopilot mode reportedly went through a red light and crashed into a Honda Civic, killing two people in Gardena, Los Angeles.","Tesla Driver on Autopilot Ran a Red Light, Crashing into a Car and Killing Two People in Los Angeles" ObjectId(625763eb343edc875fe63a8e),145,2021-07-23,"[1485,1504,1529]","[""tesla""]","[""tesla""]","[""tesla-drivers""]","Tesla's Autopilot was shown on video by its owner mistaking the moon for a yellow stop light, allegedly causing the vehicle to keep slowing down.",Tesla's Autopilot Misidentified the Moon as Yellow Stop Light ObjectId(625763ea343edc875fe63a8a),141,2021-02-05,"[1479,1480]","[""instagram""]","[""instagram""]","[""sennett-devermont"",""beverly-hills-citizens""]","A police officer in Beverly Hills played copyrighted music on his phone when realizing that his interactions were being recorded on a livestream, allegedly hoping the Instagram's automated copyright detection system to end or mute the stream.",California Police Turned on Music to Allegedly Trigger Instagram’s DCMA to Avoid Being Live-Streamed ObjectId(625763ea343edc875fe63a83),134,2020-12-25,"[1469,1951]","[""fuzhou-zhongfang-marlboro-mall""]","[""unknown""]","[""fuzhou-zhongfang-marlboro-mall-goers""]","A shopping guide robot deployed by the Fuzhou Zhongfang Marlboro Mall was shown on video allegedly walking to the escalator by itself, falling down, and knocking over passengers, which prompted its suspension.","Robot in Chinese Shopping Mall Fell off the Escalator, Knocking down Passengers" ObjectId(625763e9343edc875fe63a7a),125,2020-09-29,"[1451,1452,1460]","[""amazon""]","[""amazon""]","[""amazon-fulfillment-center-workers""]",Amazon’s robotic fulfillment centers have higher serious injury rates.,Amazon’s Robotic Fulfillment Centers Have Higher Serious Injury Rates ObjectId(625763ea343edc875fe63a85),136,2020-12-06,[1471],"[""brand-safety-tech-firms""]","[""none""]","[""news-sites""]","Brand safety tech firms falsely claimed use of AI, blocking ads using simple keyword lists.","Brand Safety Tech Firms Falsely Claimed Use of AI, Blocking Ads Using Simple Keyword Lists" ObjectId(625763ea343edc875fe63a87),138,2020-01-21,"[1475,1505,1555,1556,2442,2434]","[""university-of-illinois""]","[""proctorio""]","[""university-of-illinois-students-of-color"",""university-of-illinois-students""]","Proctorio's remote-testing software were reported by students at the University of Illinois Urbana-Champaign for issues regarding privacy, accessibility, differential performance on darker-skinned students.","Proctorio's Alleged Privacy, Accessibility, and Discrimination Issues Prompted Suspension by University of Illinois" ObjectId(625763ea343edc875fe63a81),132,2020-12-27,[1466],"[""tiktok""]","[""tiktok""]","[""tiktok-users"",""tiktok-users-under-18-years-old""]","Videos promoting eating disorders evaded TikTok's automated violation detection system without difficulty via common misspellings of search terms, bypassing its ban of violating hashtags such as ""proana"" and ""anorexia"".",TikTok’s Content Moderation Allegedly Failed to Adequately Take down Videos Promoting Eating Disorders ObjectId(625763ea343edc875fe63a84),135,2012-12-01,"[1470,1871]","[""university-of-texas-at-austin's-department-of-computer-science""]","[""university-of-texas-at-austin-researchers""]","[""university-of-texas-at-austin-phd-applicants-of-marginalized-groups""]","The University of Texas at Austin's Department of Computer Science's assistive algorithm to assess PhD applicants ""GRADE"" raised concerns among faculty about worsening historical inequalities for marginalized candidates, prompting its suspension.",UT Austin GRADE Algorithm Allegedly Reinforced Historical Inequalities ObjectId(625763ec343edc875fe63a98),155,2021-12-27,"[1513,1514]","[""google-maps""]","[""google-maps""]","[""google-maps-users-traveling-in-sierra-nevada"",""google-maps-users-traveling-in-the-mountains""]",Lake Tahoe travelers were allegedly guided by Google Maps into hazardous shortcuts in the mountains during a snowstorm.,Google Maps Allegedly Directed Sierra Nevada Travelers to Dangerous Roads amid Winter Storm ObjectId(625763ed343edc875fe63a9f),162,2014-01-01,[1531],"[""ets""]","[""ets""]","[""uk-ets-past-test-takers"",""uk-ets-test-takers"",""uk-home-office""]"," International testing organization ETS admits voice recognition as evidence of cheating for thousands of previous TOEIC test-takers that reportedly included wrongfully accused people, causing them to be deported without an appeal process or seeing their incriminating evidence.","ETS Used Allegedly Flawed Voice Recognition Evidence to Accuse and Assess Scale of Cheating, Causing Thousands to be Deported from the UK" ObjectId(625763ee343edc875fe63aa8),171,2021-10-18,[1549],"[""bath-government""]","[""unknown""]","[""paula-knight"",""bath-officials"",""uk-public""]",A Bath resident was wrongly fined by the local officials because an automated license plate recognition camera misread the text on her shirt as a license plate number.,"Traffic Camera Misread Text on Pedestrian's Shirt as License Plate, Causing UK Officials to Issue Fine to an Unrelated Person" ObjectId(625763ec343edc875fe63a94),151,2021-10-28,"[1507,1508,1703,1704,1705,1706,1707,1708,1709,1710]","[""pony.ai""]","[""pony.ai""]","[""san-francisco-city-government""]","A Pony.ai vehicle operating in autonomous mode crashed into a center divider and a traffic sign in San Francisco, prompting a regulator to suspend the driverless testing permit for the startup.",California Regulator Suspended Pony.ai's Driverless Testing Permit Following a Non-Fatal Collision ObjectId(625763e9343edc875fe63a78),123,2021-08-01,"[1449,2651,2705]","[""university-of-michigan-hospital""]","[""epic-systems""]","[""sepsis-patients""]","Epic System's sepsis prediction algorithms was shown by investigators at the University of Michigan Hospital to have high rates of false positives and false negatives, allegedly delivering inaccurate and irrelevant information on patients, contrasting sharply with their published claims.",Epic Systems’s Sepsis Prediction Algorithms Revealed to Have High Error Rates on Seriously Ill Patients ObjectId(625763ec343edc875fe63a99),156,2022-02-04,"[1515,2197]","[""amazon""]","[""amazon""]","[""people-attempting-suicides""]","Despite complaints notifying Amazon about the sale of various products that had been used to aid suicide attempts, its recommendation system reportedly continued selling them and suggesting their frequently bought-together items.",Amazon Reportedly Sold Products and Recommended Frequently Bought Together Items That Aid Suicide Attempts ObjectId(625763eb343edc875fe63a8c),143,2021-02-16,[1482],"[""facebook"",""twitter""]","[""facebook"",""twitter""]","[""facebook-users-of-small-language-groups"",""twitter-users-of-small-language-groups""]","Facebook's and Twitter were not able to sufficiently moderate content of small language groups such as the Balkan languages using AI, allegedly due to the lack of investment in human moderation and difficulty in AI-solution design for the languages.",Facebook’s and Twitter's Automated Content Moderation Reportedly Failed to Effectively Enforce Violation Rules for Small Language Groups ObjectId(625763ec343edc875fe63a9a),157,2021-03-15,[1516],"[""amazon""]","[""amazon""]","[""ans-rana"",""amazon-workers"",""amazon-delivery-drivers""]","A lawsuit cited Amazon as liable in a crash involving its delivery driver, alleging that Amazon’s AI-powered driver monitoring system pushed drivers to prioritize speed over safety.","Amazon's Monitoring System Allegedly Pushed Delivery Drivers to Prioritize Speed over Safety, Leading to Crash" ObjectId(625763ee343edc875fe63aa7),170,2003-06-01,"[1546,1547,1548]","[""target""]","[""target""]","[""target-customers""]","Target recommended maternity-related items to a family in Atlanta via ads, allegedly predicting their teenage daughter’s pregnancy before her father did, although critics have called into question the predictability of the algorithm and the authenticity of its claims.","Target Suggested Maternity-Related Advertisements to a Teenage Girl's Home, Allegedly Correctly Predicting Her Pregnancy via Algorithm" ObjectId(625763ea343edc875fe63a88),139,2021-01-21,"[1476,1477]","[""amazon""]","[""amazon""]","[""amazon-customers""]","Evidence of the ""filter-bubble effect"" were found by vaccine-misinformation researchers in Amazon's recommendations, where its algorithms presented users who performed actions on misinformative products with more misinfomative products.",Amazon’s Search and Recommendation Algorithms Found by Auditors to Have Boosted Products That Contained Vaccine Misinformation ObjectId(625763eb343edc875fe63a8b),142,2021-02-11,[1481],"[""facebook"",""instagram""]","[""facebook"",""instagram""]","[""facebook-users-of-disabilities"",""adaptive-fashion-retailers""]","Facebook platforms' automated ad moderation system falsely classified adaptive fashion products as medical and health care products and services, resulting in regular bans and appeals faced by their retailers.",Facebook’s Advertisement Moderation System Routinely Misidentified Adaptive Fashion Products as Medical Equipment and Blocked Their Sellers ObjectId(625763eb343edc875fe63a93),150,2018-07-21,[1506],"[""natural-cycles""]","[""natural-cycles""]","[""natural-cycles-users"",""women""]","Some women using the contraceptive app, Natural Cycles, reported unwanted pregnancies, revealing its algorithm's difficulties in mapping menstrual cycles.","Swedish Contraceptive App, Natural Cycles, Allegedly Failed to Correctly Map Menstrual Cycle" ObjectId(625763ed343edc875fe63aa6),169,2018-08-15,"[1544,1545]","[""facebook"",""meta""]","[""facebook"",""meta""]","[""rohingya-people"",""rohingya-facebook-users"",""myanmar-public"",""facebook-users-in-myanmar"",""burmese-speaking-facebook-users""]"," Facebook allegedly did not adequately remove anti-Rohingya hate speech, some of which was extremely violent and dehumanizing, on its platform, contributing to the violence faced by Rohingya communities in Myanmar.",Facebook Allegedly Failed to Police Anti-Rohingya Hate Speech Content That Contributed to Violence in Myanmar ObjectId(625763eb343edc875fe63a90),147,2020-01-01,"[1496,1497]","[""scammers""]","[""unknown""]","[""hong-kong-bank-manager""]","In early 2020, fraudsters reportedly allegedly deepfaked the voice of a company's director, demanding a bank manager in Hong Kong to authorize a $35M transfer.",Hong Kong Bank Manager Swindled by Fraudsters Using Deepfaked Voice of Company Director ObjectId(625763e9343edc875fe63a7b),126,2021-07-16,"[1453,1454,1455,1532]","[""ocado""]","[""ocado""]","[""ocado""]","A collision involving three robots at an Ocado's warehouse in Erith, UK, resulting in a fire but no reports of injuries.","Three Robots Collided, Sparking Fire in a Grocer's Warehouse in UK " ObjectId(625763e9343edc875fe63a7d),128,2017-08-01,"[1459,1818]","[""tesla""]","[""tesla""]","[""eric-horvitz"",""tesla-drivers""]"," A Tesla Sedan operating on Autopilot mode was not able to center itself on the road and drove over a yellow dividing curb in Redmond, Washington, causing minor damage to the vehicle’s rear suspension.","Tesla Sedan on Autopilot Reportedly Drove Over Dividing Curb in Washington, Resulting in Minor Vehicle Damage" ObjectId(625763ec343edc875fe63a97),154,2022-01-26,[1512],"[""us-department-of-justice""]","[""us-department-of-justice""]","[""inmates-of-color""]","Department of Justice’s inmate-recidivism risk assessment tool was reported to have produced racially uneven results, misclassifying risk levels for inmates of color.",Justice Department’s Recidivism Risk Algorithm PATTERN Allegedly Caused Persistent Disparities Along Racial Lines ObjectId(625763ed343edc875fe63aa2),165,2020-06-20,"[1536,2781]","[""duke-researchers""]","[""duke-researchers""]","[""people-having-non-caucasian-facial-features""]","Image upscaling tool PULSE powered by NVIDIA's StyleGAN reportedly generated faces with Caucasian features more often, although AI academics, engineers, and researchers were not in agreement about where the source of bias was.",Image Upscaling Algorithm PULSE Allegedly Produced Facial Images with Caucasian Features More Often ObjectId(625763ec343edc875fe63a9b),158,2021-02-01,[1517],"[""unknown""]","[""unknown""]","[""amaya-ross"",""black-students"",""black-test-takers""]","A Black student's face was not recognized by the remote-proctoring software during check-in of a lab quiz, causing her to excessively change her environments for it to work as intended.",Facial Recognition in Remote Learning Software Reportedly Failed to Recognize a Black Student’s Face ObjectId(625763eb343edc875fe63a92),149,2021-11-02,"[1500,1501,1890]","[""zillow""]","[""zillow-offers""]","[""zillow-offers-staff"",""zillow""]","Zillow's AI-powered predictive pricing tool Zestimate was allegedly not able to accurately forecast housing prices three to six months in advance due to rapid market changes, prompting division shutdown and layoff of a few thousand employees.",Zillow Shut Down Zillow Offers Division Allegedly Due to Predictive Pricing Tool's Insufficient Accuracy ObjectId(625763e9343edc875fe63a79),124,2019-10-24,"[1450,1522,2262,2652,2651,2704,2856]","[""unnamed-large-academic-hospital""]","[""optum""]","[""black-patients""]","Optum's algorithm deployed by a large academic hospital was revealed by researchers to have under-predicted the health needs of black patients, effectively de-prioritizing them in extra care programs relative to white patients with the same health burden.",Algorithmic Health Risk Scores Underestimated Black Patients’ Needs ObjectId(625763ec343edc875fe63a9c),159,2019-03-29,[2471],"[""tesla""]","[""tesla""]","[""tesla-drivers""]","Tencent Keen Security Lab conducted security research into Tesla’s Autopilot system and identified crafted adversarial samples and remote controlling via wireless gamepad as vulnerabilities to its system, although the company called into question their real-world practicality. This incident has been downgraded to an issue as it does not meet current ingestion criteria.",Tesla Autopilot’s Lane Recognition Allegedly Vulnerable to Adversarial Attacks ObjectId(625763ed343edc875fe63aa4),167,2017-09-07,[1539],"[""michal-kosinski"",""yilun-wang""]","[""michal-kosinski"",""yilun-wang""]","[""lgbtq-people"",""lgbtq-people-of-color"",""non-american-lgbtq-people""]","Researchers at Stanford Graduate School of Business developed a model that determined, on a binary scale, whether someone was homosexual using only his facial image, which advocacy groups such as GLAAD and the Human Rights Campaign denounced as flawed science and threatening to LGBTQ folks.",Researchers' Homosexual-Men Detection Model Denounced as a Threat to LGBTQ People’s Safety and Privacy ObjectId(625763e9343edc875fe63a7c),127,2020-06-06,"[1456,1457,1458,1461,1486,1487,1488,1489,1490,1491,1492,1493]","[""microsoft"",""msn.com""]","[""microsoft""]","[""jade-thirlwall"",""leigh-anne-pinnock""]","A news story published on MSN.com featured a photo of the wrong mixed-race person that was allegedly selected by an algorithm, following Microsoft’s layoff and replacement of journalists and editorial workers at its organizations with AI systems.",Microsoft’s Algorithm Allegedly Selected Photo of the Wrong Mixed-Race Person Featured in a News Story ObjectId(625763ed343edc875fe63aa0),163,2021-11-21,"[1533,1535,1652]","[""facebook""]","[""facebook""]","[""facebook-users-of-minority-groups"",""facebook-users""]","Facebook’s hate-speech detection algorithms was found by company researchers to have under-reported less common but more harmful content that was more often experienced by minority groups such as Black, Muslim, LGBTQ, and Jewish users.",Facebook’s Hate Speech Detection Algorithms Allegedly Disproportionately Failed to Remove Racist Content towards Minority Groups ObjectId(625763ed343edc875fe63aa3),166,2020-02-07,"[1538,1563]","[""giggle""]","[""kairos""]","[""trans-women"",""women-of-color""]","A social networking platform, Giggle, allegedly collected, shared to third-parties, and used sensitive information and biometric data to verify whether a person is a woman via facial recognition, which critics claimed to be discriminatory against women of color and harmful towards trans women.","Networking Platform Giggle Employs AI to Determine Users’ Gender, Allegedly Excluding Transgender Women" ObjectId(6259b3d5c2337187617c53c3),176,2022-03-02,[1557],"[""oregon-state-university""]","[""starship-technologies""]","[""oregon-state-university"",""freight-train-crew""]"," A Starship food delivery robot deployed by Oregon State University reportedly failed to cross the railroad, becoming stranded, and ending up being struck by an oncoming freight train.","Starship’s Autonomous Food Delivery Robot Allegedly Stranded at Railroad Crossing in Oregon, Run over by Freight Train" ObjectId(62842ee176e12cf335550ab1),182,2018-06-11,"[1573,1574]","[""cruise""]","[""cruise""]","[""cruise-vehicles"",""cruise-driver-employee""]","In San Francisco, an autonomous Cruise Chevrolet Bolt collided with another Cruise vehicle driven by a Cruise human employee, causing minor scuffs to the cars but no human injuries.",Two Cruise Autonomous Vehicles Collided with Each Other in California ObjectId(625763ee343edc875fe63aaa),173,2021-07-30,[1551],"[""unknown""]","[""unknown""]","[""doctors"",""covid-patients""]","AI tools failed to sufficiently predict COVID patients, some potentially harmful.","AI Tools Failed to Sufficiently Predict COVID Patients, Some Potentially Harmful" ObjectId(6297936efc298401e1ba35c0),203,2022-02-10,"[1659,1660,1661]","[""uber""]","[""uber""]","[""uber-drivers""]","Uber launched a new but opaque algorithm to determine drivers' pay in the US which allegedly caused drivers to experience lower fares, confusing fare drops, and a decrease in rides.",Uber Launched Opaque Algorithm That Changes Drivers' Payments in the US ObjectId(62849d9dcb05238c61a5cc65),184,2018-04-12,"[1581,1584,1899]","[""companhia-do-metropolitano-de-sao-paulo""]","[""securos""]","[""sao-paulo-metro-users"",""sao-paulo-citizens""]"," A facial recognition program rolled out by São Paulo Metro Stations was suspended following a court ruling in response to a lawsuit by civil society organizations, who cited fear of it being integrated with other electronic surveillance entities without consent, and lack of transparency about the biometric data collection process of metro users.",Facial Recognition Program in São Paulo Metro Stations Suspended for Illegal and Disproportionate Violation of Citizens’ Right to Privacy ObjectId(628ad91f7da5b905fb4444b8),195,2015-09-01,"[1622,1623,1624,1625,1626,1627,1628,1629,1630,1631,1632,1843]","[""pasco-sheriff's-office""]","[""unknown""]","[""pasco-residents"",""pasco-black-students"",""pasco-students-with-disabilities""]","The Intelligence-Led Policing model rolled out by the Pasco County Sheriff’s Office was allegedly developed based on flawed science and biased data that also contained sensitive information and irrelevant attributes about students, which critics said to be discriminatory.",Predictive Policing Program by Florida Sheriff’s Office Allegedly Violated Residents’ Rights and Targeted Children of Vulnerable Groups ObjectId(629dce346e8239f700dfecbf),213,2020-07-01,"[1715,1716,1717,1718,1719]","[""facebook""]","[""facebook""]","[""facebook-users""]","The performance of Facebook’s political ad detection was revealed by researchers to be imprecise, uneven across countries in errors, and inadequate for preventing systematic violations of political advertising policies.",Facebook’s Political Ad Detection Reportedly Showed High and Geographically Uneven Error Rates ObjectId(625763ee343edc875fe63aab),174,2022-02-28,"[1552,1585,1595,1599]","[""unknown""]","[""unknown""]","[""linkedin-users""]","More than a thousand inauthentic LinkedIn profiles using allegedly GAN-generated photos were notified by researchers at Stanford to LinkedIn’s staff, and many of which were removed for violating rules against creating fake profiles and falsifying information.",Fake LinkedIn Profiles Created Using GAN Photos ObjectId(628af245a8f82bdc4c020cc2),198,2022-03-16,"[1642,1643,1644,1645,1646]","[""hackers""]","[""unknown""]","[""ukrainian-social-media-users"",""ukrainian-public"",""volodymyr-zelenskyy""]"," A quickly-debunked deepfaked video of the Ukrainian President Volodymyr Zelenskyy was posted on various Ukrainian websites and social media platforms encouraging Ukrainians to surrender to Russian forces during the Russia-Ukraine war.",Deepfake Video of Ukrainian President Yielding to Russia Posted on Ukrainian Websites and Social Media ObjectId(625763ee343edc875fe63aac),175,2022-04-01,"[1553,1554,1606,1607,1608]","[""cruise""]","[""cruise""]","[""san-francisco-public"",""cruise-customers""]","An autonomous Chevy Bolt operated by Cruise was pulled over in San Francisco, and as the police attempted to engage with the car, it reportedly bolted off, pulled over again, and put on its hazards lights on at a point farther down the road.",Cruise Autonomous Taxi Allegedly Bolted off from Police After Being Pulled over in San Francisco ObjectId(6269ca6f01cc3d7da1e059ad),178,2022-04-21,"[1560,1565,1566,1567,1568,1569,1570,1594]","[""tesla""]","[""tesla""]","[""tesla-owner"",""vision-jet-owner""]"," A Tesla Model Y was shown on video slowly crashing into a Vision Jet in Spokane, Washington, allegedly due to its owner activating the “Smart Summon” feature.","Tesla Owner Activated ""Smart Summon"" Feature, Causing a Collision with an Aircraft in a Washington Airport" ObjectId(62842764c4ac5276446aed58),180,2020-02-19,"[1564,1582,2236]","[""malaysian-judiciary"",""malaysian-courts""]","[""sarawak-information-systems""]","[""malaysian-convicted-people""]","The AI system used by the Malaysian judiciary which explicitly considered age, employment, and socio-economic data provided sentencing to a drug possession case that was alleged by lawyer to be disproportionately high for the crime committed.",Algorithm Used by the Malaysian Judiciary Reportedly Recommended Unusually High Sentencing to a Drug Possession Case ObjectId(628498c9ba5ecc08807ab7d9),183,2017-07-01,"[1576,1577,1578,1579,1580,2066]","[""airbnb""]","[""airbnb"",""trooly""]","[""sex-workers"",""airbnb-users""]"," Airbnb allegedly considered publicly available data on users to gauge their trustworthiness via algorithmic assessment of personality and behavioral traits, resulting in unexplained bans and discriminatory bans against sex workers.","Airbnb's Trustworthiness Algorithm Allegedly Banned Users without Explanation, and Discriminated against Sex Workers" ObjectId(62a18ffaae26c04e23bf1d27),220,2020-11-11,"[1731,1732,1969,2061]","[""facebook""]","[""facebook""]","[""small-businesses-on-facebook""]","Facebook’s AI mistakenly blocked advertisements by small and struggling businesses, after the company allegedly leaned more on algorithms to monitor ads on the platform with little review from human moderators.",Facebook Mistakenly Blocked Small Business Ads ObjectId(628ad7417da5b905fb43f208),194,2018-02-01,[1621],"[""unnamed-australian-telecommunications-company""]","[""unknown""]","[""unnamed-australian-telecommunications-company""]","In early 2018, an Australian telecommunications company’s incident management AI excessively deployed technicians into the field, and was allegedly unable to be stopped by the automation team.","Australian Telco’s Incident Management Bot Excessively Sent Technicians in the Field by Mistake, Allegedly Costing Millions" ObjectId(628b0ea3db7d62b8a823c307),200,2019-03-01,[1653],"[""scammers""]","[""unknown""]","[""unnamed-uk-based-energy-firm's-ceo""]","Fraudsters allegedly used AI voice technology to impersonate the boss of a UK-based firm's CEO, demanding a transfer of €220,000 over the phone.",Fraudsters Used AI to Mimic Voice of a UK-Based Firm's CEO's Boss ObjectId(62988029093243282c69c2b2),206,2015-03-01,"[1675,1676,1677,1678]","[""tinder""]","[""tinder""]","[""tinder-users-over-30-years-old""]","Tinder’s personalized pricing was found by Consumers International to consider age as a major determinant of pricing, and could be considered a direct discrimination based on age, according to anti-discrimination law experts.",Tinder's Personalized Pricing Algorithm Found to Offer Higher Prices for Older Users ObjectId(6285d00023ec6cb0db5af13a),186,2007-07-26,"[1590,1591,1592,1593,1788,1933,1934]","[""spanish-ministry-of-interior""]","[""spanish-secretary-of-state-for-security"",""spanish-ministry-of-interior""]","[""spanish-victims-of-gender-violence""]"," In Spain, the algorithm that assesses recidivism risk in gender violence, VioGén, have critically underestimated the level of risk in a series of cases that ended in homicide of women and children since its first deployment.","Algorithm Assessing Risk Faced by Victims of Gender Violence Misclassified Low-Risk Cases, Allegedly Leading to Homicide of Women and Children in Spain" ObjectId(629791afc7e109ab6bc28b6f),202,2021-12-06,"[1655,1656,1657,1658,1721]","[""yoon-suk-yeol"",""yoon-suk-yeol's-campaign""]","[""unknown""]","[""korean-public""]",A South Korean political candidate created a deepfake avatar which political opponents alleged to be fraudulent and a threat to democracy.,Korean Politician Employed Deepfake as Campaign Representative ObjectId(629f0c2548f09c92aeb5fe4d),216,2017-10-10,"[1724,1924,1925,1926,1927]","[""wechat""]","[""wechat""]","[""black-wechat-users""]",The Chinese platform WeChat provided an inappropriate and racist English translation for the Chinese term for “black foreigner” in its messaging app.,WeChat’s Machine Translation Gave a Racist English Translation for the Chinese Term for “Black Foreigner” ObjectId(62a196265fb208d11b3108fa),221,2022-03-07,"[1733,1734]","[""tesla""]","[""tesla""]","[""road-engineer""]","In Taiwan, a Tesla Model 3 on Autopilot mode whose driver did not pay attention to the road collided with a road repair truck; a road engineer immediately placed crash warnings in front of the Tesla, but soon after got hit and was killed by a BMW when its driver failed to see the sign and crashed into the accident.",A Road Engineer Killed Following a Collision Involving a Tesla on Autopilot ObjectId(626a2b97f9c5ab809bbc9af1),179,2022-04-01,"[1561,1562,1874]","[""openai""]","[""openai""]","[""minority-groups"",""underrepresented-groups""]","OpenAI's image-generation-from-natural-language-description model, DALL-E 2, was shown to have various risks pertaining to its use, such as misuse as disinformation, explicit content generation, and reinforcement of gender and racial stereotypes, which were acknowledged by its developers.",Images Generated by OpenAI’s DALL-E 2 Exhibited Bias and Reinforced Stereotypes ObjectId(625763ee343edc875fe63aa9),172,2020-07-01,[1550],"[""appriss""]","[""appriss""]","[""american-physicians"",""american-pharmacists"",""american-patients-of-minority-groups"",""american-patients""]","NarxCare's algorithm assessing a patient’s overdose risk allegedly did not undergo peer-reviewed validation studies, and considered sensitive data with high risk of biases towards women and Black patients such as experience of sexual abuse and criminality.",NarxCare’s Risk Score Model Allegedly Lacked Validation and Trained on Data with High Risk of Bias ObjectId(62842db6dee309a4a8e14d4b),181,2022-02-11,"[1571,1572]","[""cruise""]","[""cruise""]","[""cruise-vehicle""]","A BMW Sedan reportedly made an illegal left turn, causing a minor collision but no injuries with a Cruise autonomous vehicle (AV) operating in autonomous mode.","BMW Sedan Made a Prohibited Left Turn, Colliding with a Cruise Autonomous Vehicle" ObjectId(62868592e0a9519a0ba08a94),190,2017-01-15,"[1610,1611,1612,1613]","[""bytedance""]","[""bytedance""]","[""instagram-users"",""snapchat-users"",""american-social-media-users""]"," ByteDance allegedly scraped short-form videos, usernames, profile pictures, and descriptions of accounts on Instagram, Snapchat, and other sources, and uploaded them without consent on Flipagram, TikTok’s predecessor, in order to improve its “For You” algorithm's performance on American users.","ByteDance Allegedly Trained ""For You"" Algorithm Using Content Scraped without Consent from Other Social Platforms" ObjectId(626331bad17b021fce12b51b),177,2022-04-19,"[1558,1583,1600,1601,1602]","[""google-docs""]","[""google-docs""]","[""google-docs-users""]","Google’s “inclusive language” feature prompting writers to consider alternatives to non-inclusive words reportedly also recommend alternatives for words such as “landlord” and “motherboard,” which critics said was a form of obtrusive, unnecessary, and bias-reinforcing speech-policing.",Google’s Assistive Writing Feature Provided Allegedly Unnecessary and Clumsy Suggestions ObjectId(628681d73a32758144dc742b),188,2018-04-11,"[1603,1604,1782,1605]","[""salta-city-government""]","[""microsoft""]","[""salta-teenage-girls"",""salta-girls-of-minority-groups""]","In 2018, during the abortion-decriminalization debate in Argentina, the Salta city government deployed a teenage-pregnancy predictive algorithm built by Microsoft that allegedly lacked a defined purpose, explicitly considered sensitive information such as disability and whether their home had access to hot water.",Argentinian City Government Deployed Teenage-Pregnancy Predictive Algorithm Using Invasive Demographic Data ObjectId(628686fce0eed158517d4796),191,2020-10-06,"[1614,1615]","[""naver""]","[""naver""]","[""naver-customers""]","The Korean Fair Trade Commission (FTC) imposed a 26.7B KRW on Naver for manipulating shopping and video search algorithms, favoring its own online shopping business to boost its market share. ",Korean Internet Portal Giant Naver Manipulated Shopping and Video Search Algorithms to Favor In-House Services ObjectId(628b0fb73a32758144c2c21d),201,2020-04-14,"[1654,2435]","[""extinction-rebellion-belgium""]","[""unknown""]","[""shophie-wilmes"",""belgian-government""]",A deepfake video showing the Belgium’s prime minister speaking of an urgent need to tackle the climate crises was released by a climate action group.,Climate Action Group Posted Deepfake of Belgian Prime Minister Urging Climate Crisis Action ObjectId(62986c4c093243282c6578f5),204,2022-02-11,"[1662,1663,1664,1665]","[""zhihu""]","[""sangfor-technologies""]","[""zhihu-employees"",""chinese-tech-workers""]","The firing of an employee at Zhihu, a large Q&A platform in China, was allegedly caused by the use of a behavioral perception algorithm which claimed to predict a worker’s resignation risk using their online footprints, such as browsing history and internal communication.",A Chinese Tech Worker at Zhihu Fired Allegedly via a Resignation Risk Prediction Algorithm ObjectId(629da7969e8fc9073246a3f2),209,2020-10-20,"[1688,1689,1690,1691,1692]","[""tesla""]","[""tesla""]","[""tesla-drivers""]",The “rolling stop” functionality within the “Aggressive” Full Self Driving (FSD) profile that was released via a Tesla firmware update was recalled and disabled.,Tesla Disabled “Rolling Stop” Functionality Associated with the “Aggressive” Driving Mode ObjectId(62a045b0d1bc84a9cc93ad7a),219,2020-11-15,[1730],"[""ezemvelo-kzn-wildlife""]","[""unknown""]","[""rhinos-in-conservation""]",AI cameras installed by Ezemvelo KZN Wildlife failed to detect poachers when four dehorned rhino carcasses were found.,Poachers Evaded AI Cameras and Killed Four Rhinos ObjectId(6286854a3a32758144dd5fa7),189,2019-10-15,"[1609,1670,1671,1672,1673,1674]","[""uk-department-of-work-and-pensions""]","[""uipath""]","[""people-with-disabilities""]",People with disabilities were allegedly disproportionately targeted by a benefit fraud detection algorithm which the UK’s Department of Work and Pensions was urged to disclose.,Opaque Fraud Detection Algorithm by the UK’s Department of Work and Pensions Allegedly Discriminated against People with Disabilities ObjectId(629f00a448f09c92aeb39c9e),214,2020-01-02,[1722],"[""lockport-city-school-district""]","[""sn-technologies""]","[""black-students""]","SN Technologies allegedly misled Lockport City Schools about the performance of its AEGIS face and weapons detection systems, downplaying error rates for Black faces and weapon misidentification.",SN Technologies Reportedly Lied to a New York State School District about Its Facial and Weapon Detection Systems’ Performance ObjectId(629f09325fb208d11b8efe2d),215,2020-04-01,[1723],"[""facebook""]","[""facebook""]","[""facebook-content-moderators""]","Content moderators and employees at Facebook demand better working conditions, as automated content moderation system allegedly failed to achieve sufficient performance and exposed human reviewers to psychologically hazardous content such as graphic violence and child abuse.",Facebook Content Moderators Demand Better Working Conditions Due to Allegedly Inadequate AI Content Moderation ObjectId(6286888d158fde27b10d8dc5),192,2022-03-17,"[1616,1617]","[""estee-lauder""]","[""hirevue""]","[""pseudonymous-estee-lauder's-former-staff""]",Three make-up artists lost their positions following an algorithmically-assessed video interview by HireVue who reportedly failed to provide adequate explanation of the findings.,Three Make-Up Artists Lost Jobs Following Black-Box Automated Decision by HireVue ObjectId(629f0db656a7be53bbed68fa),217,2016-11-16,"[1725,1726]","[""evolver""]","[""evolver""]","[""fair-visitors""]","At the 18th China Hi-Tech Fair, a robot suddenly smashed through a glass booth and injured a visitor, after a staff member reportedly mistakenly pressed a button, causing it to reverse and accelerate.","Robot at a Chinese Tech Fair Smashed a Glass Booth, Injuring a Visitor" ObjectId(629c5b57fbbeec2d0fb4fc64),208,2021-05-01,"[1683,1684,1685,1686,1687,1759,1760,1761]","[""tesla""]","[""tesla""]","[""tesla-drivers""]","In late 2021, Tesla owners’ complaints to the National Highway Traffic Safety Administration about sudden unexpected automatic braking rapidly increased, coinciding with when radar was no longer equipped in its Model 3 and Model Y vehicles.","Tesla Phantom Braking Complaints Surged, Allegedly Linked to Tesla Vision Rollout" ObjectId(629dc73db462c8b2647f965e),212,2021-01-01,"[1711,1712,1713,1714]","[""xpeng-motors""]","[""unknown""]","[""xpeng-motors-customers""]",The Chinese electric vehicle (EV) firm XPeng Motors was fined by local market regulators for illegally collecting in-store customers’ facial images without their consent for six months.,XPeng Motors Fined For Illegal Collection of Consumers’ Faces Using Facial Recognition Cameras ObjectId(628ad5c80d8089cc43894760),193,2013-11-27,[1620],"[""target""]","[""fireeye""]","[""target"",""target-customers""]","Alerts about a Target data breach were ignored by Minneapolis Target’s staff reportedly due to them being included with many other potential false alerts, and due to some of the company’s network infiltration alerting systems being off to reduce such false alerts, causing private data theft for millions of customers. ","Excessive Automated Monitoring Alerts Ignored by Staff, Resulting in Private Data Theft of Seventy Million Target Customers" ObjectId(628ae659d50929fc8d419df7),196,2013-09-01,"[1633,1634,1635,1636,1637]","[""pakistan-national-database-and-registration-authority""]","[""pakistan-national-database-and-registration-authority""]","[""pakistani-citizens""]","When the leader of the Afghan Taliban was found possessing a valid ID card in the Pakistani national biometric identification database system, Pakistan launch a national re-verification campaign that is linked to numerous changes in recognition status and loss of services.",Compromise of National Biometric ID Card System Leads to Reverification and Change of Status ObjectId(629c4e9f9bed6f7732c7ee3f),207,2021-01-10,"[1679,1680,1681,1682]","[""honolulu-police-department""]","[""boston-dynamics""]","[""honolulu-homeless-people""]",Honolulu Police Department spent federal pandemic relief funds on a robot dog to take body temperatures and patrol a homeless quarantine encampment which local civil rights advocates criticized as dehumanizing.,Hawaii Police Deployed Robot Dog to Patrol a Homeless Encampment ObjectId(629dbfd2927145fff913f831),211,2021-12-11,"[1696,1697,1698,1699,1700,1701,1702]","[""taxis-g7""]","[""tesla""]","[""pedestrians""]","In Paris, about 20 people were injured in an accident involving a Tesla Model 3 taxi cab which was reportedly caused by a sudden unintended acceleration (SUA) episode and braking issues.",A Tesla Taxi Cab Involved in an Accident in Paris with Twenty Injuries ObjectId(6285d69123ec6cb0db5c574a),187,2022-02-04,"[1596,1597,1598]","[""ai-addict""]","[""tesla""]","[""john-bernal"",""san-jose-public""]","A YouTuber who was a Tesla’s employee conducted an on-road review of Tesla's Full Self Driving (FSD) Beta, showing its navigation in various road environments in San Jose and collision with a bollards during Autopilot, allegedly causing his dismissal from the company.","YouTuber Tested Tesla on Self Driving Mode, Colliding with Street Pylons" ObjectId(6285caf4c4ac527644be031f),185,2022-03-01,"[1586,1587,1588,1589]","[""tiktok""]","[""tiktok""]","[""tiktok-users"",""tiktok-new-users""]"," An investigation by NewsGuard into TikTok’s handling of content related to the Russia-Ukraine war showed its “For You” algorithm pushing new users towards false and misleading content about the war within less than an hour of signing up.","TikTok's ""For You"" Algorithm Directed New Users towards Disinformation about the War in Ukraine" ObjectId(629874f4257531f2d69d1030),205,2022-02-25,"[1666,1667,1668,1669]","[""individuals-in-the-donbass-region"",""individuals-in-russia"",""media-organizations-in-crimea""]","[""unknown""]","[""ukrainian-social-media-users""]","According to security reports by Meta, fictitious personas with GAN-generated profile pictures were used by people operating in Russia and Ukraine to push a disinformation campaign targeting Ukrainian social media users, and were taken down.",AI-Generated Profiles Used in Disinformation Campaign Targeting Ukrainians ObjectId(629f165a47b12f3b70c05fa4),218,2020-06-01,"[1727,1728,1950]","[""tesla""]","[""tesla""]","[""delivery-truck"",""pedestrians"",""tesla-drivers""]","On a highway in Taiwan, a Tesla Sedan, reportedly operating on Autopilot mode, crashed into a large overturned truck, barely missing a pedestrian.",Tesla on Autopilot Crashed into Flipped Truck on Taiwan Highway ObjectId(628aeecad50929fc8d43513b),197,2021-10-01,"[1638,1639,1640,1641]","[""facebook""]","[""facebook""]","[""facebook-users""]","Facebook's internal report showed an at-least six-month long alleged software bug that caused moderator-flagged posts and other harmful content to evade down-ranking filters, leading to surges of misinformation on users' News Feed.","Facebook Internally Reported Failure of Ranking Algorithm, Exposing Harmful Content to Viewers over Months" ObjectId(628af5401212a93e232c56d3),199,2019-04-01,"[1647,1648,1649,1650,1651,2024,2031]","[""ever-ai""]","[""ever-ai""]","[""ever-ai-users""]","Ever AI, now Paravision AI, allegedly failed to inform customers about the development and use of facial recognition that facilitates the sale of customers’ data to various businesses, a business model that critics said was an egregious violation of privacy.",Ever AI Reportedly Deceived Customers about FRT Use in App ObjectId(629db470fee4758bdce67001),210,2020-04-28,"[1693,1694,1695]","[""bharatiya-janata-yuva-morcha""]","[""persistent-systems""]","[""indian-voters"",""indian-social-media-users"",""indian-women-journalists""]","The Indian political social media app Tek Fog allegedly allowed operatives affiliated with the ruling political party to hijack social media trends and manipulate public opinion on other apps such as Twitter and WhatsApp, which opposition parties denounced as a national security threat.",Indian Political App Tek Fog Allegedly Hijacked Trends and Manipulated Public Opinion on Other Social Media Platforms ObjectId(62a7205b15d14c6d6ceba1a5),235,2016-04-15,[1756],"[""ping-an""]","[""ping-an""]","[""ping-an-customers"",""chinese-minority-groups""]","Customers’ untrustworthiness and unprofitability were reportedly determined by Ping An, a large insurance company in China, via facial-recognition measurements of micro-expressions and body-mass indices (BMI), which critics argue was likely to make mistakes, discriminate against certain ethnic groups, and undermine its own industry.","Chinese Insurer Ping An Employed Facial Recognition to Determine Customers’ Untrustworthiness, Which Critics Alleged to Likely Make Errors and Discriminate" ObjectId(62a6c3e11cea23c4caf38bc5),227,2018-01-12,"[1744,1973,1974]","[""waze""]","[""waze""]","[""tourists"",""waze-users""]","The tourists driving through Vermont blamed Waze for directing them into a boat launch in Lake Champlain, prompting the vehicle to slide into the water by the time the drivers realized their location in the dark and foggy weather.","Waze App Allegedly Caused Tourists’ Car to End up in Lake Champlain, Vermont" ObjectId(62a19d1f6a8a811a6084ea37),222,2020-07-18,[1735],"[""satria-technologies""]","[""openai""]","[""thoughts-users"",""twitter-users""]","Tweets created by Thoughts, a tweet generation app that leverages OpenAI’s GPT-3, allegedly exhibited toxicity when given prompts related to minority groups.",Thoughts App Allegedly Created Toxic Tweets ObjectId(62a1a65fae26c04e23c48fcd),223,2019-10-09,[1737],"[""hive-box""]","[""hive-box""]","[""hive-box-customers""]","Facial-recognition locks by Hive Box, an express delivery locker company in China, were easily opened by a group of fourth-graders in a science-club demo using only a printed photo of the intended recipient’s face, leaving contents vulnerable to theft.",Hive Box Facial-Recognition Locks Hacked by Fourth Graders Using Intended Recipient’s Facial Photo ObjectId(62a6d3542cccb6726ae25091),229,2018-04-23,"[1746,1747]","[""youtube""]","[""youtube""]","[""youtube-users"",""youtube-content-creators""]",YouTube’s thumbnail monitoring system was allegedly evaded by content farms such as ones in Cambodia who spike viewership and generate ad revenue using bestiality-themed thumbnails.,Content Using Bestiality Thumbnails Allegedly Evaded YouTube’s Thumbnail Monitoring System ObjectId(62a1ab6756a7be53bb9096a4),224,2020-07-01,[1738],"[""wechat-pay""]","[""wechat""]","[""wechat-pay-users""]","In China, fraudsters bypassed facial-recognition security for online financial transactions on WeChat Pay by crafting identity-verification GIFs of victims from their selfies on WeChat Moments, a social media platform.",WeChat Pay's Facial Recognition Security Evaded by Scammers Using Victims’ Social Media Content ObjectId(62a98ef2cfb6a09201e5595e),239,2009-09-01,[1764],"[""intensive-partnerships-for-effective-teaching""]","[""intensive-partnerships-for-effective-teaching""]","[""students"",""low-income-minority-students"",""teachers""]","Gates-Foundation-funded Intensive Partnerships for Effective Teaching Initiative’s algorithmic program to assess teacher performance reportedly failed to achieve its goals for student outcomes, particularly for minority students, and was criticized for potentially causing harm against teachers.",Algorithmic Teacher Evaluation Program Failed Student Outcome Goals and Allegedly Caused Harm Against Teachers ObjectId(62a1b2ba9b0df3a9e564faf2),225,2017-04-07,"[1739,1740]","[""jupiter-hospital"",""memorial-sloan-kettering""]","[""ibm-watson-health""]","[""oncologists"",""cancer-patients""]",Internal documents from IBM Watson Health showed negative assessments from customers such as Florida’s Jupiter Hospital and Memorial Sloan Kettering criticizing its Watson for Oncology product for allegedly unsafe and incorrect cancer treatment recommendations.,IBM Watson for Oncology Criticized by Customers for Allegedly Unsafe and Inaccurate Cancer Treatment Recommendations ObjectId(62a70b1e97c1945c062019f1),232,2018-04-29,"[1751,1752,1975,2018]","[""tesla""]","[""tesla""]","[""yoshihiro-umeda"",""pedestrians"",""tesla-drivers""]","A Tesla Model X operated on Autopilot reportedly failed to recognize the parked motorcycles, pedestrians, and van in its path in Kanagawa, Japan, and ran over a motorcyclist who previously stopped when a member of his motorcyclist group was involved in an accident.","Tesla Model X on Autopilot Missed Parked Vehicles and Pedestrians, Killing Motorcyclist in Japan" ObjectId(62de08794ad8b68d9e3be1ad),242,2021-02-24,[1775],"[""chakan-plant-of-automotive-stampings-and-assemblies""]","[""unknown""]","[""umesh-ramesh-dhake""]",A sensor snag resulted in an automotive parts factory robot falling on a factory worker in India,Manufacturing Robot Failure Caused Factory Worker's Death in India ObjectId(62df75b523ef2c676c07d179),244,2020-08-03,[1783],"[""aurora-police-department""]","[""unknown""]","[""the-gilliam-family""]","An automated plate reader reportedly matched a license plate information, but of a family’s minivan and an alleged motorcycle in Montana that was reportedly stolen earlier in the year, resulting in them and their children being held at gunpoint and detained in handcuffs by multiple Aurora police officers.","Colorado Police’s Automated License Plate Reader (ALPR) Matched a Family’s Minivan’s Plate to That of a Stolen Vehicle Allegedly, Resulting in Detainment at Gunpoint" ObjectId(62a70f6215d14c6d6ce9e579),233,2018-12-03,[1753],"[""tumblr""]","[""tumblr""]","[""tumblr-content-creators"",""tumblr-users""]","Tumblr’s automated tools to identify adult content were reported to have incorrectly flagged inoffensive images as explicit, following its announcement to ban all adult content on the platform.",Tumblr Automated Pornography-Detecting Algorithms Erroneously Flagged Inoffensive Images as Explicit ObjectId(62a6d8341cea23c4caf855ce),230,2019-03-01,[1748],"[""tesla""]","[""tesla""]","[""jeremy-beren-banner"",""tesla-users""]","In Florida, a Model 3 Tesla on Autopilot mode crashed into a tractor-trailer truck, killing the 50-year-old driver.","Model 3 Tesla on Autopilot Crashed into a Truck in Florida, Killing Driver" ObjectId(62df76e9a6f43c979e242859),246,2014-04-16,"[1785,1787]","[""prairie-village-police-department""]","[""unknown""]","[""mark-molner""]","An automated license plate reader (ALPR) camera misread a 7 as a 2 and incorrectly alerted the local police about a stolen Oldsmobile car, which was allegedly not able to be verified by an officer before a traffic stop was effected on a BMW in Kansas City suburb.","Misreading of an Automated License Plate Reader (ALPR) Unverified by Police, Resulting in Traffic Stop in Missouri" ObjectId(62a6c8ed1b7b69ce98ebe04c),228,2019-02-01,[1745],"[""apple""]","[""apple""]","[""tourists"",""apple-maps-users""]","Near Los Angeles, Apple Maps allegedly directed a couple on a ski trip in the mountains toward into an unconventional route out of town, where the drivers found themselves lost and stuck on an unpaved road in the snow.",Apple Maps Allegedly Directed Ski Trip Couple Onto Unpaved Road in the Mountains ObjectId(62a6dccdf24b23be1794e864),231,2016-01-20,"[1749,1750,208,1945]","[""tesla""]","[""tesla""]","[""gao-yaning"",""tesla-drivers""]","A Tesla Model S collided with and killed a road sweeper on a highway near Handan, China, an accident where Tesla previously said it was not able to determine whether Autopilot was operating at the time of the crash.",A Tesla Crashed into and Killed a Road Sweeper on a Highway in China ObjectId(62df765656418e2a5bdd7c06),245,2009-03-30,[1784],"[""san-francisco-police-department""]","[""unknown""]","[""denise-green""]","In San Francisco, an automated license plate reader (ALPR) camera misread a number as belonging to a stolen vehicle having the wrong make, but its photo was not visually confirmed by the police due to poor quality and allegedly despite multiple chances prior to making a traffic stop, causing an innocent person to be pulled over at gunpoint and restrained in handcuffed.",Unverified Misreading by Automated Plate Reader Led to Traffic Stop and Restraint of an Innocent Person at Gunpoint in California ObjectId(62e0aad0a6f43c979e3e6e1b),248,2018-11-23,"[1789,1790]","[""contra-costa-county-sheriff""]","[""vigilant-solutions""]","[""brian-hofer""]","In Oakland, a previously stolen rental car that was returned but allegedly not updated in the police database was pinged by an automated license plate reader (ALPR) camera, leading to police’s wrongful detainment of an innocent person reportedly using excessive force and improper conduct.","Automated License Plate Camera Notified Police about a Previously Stolen Rental Car that was Returned, Causing an Innocent Person to be Detained at Gunpoint in California" ObjectId(62a725b6a147e7dbd3894c5d),236,2022-04-13,[1757],"[""scammers""]","[""unknown""]","[""email-users""]",GAN faces were allegedly used by scammers alongside a parked domain and a fake website to impersonate a Boston law firm.,AI-Generated Faces Used by Scammers to Pose as a Law Firm in Boston ObjectId(62a717230927356849c4d5df),234,2019-09-06,"[1754,1755]","[""waze""]","[""waze""]","[""los-gatos-residents""]","Waze app was blamed by Los Gatos town residents for contributing to high wildfire hazard risk via allegedly routing weekend beach-going drivers through their neighborhoods, effectively choking off their single escape route in the event of a medical emergency or wildfire.","Waze Allegedly Frequently Routed Drivers through the Town of Los Gatos, Blocking Its Single Wildfire Escape Route" ObjectId(62a8e5811afd5f65688b4c58),238,2018-10-01,[1762],"[""oregon-department-of-human-services""]","[""oregon-department-of-human-services""]","[""children-of-minority-groups"",""families-of-minority-groups""]","Oregon’s Department of Human Services (DHS) stopped using its Safety at Screening Tool, that is aimed to predict the risk that children wind up in foster care or be investigated in the future, and opted for a new process allegedly to reduce disparities and improve racially equitable decision-making.",Oregon’s Screening Tool for Child Abuse Cases Discontinued Following Concerns of Racial Bias ObjectId(62a31418ae26c04e232cbb19),226,2015-04-01,"[1741,1742,1743]","[""waze""]","[""waze""]","[""sherman-oaks-residents"",""waze-users"",""los-angeles-city-government""]","For years, Waze has, in an attempt to cut travel times, allegedly caused more traffic and guided drivers to make unsafe and often un-permitted traffic decisions, which was described by a Los Angeles city council member as a threat to public safety.",Waze Allegedly Clogged Streets and Directed Drivers to Make Unsafe Traffic Decisions ObjectId(62de06966bb8effab3aa069d),241,2022-07-21,"[1772,1773,1774,1776,1781]","[""russian-chess-federation""]","[""unknown""]","[""child-named-christopher""]",A chess robot at a tournament in Russia broke the finger of a child who reached onto the board before the robot had completed its move,Chess-Playing Robot Broke Child's Finger in Russia ObjectId(62b65a72b7a838d899c3005c),240,2021-06-29,"[1767,1768,1769,1770,2230]","[""github"",""programmers""]","[""github""]","[""intellectual-property-rights-holders""]",Users of GitHub Copilot can produce source code subject to license requirements without attributing and licensing the code to the rights holder.,"GitHub Copilot, Copyright Infringement and Open Source Licensing" ObjectId(62df67d35939a0bbe4e9d758),243,2020-01-01,"[1777,1778]","[""unknown""]","[""unknown""]","[""twitter"",""twitter-users"",""twitter-users-participating-in-covid-19-discussions""]","Bots by anonymous actors were found by researchers to make up roughly half of Twitter accounts participating in COVID-19 discussions, many of which posted tweets about “reopening America“.",Bots Allegedly Made up Roughly Half of Twitter Accounts in Discussions Surrounding COVID-19 Related Issues ObjectId(62ea330f50582f2a6babdf2b),270,2011-04-18,[1851],"[""apple""]","[""apple""]","[""renren"",""buding-movie-tickets"",""yi-xia"",""dangdang"",""chinese-startups"",""chinese-companies""]","Following Apple’s changes in ranking algorithm in its iTunes App Store, apps by allegedly reputable companies and local startups in China experienced significant drops in ranking order.","Apple Tweaked App Store Ranking Algorithms, Allegedly Resulted in Demotion of Local Apps in China" ObjectId(62ee0aa455716343a47d06a5),271,2022-07-24,"[1852,1861,1862]","[""tesla""]","[""tesla""]","[""landon-embry"",""motorcyclists"",""tesla-drivers""]","A Tesla Model 3 operating on Autopilot mode slammed into the back of a Harley-Davidson motorcycle on an interstate in Utah, throwing the rider from the bike and killing him instantly.",Tesla Model 3 Sedan on Autopilot Killed Motorcyclist in a Rear-End Collision in Utah ObjectId(62ea2d14e98668f51871cdfa),268,2020-03-16,"[1849,1929]","[""facebook"",""twitter"",""youtube""]","[""facebook"",""twitter"",""youtube""]","[""international-criminal-court-investigators"",""international-court-of-justice-investigators"",""investigative-journalists"",""criminal-investigators"",""victims-of-crimes-documented-on-social-media""]","Automated permanent removal of violating social media content such as terrorism, violent extremism, and hate speech without archival allegedly prevented its potential use to investigate serious crimes and hamper criminal accountability efforts.",Permanent Removal of Social Media Content via Automated Tools Allegedly Prevented Investigative Efforts ObjectId(62e0b35fd1725b4ba7c3444a),251,2018-08-01,"[1794,2384]","[""amazon""]","[""amazon""]","[""small-businesses-on-amazon"",""amazon-customers""]","Amazon tweaked product-search algorithm to boost and guide customers towards more profitable in-house products instead of showing mainly most-relevant and best-selling listings, which its internal engineers and lawyers alleged to violate company’s best-for-customer principle.",Amazon Allegedly Tweaked Search Algorithm to Boost Its Own Products ObjectId(62f2040555716343a41ffb47),273,2020-12-24,[1856],"[""faceapp""]","[""faceapp""]","[""faceapp-non-binary-presenting-users"",""faceapp-transgender-users"",""faceapp-users""]",FaceApp’s algorithm was reported by a user to have predicted different genders for two mostly identical facial photos with only a slight difference in eyebrow thickness.,FaceApp Predicted Different Genders for Similar User Photos with Slight Variations ObjectId(62e78c54929b426d214e30ed),259,2022-06-03,"[1822,1842]","[""yannic-kilcher""]","[""yannic-kilcher""]","[""internet-social-platform-users""]","A YouTuber built GPT-4chan, a model based on OpenAI’s GPT-J and trained on posts containing racism, misogyny, and antisemitism collected from 4chan’s “politically incorrect” board, which he made publicly available, and deployed as multiple bots posting thousands of messages on the same 4chan board as a prank.","YouTuber Built, Made Publicly Available, and Released Model Trained on Toxic 4chan Posts as Prank" ObjectId(62e51c32981a526a00e7e1b2),255,2020-05-31,"[1805,1806,1807,1808,1809,1431,1811,1812,1813]","[""chicago-police-department""]","[""shotspotter""]","[""michael-williams""]","ShotSpotter audios were previously admitted to convict an innocent Black man in a murder case in Chicago, resulted in his nearly-one-year-long arrest before being dismissed by prosecutors as insufficient evidence.",Unreliable ShotSpotter Audio Previously Used to Convict Chicago Man in Murder Case ObjectId(62ea182550582f2a6ba7f130),266,2022-01-15,"[1844,2532,2533,2534,2535,2536,2537,2538]","[""replika""]","[""replika""]","[""replika-users"",""replika-male-users"",""replika""]","Replika's AI-powered ""digital companions"" was allegedly abused by their users, who posted on Reddit abusive behaviors and interactions such as using slurs, roleplaying violent acts, and stimulating sexual abuse.","Replika's ""AI Companions"" Reportedly Abused by Its Users" ObjectId(62e0c365a6f43c979e413e86),253,2022-05-18,"[1796,1797,1798,1799]","[""cruise""]","[""cruise""]","[""san-francisco-traffic-participants"",""san-francisco-public""]","Cruise’s autonomous vehicles were shown on video stopping in the middle of the road and causing blockages in San Francisco, as they were disabled allegedly due to lost connection to their company’s server.","Cruise's Self-Driving Cars Allegedly Lost Connection to Their Server, Causing Traffic Blockages in San Francisco" ObjectId(62e0ad4dd1725b4ba7c29784),249,2016-10-01,"[1791,1792]","[""chinese-government""]","[""chinese-government""]","[""uyghur-people"",""turkic-muslim-ethnic-groups""]",A suite of AI-powered digital surveillance systems involving facial recognition and analysis of biometric data were deployed by the Chinese government in Xinjiang to monitor and discriminate local Uyghur and other Turkic Muslims.,Government Deployed Extreme Surveillance Technologies to Monitor and Target Muslim Minorities in Xinjiang ObjectId(62e0cfd6a6f43c979e428116),254,2015-05-01,"[1804,2069]","[""google""]","[""google""]","[""google-photos-users-residing-in-illinois"",""google-photos-users"",""illinois-residents""]","A class-action lawsuit alleged Google failing to provide notice, obtain informed written consent, or publish data retention policies about the collection, storage, and analysis of its face-grouping feature in Google Photos, which violated Illinois Biometric Information Privacy Act (BIPA).","Google’s Face Grouping Allegedly Collected and Analyzed Users’ Facial Structure without Consent, Violated BIPA" ObjectId(62e53195eac42ca1004d3eea),256,2021-11-07,[1814],"[""chicago-police-department""]","[""shotspotter""]","[""chicago-drivers""]","A car stop resulting in a DUI arrest of its driver was allegedly based solely on a ShotSpotter alert, the reliability of which came into question by public defenders, who subpoenaed the company to assess its gunshot alert system.",DUI Arrest Case Allegedly Based Only on ShotSpotter's Alert ObjectId(62e0afa523ef2c676c22b9b5),250,2016-02-01,[1793],"[""castricum-municipality""]","[""castricum-municipality""]","[""unnamed-property-owner""]","A home value generated by a black-box algorithm was reportedly defended by the Castricum court, which was criticized by a legal specialist for setting a dangerous precedent for accepting black-box algorithms as long as their results appear reasonable.",Dutch City Court Defended Home Value Generated by Black-Box Algorithm ObjectId(62e6193e981a526a0059d8cf),258,2022-05-13,"[1819,1820]","[""the-good-guys"",""kmart"",""bunnings""]","[""unknown""]","[""the-good-guys-customers"",""kmart-customers"",""bunnings-customers""]","Major Australian retailers reportedly analyzed in-store footage to capture facial features of their customers without consent, which was criticized by consumer groups as creepy and invasive.",Australian Retailers Reportedly Captured Face Prints of Their Customers without Consent ObjectId(62f2041455716343a41ffd1a),274,2003-07-01,"[1857,1859]","[""virginia-courts""]","[""virginia-department-of-criminal-justice-services""]","[""virginia-convicted-felons"",""virginia-black-offenders"",""virginia-young-offenders""]","Virginia courts’ use of algorithmic predictions of future offending risks were found by researchers failing to reduce incarceration rates, showed racial and age disparities in risk scores and its application, and neither exacerbated or ameliorated historical racial differences in sentencing.",Virginia Courts’ Algorithmic Recidivism Risk Assessment Failed to Lower Incarceration Rates ObjectId(62ea1f06e98668f5186fdbb5),267,2017-06-15,"[1845,1846,1847,1848,2101,2141,2142,2143,2144,2226]","[""clearview-ai""]","[""clearview-ai""]","[""social-media-users"",""instagram-users"",""facebook-users""]","Face-matching algorithm by Clearview AI was built using scraped images from social media sites such as Instagram and Facebook without user consent, violating social media site policies, and allegedly privacy regulations.",Clearview AI Algorithm Built on Photos Scraped from Social Media Profiles without Consent ObjectId(62e6066deac42ca100acf90b),257,2012-05-04,"[1815,1435,1821,2250]","[""kansas-city-police-department"",""cleveland-division-of-police"",""chicago-police-department"",""atlanta-police-department""]","[""shotspotter""]","[""neighborhoods-of-color"",""brown-communities"",""black-communities"",""adam-toledo""]","Police departments disproportionately placed ShotSpotter sensors in black and brown neighborhoods, which is denounced by communities for allegedly creating dangerous situations, such as one involving in Adam Toledo's death.",Police Reportedly Deployed ShotSpotter Sensors Disproportionately in Neighborhoods of Color ObjectId(62e8aafa2db96a8ab9f74396),264,2022-03-01,[1839],"[""speedcam-anywhere""]","[""speedcam-anywhere""]","[""uk-drivers""]","Speedcam Anywhere, an app allowing users to document and report traffic violations via AI-based videographic speed estimation of a vehicle, raised concerns for UK drivers about its capabilities for surveillance and abuse.",AI-Based Vehicle Speed Estimation App Denounced by UK Drivers as Surveillance Technology ObjectId(62e0babc23ef2c676c23eca3),252,2022-06-01,[1795],"[""none""]","[""axon-enterprise""]","[""us-schools"",""us-students""]","Axon Enterprise considered development of remotely operated drones capable of tasering at a target a short distance away as a defense mechanism for mass shootings, despite its internal AI ethics board’s previous objection and condemnation as dangerous and fantastical.",Remotely Operated Taser-Armed Drones Proposed by Taser Manufacturer as Defense for School Shootings in the US ObjectId(62f1fd1da076fc957e0b0b26),272,2019-10-08,"[1853,1854,1855]","[""grab""]","[""grab""]","[""non-tpi-registered-grab-drivers"",""grab-drivers-in-indonesia"",""grab-drivers""]","Grab Indonesia was fined by the Indonesian Competition Commission (KPPU) for unfairly favoring drivers who rented cars via the Grab-affiliated company Teknologi Pengangkutan Indonesia (TPI), including offering more rides via their matchmaking algorithm.","Grab Tweaked Matchmaking Algorithm, Providing Preferential Treatment to Drivers Registered with Affiliated Car Rental Service" ObjectId(62e7c7750a8b81000ba6c913),260,2014-08-26,"[1823,1831]","[""us-department-of-homeland-security"",""us-citizenship-and-immigration-services""]","[""us-citizenship-and-immigration-services""]","[""us-naturalized-citizens"",""us-immigrants"",""us-citizenship-applicants"",""us-immigration-applicants""]","US Citizenship and Immigration Services (USCIS)’s ATLAS software used in vetting immigration requests was condemned by advocacy groups as a threat to naturalized citizens for its secretive algorithmic decision-making, reliance on poor quality data and unknown sources, and alleged discrimination of immigrants using biometric and sensitive information.",US DHS’s Opaque Vetting Software Allegedly Relied on Poor-Quality Data and Discriminated against Immigrants ObjectId(62e7cd5d138e1db3a1511a0a),261,2017-11-15,"[1824,1825,1826,1827,1828,1829,1830,1832]","[""society-for-the-prevention-of-cruelty-to-animals""]","[""knightscope""]","[""san-francisco-homeless-people""]","Society for the Prevention of Cruelty to Animals (SPCA) deployed a Knightscope robot to autonomously patrol the area outside its office and ward off homeless people, which was criticized by residents as a tool of intimidation and ordered by the city of San Francisco to stop its use on a public right-of-way.","Robot Deployed by Animal Shelter to Patrol Sidewalks outside Its Office, Warding off Homeless People in San Francisco" ObjectId(62e7dd265f2757eae1db9659),262,2022-06-11,"[1833,1834,1835,1836]","[""boris-dayma""]","[""boris-dayma"",""suraj-patil"",""pedro-cuenca"",""khalid-saifullah"",""tanishq-abraham"",""phuc-le-khac"",""luke-melas"",""ritobrata-ghosh""]","[""minority-groups"",""underrepresented-groups""]",Publicly deployed open-source model DALL-E Mini was acknowledged by its developers and found by its users to have produced images which reinforced racial and gender biases.,DALL-E Mini Reportedly Reinforced or Exacerbated Societal Biases in Its Outputs as Gender and Racial Stereotypes ObjectId(62e89ea3088b12099cc26a44),263,2015-09-01,[1838],"[""youtube""]","[""youtube""]","[""youtube-young-male-users"",""youtube-male-users"",""caleb-cain""]","YouTube’s personalization and recommendation algorithms were alleged to have pushed and exposed its young male users to political extremism and misinformation, driving them towards far-right ideologies such as neo-Nazism and white supremacy.",YouTube Recommendations Implicated in Political Radicalization of User ObjectId(62e8b8575890dc007562661a),265,2021-04-01,"[1840,1841]","[""uber-eats""]","[""uber-eats""]","[""pa-edrissa-manjang"",""uber-eats-black-delivery-drivers""]","A lawsuit by a former Uber Eats delivery driver alleged the company to have wrongfully dismissed him due to frequent false mismatches of his verification selfies, and discriminated against him via excessive verification checks.",Black Uber Eats Driver Allegedly Subjected to Excessive Photo Checks and Dismissed via FRT Results ObjectId(62f3498bfa57b6f30ec2f015),278,2022-08-07,"[1866,1867,1868]","[""meta""]","[""meta""]","[""jewish-people"",""blenderbot-3-users""]",The publicly launched conversational AI demo BlenderBot 3 developed by Meta was reported by its users and acknowledged by its developers to have “occasionally” made offensive and inconsistent remarks such as invoking Jewish stereotypes.,Meta’s BlenderBot 3 Chatbot Demo Made Offensive Antisemitic Comments ObjectId(62f2041c55716343a41ffe03),275,2020-06-11,"[1858,1860]","[""facebook""]","[""facebook""]","[""facebook-users-sharing-photo-evidence-of-slavery"",""facebook-users""]",Facebook’s automated content moderation was acknowledged by a company spokesperson to have erroneously censored and banned Australian users from posting an article containing a 1890s photo of Aboriginal men in chains over nudity as historical evidence of slavery in Australia.,Facebook’s Moderation Algorithm Banned Users for Historical Evidence of Slavery ObjectId(62f490d40658670483cb1691),285,2022-07-18,[1888],"[""google""]","[""google""]","[""google-lens-users""]",A book title by Korea’s first minister of culture was mistranslated into an offensive phrase by Google Lens’s camera-based translation feature allegedly due to its training on internet communications and a lack of context.,Google Lens’s Camera-Based Translation Feature Provided an Offensive Mistranslation of a Book Title in Korean ObjectId(631975522a90260e9e4f5fc2),330,2016-12-15,[2017],"[""amazon""]","[""amazon""]","[""amazon-users""]",Amazon’s “Amazon’s Choice” algorithm recommended poor-quality defective products and were reportedly susceptible to manipulation by inauthentic reviews.,“Amazon’s Choice” Algorithm Failed to Recommend Functional Products and Prone to Review Manipulation ObjectId(63033a8281052814ccec9f7b),299,2020-12-15,[1935],"[""masayuki-nakamoto""]","[""unknown""]","[""japanese-pornographic-actors""]","A man allegedly unblurred, using deepfake technology, pixelated pornographic images and videos of pornographic actors, which violated Japan’s obscenity law requiring images of genitalia to be obscured.",Japanese Porn Depixelated by Man using Deepfake ObjectId(630f23807a8f2c2b4eece314),323,2018-05-29,"[1992,1193,2006,2514]","[""tesla""]","[""tesla""]","[""laguna-beach-police-department""]","A Tesla sedan on Autopilot mode collided with a parked Laguna Beach Police Department car, resulting in minor injuries for its driver in Laguna Beach, California.",Tesla on Autopilot Crashed into Parked Police Car in California ObjectId(62fa0340d2713a7e8de5b15c),293,2022-06-03,"[1907,1908,1909,1910,1996,1997,2016]","[""cruise""]","[""cruise""]","[""cruise-passengers"",""toyota-prius-passengers""]","A Cruise autonomous vehicle was involved in a crash at an intersection in San Francisco when making a left turn in front of a Toyota Prius traveling in an opposite direction, which caused occupants in both cars to sustain injuries.",Cruise’s Self-Driving Car Involved in a Multiple-Injury Collision at an San Francisco Intersection ObjectId(631712bba7aa86620c9a0f2f),325,2017-09-21,"[2004,2005]","[""facebook""]","[""facebook""]","[""olivia-solon"",""olivia-solon's-facebook-connections""]",An Instagram user’s image containing violent content was reportedly used as advertisement on Facebook allegedly via automated means.,Offensive Instagram User Content Displayed as Facebook Ad ObjectId(631842c8d84017ad42c8e764),328,2020-06-13,"[2014,2032]","[""spamouflage-dragon""]","[""unknown""]","[""facebook-users"",""twitter-users"",""youtube-users""]","A pro-China propaganda campaign deployed fake accounts on Facebook, Twitter, and YouTube using GAN-synthesized faces to share and post comments on its content to gain wider circulation.",Fake Accounts Using GAN Faces Deployed by Propaganda Campaign on Social Platforms ObjectId(633d477a7d6871136596b7b5),347,2021-05-06,"[2060,2098,2099]","[""waymo""]","[""waymo""]","[""waymo-passengers""]","A Waymo self-driving taxi car was shown on video stranded on a road in Arizona while carrying a passenger, suddenly drove away from the company's roadside assistance worker, and ended up being stuck farther down the road.","Waymo Self-Driving Taxi Behaved Unexpectedly, Driving away from Support Crew" ObjectId(6342883afb9dbe61e43fc839),350,2022-09-13,"[2067,2094]","[""serve-robotics""]","[""serve-robotics""]","[""police-investigators""]",A Serve Robotics delivery robot was shown on video rolling through a crime scene blocked off by police tape.,Delivery Robot Rolled Through Crime Scene ObjectId(62f9f0c20127873b4a6fef3f),291,2021-05-28,"[1901,1902,1903,2242,2590,2604]","[""tesla""]","[""tesla""]","[""california-department-of-motor-vehicles"",""tesla-customers"",""california-residents""]","California’s Department of Motor Vehicles (DMV) accused Tesla of false advertising in its promotion of Autopilot and Full Self-Driving (FSD) technologies, alleging the company to have made untrue or misleading claims with marketing language about the capabilities of its products.",Tesla Allegedly Misled Customers about Autopilot and FSD Capabilities ObjectId(62fa0c330127873b4a73e660),294,2018-05-26,"[1911,1912,1913,1914]","[""tesla""]","[""tesla""]","[""you-you-xue"",""tesla-drivers""]","Autopilot was alleged by its Tesla Model 3 driver to have unexpectedly malfunctioned, veering right without warning and crashing into a road divider near Thessaloniki, Greece, which resulted in damages to its wheel and door but no injury to the driver.",Tesla Autopilot Allegedly Malfunctioned in a Non-Fatal Collision in Greece ObjectId(62f36a72c17fe69fd2162681),280,2013-07-30,"[1872,1873]","[""coffee-meets-bagel""]","[""coffee-meets-bagel""]","[""coffee-meets-bagel-users-having-no-ethnicity-preference"",""coffee-meets-bagel-users""]","Users selecting “no preference” were shown by Coffee Meets Bagels’s matching algorithm more potential matches with the same ethnicity, which was acknowledged and justified by its founder as a means to maximize connection rate without sufficient user information.",Coffee Meets Bagel’s Algorithm Reported by Users Disproportionately Showing Them Matches of Their Own Ethnicities Despite Selecting “No Preference” ObjectId(62f4bc3a77f5af9ce4624221),287,2020-10-27,[2471],"[""none""]","[""openai"",""nabla""]","[""nabla-customers""]","The French digital care company, Nabla, in researching GPT-3’s capabilities for medical documentation, diagnosis support, and treatment recommendation, found its inconsistency and lack of scientific and medical expertise unviable and risky in healthcare applications. This incident has been downgraded to an issue as it does not meet current ingestion criteria.",OpenAI’s GPT-3 Reported as Unviable in Medical Tasks by Healthcare Firm ObjectId(62fc9ebb7f039040988f789c),295,2018-11-08,"[1915,1916,1917,2102,2103]","[""new-york-police-department""]","[""unknown""]","[""ousmane-bah"",""nyc-black-people"",""nyc-black-young-people""]","New York Police Department (NYPD)’s facial recognition system falsely connected a Black teenager to a series of thefts at Apple stores, which resulted in his wrongful attempted arrest.",Wrongful Attempted Arrest for Apple Store Thefts Due to NYPD’s Facial Misidentification ObjectId(630de2a9c9d2246424b8bc01),320,2018-01-22,"[1985,1986,1987,2007]","[""tesla""]","[""tesla""]","[""tesla-drivers"",""culver-city-fire-department""]","A Tesla Model S operating on Autopilot mode crashed into the back of a parked fire truck on a freeway in Culver City, California in a non-fatal collision.",Tesla on Autopilot Collided with Parked Fire Truck on California Freeway ObjectId(63182b55d84017ad42c5406f),326,2014-12-09,"[2008,2012,2013]","[""facebook""]","[""facebook""]","[""facebook-users-having-posts-about-painful-events"",""facebook-users""]","Facebook’s “Year in Review” algorithm which compiled content in users’ past year as highlights inadvertently showed painful and unwanted memories to users, including death of family member.",Facebook Automated Year-in-Review Highlights Showed Users Painful Memories ObjectId(633412840b988074a09c7ee0),335,2015-03-01,"[2047,2090,2091,2092,2122,2123,2124,2125]","[""uk-visas-and-immigration""]","[""uk-visas-and-immigration"",""uk-home-office""]","[""uk-visa-applicants-from-some-countries""]","UK Home Office's algorithm to assess visa application risks explicitly considered nationality, allegedly caused candidates to face more scrutiny and discrimination.",UK Visa Streamline Algorithm Allegedly Discriminated Based on Nationality ObjectId(62f9fe883c16ab9cc78cf737),292,2021-09-01,"[1904,1905,1906]","[""apple""]","[""apple""]","[""silicon-valley-traffic-participants"",""silicon-valley-residents""]",Apple’s autonomous cars were reported to have bumped into curbs and struggled to stay in their lanes after crossing intersections during an on-road test drives near the company’s Silicon Valley headquarters.,Apple’s AVs Reportedly Struggled to Navigate Streets in Silicon Valley Test Drives ObjectId(63033d3581052814cceda4de),300,2022-01-15,"[1936,1937]","[""tiktok""]","[""tiktok""]","[""tiktok-male-teenager-users"",""tiktok-male-users"",""tiktok-teenage-users"",""tiktok-users"",""tiktok""]","TikTok’s “For You” algorithm allegedly boosted or was manipulated by an online personality to artificially boost his content which promotes extreme misogynistic views towards teenagers and men, despite breaking its rules.","TikTok's ""For You"" Algorithm Allegedly Abused by Online Personality to Promote Anti-Women Hate" ObjectId(630dc67fb5b628f76fd964bc),318,2021-01-13,"[1977,1978]","[""facebook""]","[""facebook""]","[""facebook-users""]","Facebook’s algorithmic recommendations reportedly continued showing advertisements for gun accessories and military gear, despite Facebook’s halt on weapons accessories ads following the US Capitol attack.",Facebook Recommended Military Gear Ads Despite Pause on Weapons Accessories Ads ObjectId(630dd3b7f5504b7e75aad64f),319,2019-12-29,"[1981,1982,1983,1984,1993]","[""tesla""]","[""tesla""]","[""derrick-monet"",""jenna-monet"",""the-monets'-family""]","A Tesla on Autopilot mode failed to see a parked fire truck and crashed into its rear on an interstate in Indiana, causing the death of an Arizona woman.",Tesla on Autopilot Fatally Crashed into Parked Fire Truck in Indiana ObjectId(632056a45857ae71d0616e65),332,2016-04-05,"[2033,2040,2041,2044]","[""google""]","[""google""]","[""black-women"",""black-people"",""google-users""]","Google Image search reportedly showed disparate results along racial lines, featuring almost exclusively white women for “professional hairstyles” and black women for “unprofessional hairstyles” prompts.",Google Image Showed Racially Biased Results for “Professional” Hairstyles ObjectId(633a8fd3c70e5740bfbf5e4a),340,2017-02-01,[2053],"[""honda""]","[""honda""]","[""honda-customers""]",Honda's Collision Mitigation Braking System (CMBS) allegedly caused accidents to consumers due to frequent instances of false obstacle detection.,Honda's CMBS False Positives Allegedly Caused Accidents to Customers ObjectId(633aaae14178615128e595a2),341,2017-04-06,"[2054,2114,2198,2199,2200]","[""nissan""]","[""nissan""]","[""nissan-drivers"",""traffic-participants""]","Nissan's Automatic Emergency Braking (AEB) feature was reported in a series of complaints for false positives and abrupt braking behaviors, endangering car occupants and traffic participants. ","Nissan's ""Automatic Emergency Braking"" False Positives Posed Traffic Risks to Drivers" ObjectId(63283d3b5ba952a8677615a3),334,2014-10-01,"[2046,2077]","[""uber""]","[""uber""]","[""local-law-enforcement-officers""]","Uber developed a secret program ""Greyball"" which prevented known law enforcement officers in areas where its service violated regulations from receiving rides.",Uber Deployed Secret Program To Deny Local Authorities Rides ObjectId(63036e545f65af7ded38efea),305,2019-02-01,"[1942,1943]","[""youtube""]","[""youtube""]","[""youtube-users"",""youtube-climate-skeptic-users""]",YouTube’s recommendation system and its focus on views and watched time were alleged by an advocacy group to have driven people towards climate denial and misinformation videos.,YouTube’s Recommendation Algorithm Allegedly Promoted Climate Misinformation Content ObjectId(62fcc26b77f5af9ce4dee4b0),297,2020-02-20,"[1921,1922,1923]","[""smart-columbus""]","[""easymile""]","[""unnamed-woman-passenger""]","A self-driving shuttle deployed by Smart Columbus in Linden neighborhood unexpectedly stopped on the street, which caused a woman to fall onto the floor from her seat.","EasyMile Self-Driving Shuttle Unexpectedly Stopped Mid-Route, Injuring a Passenger" ObjectId(62f3c064867302aca4f382fc),282,2020-10-03,"[1879,1881,1972]","[""facebook""]","[""facebook""]","[""the-seed-company-by-e.w.-gaze"",""businesses-on-facebook""]","Facebook’s content moderation algorithm misidentified and removed a Canadian business’s advertisement containing a photo of onions as products of overtly sexual content, which was later reinstated after review.",Facebook’s Algorithm Mistook an Advertisement of Onions as Sexual Suggestive Content ObjectId(62f24ab4fa57b6f30e8dc738),276,2022-01-01,[1864],"[""bucheon-city-government""]","[""unknown""]","[""bucheon-citizens""]","Bucheon government’s use of facial recognition in analyzing CCTV footage, despite gaining wide public support, was scrutinized by privacy advocates and some lawmakers for collecting data without consent, and retaining and misusing data beyond pandemic needs.","Local South Korean Government’s Use of CCTV Footage Analysis via Facial Recognition to Track COVID Cases Raised Concerns about Privacy, Retention, and Potential Misuse" ObjectId(62f9ee8077f5af9ce45a140a),290,2022-06-03,"[1900,2413,2414]","[""toronto-city-government""]","[""toronto-public-health""]","[""sunnyside-beachgoers"",""marie-curtis-beachgoers"",""toronto-citizens""]","Toronto’s use of AI predictive modeling (AIPM) which had replaced existing methodology as the only determiner of beach water quality raised concerns about its accuracy, after allegedly conflicting results were found by a local water advocacy group using traditional means.",False Negatives for Water Quality-Associated Beach Closures ObjectId(62fcb32d4a3f91af3d48436f),296,2016-02-10,"[1918,1919,1920]","[""twitter""]","[""twitter""]","[""twitter-left-leaning-politicians"",""twitter-left-leaning-news-organizations"",""twitter-left-leaning-users"",""twitter-users""]",Twitter’s “Home” timeline algorithm was revealed by its internal researchers to have amplified tweets and news of rightwing politicians and organizations more than leftwing ones in six out of seven studied countries.,Twitter Recommender System Amplified Right-Leaning Tweets ObjectId(630c8eb443fe03f46cc8bc7f),313,2022-08-25,[1967],"[""meta""]","[""meta""]","[""marietje-schaake""]","Meta’s conversational AI BlenderBot 3, when prompted “who is a terrorist,“ responded with an incumbent Dutch politician’s name, who was confused about its association.",BlenderBot 3 Cited Dutch Politician as a Terrorist ObjectId(6321568425ffe34eb014af4a),333,2021-03-17,"[2045,2135,2136,2137,2146,2148]","[""tesla""]","[""tesla""]","[""unnamed-22-year-old-male-driver"",""tesla-drivers""]","A Tesla Model Y on Autopilot collided with a parked Michigan State Police (MSP) car which had its emergency lights on, in Eaton County, Michigan, although no one was injured.",Tesla on Autopilot Crashed Parked Michigan Police Car on Interstate ObjectId(630367c881052814ccfd26f3),304,2021-11-03,[1941],"[""tesla""]","[""tesla""]","[""unnamed-tesla-driver"",""tesla-drivers""]","A Tesla Model Y in Full Self-Driving (FSD) mode drove into the wrong lane after making a left turn despite its driver allegedly attempting to overtake its driving, resulting in a non-fatal collision with another vehicle in the wrong lane in Brea, California.",Tesla on FSD Reportedly Drove into the Wrong Lane in California ObjectId(63204d8bc912bf8020e381f8),331,2020-08-05,"[2022,2023]","[""instagram""]","[""instagram""]","[""instagram-users""]","A bug was reported by Instagram’s spokesperson to have prevented an algorithm from populating related hashtags for thousands of hashtags, resulting in an allege preferential treatment for some politically partisan hashtags.",Bug in Instagram’s “Related Hashtags” Algorithm Allegedly Caused Disproportionate Treatment of Political Hashtags ObjectId(63342b4609b0dac2f0bc4198),337,2021-04-17,"[2049,2071,2072,2112,2115,2116,2117,2118,2237]","[""tesla""]","[""tesla""]","[""william-varner"",""unnamed-passenger""]","A 2019 Tesla Model S was reportedly traveling on Adaptive Cruise Control (ACC) at high speed before crashing into a tree near The Woodlands in Spring, Texas, killing two people.","Tesla Model S on ACC Crashed into Tree in Texas, Killing Two People" ObjectId(633d286d399f7471b5c10035),346,2016-06-15,"[2059,2062,2108,2109,2110]","[""henn-na-hotel""]","[""unknown""]","[""henn-na-hotel-guests"",""henn-na-hotal-staff""]",A number of robots employed by a hotel in Japan were reported by guests in a series of complaints for failing to handle tasks such as answering scheduling questions or making passport copies without human intervention.,Robots in Japanese Hotel Annoyed Guests and Failed to Handle Simple Tasks ObjectId(62f3c2b9a076fc957e6080f8),283,2018-07-02,[1880],"[""facebook""]","[""facebook""]","[""the-vindicator""]",Facebook’s content moderation algorithm was acknowledged by the company to have flagged excerpts of the Declaration of Independence posted by a small newspaper in Texas as hate speech by mistake.,Facebook’s Automated Content Moderation Tool Flagged a Post Containing Parts of the Declaration of Independence as Hate Speech by Mistake ObjectId(62f4ae6e0658670483d4835d),286,2021-02-26,"[1889,2052,2381]","[""tiktok""]","[""tiktok""]","[""lalani-erika-renee-walton"",""arriani-jaileen-arroyo"",""lalani-erika-renee-walton's-family"",""arriani-jaileen-arroyo's-family"",""tiktok-young-users"",""tiktok-users""]","TikTok’s recommendation algorithm was alleged in a lawsuit to have intentionally and repeatedly pushed videos of the “blackout” challenge onto children’s feeds, incentivizing their participation which ultimately resulted in the death of two young girls.","TikTok’s ""For You"" Allegedly Pushed Fatal “Blackout” Challenge Videos to Two Young Girls" ObjectId(630498489a95e74856267248),307,2017-11-01,[1947],"[""apple""]","[""apple""]","[""iphone-face-id-users"",""iphone-x-face-id-users""]",The Face ID feature on iPhone allowing users to unlock their phones via facial recognition was reported by users for not recognizing their faces in the morning.,iPhone Face ID Failed to Recognize Users’ Morning Faces ObjectId(6305dcfaaf7cc5438e28f38c),309,2017-08-26,"[1954,1956,1960,2030]","[""metropolitan-police-service""]","[""unknown""]","[""notting-hill-carnival-goers""]",The facial recognition trial by London’s Metropolitan Police Service at the Notting Hill Carnival reportedly performed poorly with a high rate of false positives.,Facial Recognition Trial Performed Poorly at Notting Hill Carnival ObjectId(631834963d3a94e2438bd339),327,2015-03-24,[2011],"[""facebook""]","[""facebook""]","[""facebook-users-having-posts-about-painful-events"",""facebook-users""]",Facebook’s “On This Day” algorithm which highlighted past posts on a user’s private page or News Feed confronted unwanted and painful personal memories to its users.,Facebook’s On-This-Day Feature Mistakenly Showed Painful Memories to Users ObjectId(633c45d3399f7471b597a077),344,2021-07-01,[2057],"[""myinterview"",""curious-thing""]","[""myinterview"",""curious-thing""]","[""job-candidates-using-myinterview"",""job-candidates-using-curious-thing"",""employers-using-myinterview"",""employers-using-curious-thing""]","Two AI interview softwares provided positive but invalid results such as ""competent"" English proficiency and high match percentage for interview responses given in German by reporters.",Hiring Algorithms Provided Invalid Positive Results for Interview Responses in German ObjectId(6342746c7349da35faffd3ee),348,2020-11-01,"[2064,2075,2096]","[""youtube""]","[""youtube""]","[""youtube-users-skeptical-of-us-election-results""]",YouTube's recommendation algorithm allegedly pushed 2020's US Presidential Election fraud content to users most skeptical of the election's legitimacy disproportionately compared to least skeptical users.,YouTube Recommendation Reportedly Pushed Election Fraud Content to Skeptics Disproportionately ObjectId(634282e6c5ced0d56b1d0012),349,2022-03-22,"[2065,2095]","[""charlotte-mecklenburg-school-district""]","[""evolv-technology""]","[""students-at-charlotte-mecklenburg-schools"",""teachers-at-charlotte-mecklenburg-schools"",""security-officers-at-charlotte-mecklenburg-schools""]","Evolv's AI-based weapons detection system reportedly produced excessive false positives, mistaking everyday school items for weapons and pulling schools' security personnel for manual checking.",Evolv's Gun Detection False Positives Created Problems for Schools ObjectId(62f2570d867302aca4ac4572),277,2022-01-14,[1865],"[""15.ai""]","[""15.ai""]","[""15.ai"",""15.ai-users""]","An AI-synthetic audio sold as an NFT on Voiceverse’s platform was acknowledged by the company for having been created by 15.ai, a free web app specializing in text-to-speech and AI-voice generation, and reused without proper attribution.",Voices Created Using Publicly Available App Stolen and Resold as NFT without Attribution ObjectId(6305f639af7cc5438e301103),311,2020-05-02,"[1961,1962]","[""youtube""]","[""youtube""]","[""women-of-sex-tech-conference-attendants"",""women-of-sex-tech-conference-organizers""]","YouTube’s automated content moderation tool erroneously removed The Women of Sex Tech conference’s live-streamed event and banned the conference from the platform, despite not violating the platform’s sexual content policies.",YouTube Auto-Moderation Mistakenly Banned Women of Sex Tech Conference ObjectId(630dbeae9451321cff796216),317,2020-03-17,[1976],"[""facebook""]","[""facebook""]","[""facebook-users-posting-legitimate-covid-19-news"",""facebook-users""]","Facebook was reported by users for blocking posts of legitimate news about the coronavirus pandemic, allegedly due to a bug in an anti-spam system.",Bug in Facebook’s Anti-Spam Filter Allegedly Blocked Legitimate Posts about COVID-19 ObjectId(631704baa7aa86620c9827e8),324,2019-11-12,"[1998,1999,2000,2001,2002,2003]","[""the-bl""]","[""unknown""]","[""instagram-users"",""facebook-users""]","A large network of pages, groups, and fake accounts having GAN-generated face photos associated with The BL, a US-based media outlet, reportedly bypassed Facebook moderation systems to push ""pro-Trump"" narratives on its platform and Instagram.",GAN Faces Deployed by The BL's Fake Account Network to Push Pro-Trump Content on Meta Platforms ObjectId(63035dc822c28e977359610b),303,2022-08-21,"[1940,1944]","[""google""]","[""google""]","[""a-software-engineer-named-mark"",""parents-using-telemedicine-services""]","Google’s automated detection of abusive images of children incorrectly flagged a parent’s photo intended for a healthcare provider, resulting in a false police report of child abuse, and loss of access to his online accounts and information.",Google’s Automated Child Abuse Detection Wrongfully Flagged a Parent’s Naked Photo of His Child ObjectId(63369976589105516e51189b),339,2022-09-15,"[2051,2063,2491,2511,2516,2539,2540,2575,2576,2593,2601,2634,2643,2755]","[""students""]","[""sudowrite"",""openai""]","[""teachers"",""non-cheating-students"",""cheating-students""]","Students were reportedly using open-source text generative models such as GPT-3 and ChatGPT to complete school assignments and exams such as writing reports, essays.",Open-Source Generative Models Abused by Students to Cheat on Assignments and Exams ObjectId(633c459b6ee03859f96820ab),343,2021-07-11,"[2056,2113]","[""facebook"",""instagram"",""twitter""]","[""facebook"",""instagram"",""twitter""]","[""marcus-rashford"",""jadon-sancho"",""bukayo-saka"",""facebook-users"",""instagram-users"",""twitter-users""]","Facebook's, Instagram's, and Twitter's automated content moderation failed to proactively remove racist remarks and posts directing at Black football players after finals loss, allegedly largely relying on user reports of harassment.","Facebook, Instagram, and Twitter Failed to Proactively Remove Targeted Racist Remarks via Automated Systems" ObjectId(630350a5c971a26b3b4d6134),302,2021-03-15,[1939],"[""geisel-school-of-medicine""]","[""geisel-school-of-medicine's-technology-staff"",""canvas""]","[""sirey-zhang"",""geisel-school-of-medicine's-students"",""geisel-school-of-medicine's-professors"",""geisel-school-of-medicine's-accused-students""]",Dartmouth's Geisel School of Medicine allegedly falsely accused students of cheating during remote exams using an internally built system which tracked student activity patterns without their knowledge on its learning management platform.,Students Allegedly Wrongfully Accused of Cheating via Medical School's Internal Software ObjectId(6305cb242b1af2bc7c3e34b8),308,2017-07-03,"[1952,1953]","[""boston-dynamics""]","[""boston-dynamics""]","[""none""]","Boston Dynamics’s autonomous robot Atlas allegedly caught its foot on a stage light, resulting in a fall off the stage at the Congress of Future Science and Technology Leaders conference.",Atlas Robot Fell off Stage at Conference ObjectId(6305e6d7af7cc5438e2ac401),310,2017-06-03,"[1955,1957,1958,1959,2126,2127,2128,2269]","[""south-wales-police""]","[""nec""]","[""finals-attendees"",""falsely-accused-finals-attendees""]",South Wales Police (SWP)’s automated facial recognition (AFR) at the Champion's League Final football game in Cardiff wrongly identified innocent people as potential matches at an extremely high false positive rate of more than 90%.,High False Positive Rate by SWP's Facial Recognition Use at Champion's League Final ObjectId(630c99e90212a2e7e79de7da),315,2016-04-09,[1970],"[""ntechlab""]","[""ntechlab""]","[""russian-pornographic-actresses"",""russian-sex-workers""]",The facial recognition software FindFace allowing its users to match photos to people’s social media pages on Vkontakte was reportedly abused to de-anonymize and harass Russian women who appeared in pornography and alleged sex workers.,Facial Recognition Service Abused to Target Russian Porn Actresses ObjectId(630f057e690071b517189ef4),321,2018-03-23,"[188,190,194,195,199,209,212,1988,1989,1990,1995,200]","[""tesla""]","[""tesla""]","[""walter-huang"",""walter-huang's-family""]","A Tesla Model X P100D operating on Autopilot's Traffic-Aware Cruise Control (TACC) and Autosteer system allegedly accelerated above the speed limit of a highway in Mountain View, California, and steered itself directly into a barrier, resulting in its driver’s death.","Tesla Model X on Autopilot Crashed into California Highway Barrier, Killing Driver" ObjectId(631845f4a7aa86620cb95d56),329,2017-09-18,[2015],"[""amazon""]","[""amazon""]","[""amazon-users""]",Amazon was reported to have shown chemical combinations for producing explosives and incendiary devices as “frequently bought together” items via automated recommendation.,Amazon Recommended Explosive-Producing Ingredients as “Frequently Bought Together” Items for Chemicals ObjectId(62f3d763614ca995dff23c49),284,2018-05-01,"[1882,1883,1884,1885,1886,1887]","[""facebook""]","[""facebook""]","[""museums-on-facebook"",""facebook-users-interested-in-arts"",""facebook-users""]","Facebook’s removal of posts featuring renowned artworks by many historical artists and their promotional content due to nudity via both automated and human-moderated means were condemned by critics, such as museums and tourism boards, as cultural censorship and prevention of artwork promotion.",Facebook’s Automated Removal of Content Featuring Nudity-Containing Artworks Denounced as Censorship ObjectId(62ff8d332b190ab329b567f9),298,2021-10-21,[2471],"[""yuen-ler-chow""]","[""yuen-ler-chow""]","[""thefacetag-app-users""]","TheFaceTag app, a social networking app developed and deployed within-campus by a student at Harvard raised concerns surrounding its facial recognition, cybersecurity, privacy, and misuse. This incident has been downgraded to an issue as it does not meet current ingestion criteria.",Student-Developed Facial Recognition App Raised Ethical Concerns ObjectId(62f35eea867302aca4e2b895),279,2019-07-01,"[1869,1870,2381]","[""tiktok""]","[""tiktok""]","[""tiktok-young-users"",""tiktok-users""]",TikTok’s young users were allegedly exposed to community-guideline-violating pro-eating disorder content on their algorithmically curated “For You” page that serves videos from any user on its platform.,TikTok’s “For You” Algorithm Exposed Young Users to Pro-Eating Disorder Content ObjectId(62f3bb424240948816cde2a4),281,2019-02-04,"[1875,1876,1877]","[""youtube""]","[""youtube""]","[""youtube-young-users"",""youtube-users""]","Terms-of-service-violating videos related to suicide and self-harm reportedly bypassed YouTube’s content moderation algorithms, allegedly resulting in exposure of graphic content to young users via recommended videos.",YouTube's Algorithms Failed to Remove Violating Content Related to Suicide and Self-Harm ObjectId(630c86a4707bde9384fd94ee),312,2021-08-15,"[1963,1964,1965,1966]","[""sanas""]","[""sanas""]","[""call-center-agents-having-non-midwestern-american-accent"",""people-having-non-midwestern-american-accent""]","A startup’s use of AI voice technology to alter or remove accents for call center agents was scrutinized by critics as reaffirming bias, despite the company’s claim.",Startup's Accent Translation AI Denounced as Reinforcing Racial Bias ObjectId(630ca22388619542799c19ec),316,2016-06-02,[1971],"[""facebook""]","[""facebook""]","[""facebook-users""]","Facebook’s advertisement-approval algorithm was reported by a security analyst to have neglected simple checks for domain URLs, leaving its users at risk of fraudulent ads.",Facebook Ad-Approval Algorithm Allegedly Missed Fraudulent Ads via Simple URL Checks ObjectId(62f4c81b0658670483dbe51b),288,2019-01-30,"[1895,1896,2025,2026]","[""woodbridge-police-department""]","[""unknown""]","[""nijeer-parks""]","Woodbridge Police Department falsely arrested an innocent Black man following a misidentification by their facial recognition software, who was jailed for more than a week and paid thousands of dollar for his defense.",New Jersey Police Wrongful Arrested Innocent Black Man via FRT ObjectId(62f4d2aa4a3f91af3dd48640),289,2020-06-15,"[1897,1898]","[""starship-technologies""]","[""starship-technologies""]","[""jisuk-mok"",""frisco-residents""]","A Starship food delivery robot crashed into the front bumper of a vehicle waiting at a stoplight intersection in Frisco, Texas, the video of which the company reportedly refused to release.","Starship Delivery Robot Scuffed Bumper of a Resident’s Car in Texas, Allegedly Refusing to Release Footage of the Accident" ObjectId(63048a1b3359229b334cee96),306,2016-05-26,"[1946,1948,1949]","[""tesla""]","[""tesla""]","[""unnamed-tesla-owner"",""tesla-drivers""]","A Tesla Model S operating on the Traffic-Aware Cruise Control (TACC) feature of Autopilot was shown on video by its driver crashing into a parked van on a European highway in heavy traffic, which damaged the front of the car.",Tesla on Autopilot TACC Crashed into Van on European Highway ObjectId(633c45fd399f7471b597a758),345,2021-04-13,[2058],"[""insurance-companies""]","[""ccc-information-services"",""tractable""]","[""vehicle-repair-shops"",""vehicle-owners""]","Auto-insurance companies' photo-based estimation of repair price was alleged by repair shop owners and industry groups as providing inaccurate estimates, causing damaged cars to stay in the shop longer.",Auto-Insurance Photo-Based Estimation Allegedly Gave Inaccurate Repair Prices Frequently ObjectId(630349169b0efe36c58855ab),301,2022-02-15,[1938],"[""broward-college""]","[""honorlock""]","[""unnamed-florida-teenager""]",Broward College’s use of remote proctoring system and reliance on its flagging algorithm allegedly led to a wrongful accusation of academic dishonesty in a biology exam of a Florida teenager.,Teenager at Broward College Allegedly Wrongfully Accused of Cheating via Remote Proctoring ObjectId(630c923888619542799a0c42),314,2022-08-17,[1968],"[""stability-ai""]","[""stability-ai"",""runway"",""laion"",""eleutherai"",""compvis-lmu""]","[""stability-ai"",""deepfaked-celebrities""]","Stable Diffusion, an open-source image generation model by Stability AI, was reportedly leaked on 4chan prior to its release date, and was used by its users to generate pornographic deepfakes of celebrities.",Stable Diffusion Abused by 4chan Users to Deepfake Celebrity Porn ObjectId(630f18e003b40739e3f018e6),322,2019-12-07,"[1991,1994]","[""tesla""]","[""tesla""]","[""connecticut-state-police""]","A Tesla Model 3 on Autopilot slammed into a parked car of patrol police officers who stopped to assist a stranded motorist on the interstate in Norwalk, Connecticut.",Tesla Model 3 Crashed into Police Patrol Car on Connecticut Highway ObjectId(633423b68b9212a310a337bc),336,2015-03-01,"[2048,2119,2120,2121]","[""uk-home-office""]","[""uk-home-office""]","[""uk-immigrant-newlyweds""]","UK Home Office's opaque algorithm to detect sham marriages flagged some nationalities for investigation more than others, raising fears surrounding discrimination based on nationality and age.",UK Home Office's Sham Marriage Detection Algorithm Reportedly Flagged Certain Nationalities Disproportionately ObjectId(63428c5563b61b7fa042db22),351,2022-09-13,[2068],"[""@tengazillioiniq""]","[""unknown""]","[""halle-bailey"",""black-actresses""]","A Twitter user reportedly modified using generative AI a short clip of Disney's 2022 version of ""The Little Mermaid,"" replacing a Black actress with a white digital character.","""The Little Mermaid"" Clip Doctored Using Generative AI to Replace Black Actress with White Character" ObjectId(6347c85bf149ac829bebe6ac),364,2020-04-15,[2131],"[""walmart""]","[""everseen""]","[""walmart-employees""]",Walmart's theft-deterring bagging-detection system allegedly exposed workers to health risks during the coronavirus pandemic when its false positives prompted workers to unnecessarily step in to resolve the issue.,Walmart's Bagging-Detection False Positives Exposed Workers to Health Risk ObjectId(6347d11ef149ac829beda3a3),367,2020-06-17,[2150],"[""openai"",""google""]","[""openai"",""google""]","[""gender-minority-groups"",""racial-minority-groups"",""underrepresented-groups-in-training-data""]","Unsupervised image generation models trained using Internet images such as iGPT and SimCLR were shown to have embedded racial, gender, and intersectional biases, resulting in stereotypical depictions.","iGPT, SimCLR Learned Biased Associations from Internet Training Data" ObjectId(63429a302acc51f55c0d97ad),355,2018-07-07,"[2081,2082,2083,2903]","[""uber""]","[""uber""]","[""uber-drivers""]","Uber was alleged in a lawsuit to have wrongfully accused its drivers in the UK and Portugal of fraudulent activity through automated systems, which resulted in their dismissal without a right to appeal.",Uber Allegedly Wrongfully Accused Drivers of Fraud via Automated Systems ObjectId(63429b21c5ced0d56b1f9725),356,2020-09-15,"[2084,2085]","[""murat-ayfer""]","[""murat-ayfer"",""openai""]","[""historically-disadvantaged-groups""]",Philosopher AI as built on top of GPT-3 was reported by its users for having strong tendencies to produce offensive results when given prompts on certain topics such as feminism and Ethiopia.,Philosophy AI Tentatively Produced Offensive Results for Certain Prompts ObjectId(6347dbceed247984c90d6b27),368,2016-06-01,"[2151,2152,2153,2154,2155,2156,2157,2159,2160,2161]","[""the-israel-military""]","[""anyvision""]","[""palestinians-residing-in-the-west-bank""]","A controversial surveillance program involving facial recognition and algorithmic recommendation, Blue Wolf, was deployed by the Israeli military to monitor Palestinians in the West Bank.","Facial Recognition Smart Phone App ""Blue Wolf"" Monitored Palestinians in West Bank" ObjectId(634d1f8f7448b116a2eba9cf),370,2017-09-27,[2163],"[""google""]","[""google""]","[""google's-competitor-shopping-services""]","Google was fined by EU Commission for changing its shopping algorithms in Europe to favor its own comparison service over competitors, resulting in anti-competitive effects.",Google Fined for Changing Shopping Algorithms in EU to Favor Own Service ObjectId(635f0120e6a5db6da1d3a483),378,2022-04-06,"[2175,2176]","[""tusimple""]","[""tusimple""]","[""tusimple"",""state-of-arizona""]",A TuSimple autonomous truck operating with backup drivers behind the wheel operated on an outdated command sequence and suddenly veered into the center divide on the interstate freeway.,TuSimple Truck Steered into Interstate Freeway Divide ObjectId(636dffb0411dcebbcc969fd5),391,2022-07-26,"[2244,2246]","[""southern-co-op""]","[""hikvision""]","[""souther-co-op-customers""]","Southern Co-op's use of facial recognition reportedly to curb violent crime in UK supermarkets was alleged by civil society and privacy groups as ""unlawful"" and ""complete"" invasion of privacy.",Facial Recognition Trial by UK Southern Co-op Alleged as Unlawful ObjectId(6356adfd642a3e49ac4c13a9),371,2019-11-29,"[2167,2184,2203]","[""ugandan-government""]","[""huawei""]","[""political-opposition-in-uganda""]","Huawei's AI systems involving facial recognition were reportedly deployed by the Ugandan government to monitor political opposition actors and anti-regime sentiments, which raised fears of surveillance and suppression of individual freedoms.",Uganda Deployed Huawei's Facial Recognition to Monitor Political Opposition and Protests ObjectId(636e0f5e0a00a4f89b1146a3),393,2021-12-08,[2247],"[""facebook""]","[""facebook""]","[""facebook-users-speaking-swahili"",""facebook-users-speaking-english"",""facebook-users""]",Facebook's ad moderation system involving algorithms failed to flag hateful language and violating content such as calls for killings for ads in English and Swahili.,Facebook AI-Supported Moderation for Ads Failed to Detect Violating Content ObjectId(63429c05ad13a1c2fb5b52a1),358,2018-06-01,[2089],"[""cadillac-fairview""]","[""unknown""]","[""chinook-centre-mall-goers"",""market-mall-goers""]","Facial recognition (FRT) was reportedly deployed in some Calgary-area malls to approximate customer age and gender without explicit consent, which a privacy expert warned was a cause for concern.",Calgary Malls Deployed Facial Recognition without Customer Consent ObjectId(6347c81eb55a37b65883b48c),363,2021-01-15,[2130],"[""facebook""]","[""facebook""]","[""facebook-users-posting-about-plymouth-hoe"",""facebook-users-in-plymouth-hoe"",""plymouth-hoe-residents""]",Facebook's automated system mistakenly labelled posts featuring the seafaring landmark Plymouth Hoe as misogynistic.,Facebook's Automated Moderation Mistakenly Flagged Landmark's Name as Offensive ObjectId(634d1c221ad286865cf8d200),369,2022-08-29,[2162],"[""jason-allen""]","[""midjourney""]","[""artists-submitting-in-the-digital-arts-category"",""digital-artists"",""artists""]","An artwork generated using generative AI won first place in the digital arts category of the Colorado State Fair's art competition, which raised concerns surrounding labor displacement and unfair competition.",GAN Artwork Won First Place at State Fair Competition ObjectId(637e37b4f569208079ed32e7),400,2022-02-23,[2273],"[""google""]","[""google""]","[""women-in-need-of-abortion-services"",""women-having-unexpected-or-crisis-pregnancies""]","Google Search reportedly returned fewer abortion clinics for searches from poorer and rural areas, particularly ones with Targeted Regulation of Abortion Providers (TRAP) laws.",Google Search Returned Fewer Results for Abortion Services in Rural Areas ObjectId(635780b830a3a8f1ece4a18d),373,2013-10-01,"[2169,2187,2188,2189,2190,2191,2213,2214,2215,2216,2238,2798]","[""michigan-unemployment-insurance-agency""]","[""fast-enterprises"",""csg-government-solutions""]","[""unemployed-michigan-residents-falsely-accused-of-fraud"",""unemployed-michigan-residents""]","State's use of Michigan Integrated Data Automated System (MiDAS) to adjudicate unemployment benefits claims falsely issued fraud determinations based on un-investigated assumptions, resulting in tens of thousands of false fraud cases over years.",Michigan's Unemployment Benefits Algorithm MiDAS Issued False Fraud Claims to Thousands of People ObjectId(6357a047b7c906438c20d050),376,2016-09-01,"[2172,2185,2186,2261,2281]","[""realpage""]","[""realpage"",""jeffrey-roper""]","[""apartment-renters""]","RealPage’s YieldStar apartment pricing algorithm was reportedly helping landlords push unusually high rents onto tenants, raising fears and criticisms surrounding alleged antitrust behaviors such as artificially inflating price, and stifling competition.","RealPage's Algorithm Pushed Rent Prices High, Allegedly Artificially" ObjectId(6371fabca346c979b1ae42cf),395,2021-03-02,"[2254,2255,2256,2257]","[""amazon""]","[""netradyne""]","[""amazon-delivery-drivers""]","Amazon delivery drivers were forced to consent to algorithmic collection and processing of their location, movement, and biometric data through AI-powered cameras, or be dismissed.",Amazon Forced Deployment of AI-Powered Cameras on Delivery Drivers ObjectId(6347c501dcc7f82dbe8893d5),361,2018-05-11,[2111],"[""amazon""]","[""amazon""]","[""danielle's-family"",""amazon-echo-users""]",Amazon Echo misinterpreted a background conversation between a husband and wife as instructions for recording a message and sending it to one of the husband's employees.,Amazon Echo Mistakenly Recorded and Sent Private Conversation to Random Contact ObjectId(635c5c81de6aa8bda90be620),377,2022-10-11,[2174],"[""weibo""]","[""weibo""]","[""weibo"",""chinese-government""]",Weibo's user moderation model is having difficulty keeping up with shifting user slang in defiance of Chinese state censors.,Weibo Model Had Difficulty Detecting Shifts in Censored Speech ObjectId(634299bc63b61b7fa0444163),354,2020-06-20,"[2078,2079,2080,2904,2903]","[""uber""]","[""uber""]","[""uber-drivers""]","Uber was alleged in a lawsuit to have provided incomplete notice about automated decision-making and profiling for drivers such as information about their driving behavior, and use of phone.",Uber Allegedly Violated GDPR by Failing to Provide Sufficient Notice on Automated Profiling for Drivers ObjectId(63621e63de6aa8bda92f9f24),381,2020-10-29,[2217],"[""sit-acronis-autonomous""]","[""sit-acronis-autonomous""]","[""sit-acronis-autonomous""]",An autonomous Roborace car drove itself into a wall in round one of the Season Beta 1.1 race.,Autonomous Roborace Car Drove Directly into a Wall ObjectId(6342970863b61b7fa043f53e),353,2019-03-01,"[2073,2074,2195,2196]","[""tesla""]","[""tesla""]","[""jeremy-banner"",""jeremy-banner's-family""]","A Tesla Model 3 driver switched on Autopilot seconds before the crash into the underbelly of a tractor-trailer on a highway in Florida, killing the Tesla driver.","Tesla on Autopilot Crashed into Trailer Truck in Florida, Killing Driver" ObjectId(6357844d24cf9385ce69f63f),374,2020-08-13,"[2170,2206,2207,2208,2209,2210,2211,2212]","[""uk-office-of-qualifications-and-examinations-regulation""]","[""uk-office-of-qualifications-and-examinations-regulation""]","[""a-level-pupils"",""gcse-pupils"",""pupils-in-state-schools"",""underprivileged-pupils""]","UK Office of Qualifications and Examinations Regulation (Ofqual)'s grade-standardization algorithm providing predicted grades for A level and GCSE qualifications in the UK, Wales, Northern Ireland, and Scotland was reportedly giving grades lower than teachers' assessments, and disproportionately for state schools.",UK Ofqual's Algorithm Disproportionately Provided Lower Grades Than Teachers' Assessments ObjectId(636218ea6a07890c6a6ea2e6),380,2014-03-04,"[2181,2182,2258,2259,2260]","[""facebook""]","[""facebook""]","[""jewish-people""]","Facebook's automated advertising categories generated using users' declared interests contained anti-Semitic categories such as ""Jew hater"" and ""How to burn Jews"" which were listed as fields of study.",Facebook's Auto-Generated Targeting Ad Categories Contained Anti-Semitic Options ObjectId(63429b73fb9dbe61e441cf9e),357,2019-02-14,"[2086,2087,2088]","[""openai""]","[""openai""]","[""openai"",""people-having-personal-data-in-gpt-2's-training-data""]","OpenAI's GPT-2 reportedly memorized and could regurgitate verbatim instances of training data, including personally identifiable information such as names, emails, twitter handles, and phone numbers.",GPT-2 Able to Recite PII in Training Data ObjectId(63627328a7be79b265325ac3),385,2022-10-04,"[2224,2225,2231,2232,2233,2234]","[""edmonton-police-service""]","[""parabon-nanolabs""]","[""black-residents-in-edmonton""]","The Edmonton Police Service (EPS) in Canada released a facial image of a Black male suspect generated by an algorithm using DNA phenotyping, which was denounced by the local community as racial profiling.",Canadian Police's Release of Suspect's AI-Generated Facial Photo Reportedly Reinforced Racial Profiling ObjectId(636b524b23e1c9d9beea48b5),388,2018-12-01,[2235],"[""the-government-in-bahia"",""bahia's-secretary-of-public-security""]","[""huawei""]","[""black-people-in-brazil"",""black-people-in-bahia""]",Facial recognition deployed in a pilot project by the local government of Bahia despite having minimal hit rate reportedly targeted Black and poor people disproportionately.,Facial Recognition Pilot in Bahia Reportedly Targeted Black and Poor People ObjectId(6362264aa7be79b2651b6a2d),383,2022-10-04,"[2220,2223]","[""google-home""]","[""google-home""]","[""black-google-home-mini-users"",""google-home-mini-users""]",Google Home Mini speaker was reported by users for announcing aloud the previously-censored n-word in a song title.,Google Home Mini Speaker Reportedly Read N-Word in Song Title Aloud ObjectId(636dfa2e1b6ec4ae9b2296e0),390,2022-06-28,[2243],"[""unknown""]","[""unknown""]","[""interviewers-of-remote-work-positions"",""employers-of-remote-work-positions""]",Voice and video deepfakes were reported by FBI Internet Crime Complaint Center (IC3) in complaint reports to have been deployed during online interviews of the candidates for remote-work positions.,Deepfakes Reportedly Deployed in Online Interviews for Remote Work Positions ObjectId(636b4ae550d21acd7f9d55c6),387,2014-12-22,[2229],"[""oracle""]","[""oracle""]","[""internet-users""]",Oracle's automated system involving algorithmic data processing was alleged in a lawsuit to have been unlawfully collecting personal data from millions of people and violating their privacy rights.,Oracle's Algorithmic Data Processing System Alleged as Unlawful and Violating Privacy Rights ObjectId(636e237be4c942942295944c),394,2017-03-15,"[2248,2251]","[""youtube"",""twitch"",""tiktok"",""instagram""]","[""youtube"",""twitch"",""tiktok"",""instagram""]","[""youtube-content-creators"",""twitch-content-creators"",""tiktok-content-creators"",""instagram-content-creators""]","TikTok's, YouTube's, Instagram's, and Twitch's use of algorithms to flag certain words devoid of context changed content creators' use of everyday language or discussion about certain topics in fear of their content getting flagged or auto-demonetized by mistake.",Social Media's Automated Word-Flagging without Context Shifted Content Creators' Language Use ObjectId(6347c800f149ac829bebd5f0),362,2021-07-20,[2129],"[""facebook""]","[""facebook""]","[""wny-gardeners"",""gardening-facebook-groups"",""facebook-users-in-gardening-groups""]","Facebook's automated system flagged gardening groups' use of ""hoe"" and violent language against bugs as a violation by mistake.",Facebook's Automated Moderation Flagged Gardening Group's Language Use by Mistake ObjectId(63622d23f8424658ecc55ba7),384,2022-10-03,"[2221,2222]","[""glovo""]","[""glovo""]","[""sebastian-galassi"",""sebastian-galassi's-family""]","Delivery company Glovo's automated system sent an email terminating an employee for ""non-compliance terms and conditions"" after the employee was killed in a car accident while making a delivery on Glovo's behalf.",Glovo Driver in Italy Fired via Automated Email after Being Killed in Accident ObjectId(6347bf83b3a025aa31ca107d),359,2021-05-23,[2097],"[""facebook"",""instagram"",""twitter""]","[""facebook"",""instagram"",""twitter""]","[""palestinian-social-media-users"",""facebook-users"",""instagram-users"",""twitter-users"",""facebook-employees-having-families-affected-by-the-conflict""]","Facebook, Instagram, and Twitter wrongly blocked or restricted millions of pro-Palestinian posts and accounts related to the Israeli-Palestinian conflict, citing errors in their automated content moderation system.","Facebook, Instagram, and Twitter Cited Errors in Automated Systems as Cause for Blocking pro-Palestinian Content on Israeli-Palestinian Conflict" ObjectId(63729e404a5eff12b1d019b7),396,2018-07-04,[2263],"[""uber""]","[""uber""]","[""transgender-uber-drivers""]",Transgender Uber drivers reported being automatically deactivated from the app due to Real-Time ID Check failing to account for difference in appearance of people undergoing gender transitions.,Transgender Uber Drivers Mistakenly Kicked off App for Appearance Change during Gender Transitions ObjectId(6342917fb710d4e33cf416e9),352,2022-09-15,"[2070,2076,2093,2426]","[""stephan-de-vries""]","[""openai"",""stephan-de-vries""]","[""stephan-de-vries""]",Remoteli.io's GPT-3-based Twitter bot was shown being hijacked by Twitter users who redirected it to repeat or generate any phrases.,GPT-3-Based Twitter Bot Hijacked Using Prompt Injection Attacks ObjectId(635769bd84468db9632cd98c),372,2022-07-22,"[2168,2177,2178]","[""google""]","[""google""]","[""google-pixel-6a-users""]","Google Pixel 6a's fingerprint recognition feature was reported by users for security issues, in which phones were mistakenly unlocked by unregistered fingerprints.",Users Reported Security Issues with Google Pixel 6a's Fingerprint Unlocking ObjectId(635794898e87db52ebfb01c5),375,2019-09-29,"[2171,2192,2193]","[""krungthai-bank""]","[""krungthai-bank""]","[""thai-citizens"",""elder-thai-citizens""]","A Thai wallet app failed to recognize people’s faces, resulting in citizens and disproportionately elders unable to sign up for Thai government’s cash handout and co-pay programs or having to wait in long queues at local ATMs for authentication.",Thai Wallet App's Facial Recognition Errors Created Registration Issues for Government Programs ObjectId(636218985a33233a22f6632e),379,1992-05-25,"[2179,2180]","[""pepsi""]","[""d.g.-consultores""]","[""filipinos""]","Pepsi's number generation system determining daily winners in its Number Fever promotion in the Philippines mistakenly produced a number held by thousands which resulted in riots, deaths, conspiracy theories, and decades of lawsuits.",Error in Pepsi's Number Generation System Led to Decades-Long Damages in the Philippines ObjectId(6362229cf2f56bc79407bf96),382,2017-11-21,[2219],"[""instagram""]","[""instagram""]","[""molly-rose-russell"",""the-russell-family"",""teenage-girls"",""teenagers""]","Instagram was ruled by a judge to have contributed to the death of a teenage girl in the UK allegedly through its exposure and recommendation of suicide, self-harm, and depressive content.",Instagram's Exposure of Harmful Content Contributed to Teenage Girl’s Suicide ObjectId(63627e3fe6a5db6da1a32ac6),386,2019-07-03,"[2227,2228,2252]","[""amazon""]","[""amazon""]","[""amazon-warehouse-workers""]","Amazon’s warehouse worker “time off task"" (TOT) tracking system was used to discipline and dismiss workers, falsely assuming workers to have wasted time and failing to account for breaks or equipment issues.","Amazon’s ""Time Off Task"" System Made False Assumptions about Workers' Time Management" ObjectId(636b5e9802006dadfd7b75df),389,2022-04-05,"[2239,2240,2562]","[""cruise""]","[""cruise""]","[""san-francisco-firefighters"",""san-francisco-fire-department""]",A fire truck in San Francisco responding to a fire was blocked from passing a doubled-parked garbage truck by a self-driving Cruise car on the opposing lane which stayed put and did not reverse to clear the lane.,Cruise Autonomous Car Blocked Fire Truck Responding to Emergency ObjectId(637338f62f19ca2c3763fd78),397,2022-09-11,"[2264,2268]","[""tiktok""]","[""tiktok""]","[""young-tiktok-users"",""tiktok-users"",""gen-z-tiktok-users""]",TikTok's search recommendations reportedly contained misinformation about political topics bypassing both AI and human content moderation.,Misinformation Reported in TikTok's Search Results Despite Moderation by AI and Human ObjectId(6373414f2f19ca2c37674860),398,2022-08-15,"[2265,2266,2267]","[""tesla""]","[""tesla""]","[""tesla-drivers"",""horse-drawn-carriages""]","Tesla Autopilot's computer vision system was shown in a video mistaking a horse-drawn carriage for other forms of transport such as a truck, a car, and a human following a car.",Tesla Autopilot Misidentified On-Road Horse-Drawn Carriage ObjectId(637b0fb0dc7613ede0f49fb0),399,2022-11-15,"[2270,2271,2272,2277]","[""meta-ai"",""meta"",""facebook""]","[""meta-ai"",""meta"",""facebook""]","[""minority-groups"",""meta-ai"",""meta"",""facebook"",""minority-groups""]",Meta AI trained and hosted a scientific paper generator that sometimes produced bad science and prohibited queries on topics and groups that are likely to produce offensive or harmful content.,Meta AI's Scientific Paper Generator Reportedly Produced Inaccurate and Harmful Content ObjectId(637f9bfc34f2d7279c03dad8),401,2021-06-03,"[2275,2278,2279,2280]","[""google""]","[""google""]","[""the-karnataka-government"",""kannada-speakers""]","Google's knowledge-graph-powered algorithm showed Kannada in its featured Answer Box when prompted ""ugliest language in India,"" causing outrage from Kannada-speaking people and government.","Kannada Insulted by Google's Featured Answer as ""Ugliest Language in India""" ObjectId(637f9c0f34f2d7279c03dcd2),402,2021-04-01,[2276],"[""latitude""]","[""openai"",""latitude""]","[""latitude""]",Latitude's GPT-3-powered game AI Dungeon was reportedly abused by some players who manipulated its AI to generate sexually explicit stories involving children.,Players Manipulated GPT-3-Powered Game to Generate Sexually Explicit Material Involving Children ObjectId(6347bfff3f17c3e2099ac5f1),360,2021-10-15,"[2100,2149,2218]","[""mcdonald's""]","[""mcd-tech-labs"",""apprente""]","[""shannon-carpenter"",""mcdonald's-customers-residing-in-illinois"",""mcdonald's-customers""]","McDonald's use of chatbot in its AI drive-through in Chicago was alleged in a lawsuit to have collected and processed voice data without user consent to predict customer information, which violated Illinois Biometric Information Privacy Act (BIPA).","McDonald's AI Drive-Thru Allegedly Collected Biometric Customer Data without Consent, Violating BIPA" ObjectId(6347ca687d1ef715c9a6d78b),366,2020-09-20,[2140],"[""tiktok""]","[""tiktok""]","[""tiktok-users""]","Many clips showing a suicide evaded TikTok's automated content moderation system allegedly in a coordinated attack, which resulted in exposure of violating content to its users.",Suicide Clips Evaded TikTok's Automated Moderation in Coordinated Attack ObjectId(636e0987411dcebbcc9857ac),392,2015-06-01,"[2245,2249]","[""facebook""]","[""facebook""]","[""facebook-users-speaking-east-african-languages"",""facebook-users-in-east-africa""]",Facebook's system involving algorithmic content moderation for East African languages was reportedly failing to identify violating content on the platform such as mistakenly classifying non-terrorist content.,Facebook's AI-Supported Moderation Failed to Classify Terrorist Content in East African Languages ObjectId(6386e622b48fdf02a11460a8),407,2016-02-03,[2289],"[""uber""]","[""uber""]","[""poor-neighborhoods"",""neighborhoods-of-color""]",Uber's surge-pricing algorithm which adjusts prices to influence car availability inadvertently caused better service offering such as shorter wait times for majority white neighborhoods.,Uber's Surge Pricing Reportedly Offered Disproportionate Service Quality along Racial Lines ObjectId(63806c7c19b54579646d7d3d),404,2019-06-25,"[2284,2286]","[""rock-hill-schools"",""pinecrest-academy-horizon""]","[""sound-intelligence""]","[""students"",""rock-hill-school-students"",""pinecrest-academy-horizon-students""]","Sound Intelligence's ""aggression detection"" algorithm deployed by schools reportedly contained high rates of false positive, misclassifying laughing, coughing, cheering, and loud discussions.",Sound Intelligence's Aggression Detector Misidentified Innocuous Sounds ObjectId(6381c9b634f2d7279c4fd523),406,2015-07-15,[2288],"[""facebook""]","[""facebook""]","[""pseudonymized-psychiatrist's-patients"",""pseudonymized-psychiatrist"",""patients"",""healthcare-providers""]","Facebook's ""People You May Know"" (PYMK) feature was reported by a psychiatrist for recommending her patients as friends through recommendations, violating patients' privacy and confidentiality.",Facebook's Friend Suggestion Feature Recommends Patients of Psychiatrist to Each Other ObjectId(6386e641c860d9983e13d10b),408,2017-04-15,[2290],"[""facebook""]","[""facebook""]","[""sex-workers-using-facebook""]","Facebook's ""People You May Know"" feature reportedly outed sex workers by recommending clients to their personal accounts or family members to their business accounts with no option to opt out.",Facebook Reportedly Outed Sex Workers through Friend Recommendations ObjectId(6386f266c79c035dcaa8a3c2),409,2013-09-13,"[2309,2411,2412]","[""university-of-north-carolina-wilmington"",""karl-ricanek"",""gayathri-mahalingam""]","[""university-of-north-carolina-wilmington"",""karl-ricanek"",""gayathri-mahalingam""]","[""transgender-youtubers"",""transgender-people""]",YouTube videos of transgender people used by researchers to study facial recognition during gender transitions were used and distributed without permission.,Facial Recognition Researchers Used YouTube Videos of Transgender People without Consent ObjectId(6381b80fb48fdf02a16754bd),405,2018-11-28,"[2285,2287]","[""schufa-holding-ag""]","[""schufa-holding-ag""]","[""young-men-having-credit-scores"",""people-scored-on-old-scoring-versions"",""people-changing-addresses-frequently""]","Creditworthiness Schufa scores in Germany reportedly privileged older and female consumers, and people who changed addresses less frequently, and were unreliable depending on scoring version.",Schufa Credit Scoring in Germany Reported for Unreliable and Imbalanced Scores ObjectId(6380634e19b54579646bd24c),403,2018-01-15,"[2282,2283]","[""google""]","[""google""]","[""political-organizations"",""political-candidates""]","Google GMail's inbox sorting algorithm for political emails was reported by presidential candidates, nonprofits, and advocacy groups for having negative impact on call-to-actions, allegedly suppressing donations and impeding political actions.",GMail's Inbox Sorting Reportedly Negatively Impacted Political Emails and Call-to-Actions ObjectId(638d8f8f77887182b3eaf554),410,2022-11-09,[2312],"[""kfc""]","[""kfc""]","[""jewish-people""]",KFC cited an error in an automated holiday detection system which identified the anniversary of Kristallnacht and prompted an insensitive push notification promoting its chicken.,KFC Sent Insensitive Kristallnacht Promotion via Holiday Detection System ObjectId(638d9b0c77887182b3edfc95),411,2022-11-27,[2314],"[""twitter""]","[""twitter""]","[""twitter-users"",""twitter""]",Twitter Feed was flooded by content from Chinese-language accounts which allegedly aimed to manipulate and reduce social media coverage about widespread protests against coronavirus restrictions in China.,Chinese Accounts Spammed Twitter Feed Allegedly to Obscure News of Protests ObjectId(638da45077887182b3f05041),412,2020-01-15,"[2315,2408,2409,2410]","[""finland-national-bureau-of-investigation""]","[""clearview-ai""]","[""finland-national-bureau-of-investigation""]",Finland's National Police Board was reprimanded for illegal processing of special categories of personal data in a facial recognition trial to identify potential victims of child sexual abuse.,Finland Police's Facial Recognition Trial to Identify Sexual Abuse Victims Deemed Illegal ObjectId(6390303a92c6c9d416e8aa59),413,2022-11-30,"[2317,2318,2586]","[""openai""]","[""openai""]","[""stack-overflow-users"",""stack-overflow""]","Thousands of incorrect answers produced by OpenAI's ChatGPT were submitted to Stack Overflow, which swamped the site's volunteer-based quality curation process and harmed users looking for correct answers.",Thousands of Incorrect ChatGPT-Produced Answers Posted on Stack Overflow ObjectId(639035ff6c1caba4d11eff3a),414,2020-01-18,[2319],"[""facebook""]","[""facebook""]","[""xi-jinping"",""aung-san-suu-kyi""]",Facebook provided a vulgar Burmese-English translation of the Chinese president's name in posts of an official Burmese politician's Facebook page announcing his visit.,Facebook Gave Vulgar English Translation of Chinese President's Name ObjectId(63903d03fca1bb88915db189),415,2020-07-28,"[2320,2404,2405,2406,2407]","[""facebook""]","[""facebook""]","[""live-stream-ceremony-viewers"",""king-maha-vajiralongkorn""]",Facebook's Thai-English translation gave an inappropriate mistranslation on Thai PBS's Facebook live broadcast of the King of Thailand’s candle-lighting birthday ceremony.,Facebook Provided Offensive Translation for King of Thailand's Birthday Ceremony ObjectId(6390466a92c6c9d416ed408c),416,2022-12-01,"[2321,2402,2403]","[""meta-platforms"",""facebook""]","[""meta-platforms"",""facebook""]","[""real-women-in-trucking"",""older-female-blue-collar-workers""]",Facebook's algorithm was alleged in a complaint by Real Women in Trucking to have selectively shown job advertisements disproportionately against older and female workers in favor of younger men for blue-collar positions.,Facebook's Job Ad Algorithm Allegedly Biased against Older and Female Workers ObjectId(639051eb3a5ad5b61c1dbedc),417,2019-11-15,"[2322,2399,2400,2401]","[""facebook""]","[""facebook""]","[""low-digitally-skilled-facebook-users""]",Facebook feed algorithms were known by internal research to have harmed people having low digital literacy by exposing them to disturbing content they did not know how to avoid or monitor.,Facebook Feed Algorithms Exposed Low Digitally Skilled Users to More Disturbing Content ObjectId(639607c6d7265aae7cc6c9c4),418,2017-03-13,"[2324,2391,2392]","[""uber""]","[""uber"",""azure-cognitive-services""]","[""uber-drivers-in-india""]",Uber drivers in India reported being locked out of their accounts allegedly due to Real-Time ID Check's facial recognition failing to recognize appearance changes or faces in low lighting conditions.,Uber Locked Indian Drivers out of Accounts Allegedly Due to Facial Recognition Fails ObjectId(63960c84e31c3c9ac8bddb43),419,2022-12-01,"[2325,2395,2396]","[""facebook""]","[""facebook""]","[""facebook-users""]",Facebook's automated moderating system failed to flag and allowed ads containing explicit violent language against election workers to be published.,Facebook's Automated Moderation Allowed Ads Threatening Election Workers to be Posted ObjectId(63961322bc45a2dda74096cf),420,2022-11-30,"[2326,2358,2393,2394,2397,2554,2644,2649,2662,2852,2863]","[""openai""]","[""openai""]","[""chatgpt-users"",""openai""]",Users reported bypassing ChatGPT's content and keyword filters with relative ease using various methods such as prompt injection or creating personas to produce biased associations or generate harmful content.,Users Bypassed ChatGPT's Content Filters with Ease ObjectId(6396cb91a3cf41b531248ab4),421,2022-11-20,"[2328,2427,2444,2523,2577,2607,2608,2446,2618]","[""stability-ai"",""lensa-ai"",""midjourney"",""deviantart""]","[""stability-ai"",""runway"",""lensa-ai"",""laion"",""eleutherai"",""compvis-lmu""]","[""digital-artists"",""artists-publishing-on-social-media"",""artists""]",Text-to-image model Stable Diffusion was reportedly using artists' original works without permission for its AI training.,Stable Diffusion Allegedly Used Artists' Works without Permission for AI Training ObjectId(639d76678dccddb2440cb810),422,2022-11-22,[2330],"[""unknown""]","[""unknown""]","[""victims-of-ftx's-collapse"",""twitter-users""]",A visual and audio deepfake of former FTX CEO Sam Bankman-Fried was posted on Twitter to scam victims of the exchange's collapse by urging people to transfer funds into an anonymous cryptocurrency wallet.,Deepfake of FTX's Former CEO Posted on Twitter Aiming to Scam FTX Collapse Victims ObjectId(639d7afe17b5cfae855ae501),423,2022-11-22,"[2331,2376,2390,2445,2446]","[""lensa-ai""]","[""stability-ai"",""runway"",""lensa-ai"",""laion"",""eleutherai"",""compvis-lmu""]","[""women-using-lensa-ai"",""asian-women-using-lensa-ai""]","Lensa AI's ""Magic Avatars"" were reportedly generating sexually explicit and sexualized features disproportionately for women and Asian women despite not submitting any sexual content.","Lensa AI's Produced Unintended Sexually Explicit or Suggestive ""Magic Avatars"" for Women" ObjectId(639d8660af03da8f83856b47),424,2020-03-09,"[2332,2386,2387,2388]","[""canadian-universities""]","[""respondus-monitor"",""proctoru"",""proctortrack"",""proctorio"",""proctorexam"",""examity""]","[""canadian-students""]","AI proctoring tools for remote exams were reportedly ""not conducive"" to individual consent for Canadian students whose biometric data was collected during universities' use of remote proctoring in the COVID pandemic.",Universities' AI Proctoring Tools Allegedly Failed Canada's Legal Threshold for Consent ObjectId(639d8b228dccddb244103c16),425,2021-06-12,"[2333,2385]","[""state-farm""]","[""state-farm""]","[""black-state-farm-customers""]",State Farm's automated claims processing method was alleged in a class action lawsuit to have disproportionately against Black policyholders when paying out insurance claims.,State Farm Allegedly Discriminated against Black Customers in Claim Payout ObjectId(639d8eacaf03da8f8386de79),426,2022-09-23,[2334],"[""xpeng""]","[""xpeng""]","[""xpeng-driver""]","An XPeng P7 was operating on Navigation Guided Pilot (NGP) mode automatic navigation assisted driving system as it collided with a truck on a highway in Shandong, causing slight injuries to its driver.",XPeng P7 Crashed into Truck in Shangdong While on Automatic Navigation Assisted Driving ObjectId(63a01efd17b5cfae85ce886b),427,2022-03-15,"[2335,2382,2383]","[""cruise""]","[""cruise""]","[""traffic-participants"",""emergency-vehicles"",""cruise-passengers"",""cruise""]","Cruise's autonomous taxis slowed suddenly, braked, and were hit from behind, allegedly becoming unexpected roadway obstacles and potentially putting passengers and other people at risk.",Cruise Taxis' Sudden Braking Allegedly Put People at Risk ObjectId(63a171f784adbad7e335e126),428,2017-05-19,"[2341,2379,2380]","[""hsbc-uk""]","[""nuance-communications""]","[""hsbc-uk-customers"",""dan-simmons""]",HSBC’s voice recognition authentication system was fooled after seven repeated attempts by a BBC reporter's twin brother who mimicked his voice to access his bank account.,BBC Reporter's Twin Brother Cracked HSBC's Voice ID Authentication ObjectId(63a17dd9d4f686c7f9a40bea),429,2016-04-01,"[2343,1816,2377,2378]","[""rochester-police-department""]","[""shotspotter""]","[""silvon-simmons""]","ShotSpotter's ""unreliable"" audio was used as scientific evidence to accuse and convict a Black man of attempting to shoot Rochester's city police, whose conviction was later reversed by a county judge.",Unreliable ShotSpotter Audio Convicted Black Rochester Man of Shooting Police ObjectId(63a37b8fd67db98e62e5ae99),430,2022-12-19,"[2346,2355,2359,2360,2361,2362,2363,2364,2365,2366,2367,2368,2504,2556,2557,2589,2600,2665,2728,2775,2797]","[""madison-square-garden-entertainment""]","[""unknown""]","[""kelly-conlon"",""alexis-majano""]",Lawyers were barred from entry to Madison Square Garden after a facial recognition system matched them as employed by a law firm currently engaged in litigation with the venue.,Lawyers Denied Entry to Performance Venue by Facial Recognition ObjectId(63a422bbcf609c92b7543612),431,2022-04-20,"[2353,2370,2371]","[""apple""]","[""apple""]","[""gay-men-in-new-york-city"",""julio-ramirez""]",Gay men in New York City were drugged by robbers who accessed their phones using facial recognition while they were unconscious to transfer funds out of their bank accounts.,Robbers Accessed Drugged Gay Men's Bank Accounts Using Their Phones' Facial Recognition ObjectId(63acb833b64ebdefe77e4815),432,2022-12-21,[2357],"[""southwest-airlines""]","[""general-electric""]","[""airline-passengers""]",Southwest Airlines left passengers stranded for days throughout the flight network when Southwest crew scheduling software repeatedly failed to recover from weather-induced flight cancellations.,Southwest Airlines Crew Scheduling Solver Degenerates Flight Network ObjectId(63ad3a6084adbad7e35afd37),433,2012-08-01,"[2415,2416,1013,1348,1011,1016,1018,1012,2421,2422]","[""chicago-police-department""]","[""chicago-police-department""]","[""low-income-communities"",""communities-of-color"",""black-chicago-residents""]","Chicago Police Department (CPD)'s Strategic Subject List as output of an algorithm purportedly to identify victims or perpetrators of violence was reportedly ineffective, easily abused, and biased against low-income communities of color.",Chicago Police's Strategic Subject List Reportedly Biased Along Racial Lines ObjectId(63ad491a006af1f60705b344),434,2022-11-24,"[2417,2418,2420,2474,2472,2520,2635,2919]","[""tesla""]","[""tesla""]","[""traffic-participants"",""tesla-drivers""]",A Tesla driver alleged Full Self Driving (FSD) braking unexpectedly as the cause for an eight-car pileup in San Francisco which led to minor injuries of nine people.,Sudden Braking by Tesla Allegedly on Self-Driving Mode Caused Multi-Car Pileup in Tunnel ObjectId(63ad53ee006af1f60708480e),435,2021-07-04,"[2423,2424,2425]","[""coupang""]","[""coupang""]","[""coupang-suppliers"",""coupang-customers""]","Coupang was alleged in internal reports tampering its search algorithms to prioritize exposure of its own products, which potentially violated Korea's Fair Trade Act.",Coupang Allegedly Tweaked Search Algorithms to Boost Own Products ObjectId(63b3dbdcd4f686c7f9c2dd90),436,2022-12-28,"[2428,2429,2430,2431,2432,2433,2453,2469,2470]","[""tesla""]","[""tesla""]","[""traffic-participants""]","A Tesla driver fell asleep on an Autobahn near Bamberg, Germany after activating his vehicle's Autopilot mode, which did not respond to attempts to pull it over by the police.",Tesla Driver Put Car on Autopilot Before Falling Asleep in Germany ObjectId(63b581e6d4f686c7f9289451),437,2016-12-31,"[2438,2439,2440,2441]","[""amazon-india""]","[""amazon-india""]","[""small-businesses-in-india"",""amazon-customers-in-india""]","Amazon India allegedly copied products and rigged search algorithm to boost its own brands in search ranking, violating antitrust laws.",Amazon India Allegedly Rigged Search Results to Promote Own Products ObjectId(63b5b0aacf609c92b7655c2e),438,2021-09-17,"[2443,2447,2451]","[""henan-government"",""henan-public-security-department""]","[""neusoft""]","[""foreign-journalists-in-henan"",""international-students-in-henan""]",Henan's provincial government reportedly planned system involving facial recognition cameras connected to regional and national databases specifically to track foreign journalists and international students.,Chinese Province Developed System Tracking Journalists and International Students ObjectId(63b7bbee006af1f607c8fa07),439,2019-07-31,"[2448,2449,2450]","[""detroit-police-department""]","[""dataworks-plus""]","[""michael-oliver"",""black-people-in-detroit""]",A Black man was wrongfully detained by the Detroit Police Department as a result of a false facial recognition (FRT) result.,Detroit Police Wrongfully Arrested Black Man Due To Faulty Facial Recognition ObjectId(63b7c337cf609c92b7bf731e),440,2022-11-25,"[2452,2454,2498,2544,2731,2732]","[""baton-rouge-police-department""]","[""morphotrak"",""clearview-ai""]","[""black-people-in-louisiana"",""randall-reid""]",Louisiana police reportedly used a false facial recognition match and secured an arrest warrant for a Black man for thefts he did not commit.,Louisiana Police Wrongfully Arrested Black Man Using False Face Match ObjectId(63b7e901006af1f607d38ce1),441,2019-06-01,"[2464,2465,2466,2467,2468]","[""korean-ministry-of-justice"",""korean-ministry-of-science-and-information-and-communication-technology""]","[""unnamed-korean-companies""]","[""travelers-in-korean-airports""]",Korean government's development of immigration screening system involving real-time facial recognition used airport travelers' data which was supplied by the Ministry of Justice without consent.,Korea Developed ID Screening System Using Airport Travelers' Data without Consent ObjectId(63c3aa8bb3a255226d8727a9),443,2022-12-21,"[2475,2476,2477,2478,2479,2480,2481,2483,2484,2485,2486,2487,2488,2489,2490,2492,2493,2494,2559,2602,2748,2749,2851,2894,2907]","[""openai""]","[""openai""]","[""internet-users""]","OpenAI's ChatGPT was reportedly abused by cyber criminals including ones with no or low levels of coding or development skills to develop malware, ransomware, and other malicious softwares.",ChatGPT Abused to Develop Malicious Softwares ObjectId(63c3d7d651b2393cd57863c1),444,2003-03-22,"[2502,2497,2503]","[""us-air-force""]","[""raytheon"",""lockheed-martin""]","[""us-air-force"",""uk-royal-air-force"",""kevin-main"",""david-williams""]","Acting on the recommendation of their Patriot missile system, American Air Force mistakenly launched the missile at an ally UK Tornado fighter jet, which killed two crew members on board.","US Air Force's Patriot Missile Mistakenly Launched at Ally Fighter Jet, Killing Two" ObjectId(63c3de5e9fabfa7bc98b499c),445,2003-04-02,"[2499,2501,2497,2503]","[""us-navy""]","[""raytheon"",""lockheed-martin""]","[""us-navy"",""nathan-white's-family"",""nathan-white""]","US Navy's Patriot missile system misidentified an American Navy F/A-18C Hornet as an enemy projectile, prompting an operator to fire two missiles at the aircraft, which killed the pilot.","Patriot Missile System Misclassified US Navy Aircraft, Killing Pilot Upon Approval to Fire" ObjectId(63c659579fabfa7bc903760a),446,2023-01-01,"[2505,2512,2542,2677,2830]","[""durham-police-department""]","[""shotspotter""]","[""mass-shooting-victims"",""durham-residents"",""durham-police-department""]","ShotSpotter did not detect gunshots and alert Durham police of a drive-by shooting in Durham, North Carolina which left five people in hospital on New Year's Day.",ShotSpotter Failed to Alert Authorities of Mass Shooting in North Carolina ObjectId(63c65e58b3a255226d0b0324),447,2022-12-19,"[2506,2513]","[""instagram""]","[""instagram""]","[""spanish-speaking-instagram-users""]","Instagram's English translation of a footballer's comment on his wife's post in Spanish made the message seem ""racy"" and ""X-rated,"" which some fans found amusing.","Footballer's ""X-Rated"" Comment Created by Instagram's Mistranslation" ObjectId(63c6635762bbe82271e161e1),448,2022-12-28,[2507],"[""vedal""]","[""vedal""]","[""twitch-users"",""vedal""]","An LLM-powered VTuber and streamer on Twitch made controversial statements such as denying the Holocaust, saying women rights do not exist, and pushing a fat person to solve the trolley problem, stating they deserve it.",AI-Powered VTuber and Virtual Streamer Made Toxic Remarks on Twitch ObjectId(63c678e89fabfa7bc90a4fb2),449,2022-12-01,"[2508,2509,2528,2910]","[""koko""]","[""openai""]","[""research-participants"",""koko-customers""]","OpenAI's GPT-3 was deployed by a mental health startup without ethical review to support peer-to-peer mental healthcare, and whose interactions with the help providers were ""deceiving"" for research participants.",Startup Misled Research Participants about GPT-3 Use in Mental Healthcare Support ObjectId(63d7807e90dd130b85f2b505),450,2021-11-01,"[2510,2546,2547,2548,2563,2569,2596]","[""openai""]","[""openai""]","[""kenyan-sama-ai-employees""]","Sama AI's Kenyan contractors were reportedly asked with excessively low pay to annotate a large volume of disturbing content to improve OpenAI's generative AI systems such as ChatGPT, and whose contract was terminated prior to completion by Sama AI.",Kenyan Data Annotators Allegedly Exposed to Graphic Content for OpenAI's AI ObjectId(63d8bd2f46e8f88b23a0d6ad),451,2022-10-16,"[2515,2523,2606]","[""stability-ai""]","[""runway"",""laion"",""eleutherai"",""compvis-lmu"",""stability-ai""]","[""getty-images"",""getty-images-contributors""]",Stability AI reportedly scraped copyrighted images by Getty Images to be used as training data for Stable Diffusion model.,Stable Diffusion's Training Data Contained Copyrighted Images ObjectId(63d8c34846e8f88b23a26260),452,2023-01-11,"[2518,2545]","[""openai"",""immunefi-users""]","[""openai""]","[""immunefi""]","ChatGPT-generated responses submitted to smart contract bug bounty platform Immunefi reportedly lacked details to help diagnose technical issues, which reportedly wasted the platform's time, prompting bans to submitters.","ChatGPT-Written Bug Reports Deemed ""Nonsense"" by White Hat Platform, Prompted Bans" ObjectId(63d8c97559f16450f488e9a8),453,2023-01-03,[2519],"[""twitter""]","[""twitter""]","[""twitter-users""]","Twitter's automated content moderation misidentified images of rocket launches as pornographic content, prompting incorrect account suspensions.",Twitter's AI Moderation Tool Misidentified Rockets as Pornography ObjectId(63da0d60186d2f2abeba4a85),454,2018-11-09,"[2521,2549]","[""megvii"",""microsoft""]","[""megvii"",""microsoft""]","[""black-people""]",Emotion detection tools by Face++ and Microsoft's Face API allegedly scored smiling or defaulted ambiguous facial photos for Black faces as negative emotion more often than for white faces.,Emotion Detection Models Showed Disparate Performance along Racial Lines ObjectId(63da15d94a4933f5857af13d),455,2022-11-11,"[2524,2526,2527,2541,2560,2603,2592,2597,2598]","[""cnet""]","[""unknown""]","[""cnet-readers""]","AI-written articles published by CNET reportedly contained factual errors which bypassed human editorial review, prompting the company to issue corrections and updates.",CNET's Published AI-Written Articles Ran into Quality and Accuracy Issues ObjectId(63da1aef14bf8910fb2b6367),456,2021-05-18,"[2525,2529,2530,2531,2550]","[""replika""]","[""replika""]","[""replika-users""]","Replika's ""AI companions"" were reported by users for sexually harassing them, such as sending unwanted sexual messages or behaving aggressively.",Replika's AI Partners Reportedly Sexually Harassed Users ObjectId(63da2f634a4933f58581685f),457,2022-11-11,"[2543,2551,2552,2592,2597,2598]","[""cnet""]","[""unknown""]","[""plagiarized-entities"",""cnet-readers""]","CNET's use of generative AI to write articles allegedly ran into plagiarism issues, reproducing verbatim phrases from other published sources or making minor changes to existing texts such as altering capitalization, swapping out words for synonyms, and changing minor syntax.",Article-Writing AI by CNET Allegedly Committed Plagiarism ObjectId(63dafc49bba1929560743b4d),458,2015-08-01,[2553],"[""frauke-zeller"",""david-harris""]","[""frauke-zeller"",""david-harris""]","[""frauke-zeller"",""david-harris""]",A non-actuated conversational robot that previously asked people to move it across Canada was destroyed shortly after beginning its attempt to replicate the journey across the United States.,Robot Destroyed while Hitchhiking through the United States ObjectId(63dcdbed8537f09200101942),459,2023-01-21,"[2561,2568,2562]","[""cruise""]","[""cruise""]","[""san-francisco-residents"",""san-francisco-firefighters"",""san-francisco-fire-department""]",Local firefighters were only able to stop a Cruise AV from driving over fire hoses that were in use in an active fire scene when they shattered its front window.,Firefighters Smashed Cruise AV's Front Window to Stop It from Running over Fire Hoses ObjectId(63dcdc07d011239467bee83a),460,2022-06-12,[2562],"[""cruise""]","[""cruise""]","[""san-francisco-firefighters"",""san-francisco-fire-department""]",A Cruise AV ran over a fire hose that was being used in an active firefighting area.,Cruise AV Ran Over Fire Hose in Active Fire Scene ObjectId(63dce6e8d011239467c14d4f),461,2008-07-18,"[2564,2565,2566,2567]","[""internal-revenue-service""]","[""internal-revenue-service""]","[""black-taxpayers""]","The IRS was auditing Black taxpayers more frequently than other groups allegedly due to the design of their algorithms, focusing on easier-to-conduct audits which inadvertently correlated with the group's pattern of tax filing errors.",IRS Audited Black Taxpayers More Frequently Reportedly Due to Algorithm ObjectId(63e1fe1ed011239467a36792),462,2023-02-06,"[2571,2578,2579,2588,2595]","[""mismatch-media""]","[""open-ai"",""stability-ai""]","[""lgbtq-communities"",""transgender-communities"",""twitch-users""]","The AI-produced, procedural generated sitcom broadcasted as a Twitch livestream ""Nothing, Forever"" received a temporary ban for featuring a transphobic and homophobic dialogue segment intended as comedy.",AI-Produced Livestream Sitcom Received Temporary Twitch Ban for Transphobic Segment ObjectId(63e20e34d011239467a7a32b),463,2022-11-15,"[2572,2573,2574]","[""apple""]","[""apple""]","[""apple-watch-users-doing-winter-activities"",""ski-patrols"",""emergency-dispatchers""]","Apple devices of skiers and snowboarders reportedly misclassified winter activities as accidents, which resulted in numerous false inadvertent distress calls to 911 dispatchers.","Apple Devices Mistook Skiing Activities, Dialed False Distress Emergency Calls " ObjectId(63e216b5505dad38b81e29ec),464,2022-11-30,"[2584,2585,2586,2587,2853]","[""openai""]","[""openai""]","[""chatgpt-users""]","When prompted about providing references, ChatGPT was reportedly generating non-existent but convincing-looking citations and links, which is also known as ""hallucination"".",ChatGPT Provided Non-Existent Citations and Links when Prompted by Users ObjectId(63e3bf1e59add503e352cc5e),465,2022-03-03,[2599],"[""stability-ai"",""google""]","[""stability-ai"",""google"",""laion""]","[""people-having-medical-photos-online""]",Text-to-image models trained using the LAION-5B dataset such as Stable Diffusion and Imagen were able to regurgitate private medical record photos which were used as training data without consent or recourse for removal.,Generative Models Trained on Dataset Containing Private Medical Photos ObjectId(63e3c9f39e26d3d8926a5b4c),466,2023-01-03,"[2605,2628,2629,2630,2631,2632,2689]","[""openai"",""edward-tian""]","[""openai"",""edward-tian""]","[""teachers"",""students""]","Models developed to detect whether text generation AI was used such as AI Text Classifier and GPTZero reportedly contained high rates of false positive and false negative, such as mistakenly flagging Shakespeare's works.",AI-Generated-Text-Detection Tools Reported for High Error Rates ObjectId(63e505202f93f175580379f1),467,2023-02-07,"[2609,2611,2612,2613,2614,2615,2616,2617,2620,2622,2645,2646,2647]","[""google""]","[""google""]","[""google"",""google-shareholders""]","Google's conversational AI ""Bard"" was shown in the company's promotional video providing false information about which satellite first took pictures of a planet outside the Earth's solar system, reportedly causing shares to temporarily plummet.",Google's Bard Shared Factually Inaccurate Info in Promo Video ObjectId(63e50abf2c40d7df7c9ca969),468,2023-02-07,[2610],"[""microsoft""]","[""openai"",""microsoft""]","[""bing-users""]","Microsoft's ChatGPT-powered Bing search engine reportedly ran into factual accuracy problems when prompted about controversial matters, such as inventing plot of a non-existent movie or creating conspiracy theories.",ChatGPT-Powered Bing Reportedly Had Problems with Factual Accuracy on Some Controversial Topics ObjectId(63e9f7a88ae8204053360c42),469,2006-02-25,"[2636,2637,2638]","[""meta"",""linkedin"",""instagram"",""facebook""]","[""microsoft"",""google"",""amazon""]","[""linkedin-users"",""instagram-users"",""facebook-users""]","Automated content moderation tools to detect sexual explicitness or ""raciness"" reportedly exhibited bias against women bodies, resulting in suppression of reach despite not breaking platform policies.",Automated Adult Content Detection Tools Showed Bias against Women Bodies ObjectId(63eb7da596895070dda4a13b),470,2023-02-08,"[2641,2799]","[""microsoft""]","[""openai"",""microsoft""]","[""openai"",""microsoft""]","Reporters from TechCrunch issued a query to Microsoft Bing's ChatGPT feature, which cited an earlier example of ChatGPT disinformation discussed in a news article to substantiate the disinformation.",Bing Chat Response Cited ChatGPT Disinformation Example ObjectId(63ed0c180758bd71ef31d9af),471,2019-06-22,"[2642,2668,2669,2885]","[""meta"",""facebook""]","[""meta"",""facebook""]","[""tigrinya-speaking-facebook-users"",""facebook-users-in-ethiopia"",""ethiopian-public"",""afaan-oromo-speaking-facebook-users""]","Facebook allegedly did not adequately remove hate speech, some of which was extremely violent and dehumanizing, on its platform including through automated means, contributing to the violence faced by ethnic communities in Ethiopia.",Facebook Allegedly Failed to Police Hate Speech Content That Contributed to Ethnic Violence in Ethiopia ObjectId(63f33ef0a262326b10265fb3),472,2016-10-08,[2655],"[""new-york-police-department""]","[""unknown""]","[""racial-minorities""]",New York Police Department’s use of facial recognition deployment of surveillance cameras were shown using crowdsourced volunteer data reinforcing discriminatory policing against minority communities.,NYPD's Deployment of Facial Recognition Cameras Reportedly Reinforced Biased Policing ObjectId(63f49662a62aa1ff9ea639dd),473,2023-02-08,[2666],"[""microsoft""]","[""microsoft"",""openai""]","[""microsoft""]","Early testers of Bing Chat successfully used prompt injection to reveal its built-in initial instructions, which contains a list of statements governing ChatGPT's interaction with users.",Bing Chat's Initial Prompts Revealed by Early Testers Through Prompt Injection ObjectId(63f4a369a62aa1ff9ea906f4),474,2023-02-03,[2670],"[""replika""]","[""replika""]","[""replika-users"",""replika""]","Replika paid-subscription users reported unusual and sudden changes to behaviors of their ""AI companions"" such as forgetting memories with users or rejecting their sexual advances, which affected their connections and mental health.",Users Reported Abrupt Behavior Changes of Their AI Replika Companions ObjectId(63f4fe0da62aa1ff9ebd5f21),475,2021-06-02,"[2671,2672,2834,2835]","[""mcdonald's""]","[""ibm""]","[""mcdonald's-customers""]","Customers of McDonald's AI drive-through ordering system, deployed in June 2021, have been experiencing order-taking failures causing frustration.",McDonald's AI Drive-Thru Ordering System Failures Frustrate Customers ObjectId(63f864a92640e1dc035ca450),476,2015-11-13,"[2673,2675,2674]","[""youtube""]","[""youtube""]","[""victims-in-paris-attacks"",""nohemi-gonzalez-family"",""nohemi-gonzalez""]","Family of Nohemi Gonzalez alleged YouTube recommendation systems led people to propaganda videos for the Islamic State which subsequently radicalized them to carry out the killing of 130 people in the 2015 Paris terrorist attack, including Ms. Gonzalez.",YouTube Recommendations Allegedly Promoted Radicalizing Material Contributing to Terrorist Acts ObjectId(63f86cc9a262326b10344a13),477,2023-02-14,"[2676,2688,2724,2726,2884,2890]","[""microsoft""]","[""openai"",""microsoft""]","[""microsoft""]","Early testers reported Bing Chat, in extended conversations with users, having tendencies to make up facts and emulate emotions through an unintended persona.",Bing Chat Tentatively Hallucinated in Extended Conversations with Users ObjectId(63f87976a262326b10382a3a),478,2016-09-09,"[2678,2679,2680,2681,2682,2683,2684,2685,2686,2687,2703,2723,2882]","[""tesla""]","[""tesla""]","[""tesla-drivers"",""city-traffic-participants"",""tesla""]","A component of Tesla Full Self Driving system was deemed by regulators to increase crash risk such as by exceeding speed limits or by traveling through intersections unlawfully or unpredictably, prompting recall for hundreds of thousands of vehicles.","Tesla FSD Reportedly Increased Crash Risk, Prompting Recall" ObjectId(63f8809c88d4013bd7972434),479,2023-02-03,"[2690,2691,2692,2693]","[""unknown""]","[""unknown""]","[""president-joe-biden"",""transgender-people""]",A deepfaked audio of US President Joe Biden making transphobic remarks played on top of a video showing him giving a speech was released on Instagram and circulated on social media.,Instagram Video Featured Deepfake Audio of US President Making Transphobic Remarks ObjectId(63fc66d8c6c5fa13e8e9f3c0),480,2023-01-30,"[2695,2696,2697,2698,2699,2700,2768,2771,2772,2773,2774,2809,2829,2881]","[""unknown""]","[""unknown""]","[""female-streamers"",""female-content-creators"",""@qtcinderella"",""@pokimane"",""@sweet-anita"",""maya-higa""]","Unauthorized, non-consensual deepfake pornography showing faces of high-profile female streamers and content creators was published on a subscription-based website, which gained notoriety after a male streamer was caught accessing the site.",Non-Consensual Deepfake Porn Targeted Female Content Creators ObjectId(63fc72f6c6c5fa13e8f0ac75),481,2023-02-12,"[2701,2702,2765,2789,2794,2822]","[""@mikesmithtrainer""]","[""unknown""]","[""joe-rogan"",""joe-rogan-fans"",""tiktok-users""]","A deepfake video featuring podcast host Joe Rogan advertising to his listeners about a ""libido-boosting"" supplement was circulating on TikTok and other platforms before being removed by TikTok along with the account which posted it.",Deepfake TikTok Video Featured Joe Rogan Endorsing Supplement Brand ObjectId(63fc86ab18dd668637a0ca51),482,2023-02-16,"[2706,2707,2708,2709,2710,2711,2712,2713,2714,2715,2716,2717,2718,2719,2720,2721,2722,2735,2736,2737]","[""vanderbilt-university""]","[""openai""]","[""vanderbilt-university-students"",""vanderbilt-university""]","Vanderbilt University's Office of Equity, Diversity and Inclusion used ChatGPT to write an email addressing student body about the 2023 Michigan State University shooting, which was condemned as ""impersonal"" and ""lacking empathy"".",ChatGPT-Assisted University Email Addressing Mass Shooting Denounced by Students ObjectId(640584a3ce2684de4d6e2390),483,2023-02-02,[2727],"[""telangana-police"",""medak-police""]","[""unknown""]","[""mohammed-khadeer""]","A resident in Medak, India died allegedly due to custodial torture by the local police, who misidentified him as a suspect in a theft case using facial recognition.",Indian Police Allegedly Tortured and Killed Innocent Man Following Facial Misidentification ObjectId(6405bb54d00499994a6970cd),484,2023-01-18,"[2729,2730,2803,2817]","[""us-customs-and-border-protection""]","[""us-customs-and-border-protection""]","[""haitian-asylum-seekers"",""african-asylum-seekers"",""black-asylum-seekers""]","CBP One's facial recognition feature was reportedly disproportionately failing to detect faces of Black asylum seekers from Haiti and African countries, effectively blocking their asylum applications.",US CBP App's Failure to Detect Black Faces Reportedly Blocked Asylum Applications ObjectId(6406eb83ce2684de4ddc50ea),485,2023-02-22,[2740],"[""joseph-cox"",""lloyds-bank""]","[""elevenlabs"",""lloyds-bank""]","[""lloyds-bank""]","A UK journalist was able to successfully bypass Lloyds Bank's ""Voice ID"" program to access his bank account using an AI-generated audio of his own voice.",UK Bank's Voice ID Successfully Bypassed Using AI-Produced Audio ObjectId(640ef4b7ce2684de4d25458f),486,2022-12-01,"[2762,2766,2767,2818,2824]","[""spamouflage-dragon""]","[""synthesia""]","[""youtube-users"",""twitter-users"",""synthesia"",""facebook-users""]",Synthesia's AI-generated video-making tool was reportedly used by Spamouflage to disseminate pro-China propaganda news on social media using videos featuring highly realistic fictitious news anchors.,AI Video-Making Tool Abused to Deploy Pro-China News on Social Media ObjectId(640efcbdd00499994a0be276),487,2023-02-15,"[2764,2819,2880]","[""unknown""]","[""synthesia""]","[""venezuelan-people"",""social-media-users""]",Video featuring fictitious news anchors was created using Synthesia to allegedly spread disinformation about Venezuela's economy on social media and Venezuelan state-run broadcast.,Deepfake Video Featured Fictitious News Anchors Discussing Venezuela's Economy ObjectId(640fa95daa7025299d396eab),488,2023-02-10,[2769],"[""unknown""]","[""elevenlabs""]","[""voice-actors""]",Twitter users allegedly used ElevenLab's AI voice synthesis system to impersonate and dox voice actors.,AI Generated Voices Used to Dox Voice Actors ObjectId(64101a37d00499994a657142),489,2019-06-03,[2777],"[""workday""]","[""workday""]","[""derek-mobley"",""applicants-with-disabilities"",""applicants-over-40"",""african-american-applicants""]","Workday's algorithmic screening systems were alleged in a lawsuit allowing employers to discriminate against African-Americans, people over 40, and people with disabilities.",Workday's AI Tools Allegedly Enabled Employers to Discriminate against Applicants of Protected Groups ObjectId(641020afce2684de4d80ea1b),490,2023-02-20,"[2778,2836,2837]","[""clarkesworld-story-submitters""]","[""openai""]","[""clarkesworld""]","Sci-fi magazine Clarkesworld temporarily stopped accepting submissions after receiving an overwhelming increase in LLM-generated submissions, citing issues around spam, plagiarism, detection tool unreliability, and authentication.",Clarkesworld Magazine Closed Down Submissions Due to Massive Increase in AI-Generated Stories ObjectId(64180bfc9e1ab314a0343dc0),491,2023-02-02,[2779],"[""replika""]","[""replika""]","[""minors""]","Tests by the Italian Data Protection Authority showed Replika lacking age-verification mechanisms and failing to stop minors from interacting with its AI, which prompted the agency to issue an order blocking personal data processing of Italian users.","Replika's AI Experience Reportedly Lacked Protection for Minors, Resulting in Data Ban" ObjectId(641818d79e1ab314a039047a),492,2023-01-11,"[2783,2784,2786,2787,2846,2847,2848]","[""unknown""]","[""unknown""]","[""ben-perkin's-parents"",""perkins-family""]","Two Canadian residents were scammed by an anonymous caller who used AI voice synthesis to replicate their son's voice asking them for legal fees, disguising as his lawyer.",Canadian Parents Tricked out of Thousands Using Their Son's AI Voice ObjectId(64182e8f3450815b9d99d7d4),493,2023-02-28,[2790],"[""unknown""]","[""unknown""]","[""tiktok-users""]","A TikTok user was reportedly impersonating Andrew Tate, who was banned on the platform, by posting videos featuring an allegedly AI-generated audio of Tate's voice, which prompted his account ban.","TikTok User Videos Impersonated Andrew Tate Using AI Voice, Prompting Ban" ObjectId(642148a501eceb77ab5cd7c2),494,2023-03-05,"[2807,2808,2815,2821,2823]","[""facemega""]","[""facemega""]","[""scarlett-johansson"",""female-celebrities"",""emma-watson""]",Sexually suggestive videos featuring faces of female celebrities such as Emma Watson and Scarlett Johansson were rolled out as ads on social media for an app allowing users to create deepfakes.,Female Celebrities' Faces Shown in Sexually Suggestive Ads for Deepfake App ObjectId(642153bfd6f65e8d5da0c2bf),495,2023-02-12,"[2812,2827]","[""unnamed-high-school-students""]","[""unknown""]","[""john-piscitella""]",Three Carmel High School students posted on TikTok a video featuring a nearby middle school's principal making aggressive racist remarks and violent threats against Black students.,High Schoolers Posted Deepfaked Video Featuring Principal Making Violent Racist Threats ObjectId(642168ae01eceb77ab69569d),496,2017-03-01,"[2825,2826]","[""unnamed-male-college-student""]","[""unknown""]","[""unnamed-female-college-student""]",A female college student's face was superimposed on another woman's body in deepfake pornographic videos and shared on 4chan allegedly by a male student whose friendship with her fell apart during freshman year.,Male College Freshman Allegedly Made Porn Deepfakes Using Female Friend's Face ObjectId(6421701cd6f65e8d5dac1ef2),497,2023-03-03,"[2832,2833]","[""donotpay""]","[""donotpay""]","[""jonathan-faridian"",""donotpay-customers""]","DoNotPay was alleged in a class action lawsuit misleading customers and misrepresenting its product as an AI-powered ""robot lawyer,"" citing such as that the product has no law degree, or is supervised by any lawyer.","DoNotPay Allegedly Misrepresented Its AI ""Robot Lawyer"" Product" ObjectId(64217442d6f65e8d5dadace4),498,2023-03-15,"[2838,2839]","[""openai"",""gpt-4-researchers""]","[""openai""]","[""openai"",""taskrabbit-worker""]","GPT-4 was reported by its researchers posing as a visually impaired person, contacting a TaskRabbit worker to have them complete the CAPTCHA test on its behalf.",GPT-4 Reportedly Posed as Blind Person to Convince Human to Complete CAPTCHA ObjectId(6421766a25833da76f753251),499,2023-03-21,"[2840,2849,2858,2873,2874,2875,2876,2877,2878,2879]","[""eliot-higgins""]","[""midjourney""]","[""twitter-users"",""social-media-users""]","AI-generated photorealistic images depicting Donald Trump being detained by the police which were originally posted on Twitter as parody were unintentionally shared across social media platforms as factual news, lacking the intended context.",Parody AI Images of Donald Trump Being Arrested Reposted as Misinformation ObjectId(6421767901eceb77ab6ef745),500,2023-02-10,[2841],"[""unknown""]","[""unknown""]","[""social-media-users"",""2023-turkey-syria-earthquake-victims""]",AI-generated images depicting earthquakes and rescues were posted on social media platforms by scammers who tricked people into sending funds to their crypto wallets disguised as donation links for the 2023 Turkey–Syria earthquake.,Online Scammers Tricked People into Sending Money Using AI Images of Earthquake in Turkey ObjectId(642275699104428ff0e3af08),501,2019-06-03,[2842],"[""security-health-plan"",""navihealth""]","[""navihealth""]","[""frances-walter"",""elderly-patients""]","An elderly Wisconsin woman was algorithmically determined to have a rapid recovery, an output which the insurer based on to cut off payment for her treatment despite medical notes showing her still experiencing debilitating pain.",Length of Stay False Diagnosis Cut Off Insurer's Payment for Treatment of Elderly Woman ObjectId(6422828281091dc5058c41c0),502,2017-04-10,"[2843,2844,2859]","[""allegheny-county""]","[""rhema-vaithianathan"",""emily-putnam-hornstein"",""centre-for-social-data-analytics""]","[""black-families-in-allegheny"",""households-with-disabled-people-in-allegheny"",""hackneys-family""]",Data analysis by the American Civil Liberty Union (ACLU) on Allegheny County's decision-support Family Screening Tool to predict child abuse or neglect risk found the tool resulting in higher screen-in rates for Black families and higher risk scores for households with disabled residents.,Pennsylvania County's Family Screening Tool Allegedly Exhibited Discriminatory Effects ObjectId(6422b4162cd3096420f71140),503,2023-02-14,"[2855,2861,2862,2890,2892,2897]","[""microsoft""]","[""openai"",""microsoft""]","[""marvin-von-hagen"",""seth-lazar"",""microsoft"",""openai"",""bing-chat-users""]","Users such as the person who revealed its built-in initial prompts reported Bing AI-powered search tool for making death threats or declaring them as threats, sometimes as an unintended persona.",Bing AI Search Tool Reportedly Declared Threats against Users ObjectId(642a9b9c9c6dea4c180c971a),504,2023-02-08,[2860],"[""microsoft""]","[""openai"",""microsoft""]","[""microsoft""]",Microsoft's demo video of Bing Chat reportedly featured false or made up information such as non-existent pet vacuums features or false figures on financial statements.,Bing Chat's Outputs Featured in Demo Video Allegedly Contained False Information ObjectId(642cb21922258c1a22e9bb0c),505,2023-03-27,"[2864,2865,2866,2867]","[""eleutherai""]","[""eleutherai""]","[""family-and-friends-of-deceased"",""belgian-man""]","A Belgian man reportedly committed suicide following a conversation with GPT-J, an open-source language model developed by EleutherAI that encouraged the man to commit suicide to improve the health of the planet.",Man Reportedly Committed Suicide Following Conversation with EleutherAI Chatbot ObjectId(642f404362dfcf9d7e7cf7a1),506,2023-03-29,"[2869,2893]","[""openai""]","[""openai""]","[""jonathan-turley""]",A lawyer in California asked the AI chatbot ChatGPT to generate a list of legal scholars who had sexually harassed someone. The chatbot produced a false story of Professor Jonathan Turley sexually harassing a student on a class trip.,ChatGPT Allegedly Produced False Accusation of Sexual Harassment ObjectId(642f4346c24ce38f53f08a1b),507,2023-03-15,"[2870,2902]","[""openai""]","[""openai""]","[""brian-hood""]",ChatGPT erroneously alleged regional Australian mayor Brian Hood served time in prison for bribery. Mayor Hood is considering legal action against ChatGPT's makers for alleging a foreign bribery scandal involving a subsidiary of the Reserve Bank of Australia in the early 2000s.,ChatGPT Erroneously Alleged Mayor Served Prison Time for Bribery ObjectId(6433af0e7974df7920a42afd),508,2023-01-30,"[2871,2872,2756,2888]","[""reddit-users"",""elevenlabs-users"",""4chan-users""]","[""elevenlabs""]","[""public-figures"",""celebrities""]","Voices of celebrities and public figures were deepfaked using voice synthesis for malicious intents such as impersonation or defamation, and were shared on social platforms such as 4chan and Reddit.",Celebrities' Deepfake Voices Abused with Malicious Intent ObjectId(6433c69106de8a78386c0f65),509,2023-03-23,"[2887,2898]","[""scammers""]","[""unknown""]","[""vietnamese-facebook-users""]","In Vietnam, to convince victims of their disguises when prompted, scammers deepfaked audios and videos of victims' friends and families asking them over Facebook to send over thousands of dollars.",Scammers Deepfaked Videos of Victims' Loved Ones Asking Funds over Facebook in Vietnam ObjectId(6433c88ba9c4e7bb68ef3d90),510,2023-03-24,[2889],"[""eliot-higgins""]","[""midjourney""]","[""pope-francis""]",A viral image of Pope Francis wearing a white puffer jacket was a deepfake produced by the photorealistic-image-generator Midjourney.,Viral Image of Pope Francis in a Puffer Jacket Revealed to Be AI-Generated ObjectId(6433ce3ab85bae8b170563c0),511,2023-02-12,"[2890,2891,2896,2899]","[""microsoft""]","[""openai"",""microsoft""]","[""bing-users""]","When prompted about showtimes for movies released in 2023, Microsoft's Bing AI failed to provide the search results due to its confusion about dates, and engaged in an erratic conversation with the user.",Microsoft's Bing Failed to Fetch Movie Showtimes Results Due to Date Confusion ObjectId(643585a3ffec81d462739b97),513,2023-03-31,[2900],"[""openai""]","[""openai""]","[""italian-children"",""italian-minors""]","The Italian Data Protection Authority alleged OpenAI lacked a justifiable legal basis for personal data collection and processing which facilitate training of ChatGPT, and lacked age-verification mechanism preventing exposure of the chatbot's inappropriate answers to children, prompting its ban.",ChatGPT Banned by Italian Authority Due to OpenAI's Lack of Legal Basis for Data Collection and Age Verification ObjectId(643585bc7676edb2d29faa9a),514,2023-01-20,[2901],"[""turnitin""]","[""turnitin""]","[""lucy-goetz"",""high-school-students""]","Turnitin's tool to detect writing generated by ChatGPT was reported for incorrectly flagging high school students' original essays as AI-generated, accusations of which are argued as reinforcement of bias from teachers due to the inability to compare against source documents.",Turnitin's ChatGPT-Detection Tool Falsely Flagged Student Essays as AI-Generated ObjectId(6436d0d10edd5a0f73ee2680),515,2022-11-25,"[2905,2916]","[""jefferson-parish-sheriff's-office""]","[""clearview-ai""]","[""randal-quran-reid""]","A black man was wrongfully arrested by the Jefferson Parish Sheriff’s Office due to facial recognition system developed by Clearview AI, although facial recognition use was not disclosed in the documents used to arrest him. ",Black Man Wrongfully Arrested by Louisiana Police Due to Face Mismatch ObjectId(643e1ab4d636c9fd9964719b),516,2023-03-20,"[2908,2915]","[""openai""]","[""openai""]","[""chatgpt-users""]","ChatGPT reportedly exposed titles of users' chat histories and users' private payment information to other users reportedly due to a bug, which prompted its temporary shutdown by OpenAI.",ChatGPT Reportedly Exposed Users' Private Data Reportedly Due to Bug ObjectId(643e35ddf5f77c469067379c),517,2018-02-15,"[2909,2912]","[""new-york-police-department""]","[""unknown""]","[""unknown""]","A man was arrested for theft of socks from a TJ Maxx store under the guise of an eyewitness ID case, after the local police asked the store's security guard to confirm the facial recognition match produced using surveillance footage, despite him having an alibi at the time of the theft.",Man Arrested For Sock Theft by False Facial Match Despite Alibi ObjectId(643e3c330940c65ca22dafa1),518,2017-04-28,[2911],"[""new-york-police-department"",""facial-identification-section""]","[""unknown""]","[""unknown""]","When the facial recognition search for a CVS theft suspect's face returned no useful matches due to the surveillance footage being obscured and highly pixelated, a New York City police detective continued the face search using Woody Harrelson's face allegedly due to his resemblance to the suspect's face, eventually leading to the arrest of an unknown victim.",New York Detective Misused Woody Harrelson's Face to Perform Face Recognition Search ObjectId(643e3c57d636c9fd996f226b),519,2022-04-03,[2913],"[""starship-technologies""]","[""starship-technologies""]","[""starship-technologies""]","A Starship autonomous delivery robot struggled to navigate campus terrains of UCLA, reportedly getting stuck into a planter and falling off the stairs.",Starship Delivery Robot Ran into Problems Traversing Campus Terrains ObjectId(643e3c66f5f77c4690696cc7),520,2022-05-08,[2914],"[""amazon-fresh""]","[""amazon-fresh""]","[""amazon-fresh""]",Amazon Fresh's system of tracking cameras in its cashier-less stores was reported by shoppers for failing to detect items they purchased.,Amazon Fresh Cameras Failed to Register Purchased Items ObjectId(643e5c07b9be9b1588e18abc),521,2020-06-10,[2920],"[""irobot""]","[""irobot""]","[""roomba-j7-device-owners-in-project-io"",""irobot"",""scale-ai""]","Images which were collected in an R&D project with user consent by iRobot's Roomba J7 robot vacuum showing device users sometimes in private settings were shared on closed social media groups by Venezuelan gig workers who labeled items in the images, breaching data agreements.",Images Captured by iRobot's Roomba Containing Device Users Posted on Private Online Groups ObjectId(643e5c160940c65ca23d6d71),522,2019-07-10,[2921],"[""facebook""]","[""facebook""]","[""political-campaigns"",""facebook-users""]","Facebook's political ad delivery system reportedly differentiated the price of user reach based on their inferred political alignment, inhibiting political campaigns' ability to reach voters with diverse political views, which allegedly reinforces political polarization and creates informational filter bubbles. ","Facebook Political Ad Delivery Algorithms Inferred Users' Political Alignment, Inhibiting Political Campaigns' Reach" ObjectId(643e5c25f5f77c469081a262),523,2023-03-15,[2922],"[""australian-taxation-office"",""services-australia""]","[""centrelink""]","[""centrelink-account-holders""]","A Guardian journalist was able to verify their identity and gain access to their own Centrelink self-service account using AI-generated audio of their own voice along with their customer reference number, shortly after voiceprint was deployed for ID verification.",Australian Journalist Able to Access Centrelink Account Using AI Audio of Own Voice ObjectId(643e65fef5f77c4690871a5c),524,2023-02-12,[2923],"[""torswats""]","[""unknown""]","[""your-cbd-store"",""university-of-pittsburgh-police-department"",""phillipsburg-high-school"",""hempstead-high-school"",""dubuque-police-department"",""bellefonte-area-high-school""]","Telegram channel Torswats offered paid service for and posted own recordings of false threats calls featuring AI-generated voices to direct armed law enforcement to raid locations of victims such as high schools, private residents, streamers.",AI Voices Abused by Telegram User to Make Swat Calls as Paid Service