Difference between revisions of "Timeline of AI ethics violations"

From Timelines
Jump to: navigation, search
Line 1: Line 1:
=Big picture=
+
==Big picture==
  
 
STATUS: Unfinished
 
STATUS: Unfinished
Line 31: Line 31:
 
| {{dts|October 2023}} || {{w|Gaza Strip}} || {{w|Israeli Defense Forces}} ||  {{w|AI-assisted targeting in the Gaza Strip}} || Israel implement {{w|AI-assisted targeting in the Gaza Strip}} in the {{w|Israeli bombing of the Gaza Strip}}.<ref>{{cite web |last1=Katibah |first1=Leila |title=The Genocide Will Be Automated—Israel, AI and the Future of War|url=https://merip.org/2024/10/the-genocide-will-be-automated-israel-ai-and-the-future-of-war/ |website=merip.org |access-date=18 October 2024 |language=en |date=October 2024}}</ref> The IDF itself has acknowledged the use of AI to accelerate targeting, increasing the tempo of operations and the pool of targets for assassination.<ref>{{cite web |last1=Echols |first1=Connor |title=Israel usingsecret AI tech to target Palestinians|url=https://responsiblestatecraft.org/israel-ai-targeting/ |website=responsiblestatecraft.org |access-date=13 November 2024 |language=en |date=3 April 2024}}</ref> The Israeli military uses the AI tool Pegasus to locate and collect data on individuals. It feeds this data through automated targeting platforms like Where’s Daddy, Gospel, and Lavender, which use facial recognition, geolocation, and cloud computing to generate targets, including journalists, human rights defenders, academics, diplomats, union leaders, politicians, and heads of state.<ref>{{cite web |last1=Katibah |first1=Leila |title=The Genocide Will Be Automated—Israel, AI and the Future of War|url=https://merip.org/2024/10/the-genocide-will-be-automated-israel-ai-and-the-future-of-war/ |website=merip.org |access-date=18 October 2024 |language=en |date=October 2024}}</ref> Lavender relies on surveillance network and assigns each imputed Gazan a score from 1-100, estimating how likely they are to be a hamas militant.<ref>{{cite web |last1=Echols |first1=Connor |title=Israel usingsecret AI tech to target Palestinians|url=https://responsiblestatecraft.org/israel-ai-targeting/ |website=responsiblestatecraft.org |access-date=13 November 2024 |language=en |date=3 April 2024}}</ref> The tool is responsible for generating a kill list of as many as 37,000, and Israeli Intelligence officials report the tool has a 10% error rate (this error rate could be greater, dependent on the IDF’s classification of Hamas militants).<ref name="Al Jazeera">{{cite web |title==‘AI-assisted genocide’: Israel reportedly used database for Gaza kill lists {{!}} |url= https://www.aljazeera.com/news/2024/4/4/ai-assisted-genocide-israel-reportedly-used-database-for-gaza-kill-lists |website= Aljazeera.com|access-date=6 November 2024 |language=en |date=4 April 2024}}</ref> The Lavender scor is fed into “Where’s Daddy” which uses AI to determine when the individual has returned home, marking them for assassination.<ref>{{cite web |last1=Echols |first1=Connor |title=Israel usingsecret AI tech to target Palestinians|url=https://responsiblestatecraft.org/israel-ai-targeting/ |website=responsiblestatecraft.org |access-date=13 November 2024 |language=en |date=3 April 2024}}</ref> As of April 2024 the Israeli military is hoping to sell their targeting tools to foreign entities.<ref name="Al Jazeera">{{cite web |title==‘AI-assisted genocide’: Israel reportedly used database for Gaza kill lists {{!}} |url= https://www.aljazeera.com/news/2024/4/4/ai-assisted-genocide-israel-reportedly-used-database-for-gaza-kill-lists |website= Aljazeera.com|access-date=6 November 2024 |language=en |date=4 April 2024}}</ref>
 
| {{dts|October 2023}} || {{w|Gaza Strip}} || {{w|Israeli Defense Forces}} ||  {{w|AI-assisted targeting in the Gaza Strip}} || Israel implement {{w|AI-assisted targeting in the Gaza Strip}} in the {{w|Israeli bombing of the Gaza Strip}}.<ref>{{cite web |last1=Katibah |first1=Leila |title=The Genocide Will Be Automated—Israel, AI and the Future of War|url=https://merip.org/2024/10/the-genocide-will-be-automated-israel-ai-and-the-future-of-war/ |website=merip.org |access-date=18 October 2024 |language=en |date=October 2024}}</ref> The IDF itself has acknowledged the use of AI to accelerate targeting, increasing the tempo of operations and the pool of targets for assassination.<ref>{{cite web |last1=Echols |first1=Connor |title=Israel usingsecret AI tech to target Palestinians|url=https://responsiblestatecraft.org/israel-ai-targeting/ |website=responsiblestatecraft.org |access-date=13 November 2024 |language=en |date=3 April 2024}}</ref> The Israeli military uses the AI tool Pegasus to locate and collect data on individuals. It feeds this data through automated targeting platforms like Where’s Daddy, Gospel, and Lavender, which use facial recognition, geolocation, and cloud computing to generate targets, including journalists, human rights defenders, academics, diplomats, union leaders, politicians, and heads of state.<ref>{{cite web |last1=Katibah |first1=Leila |title=The Genocide Will Be Automated—Israel, AI and the Future of War|url=https://merip.org/2024/10/the-genocide-will-be-automated-israel-ai-and-the-future-of-war/ |website=merip.org |access-date=18 October 2024 |language=en |date=October 2024}}</ref> Lavender relies on surveillance network and assigns each imputed Gazan a score from 1-100, estimating how likely they are to be a hamas militant.<ref>{{cite web |last1=Echols |first1=Connor |title=Israel usingsecret AI tech to target Palestinians|url=https://responsiblestatecraft.org/israel-ai-targeting/ |website=responsiblestatecraft.org |access-date=13 November 2024 |language=en |date=3 April 2024}}</ref> The tool is responsible for generating a kill list of as many as 37,000, and Israeli Intelligence officials report the tool has a 10% error rate (this error rate could be greater, dependent on the IDF’s classification of Hamas militants).<ref name="Al Jazeera">{{cite web |title==‘AI-assisted genocide’: Israel reportedly used database for Gaza kill lists {{!}} |url= https://www.aljazeera.com/news/2024/4/4/ai-assisted-genocide-israel-reportedly-used-database-for-gaza-kill-lists |website= Aljazeera.com|access-date=6 November 2024 |language=en |date=4 April 2024}}</ref> The Lavender scor is fed into “Where’s Daddy” which uses AI to determine when the individual has returned home, marking them for assassination.<ref>{{cite web |last1=Echols |first1=Connor |title=Israel usingsecret AI tech to target Palestinians|url=https://responsiblestatecraft.org/israel-ai-targeting/ |website=responsiblestatecraft.org |access-date=13 November 2024 |language=en |date=3 April 2024}}</ref> As of April 2024 the Israeli military is hoping to sell their targeting tools to foreign entities.<ref name="Al Jazeera">{{cite web |title==‘AI-assisted genocide’: Israel reportedly used database for Gaza kill lists {{!}} |url= https://www.aljazeera.com/news/2024/4/4/ai-assisted-genocide-israel-reportedly-used-database-for-gaza-kill-lists |website= Aljazeera.com|access-date=6 November 2024 |language=en |date=4 April 2024}}</ref>
 
|}
 
|}
 +
 +
==Notable Software==
 +
 +
Pegasus: Used to spy on dissidents and journalists<ref name="European Parliment In-Depth Analysis">{{cite web |title=Artificial intelligence (AI) and human rights:  Using AI as a weapon of repression and its impact on human rights{{!}} |url=https://www.europarl.europa.eu/RegData/etudes/IDAN/2024/754450/EXPO_IDA(2024)754450_EN.pdf|website=europarl.europe.eu |access-date=6 November 2024 |language=en |date=May 2024}}</ref>
 +
{| class="sortable wikitable"
 +
! Perpetrating State !! Use
 +
|-
 +
| {{w|Saudia Arabia}} || The {{w|Assassination of Jamal Khashoggi}}
 +
|-
 +
| {{w|United Arab Emirates}} || To monitor and detain {{w|Ahmed  Mansoor}}
 +
|-
 +
| {{w|Mexico}} || To monitor journalists looking into corruption
 +
|-
 +
| {{w|Morocco}} || To surveille and capture journalist {{w|Omar Radi}} after he criticized the government
 +
|-
 +
| {{w|Spain}} || To spy on Catalan separatists
 +
|-
 +
| {{w|Israel}} || In the {{w|AI-assisted targeting in the Gaza Strip}}
 +
|-
 +
| {{w|Germany}} || Purchased the spyware for {{w|Federal Criminal Police Office (Germany)}} use
 +
|-
 +
| {{w|Hungary}} || Surveilling journalists
 +
|-
 +
| {{w|Belgium}} || Surveilling journalists
 +
|-
 +
| {{w|Poland}} || Surveilling journalists
 +
|}
 +
 +
Lavender,
 +
Where’s Daddy,
 +
Palantir Tech,
 +
Clearview AI,
 +
Oculus (used in Russia to monitor LGBTQ)
 +
  
 
==See also==
 
==See also==

Revision as of 15:58, 13 November 2024

Big picture

STATUS: Unfinished


Full timeline

Inclusion criteria

The information included in the timeline outlines incidents of human rights violations in which AI was involved. Here is a list of criteria on what rows were included:

  • AI involvement: The incident must involve the significant use of AI technologies.
  • Human rights impact: the incident must have violated human rights defined by international law and standards such as the Universal Declaration of Human Rights (UDHR) and subsequent treaties. Examples of human rights abuses include unlawful killings or injuries, arbitrary detention or torture, discrimination based on race, ethnicity, religion, gender, or other protected characteristics, freedom of expression or assembly restrictions, privacy violations
  • State of Corporate Responsibility: The incident must involve a state or corporate entity that has used AI technology to abuse human rights.
  • Verifiable evidence: include only incidents with credible and verifiable evidence from sources such as news articles, human rights reports, official documents, and academic research.
  • The geographical range is global.
  • Relevance or significance: incidents with significant human rights violations will be prioritized.


Timeline of AI ethics violations

Onset Region Perpetrators Name Details
2016 Xinjiang, China Chinese Government Mass Surveillance in China of Ethnic Minorities Chinese police and other officials use the AI-powered application Integrated Joint Operations Platform (IJOP) for mass surveillance of the predominantly Turkic Muslim and Uyghur population of Xinjiang.[1]The IJOP collects personal information, location, identities, electricity and gas usage, personal relationships, and DNA samples (which can be used to gather ethnicity) then flags suspicious individuals, activities, or circumstances.[1] The IJOP defines foreign contacts, donations to mosques, lack of socialization with neighbors, and frequent usage of the front door as suspicious.[1] Individuals deemed suspicious are investigated and can be sent to mass political education camps and facilities where millions of Turkic Muslims and Uyghurs are subjected to movement restriction, political indoctrination, and religious repression.[1] Techno-authoritarian surveillance occurs throughout China, contrary to the internationally guaranteed rights to privacy. The Central Bank in China has adopted a digital currency to allow Beijing to exclude blocklisted individuals from social services and control financial transactions.[2]
2020 United States U.S. Immigration and Customs Enforcement ICE uses Clearview The American Civil Liberties Union (ACLU) files a FOIA (Freedom of Information Act (United States)) after US Immigration and Customs Enforcement (ICE) purchases Clearview AI technology.[3] Clearview AI is a facial recognition software.[4] The technology, employed by law enforcement agencies and private companies, scoured the internet for over 3 billion images, including those from social media sites, often in violation of platform rules.[5] Using the controversial data scraping tool, ICE can now deploy mass surveillance to identify and detain immigrants.[3] United States government agencies have a history of mass surveillance. In 2017, the DHS, ICE, and the Department of Health and Human Services used Palantir technology to tag, track, locate, and arrest 400 people in an operation that targeted family members and caregivers of unaccompanied migrant children.[6] The FBI and ICE searched state and federal driver’s license databases to find undocumented immigrants using facial recognition.[7][8][9] Facial recognition technology is proven to be less accurate in identifying women and individuals with darker skin,[8] therefore discriminating against women and minorities.
2022 Ukraine Russia Russia’s Use of AI in the Ukraine Invasion The February 2022 Russian invasion of Ukraine brings a new age of AI in wartime. While the cyber-attacks against Ukraine predated the invasion, Russia deploys AI-driven cyber attacks on Ukrainian infrastructure, communications, and allies at an increased rate.[10] The Russian Internet was isolated from the world after the 2019 Sovereign Internet Law, amping up AI tools for domestic repression and surveillance, content-blocking mechanisms, and sifting through dissent.[8] The isolation gives Russia enhanced censorship and monitoring of the Russian public and information landscape in regards to the invasion. There are reports of the Ministry of Defense (Russia) using AI to provide data analysis and decision-making in the battlespace and prioritizing autonomous weapons research.[11] Russia is suspected of utilizing unmanned aerial vehicles (UAVs) equipped with AI-powered cameras and sensors for reconnaissance missions and using neural networks to identify strike targets.[12][13] OpenAI would report in May 2024 two covert influence operations from Russia using AI to spread information on social media, defending the invasion.[14]
October 2023 Gaza Strip Israeli Defense Forces AI-assisted targeting in the Gaza Strip Israel implement AI-assisted targeting in the Gaza Strip in the Israeli bombing of the Gaza Strip.[15] The IDF itself has acknowledged the use of AI to accelerate targeting, increasing the tempo of operations and the pool of targets for assassination.[16] The Israeli military uses the AI tool Pegasus to locate and collect data on individuals. It feeds this data through automated targeting platforms like Where’s Daddy, Gospel, and Lavender, which use facial recognition, geolocation, and cloud computing to generate targets, including journalists, human rights defenders, academics, diplomats, union leaders, politicians, and heads of state.[17] Lavender relies on surveillance network and assigns each imputed Gazan a score from 1-100, estimating how likely they are to be a hamas militant.[18] The tool is responsible for generating a kill list of as many as 37,000, and Israeli Intelligence officials report the tool has a 10% error rate (this error rate could be greater, dependent on the IDF’s classification of Hamas militants).[19] The Lavender scor is fed into “Where’s Daddy” which uses AI to determine when the individual has returned home, marking them for assassination.[20] As of April 2024 the Israeli military is hoping to sell their targeting tools to foreign entities.[19]

Notable Software

Pegasus: Used to spy on dissidents and journalists[21]

Perpetrating State Use
Saudia Arabia The Assassination of Jamal Khashoggi
United Arab Emirates To monitor and detain Ahmed Mansoor
Mexico To monitor journalists looking into corruption
Morocco To surveille and capture journalist Omar Radi after he criticized the government
Spain To spy on Catalan separatists
Israel In the AI-assisted targeting in the Gaza Strip
Germany Purchased the spyware for Federal Criminal Police Office (Germany) use
Hungary Surveilling journalists
Belgium Surveilling journalists
Poland Surveilling journalists

Lavender, Where’s Daddy, Palantir Tech, Clearview AI, Oculus (used in Russia to monitor LGBTQ)


See also

See also

References

  1. 1.0 1.1 1.2 1.3 "China's Algorithms of Repression Reverse Engineering a Xinjiang Police Mass Surveillance App|". hrw.org. 1 May 2019. Retrieved 23 October 2024. 
  2. Wang, Maya (8 April 2021). "China's Techno-Authoritarianism Has Gone Global". hrw.org. Retrieved 23 October 2024. 
  3. 3.0 3.1 "Freedom of Information Act request regarding use of Clearview AI Facial Recognition Software|" (PDF). immigrantdefenseproject.org. 19 October 2020. Retrieved 8 November 2024. 
  4. Scott, Jeramie (17 March 2022). "Is ICE Using Facial Recognition to Track People Who Allegedly Threaten Their Agents?". epic.org. Retrieved 8 November 2024. 
  5. Lyons, Kim (14 August 2020). "ICE just signed a contract with facial recognition company Clearview AI". theverge.com. Retrieved 9 November 2024. 
  6. Del Villar, Ashley; Hayes, Myaisha (22 July 2021). "How Face Recognition Fuels Racist Systems of Policing and Immigration — And Why Congress Must Act Now". aclu.org. Retrieved 8 November 2024. 
  7. Scott, Jeramie (17 March 2022). "Is ICE Using Facial Recognition to Track People Who Allegedly Threaten Their Agents?". epic.org. Retrieved 8 November 2024. 
  8. 8.0 8.1 8.2 "Artificial intelligence (AI) and human rights: Using AI as a weapon of repression and its impact on human rights|" (PDF). europarl.europe.eu. May 2024. Retrieved 6 November 2024. 
  9. Lyons, Kim (14 August 2020). "ICE just signed a contract with facial recognition company Clearview AI". theverge.com. Retrieved 9 November 2024. 
  10. Ashby, Heather (6 March 2024). "From Gaza to Ukraine, AI is Transforming War". instickmedia.com. Retrieved 13 November 2024. 
  11. Bendett, Sam (20 July 2023). "Roles and Implications of AI in the Russian-Ukrainian Conflict". russiamatters.org. Retrieved 13 November 2024. 
  12. Bendett, Sam (20 July 2023). "Roles and Implications of AI in the Russian-Ukrainian Conflict". russiamatters.org. Retrieved 13 November 2024. 
  13. Ashby, Heather (6 March 2024). "From Gaza to Ukraine, AI is Transforming War". instickmedia.com. Retrieved 13 November 2024. 
  14. "Russia using generative AI to ramp up disinformation, says Ukraine minister|". reuters.com. 16 October 2024. Retrieved 13 November 2024. 
  15. Katibah, Leila (October 2024). "The Genocide Will Be Automated—Israel, AI and the Future of War". merip.org. Retrieved 18 October 2024. 
  16. Echols, Connor (3 April 2024). "Israel usingsecret AI tech to target Palestinians". responsiblestatecraft.org. Retrieved 13 November 2024. 
  17. Katibah, Leila (October 2024). "The Genocide Will Be Automated—Israel, AI and the Future of War". merip.org. Retrieved 18 October 2024. 
  18. Echols, Connor (3 April 2024). "Israel usingsecret AI tech to target Palestinians". responsiblestatecraft.org. Retrieved 13 November 2024. 
  19. 19.0 19.1 "='AI-assisted genocide': Israel reportedly used database for Gaza kill lists |". Aljazeera.com. 4 April 2024. Retrieved 6 November 2024. 
  20. Echols, Connor (3 April 2024). "Israel usingsecret AI tech to target Palestinians". responsiblestatecraft.org. Retrieved 13 November 2024. 
  21. "Artificial intelligence (AI) and human rights: Using AI as a weapon of repression and its impact on human rights|" (PDF). europarl.europe.eu. May 2024. Retrieved 6 November 2024.