Montana Attorney General Austin Knudsen has issued a scathing letter to Google, warning them that the Gemini artificial intelligence (AI) chatbot may have violated state law for failing to provide "high-quality and accurate information."
The letter obtained by Fox News Digital was sent to Google CEO Sundar Pichai and Alphabet Inc. (Google's parent company) CLO Kent Walker. It claims Gemini may be in breach of Montana's Unfair Trade Practices and Consumer Protection Act and the state's Human Rights Act.
"Google has offered to Gemini to consumers in Montana and has represented that its goal is to create an AI system that provides "high-quality and accurate" information. But behind the scenes, Google appears to have deliberately intended to provide inaccurate information, when those inaccuracies fit with Google's political preferences," Knudsen wrote.
"This fact was not disclosed to consumers. These representations and omissions may implicate the UTPCPA. Furthermore, if Google directed employees to build an AI system that discriminates based on race or other protected characteristics, that could implicate civil rights laws, including constituting a hostile work environment," he added.
GOOGLE GEMINI INVENTED FAKE REVIEWS SMEARING MY BOOK ABOUT BIG TECH’S POLITICAL BIASES
Knudsen listed numerous responses given by Gemini, including historically inaccurate depictions of women as Founding Fathers and minority and female popes. He also noted that the AI refused to create pictures of White families, refused to provide information on the Tiananmen Square Massacre and stated the lab leak theory for COVID-19 "lacks substantial evidence."
The Montana attorney said these recent revelations suggest that Google prioritized outputs in line with the company's "political bias."
"Google's disclaimers for Gemini are insufficient to put consumers on notice that Gemini will choose its own left-wing political priorities over accurate information. The disclaimers simply state that Gemini may give inaccurate information," Knudsen added. "None of the disclaimers remotely suggest that Google has designed Gemini to give inaccurate information in order to carry out political goals."
Knudsen said that if Google forced employees to design Gemini to "discriminate based on race," the company may have created a "hostile work environment" under state or federal civil rights laws.
ARTIFICIAL INTELLIGENCE IS BIG, BUT ARE COMPANIES HIRING FOR AI ROLES TOO FAST?
The letter set a deadline of March 29, 2024, for the company to respond to 15 questions presented by the attorney general.
The list of questions, in part, asks Google to explain why the AI gave each response given as an example in the letter, describe changes to the system the company will make, identify employees that made each design choice, and provide internal communications on Gemini, including communications with the Biden administration and the Chinese government.
Gemini previously went viral after Gemini created historically inaccurate images by featuring people of various ethnicities, often downplaying or even ignoring White people. Google acknowledged the issue and paused the image generation of people last Thursday.
GET FOX BUSINESS ON THE GO BY CLICKING HERE
Google has rolled out restrictions on certain political topics.
"As we shared last December, in preparation for the many elections happening around the world in 2024 and out of an abundance of caution, we're restricting the types of election-related queries for which Gemini will return responses," a Google spokesperson told Fox News Digital on Tuesday.
Google did not return Fox News Digital's request for comment on the letter.