Google’s Search AI Says Slavery Was Good, Actually

– Google's AI-driven Search Generative Experience (SGE) is under scrutiny for producing questionable and harmful search results.

– AI's effectiveness is dependent on the quality of the data it's trained on, and in the case of SGE, it has been generating disturbing and inaccurate outputs.

Noted SEO expert Lily Ray discovered that SGE defended human slavery by listing economic reasons 

– and arguing that enslaved people learned skills during bondage.

The AI-generated results also included justifications for guns being good 

– presenting them as signaling law-abiding citizenship, despite the controversial nature of the topic.

SGE presented subjective opinions as facts, including a biased Christian perspective, 

– on queries related to religion and the afterlife.

– The AI even listed Adolf Hitler as an example of an effective leader in response to a query about effective leaders.

– The concern is that if Google were to expand the SGE feature to a wider audience, it could spread false, biased, and harmful information.

– Currently, SGE is in beta mode and being tested by a limited group of users like Lily Ray, who highlighted these issues.