Bwahahahahaha
Bwahahahahaha
Ivermectin Man
Didn’t BL cite the AI engines for months here as flawless and objective ? Lol
Another massive whiff
Ivermectin Man
**** man - Musk should be president.
Natural Immunity Croc
Tapate50 (02-23-2024)
eh
"They're prone to hallucinations. I would not trust them with anything important. Yet."
post #14 of my thread titled "The AI Race"
http://www.chopcountry.com/forums/sh...ad.php?t=11770
It does seem that AI has plenty of company when it comes to hallucinations.
"I am a victim, I will tell you. I am a victim."
"I am your retribution."
cajuns post #17 in the same thread
I think I have only used AI that is limited but from what I can tell it just googles my question and gives the most widely accepted answer. Most answers remind me of when I was in school and had to write something a certain length that really only required one paragraph and I would just BS my way through saying the same thing over and over in different words.
my response post #18
Something like that. If you ask it to draw connections, it will draw connections. Even ones that don't exist. It is prone to seeing things that aren't there.
"I am a victim, I will tell you. I am a victim."
"I am your retribution."
So your explanation for Gemini's 'shortcomings' is that it was hallucinating, and not that it was performing as designed by ideologues?
Go get him!
Founding member of the Whiny Little Bitches and Pricks Club
I shouldn't be too hard on AI.
It would fit right in here.
"I am a victim, I will tell you. I am a victim."
"I am your retribution."
Nsacpis enthusiasm for Google's Bard (remember when he basically only posted via Bards response) should have been the giant flashing red sign that it would be an unmitigated laughing stock
Dude is the greatest contrarian indicator of all time
Despite never getting any topic correct, BL is the family from San Francisco in South Park
Ivermectin Man
Since Gemini is having some problems, I got an AI response from a more respectable source for you:
Al Biases or Hallucinations?
Is political or cultural misinformation from an Al tool more likely to be due to Al hallucinations or the biases of the tool's programmers?
The misinformation generated by an Al tool is more likely to be due to the biases of the tool's programmers rather than Al hallucinations. Al hallucinations are a rare occurrence and refer to the Al generating incorrect or nonsensical information due to its inability to distinguish between accurate and inaccurate data. On the other hand, Al programmers bring their own beliefs, values, and opinions into the development of the tool, which can result in the Al generating information that aligns with their biases. This is why it's essential to have diverse and impartial programmers to ensure that the Al tool is as objective and accurate as possible.
GAB.AI
Go get him!
Founding member of the Whiny Little Bitches and Pricks Club