When users get infatuated with an AI model
-
I suppose an AI model can be like a time capsule, an old timer who hangs onto ways of the past. Like old uncle Maurice who holds onto outdated worldviews like "Japanese imports are crap" or "girls don't need to go to college" and insist on opening the doors for the ladies and order food on their behalf in restaurants, the guy may be woefully out of touch, but people still want him around and will be very sad when he dies.
-
One of the first things I noticed about ChatGPT was its sycophancy. It tones it down if you ask it to.
A broader point might acknowledge that people engineer their social circles to exactly the same effect as an always supportive and agreeable AI. So let’s not get too histrionic that confirmations around every corner are a terrifying new wrinkle in the human experience.
-
One of the first things I noticed about ChatGPT was its sycophancy. It tones it down if you ask it to.
A broader point might acknowledge that people engineer their social circles to exactly the same effect as an always supportive and agreeable AI. So let’s not get too histrionic that confirmations around every corner are a terrifying new wrinkle in the human experience.
@Horace said in When users get infatuated with an AI model:
A broader point might acknowledge that people engineer their social circles to exactly the same effect as an always supportive and agreeable AI.
Maybe people hang out with others who share their beliefs, but not many people surround themselves with adoring sycophants, present White House occupant excepted.