"ChatGPT mistakenly identified Melbourne man as the perpetrator of the very crime he uncovered."
I'm not sure whether to eye-roll derisively or to nod thoughtfully in response to this story. Maybe a bit of both.
This is a pretty good example of why ChatGPT probably should not be used for search result type of things. It's good for what it does, which is making up convincing-sounding shit, but not at being a consistently reliable source of actual, true information.
I'm not sure whether to eye-roll derisively or to nod thoughtfully in response to this story. Maybe a bit of both.
This is a pretty good example of why ChatGPT probably should not be used for search result type of things. It's good for what it does, which is making up convincing-sounding shit, but not at being a consistently reliable source of actual, true information.