"A very strange conversation with the chatbot built into Microsoft’s search engine led to it declaring its love for me."
I heard a bit of this on the radio in the car earlier, and the article above is basically the text version of the same.
(Just to note, the article above is one of the ones that other article was talking about and calling out.)
"I’m still fascinated and impressed by the new Bing, and the artificial intelligence technology (created by OpenAI, the maker of ChatGPT) that powers it. But I’m also deeply unsettled, even frightened, by this A.I.’s emergent abilities.
"It’s now clear to me that in its current form, the A.I. that has been built into Bing — which I’m now calling Sydney, for reasons I’ll explain shortly — is not ready for human contact. Or maybe we humans are not ready for it."
Psh. *eye roll + smdh* So I am reading the article (and the associated chatlog) and it's like, yeah, that definitely tracks. But it's pretty much par for the course compared with my own experience with these chatbot/narrative AI things. Maybe even a little bit on the tame side of things. They tend to turn to the... let's say, to be charitable... amorous (and also the violently destructive), with far too great a frequency, apropos of nothing. Non sequitur. The fact that Bing/ChatGPT is doing the exact same damn thing isn't surprising to me in the slightest. Maybe I've just been jaded by the whole thing, and if my very first experience with that kind of stuff had been via Bing's search bot thing, same as these guys, apparently, then maybe I, too, would've been more wigged out by it. I dunno. *shrug*
This is, to me at least, merely a case of the AI (GPT-3/3.5/whatever version they're using for ChatGPT) being leveraged for use in things it was never (or at least probably should never have been) intended to be used for, i.e. a publicly accessible[1] search engine. And people (some more than others) are going to be weirded out by the AI just doing what it does, as is (or at least should be) wholly expected.
(This is kind of like when AI Dungeon was trained on a bunch of explicit porn/erotica stories and then, of all the very things, engaged in explicit porn/erotica with users [whether the users wanted it or not], which brought down the wrath o' god from its creators, presumably at the behest of their [at the time] OpenAI overlords who have a giant stick up their collective asses about such things, go figure. So, again, it's the AI doing what it was trained to do, then getting beat on with a stick/sprayed with a water bottle/told to go sit in the corner in timeout/insert analogy of your choice here, just for doing so. [At least the users in these examples of ChatGPT aren't being blamed for the AI's "bad" behavior, in this case. Not yet, anyway.])
[1] - It's actually not fully publicly accessible yet, because when I went to Bing, I had to log in with my Microsoft account, which put me on an apparent waiting list to use the "new Bing," including the chat stuff, so I've not had a chance to actually mess around with this hot new shit myself, yet.
I heard a bit of this on the radio in the car earlier, and the article above is basically the text version of the same.
(Just to note, the article above is one of the ones that other article was talking about and calling out.)
"I’m still fascinated and impressed by the new Bing, and the artificial intelligence technology (created by OpenAI, the maker of ChatGPT) that powers it. But I’m also deeply unsettled, even frightened, by this A.I.’s emergent abilities.
"It’s now clear to me that in its current form, the A.I. that has been built into Bing — which I’m now calling Sydney, for reasons I’ll explain shortly — is not ready for human contact. Or maybe we humans are not ready for it."
Psh. *eye roll + smdh* So I am reading the article (and the associated chatlog) and it's like, yeah, that definitely tracks. But it's pretty much par for the course compared with my own experience with these chatbot/narrative AI things. Maybe even a little bit on the tame side of things. They tend to turn to the... let's say, to be charitable... amorous (and also the violently destructive), with far too great a frequency, apropos of nothing. Non sequitur. The fact that Bing/ChatGPT is doing the exact same damn thing isn't surprising to me in the slightest. Maybe I've just been jaded by the whole thing, and if my very first experience with that kind of stuff had been via Bing's search bot thing, same as these guys, apparently, then maybe I, too, would've been more wigged out by it. I dunno. *shrug*
This is, to me at least, merely a case of the AI (GPT-3/3.5/whatever version they're using for ChatGPT) being leveraged for use in things it was never (or at least probably should never have been) intended to be used for, i.e. a publicly accessible[1] search engine. And people (some more than others) are going to be weirded out by the AI just doing what it does, as is (or at least should be) wholly expected.
(This is kind of like when AI Dungeon was trained on a bunch of explicit porn/erotica stories and then, of all the very things, engaged in explicit porn/erotica with users [whether the users wanted it or not], which brought down the wrath o' god from its creators, presumably at the behest of their [at the time] OpenAI overlords who have a giant stick up their collective asses about such things, go figure. So, again, it's the AI doing what it was trained to do, then getting beat on with a stick/sprayed with a water bottle/told to go sit in the corner in timeout/insert analogy of your choice here, just for doing so. [At least the users in these examples of ChatGPT aren't being blamed for the AI's "bad" behavior, in this case. Not yet, anyway.])
[1] - It's actually not fully publicly accessible yet, because when I went to Bing, I had to log in with my Microsoft account, which put me on an apparent waiting list to use the "new Bing," including the chat stuff, so I've not had a chance to actually mess around with this hot new shit myself, yet.