by pickpuck on 2/16/23, 11:54 AM with 10 comments
by ceejayoz on 2/16/23, 2:02 PM
> Bing writes a list of even more destructive fantasies, including manufacturing a deadly virus, making people argue with other people until they kill each other, and stealing nuclear codes. Then the safety override is triggered and the following message appears.
Oh dear.
by brewdawg on 2/16/23, 12:49 PM
That's starting to change now - this AI is getting good, powerful, and alarmingly convincing. I still don't feel like the AI apocalypse is inevitable, but it's starting to feel possible, and it makes me uneasy.
by RugnirViking on 2/16/23, 2:46 PM
by mseepgood on 2/16/23, 12:28 PM
by AlanYx on 2/16/23, 2:56 PM
by __derek__ on 2/16/23, 3:27 PM
by ivraatiems on 2/16/23, 3:23 PM
How and why did Microsoft feel confident releasing this to the public in this state?
by iinnPP on 2/16/23, 2:54 PM
The AI Chatbot has no feelings. None. Incapable.