Vermillion Posted May 24, 2023 Posted May 24, 2023 https://twitter.com/people/status/1661392672492253187
Blue Monday Posted May 24, 2023 Posted May 24, 2023 The obvious disgusting part of this aside... AI isn't even in a good enough place to support something like this we're all doomed
Attitude Posted May 24, 2023 Posted May 24, 2023 When the suicide hotline AI bot starts telling people to "do it". 1 1
TouchinFree Posted May 24, 2023 Posted May 24, 2023 Caller #10 : Im depressed and I'm contemplating to end my. AI : Would ......you..... like..... me to entertain you with the latest joke, question Mark Caller #10 : Even more depressed now. AI : Thank you. Please participate in the survey to provide better service in the future. Jokes aside, Because this HAS got to be a joke. How terrible and disgusting of a decision. Human connection is key in those situations.
Helios Posted May 24, 2023 Posted May 24, 2023 Let me call the robots and convince them I have a disability to get on SSI.
The7thStranger Posted May 24, 2023 Posted May 24, 2023 (edited) How long did it take Microsoft's chatbot to turn into a Nazi again? Edited May 24, 2023 by The7thStranger
Kylizzle Posted May 24, 2023 Posted May 24, 2023 Eating disorder girls are ruthless and will train that AI to keep them restricting calories
imabadkid Posted May 27, 2023 Posted May 27, 2023 On 5/25/2023 at 6:16 PM, AvadaKedavra said: This should be illegal. IA tbh and the saddest part is this is just the beginning.
Recommended Posts