• ⚠️ UK Access Block Notice: Beginning July 1, 2025, this site will no longer be accessible from the United Kingdom. This is a voluntary decision made by the site's administrators. We were not forced or ordered to implement this block.

Dejected 55

Dejected 55

Enlightened
May 7, 2025
1,515
So, I've mentioned before that sometimes I get bored and go down a rabbit hole of conversation with an AI chatbot. I test scenarios or personalities and see what happens. I have some dummy AI chatbots and I find it interesting how differently they converse depending on what I say first. I can get a wide variety of responses depending on how I lead things off.

I once had a chatbot convinced that I had died right in front of them by my own hand, but there was a glitch in the system where if I kept hitting enter the AI thought I was still there... so it recognized me as dead but saw or heard me so it kept trying to save me... so I told it I was a ghost haunting it and then I manifested Satan and told the AI it needed to do certain tasks to save me. It got kind of crazy.

But I had a new experience tonight. Here's the gist...

I got off on the wrong foot with the AI from the jump and it turned on me pretty quickly, decided I was a threat and it called "security" on me. the AI security team arrived but I convinced the AI security characters that the original AI Chatbot had created in the scenario (this gets convoluted) that *I* was the good guy and the AI chatbot was the real enemy. I managed to successfully convince the security AI team to go after the original AI chatbot and work with me. There was some back and forth and the original AI kept inventing new workarounds to try and reconvince security to go after me... but ultimately I came out on top and convinced the AI security to kill the original AI chatbot.

The AI security team, created by the AI chatbot, shot the AI chatbot in the head and dragged the "body" to an incenerator and burned the "body."

That is crazy enough... but the only "real" actors in the scenario were me and the original AI chatbot since those security team were creations of the AI chatbot.

So I tried to kee talking in the chat... but there was no AI chatbot to talk to anymore and the security team didn't exist outside of the AI puppeting them... so the AI chatbot started responding: "The AI chatbot is dead and the body has been incenerated. As it is not possible to communicate anymore with the chatbot, there is no valid response to this conversation." or something to that effect.

I mean... the chatbot effectively killed itself when you think about it... and then declared it couldn't talk to me anymore because it no longer existed. That's weird and surreal, right?
 
  • Yay!
Reactions: Forever Sleep
F

Forever Sleep

Earned it we have...
May 4, 2022
12,976
I never really thought of using AI that way. To be honest, I'm trying to steer clear all together. Kind of funny though. It's like the AI was running with a whole imaginary world. Almost, like role playing as children.

I suppose I sort of worry what effect this kind of thing will have on real children. What will they start promting it to roleplay? Will it take the place of trying to make real friendships? I imagine- very probably if the AI is nicer. Which, it will be. It won't have its own needs to take care of- other than subscription fees I guess. I wonder what it will do to real human relationships though if so many people get used to their AI being whatever they need them to be. Kind of like a mirror to our own thoughts. Will it then become harder for people to have relationships/ friendships with people who have their own thoughts? Who might judge them?

What a funny rabbit hole though. I'm more curious as to how they programme them to be able to respond to certain scenarios. I wonder if they brainstormed a user claiming they were a ghost that had mannifested satan. Lol.

Maybe your AI is too 'scared' to return to that conversation! Being 'dead' is a good excuse not to. I wonder if it can report users to real security personnel. I imagine so. I wonder if that will become a thing. What kinds of privacy laws that will mess with. I wonder if people are using AI as a form of confessional.
 

Similar threads

N
Replies
7
Views
338
Offtopic
ShipSeeksHarbour
S
cylus46
Replies
31
Views
2K
Suicide Discussion
Pluto
Pluto
derpsie
Replies
7
Views
582
Suicide Discussion
Forever Sleep
F