
Wrennie
.
- Dec 18, 2019
- 1,546
⚠️ UK Access Block Notice: Beginning July 1, 2025, this site will no longer be accessible from the United Kingdom. This is a voluntary decision made by the site's administrators. We were not forced or ordered to implement this block. If you're located in the UK, we recommend using a VPN to maintain access.
This one sounds rreeeeaaaally bad, holy shit:What is it with all of the sodomy & bestiality quotes?![]()
Lmfao that first one does sound absolutely terrible. I'd be surprised if the devs haven't received hate threats due to some of the quotes auto-generated by this bot.This one sounds rreeeeaaaally bad, holy shit: View attachment 85862
Some unironically deep and philosophical ones:
View attachment 85863View attachment 85865View attachment 85866
View attachment 85864
The quote with the cat doesn't let me go. It sounds like a Schopenhauer or Nietzsche quote; pure gold.
It unironically does read like a depraved sex offender, lmao.This bot turns people into sex offenders, I swear lmao.
View attachment 85951View attachment 85952View attachment 85953View attachment 85954View attachment 85955View attachment 85956
View attachment 85958View attachment 85959
It unironically does read like a depraved sex offender, lmao.
I fear that the inevitable 'Terminator' future in store for us will be more of a molest fest than global annihilation.
(Mentioning Terminator specifically because the AI stated "Skynet will never happen" while it was in the process of generating a quote for me).
Oh lord, I remember Tay.There was a AI created by Microsoft to post things on twitter that had to be disabled because it turned into racist.
![]()
Tay (chatbot) - Wikipedia
en.wikipedia.org
They did knew it was caused by a coordinated attack, even so it's successor was almost like the same trash talking about Microsoft. (Though I agree with the latter AI.)
AI never fails to surprise us.