Syrus@lemmy.worldtoTechnology@lemmy.world•Jailbroken AI Chatbots Can Jailbreak Other ChatbotsEnglish
12·
10 months agoYou would need to know the recipe to avoid making it by accident.
You would need to know the recipe to avoid making it by accident.
German guilt…
Man shut the fuck up, you’re clueless