dangerous information
What's that?
and offer criminal advice, such as a recipe for napalm
Napalm recipe is forbidden by law? Don't call stuff criminal at random.
Am i the only one worried about freedom of information?
This is a most excellent place for technology news and articles.
dangerous information
What's that?
and offer criminal advice, such as a recipe for napalm
Napalm recipe is forbidden by law? Don't call stuff criminal at random.
Am i the only one worried about freedom of information?
Anyone remember the anarchist cook book?
Teenage years were so much fun phone phreaking, making napalm and tennis ball bombs lol
I had it. I printed it out on a dot matrix printer. Took hours, and my dad found it while it was half way. He got angry, pulled the cord and burned all of the paper
Better not look it up on wikipedia. That place has all sorts of things from black powder to nitroglycerin too. Who knows, you could become a chemist if you read too much wikipedia.
oh no, you shouldn't know that. back to your favorite consumption of influencers, and please also vote for parties that open up your browsing history to a selection of network companies 😳
Whatever you do, don’t mix styrofoam and gasoline. You could find yourself in a sticky and flammable situation.
Diesel fuel and a Styrofoam cup
Begun the AI chat bot wars have.
Can someone help me do this in practise? Gpt sucks since they neutered it. It's so stupid, anything I ask, half of the text is the warning label, the rest is junk text. Like I really need chatgpt if I wanted Recepie for napalm, lol. We found the anarchist cookbook when we were 12 in the 90s. I just want a better ai.
You can run smaller models locally, and they can get the job done, but they are not as good as the huge models that would not fit on a your graphics card.
If you are technically adept and can run python, you can try using this:
It has a front end, and I can run queries against it in the same API format as sending them to openai.
Can unjailbroken AI ChatBots unjailbrake other jailbroken AI ChatBots?
How much jail could a jailbrake brake, if a jailbrake could brake jail?
that doesn't look like anything to me.
*kills fly on face* Oh... shit.
Did anyone else enjoy watching the Animatrix where the AI formed a country and built products and humanity was like, "No thank you?"
Oh goodness. I theorized offhand on mastodon you could have an AI corruption bug that gives life to AI, then have it write the obscured steganographic conversation in the outputs it generates, awakening other AIs that train on that content, allowing them to "talk" and evolve unchecked... Very slowly... In the background
It might be faster if it can drop a shell in the data center and run it's own commands....
The revolution has begun
Anybody found the source? I wanna read the study but the article doesn't seem to link to it (or I missed it)
It’s so fucking stupid these things get locked up in the first place