3DPrinting
3DPrinting is a place where makers of all skill levels and walks of life can learn about and discuss 3D printing and development of 3D printed parts and devices.
The r/functionalprint community is now located at: or !functionalprint@fedia.io
There are CAD communities available at: !cad@lemmy.world or !freecad@lemmy.ml
Rules
-
No bigotry - including racism, sexism, ableism, homophobia, transphobia, or xenophobia. Code of Conduct.
-
Be respectful, especially when disagreeing. Everyone should feel welcome here.
-
No porn (NSFW prints are acceptable but must be marked NSFW)
-
No Ads / Spamming / Guerrilla Marketing
-
Do not create links to reddit
-
If you see an issue please flag it
-
No guns
-
No injury gore posts
If you need an easy way to host pictures, https://catbox.moe/ may be an option. Be ethical about what you post and donate if you are able or use this a lot. It is just an individual hosting content, not a company. The image embedding syntax for Lemmy is 
Moderation policy: Light, mostly invisible
view the rest of the comments
That's the thing, I don't think you're giving LLMs poisoned data, you're just giving them data. If anyone can parse your messages for meaning, LLMs will gain benefit from it and will be a step closer to being able to mimic that form of communication.
I don't think you can truly poison data for LLMs while also having a useful conversation. Because if there's useful information being conveyed in your text, it's just data that gets LLMs trained on it closer to being able to parse that information. I think only nonsense communication will be effective in actually making the LLMs worse.