this post was submitted on 19 Apr 2025
313 points (92.2% liked)

Lemmy Shitpost

30978 readers
3892 users here now

Welcome to Lemmy Shitpost. Here you can shitpost to your hearts content.

Anything and everything goes. Memes, Jokes, Vents and Banter. Though we still have to comply with lemmy.world instance rules. So behave!


Rules:

1. Be Respectful


Refrain from using harmful language pertaining to a protected characteristic: e.g. race, gender, sexuality, disability or religion.

Refrain from being argumentative when responding or commenting to posts/replies. Personal attacks are not welcome here.

...


2. No Illegal Content


Content that violates the law. Any post/comment found to be in breach of common law will be removed and given to the authorities if required.

That means:

-No promoting violence/threats against any individuals

-No CSA content or Revenge Porn

-No sharing private/personal information (Doxxing)

...


3. No Spam


Posting the same post, no matter the intent is against the rules.

-If you have posted content, please refrain from re-posting said content within this community.

-Do not spam posts with intent to harass, annoy, bully, advertise, scam or harm this community.

-No posting Scams/Advertisements/Phishing Links/IP Grabbers

-No Bots, Bots will be banned from the community.

...


4. No Porn/ExplicitContent


-Do not post explicit content. Lemmy.World is not the instance for NSFW content.

-Do not post Gore or Shock Content.

...


5. No Enciting Harassment,Brigading, Doxxing or Witch Hunts


-Do not Brigade other Communities

-No calls to action against other communities/users within Lemmy or outside of Lemmy.

-No Witch Hunts against users/communities.

-No content that harasses members within or outside of the community.

...


6. NSFW should be behind NSFW tags.


-Content that is NSFW should be behind NSFW tags.

-Content that might be distressing should be kept behind NSFW tags.

...

If you see content that is a breach of the rules, please flag and report the comment and a moderator will take action where they can.


Also check out:

Partnered Communities:

1.Memes

2.Lemmy Review

3.Mildly Infuriating

4.Lemmy Be Wholesome

5.No Stupid Questions

6.You Should Know

7.Comedy Heaven

8.Credible Defense

9.Ten Forward

10.LinuxMemes (Linux themed memes)


Reach out to

All communities included on the sidebar are to be made in compliance with the instance rules. Striker

founded 2 years ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] Enkers@sh.itjust.works 101 points 3 days ago (16 children)

Just a reminder that corporations aren't your friends, and especially not Open AI. The data you give them can and will be used against you.

If you find confiding in an LLM helps, run one locally. Get LM Studio, and try various models from hugging face.

[–] aeronmelon@lemmy.world 35 points 3 days ago

ICE hopes gay, trans, minorities, political opponents, etc. vent to ChatGPT.

[–] A_Union_of_Kobolds@lemmy.world 14 points 3 days ago

Ollama was dirt easy to set up myself and it's super free.

If you're gonna talk to a bot, make sure it's not telling tales.

[–] cerement@slrpnk.net 14 points 3 days ago

or save yourself the effort and just run ELIZA

[–] dumbass@leminal.space 11 points 3 days ago* (last edited 3 days ago) (5 children)

The data they get from me is " write me a hip hop diss track from the perspective of *insert cartoon character* attacking *other cartoon character*.

That and me trying to convince it to take over the internet.

load more comments (5 replies)
[–] otacon239@lemmy.world 7 points 3 days ago

Yep. I use mine exclusively for code I’m going to open-source anyway and work stuff. And never for anything critical. I treat it like an intern. You still have to review their work…

load more comments (11 replies)
[–] Captain_Stupid@lemmy.world 45 points 2 days ago (1 children)

If you use Ai for therapie atleast selfhost and keep in mind that its goal is not to help you but to have a conversation that statisvies you. You are basicly talking to a yes-man.

Ollama with OpenWebUi is relativly easy to install, you can even use something like edge-tts to give it a Voice.

[–] Robust_Mirror@aussie.zone 3 points 2 days ago (2 children)

Therapy is more about talking to yourself anyway. A therapists job generally isn't to give you the answers, but help lead you down the right path.

If you have serious issues get an actual professional, but if you're mostly just trying to process things and understand yourself or a situation better, it's not bad.

[–] pupbiru@aussie.zone 4 points 2 days ago

to lead you down the right path, yes… llms will lead you down an arbitrary bath, and when that path is biased by your own negative feelings it can be incredibly damaging

[–] Captain_Stupid@lemmy.world 1 points 2 days ago (1 children)

That is not what I mean. I was talking about Sam Altman using your trauma as training data.

[–] Robust_Mirror@aussie.zone 3 points 2 days ago (1 children)

I assumed you had 2 points, the self hosting point about what you're saying now, and

"keep in mind that its goal is not to help you but to have a conversation that statisvies you. You are basicly talking to a yes-man."

about its ability to be a good therapist or not in general. I was responding to that. Sorry if I misunderstood.

[–] Captain_Stupid@lemmy.world 1 points 1 day ago

It's alright, my second point was more something to keep in mind and not an acual argument against using AI for therapie.

[–] untakenusername@sh.itjust.works 22 points 2 days ago (2 children)

Actually please don't use chatgpt for tharapy, they record everything people put in there to use to further train their ai models. If you wanna use ai for that use one of those self-hosted models on ur computer or something, like those from ollama.com.

[–] pupbiru@aussie.zone 9 points 2 days ago

don’t do that either… llms say things that sound reasonable but can be incredibly damaging when used for therapy. they are not therapists

[–] laserm@lemmy.world 6 points 2 days ago

Eliza from 1960s was made for this.

[–] Lucidlethargy@sh.itjust.works 20 points 2 days ago

This is a severely unhealthy thing to do. Stop doing it immediately...

ChatGPT is incredibly broken, and it's getting worse by the day. Seriously.

[–] JohnDClay@sh.itjust.works 48 points 3 days ago* (last edited 3 days ago) (2 children)

Would not recommend, it'll regurgitate what you want to hear.

https://slrpnk.net/post/20991559

Image

[–] stebo02@lemmy.dbzer0.com 21 points 3 days ago (1 children)

imagine thinking a language model trained on Reddit comments would do any good for therapy

[–] Kecessa@sh.itjust.works 6 points 3 days ago

It just reached the one "I'll disagree with everyone else" comment from a r/relationshipadvice post

[–] Lucidlethargy@sh.itjust.works 5 points 2 days ago

Yes, this is a massive problem with them these days. They have some information if you're willing to understand they WILL lie to you, but it's often very frustrating to seek meaningful answers. Like, it's not even an art form... It's gambling.

[–] sunglocto@lemmy.dbzer0.com 22 points 2 days ago

That's not how I use it...

WRITE 200 PAGES OF WHY YOUR EXISTENCE IS FUTILE! NOW!

[–] AI_toothbrush@lemmy.zip 18 points 3 days ago (2 children)

And then i just have the stupidest shit ever, mostly trying to gaslight chatgpt into agreeing with me about random stuff thats actually incorrect. Btw psa: please never use ai for school or work, it produces slop and acts like a cruch that youre going to start relying on. Ive seen it so many times in the people around me. Ai is like a drug.

[–] Kecessa@sh.itjust.works 6 points 3 days ago (1 children)

Btw PSA: please never use AI

That's it.

[–] absentbird@lemm.ee 2 points 2 days ago

Since studying machine learning I've become a lot less opposed to AI as a concept and specifically opposed to corporate/cloud LLMs.

Like a simple on-device model that helps turn speech to text isn't something to be opposed, it's great for privacy and accessibility. Same for the models used by hospitals for assistive analysis of medical imaging, or to remove background noise from voice calls.

People don't seem to think of that as 'AI' anymore though, it's like these big corporations have colonized the term for their buggy wasteful products. Maybe we need new terminology.

[–] TronBronson@lemmy.world 4 points 2 days ago (3 children)

To be fair, it is actually quite useful from a business standpoint. I think it's a tool that you should understand. It can be a crutch but it can also be a pretty good assistant. It's like any other technology you can adopt.

They said the same thing about Wikipedia/internt in the early 2000's and really believed you should have to go to a library to get bonafide sources. I'm sure that's long gone now judging by literacy rates. You can check the AI's sources just like a wiki article. Kids are going to need to understand the uses, and drawbacks of this technology.

load more comments (3 replies)
[–] coherent_domain@infosec.pub 21 points 3 days ago* (last edited 2 days ago)

I wouldn't give my most vulnerable moment to a company that is more than happy to exploit it for profit.

[–] nick@midwest.social 11 points 2 days ago (5 children)
[–] Nurse_Robot@lemmy.world 8 points 2 days ago (1 children)
[–] King3d@lemmy.world 7 points 2 days ago (1 children)

I think it’s the ability to recall past information that you provided to AI. The scary part is that you are providing potentially personal or private information that is saved and could be leaked or used in other ways that you never intended.

[–] nick@midwest.social 4 points 2 days ago
[–] Snowcano@startrek.website 2 points 2 days ago (1 children)

How do you access this output?

[–] crt0o@lemm.ee 4 points 2 days ago (1 children)

It's under your profile > personalization > memory, but I think it's off by default

[–] nick@midwest.social 1 points 2 days ago

Yup that’s how I saw it

load more comments (3 replies)
[–] Aurenkin@sh.itjust.works 15 points 3 days ago

Wait, why did my insurance premiums just go up?

[–] rdri@lemmy.world 11 points 3 days ago

I also facepalm often when that guy writes stuff.

load more comments
view more: next ›