this post was submitted on 18 Aug 2024
16 points (86.4% liked)

Cybersecurity

5685 readers
4 users here now

c/cybersecurity is a community centered on the cybersecurity and information security profession. You can come here to discuss news, post something interesting, or just chat with others.

THE RULES

Instance Rules

Community Rules

If you ask someone to hack your "friends" socials you're just going to get banned so don't do that.

Learn about hacking

Hack the Box

Try Hack Me

Pico Capture the flag

Other security-related communities !databreaches@lemmy.zip !netsec@lemmy.world !cybersecurity@lemmy.capebreton.social !securitynews@infosec.pub !netsec@links.hackliberty.org !cybersecurity@infosec.pub !pulse_of_truth@infosec.pub

Notable mention to !cybersecuritymemes@lemmy.world

founded 1 year ago
MODERATORS
 

Copilot Autofix, a new addition to the GitHub Advanced Security service, analyzes vulnerabilities in code and offers code suggestions to help developers fix them.

you are viewing a single comment's thread
view the rest of the comments
[–] ShinkanTrain@lemmy.ml 22 points 3 months ago (3 children)
[–] prex@aussie.zone 7 points 3 months ago

Autofix has now corrected your sentence to:

"We're all going to die."

This is now a perfectly correct sentence in every way.

Thank you for using Autofix.

[–] Gladaed@feddit.org 4 points 3 months ago

True, but unrelated. Llms aren't sentient. They are just a useful tool at times.

[–] hoch@lemmy.world 0 points 3 months ago

Please point to where the language model hurt you