this post was submitted on 21 Mar 2025
1428 points (99.3% liked)

Technology

67242 readers
6007 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] 4am@lemm.ee 307 points 2 days ago (3 children)

Imagine how much power is wasted on this unfortunate necessity.

Now imagine how much power will be wasted circumventing it.

Fucking clown world we live in

[–] zovits@lemmy.world 1 points 22 hours ago

From the article it seems like they don't generate a new labyrinth for every single time: Rather than creating this content on-demand (which could impact performance), we implemented a pre-generation pipeline that sanitizes the content to prevent any XSS vulnerabilities, and stores it in R2 for faster retrieval."

[–] Demdaru@lemmy.world 55 points 2 days ago (2 children)

On on hand, yes. On the other...imagine frustration of management of companies making and selling AI services. This is such a sweet thing to imagine.

[–] halfapage@lemmy.world 86 points 2 days ago (2 children)

My dude, they'll literally sell services to both sides of the market.

[–] Melvin_Ferd@lemmy.world -1 points 2 days ago (2 children)

I just want to keep using uncensored AI that answers my questions. Why is this a good thing?

[–] explodicle@sh.itjust.works 10 points 2 days ago (1 children)

Because it only harms bots that ignore the "no crawl" directive, so your AI remains uncensored.

[–] CileTheSane@lemmy.ca 5 points 2 days ago (2 children)

Because it's not AI, it's LLMs, and all LLMs do is guess what word most likely comes next in a sentence. That's why they are terrible at answering questions and do things like suggest adding glue to the cheese on your pizza because somewhere in the training data some idiot said that.

The training data for LLMs come from the internet, and the internet is full of idiots.

[–] Lifter@discuss.tchncs.de 4 points 2 days ago

LLM is a subset of AI

[–] Melvin_Ferd@lemmy.world -1 points 2 days ago (1 children)

That's what I do too with less accuracy and knowledge. I don't get why I have to hate this. Feels like a bunch of cavemen telling me to hate fire because it might burn the food

[–] CileTheSane@lemmy.ca 3 points 1 day ago (1 children)

Because we have better methods that are easier, cheaper, and less damaging to the environment. They are solving nothing and wasting a fuckton of resources to do so.

It's like telling cavemen they don't need fire because you can mount an expedition to the nearest valcanoe to cook food without the need for fuel then bring it back to them.

The best case scenario is the LLM tells you information that is already available on the internet, but 50% of the time it just makes shit up.

[–] Melvin_Ferd@lemmy.world -2 points 1 day ago (1 children)

Wasteful?

Energy production is an issue. Using that energy isn't. LLMs are a better use of energy than most of the useless shit we produce everyday.

[–] CileTheSane@lemmy.ca 1 points 19 hours ago

Did the LLMs tell you that? It's not hard to look up on your own:

Data centers, in particular, are responsible for an estimated 2% of electricity use in the U.S., consuming up to 50 times more energy than an average commercial building, and that number is only trending up as increasingly popular large language models (LLMs) become connected to data centers and eat up huge amounts of data. Based on current datacenter investment trends,LLMs could emit the equivalent of five billion U.S. cross-country flights in one year.

https://cse.engin.umich.edu/stories/power-hungry-ai-researchers-evaluate-energy-consumption-across-models

Far more than straightforward search engines that have the exact same information and don't make shit up half the time.