Aux

joined 7 months ago
[–] Aux@feddit.uk -1 points 1 month ago (1 children)

Fury? I mean the only slop here are lemmings.

[–] Aux@feddit.uk 2 points 2 months ago

It does respect robots.txt, but that doesn't mean it won't index the content hidden behind robots.txt. That file is context dependent. Here's an example.

Site X has a link to sitemap.html on the front page and it is blocked inside robots.txt. When Google crawler visits site X it will first load robots.txt and will follow its instructions and will skip sitemap.html.

Now there's site Y and it also links to sitemap.html on X. Well, in this context the active robots.txt file is from Y and it doesn't block anything on X (and it cannot), so now the crawler has the green light to fetch sitemap.html.

This behaviour is intentional.

[–] Aux@feddit.uk 2 points 2 months ago (2 children)

What kind of code are you writing that your CPU goes to sleep? If you follow any good practices like TDD, atomic commits, etc, and your code base is larger than hello world, your PC will be running at its peak quite a lot.

Example: linting on every commit + TDD. You'll be making loads of commits every day, linting a decent code base will definitely push your CPU to 100% for a few seconds. Running tests, even with caches, will push CPU to 100% for a few minutes. Plus compilation for running the app, some apps take hours to compile.

In general, text editing is a small part of the developer workflow. Only junior devs spend a lot of time typing stuff.

[–] Aux@feddit.uk -5 points 2 months ago (3 children)

AI being trained by AI is how you train most models. Man, people here are ridiculously ignorant...

[–] Aux@feddit.uk -2 points 2 months ago

AI won't see Markov chains - that trap site will be dropped at the crawling stage.

[–] Aux@feddit.uk -4 points 2 months ago (18 children)
[–] Aux@feddit.uk -4 points 2 months ago (1 children)

Making bad financial decisions is not poverty.

view more: ‹ prev next ›