leisesprecher

joined 1 month ago
[–] leisesprecher@feddit.org 4 points 15 hours ago

There was a very simple phone from Samsung a few years back that had a solar cell on the back.

Since the battery lasted over a week anyway, you could easily double the battery life by just having it in indirect light.

Modern phones are guzzling so much power that it's hardly useful there.

[–] leisesprecher@feddit.org 6 points 2 days ago (1 children)

If the vast majority of people are affected, is it really "extreme" anymore?

[–] leisesprecher@feddit.org 18 points 5 days ago (4 children)

And a whole lot of content that I frankly would have preferred not to have seen.

When you're 12 and your parents have no idea what you're doing, you'll end up in very dark corners.

[–] leisesprecher@feddit.org 2 points 5 days ago (1 children)

And who does that?

I think you don't really get my point. I'm not arguing that there are no ways to archive data. I'm arguing that there are no technologies available for average Joe.

It is hardly a good strategy to basically set up half a datacenter at home.

[–] leisesprecher@feddit.org 4 points 6 days ago

Thin concrete slabs are extremely brittle.

[–] leisesprecher@feddit.org 15 points 6 days ago (4 children)

Is it? It's rather expensive and would you really know, if the data is gone or corrupted?

You'd have to download every single file in certain intervals and check it. That's not really low complexity.

[–] leisesprecher@feddit.org 20 points 6 days ago (28 children)

But what actually is "archival"?

Like, what technology normal person has access to counts at least as enthusiast level archival?

Magnetic tape, optical media, flash, HDD all rot away, potentially within frighteningly short timeframes and often with subtle bitrot.

[–] leisesprecher@feddit.org 14 points 6 days ago (4 children)

Why exactly does MS gaming employ over 20.000 people?

[–] leisesprecher@feddit.org 6 points 1 week ago (1 children)

It's usually not a question of legality, but efficiency.

It's easy and efficient to bust someone for seeding, but busting hundreds for the odd file you can prove they downloaded is expensive and takes forever.

[–] leisesprecher@feddit.org 5 points 1 week ago

If some bot reacts to this comment, you'll make the developer very unhappy.

[–] leisesprecher@feddit.org 12 points 1 week ago

And let's be real here, it's not too rare, the current victims simply don't count as much.

All those "tropical" diseases seem to be completely irrelevant as long as only poor people in a developing country get it. But as soon as a good white person dies, it's defcon 11 and suddenly it's really important to develop something expensive to help the rich countries.

 

I have a small homelab running a few services, some written by myself for small tasks - so the load is basically just me a few times a day.

Now, I'm a Java developer during the day, so I'm relatively productive with it and used some of these apps as learning opportunities (balls to my own wall overengineering to try out a new framework or something).

Problem is, each app uses something like 200mb of memory while doing next to nothing. That seems excessive. Native images dropped that to ~70mb, but that needs a bunch of resources to build.

So my question is, what is you go-to for such cases?

My current candidates are Python/FastAPI, Rust and Elixir, but I'm open for anything at this point - even if it's just for learning new languages.

[–] leisesprecher@feddit.org 1 points 1 week ago

The long-term goal is for Rust to overtake C in the kernel (from what I understand

Your understanding wrong. Rust is limited to some very specific niches within the kernel and will likely not spread out anytime soon.

critical code gets left untouched (a lot of the time) because no one wants to be the one that breaks shit

The entire kernel is "critical". The entire kernel runs - kind of by definition - in kernel space. Every bug there has the potential for privilege escalation or faults - theoretically even hardware damage. So following your advice, nobody should every touch the kernel at all.

 

I asked a while ago, how to build an automatic light switch and finally got around to actually building it.

My board is an ESP8266 mini D, and ignoring all the sensor parts, my problem right now is powering the actual light.

It's just a small LED array and I connected it directly to the 5V and GND pins (controlled via a transistor).

Measuring from the wall (so including the PSU), this whole setup pulls about 3W (so far expected), however, one small component close to the USB connector gets uncomfortably warm, and I'm not sure, whether that's ok.

The hot component is one of the two small thingies circled in the picture. I thought the 5V get pulled directly from the USB plug, so I'm not sure, why there is any circuitry involved.

 

I'm trying to build a very simple, stupid light switch for my grow light. Essentially, I want to turn on the light, if it gets too dark outside, so that my plants can survive the northern winter.

Since I'm a software guy, my first thought was an ESP32, but that seems excessive.

My current approach would be something like this: https://www.ebay.com/itm/313561010352 In conjunction with a relay, both powered by a USB-PSU.

If the light level is low enough, the logic DO pin should send a signal and that should be enough to trigger a small relay, so that the relay then closes the circuit to switch on the lights.

Is that idea completely stupid? With electronics, I'm usually missing something very obvious.

The lights themselves are already just usb powered and only draw 5W, so that shouldn't be problem.

What I'm concerned with is the actual switching. Is the logic signal "strong" enough to activate a relay? Would simple transistor maybe sufficient?

view more: next ›