jlh

joined 1 year ago
[–] jlh@lemmy.jlh.name 2 points 3 hours ago

Ah, OK. They seemed to know the streets well and the ongoing political process, at least.

[–] jlh@lemmy.jlh.name 7 points 3 hours ago* (last edited 3 hours ago) (3 children)

Here's a video from Oh The Urbanity!, a resident of Toronto, on the stupidity of removing these specific bike lanes, and the insanity of Ontario making expensive, self-destructive traffic decisions for Toronto's residents.

https://youtu.be/_FZDEehlaC4

[–] jlh@lemmy.jlh.name 6 points 10 hours ago (1 children)

There needs to be due process. We can't ban a website because 10k people said it has disinformation. The DSA is the process for combatting disinformation on major platforms, and we should follow it. Twitter is already being sued under the DSA, and they will be banned in the next few months if they do not fulfill their obligations to fight disinformation.

[–] jlh@lemmy.jlh.name 2 points 13 hours ago* (last edited 12 hours ago)

I'm looking at the future and what might be good replacement that offers a blend of power-efficiency, flexibility, and storage cost.

Any modern CPU will improve energy efficiency. AMD AM4 platform and Intel N100 are very cheap. AMD SP3 platform is also very cheap and has a ton of PCIe lanes and memory expandability for gpus, NVMe, and running lots of VMs.

For storage cost, used hdds are currently $8/TB, generic NVMe is currently $45, and used enterprise SSDs are $50/TB, decide which you want to invest in depending on your needs. Used enterprise drives will be the fastest and most durable for databases and RAID.

https://diskprices.com/

https://www.ebay.com/sch/i.html?_nkw=pm983+m.2&_trksid=p4432023.m4084.l1313

SSD prices are expected to decrease faster than HDD prices, and will probably overtake HDDs for value in ~5 years.

About dGPUs, Intel A310 is the best transcoding Gpu out there. Used Nvidia datacenter gpus probably have the best vram/$ and pytorch compatibility for AI

[–] jlh@lemmy.jlh.name 31 points 13 hours ago (2 children)

change.org isn't going to do much, and the EU already has an ongoing lawsuit with Twitter regarding its disinformation promotion.

https://ec.europa.eu/commission/presscorner/detail/en/ip_23_6709

It could be argued that the EU prosecutors should speed things up, though.

[–] jlh@lemmy.jlh.name 6 points 13 hours ago

please press tab 🥲

The halo community kinda hates the remastered graphics for halo CE lol

[–] jlh@lemmy.jlh.name 86 points 1 day ago (1 children)

The irony of saying that you're against the corrupt establishment and then going out and saying you're a big supporter of Donald Trump, the most corrupt, most established politician of the 2020's.

[–] jlh@lemmy.jlh.name 9 points 2 days ago* (last edited 2 days ago) (1 children)

More proof that the election was decided by the economy, not climate or abortion. Which means the opposition has a lot of leverage here to stand up to Trump's fossil fuel policies and abortion bans.

[–] jlh@lemmy.jlh.name 3 points 2 days ago (1 children)

100% if you fine them 6% of their global revenue for refusing safety recommendations by the EU and independent auditors

https://digital-strategy.ec.europa.eu/en/policies/dsa-enforcement

There's a reason why Elon Musk is running to Trump for help after the EU started suing him for breaking this law.

https://www.politico.eu/article/donald-trump-elon-musk-x-tech-social-media-politics-elections-eu/

If Trump can't dodge EU disinformation laws, no one can.

[–] jlh@lemmy.jlh.name 6 points 2 days ago (3 children)

Tweak algorithms to limit reach of new accounts, don't allow russians to buy ads or blue checkmarks, have a team of moderators that moderate based on known bad images, known bad IP addresses, known bad account creation patterns. If non-profit researchers are able to uncover botnets, there's no reason why billion dollar companies can't. It's a cat and mouse game, but it's not acceptable for these companies to put in 0 effort. These companies are better funded than the Internet Research Agency.

[–] jlh@lemmy.jlh.name 119 points 2 days ago (3 children)

When have Comcast, Disney, or IBM ever have been on the wrong side of history? /s

[–] jlh@lemmy.jlh.name 10 points 3 days ago* (last edited 3 days ago) (5 children)

Technology is not the solution to a social problem. Big tech companies have an obligation to make it more difficult for state actors and extremists from multiplying obviously false claims about elections and protected minorities.

 

https://web.archive.org/web/20240719155854/https://www.wired.com/story/crowdstrike-outage-update-windows/

"CrowdStrike is far from the only security firm to trigger Windows crashes with a driver update. Updates to Kaspersky and even Windows’ own built-in antivirus software Windows Defender have caused similar Blue Screen of Death crashes in years past."

"'People may now demand changes in this operating model,' says Jake Williams, vice president of research and development at the cybersecurity consultancy Hunter Strategy. 'For better or worse, CrowdStrike has just shown why pushing updates without IT intervention is unsustainable.'"

 

I wanted to share an observation I've seen on the way the latest computer systems work. I swear this isn't an AI hype train post 😅

I'm seeing more and more computer systems these days use usage data or internal metrics to be able to automatically adapt how they run, and I get the feeling that this is a sort of new computing paradigm that has been enabled by the increased modularity of modern computer systems.

First off, I would classify us being in a sort of "second-generation" of computing. The first computers in the 80s and 90s were fairly basic, user programs were often written in C/Assembly, and often ran directly in ring 0 of CPUs. Leading up to the year 2000, there were a lot of advancements and technology adoption in creating more modular computers. Stuff like microkernels, MMUs, higher-level languages with memory management runtimes, and the rise of modular programming in languages like Java and Python. This allowed computer systems to become much more advanced, as the new abstractions available allowed computer programs to reuse code and be a lot more ambitious. We are well into this era now, with VMs and Docker containers taking over computer infrastructure, and modern programming depending on software packages, like you see with NPM and Cargo.

So we're still in this "modularity" era of computing, where you can reuse code and even have microservices sharing data with each other, but often the amount of data individual computer systems have access to is relatively limited.

More recently, I think we're seeing the beginning of "data-driven" computing, which uses observability and control loops to run better and self-manage.

I see a lot of recent examples of this:

  • Service orchestrators like Linux-systemd and Kubernetes that monitor the status and performance of services they own, and use that data for self-healing and to optimize how and where those services run.
  • Centralized data collection systems for microservices, which often include automated alerts and control loops. You see a lot of new systems like this, including Splunk, OpenTelemetry, and Pyroscope, as well as internal data collection systems in all of the big cloud vendors. These systems are all trying to centralize as much data as possible about how services run, not just including logs and metrics, but also more low-level data like execution-traces and CPU/RAM profiling data.
  • Hardware metrics in a lot of modern hardware. Before 2010, you were lucky if your hardware reported clock speeds and temperature for hardware components. Nowadays, it seems like hardware components are overflowing with data. Every CPU core now not only reports temperature, but also power usage. You see similar things on GPUs too, and tools like nvitop are critical for modern GPGPU operations. Nowadays, even individual RAM DIMMs report temperature data. The most impressive thing is that now CPUs even use their own internal metrics, like temperature, silicon quality, and power usage, in order to run more efficiently, like you see with AMD's CPPC system.
  • Of source, I said this wasn't an AI hype post, but I think the use of neural networks to enhance user interfaces is definitely a part of this. The way that social media uses neural networks to change what is shown to the user, the upcoming "AI search" in Windows, and the way that all this usage data is fed back into neural networks makes me think that even user-facing computer systems will start to adapt to changing conditions using data science.

I have been kind of thinking about this "trend" for a while, but this announcement that ACPI is now adding hardware health telemetry inspired me to finally write up a bit of a description of this idea.

What do people think? Have other people seen the trend for self-adapting systems like this? Is this an oversimplification on computer engineering?

view more: next ›