this post was submitted on 06 Oct 2024
320 points (99.4% liked)

Technology

59427 readers
2848 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] henfredemars@infosec.pub 91 points 1 month ago (3 children)

Ensuring that the system complies with industry standards and integrating security measures for cross-technology communication are also necessary steps, Gao adds.

This is absolutely a huge factor that could make or break the technology if they don't do this perfectly. This could be the single most important part of the tech.

2.4 GHz is super saturated. The last thing we need is long range i.e. large footprint signals in already saturated spectrum. How this technology is deployed should either be not at all, or very carefully, to prevent widespread interference with existing WiFi devices. This spectrum is already on the verge of being complete trash. Please please do not be deploying more stuff on 2.4 spanning an entire "smart city."

[–] shortwavesurfer@lemmy.zip 41 points 1 month ago (3 children)

I actually ditched 2.4 gigahertz Wi-Fi on my home network entirely for this exact reason. If a device is not compatible with 5 gigahertz Wi-Fi, it doesn't get purchased.

[–] henfredemars@infosec.pub 31 points 1 month ago (2 children)

It doesn’t just benefit you. You’re benefiting the current users of that spectrum that for one reason or another might not be able to switch.

I suspect most users though couldn’t tell you what frequency their network uses let alone the devices on it.

[–] cmnybo@discuss.tchncs.de 12 points 1 month ago (1 children)

Anyone with a NAS will immediately notice that they are on 2.4GHz because it will take several times longer to transfer files.

[–] henfredemars@infosec.pub 21 points 1 month ago (1 children)

I think users who know what a NAS is probably know that information already. But true, yes!

[–] jagged_circle@feddit.nl 5 points 1 month ago (1 children)

Some of us know what a NAS is, but aren't fortunate enough to afford one

[–] DogWater@lemmy.world 3 points 1 month ago

Indeed. Hello poorish brother

[–] sugar_in_your_tea@sh.itjust.works 6 points 1 month ago (1 children)

Yup, I have one device that's stuck on 2.4GHz, my Brother laser printer. It works fantastically otherwise and it has an Ethernet port, but I haven't bothered to run cable yet to it. I suspect a lot of people have that one device they'd rather not replace, which is still on an old wifi standard.

So I just make sure to have a simultaneous dual-band setup. Everything else uses 5GHz, and the 2.4GHz band exists for that one device, or if I'm on the opposite side of the house or something. I use fancy networking stuff though (Ubiquiti APs), your average person would just be confused at why the internet is sometimes slow (i.e. when the printer wakes up).

[–] AA5B@lemmy.world 2 points 1 month ago

While my printer only supports 2.4GHz, it’s always been on Ethernet

But too many smart home devices and media streamers, even after making an effort to stick with local IoT meshes.

[–] circuscritic@lemmy.ca 19 points 1 month ago (3 children)

Do you live in a high density urban environment?

Because if so, that totally makes sense, and the other benefit of 5GHz/6GHz not traveling too far outside your apartment or condo wall, is pretty nifty as well.

But if you live in a house in the suburbs, man, that is commitment well outside of necessity, or convenience. Not saying it's bad choice per se, just seems unnecessarily burdensome IMO.

[–] shortwavesurfer@lemmy.zip 9 points 1 month ago

I live in a single family house, but the area has quite a few single family houses packed pretty close together. So there's still a lot of traffic on 2.4 GHz.

[–] MossyFeathers@pawb.social 5 points 1 month ago

In my experience, having a vr setup with vive body trackers consumes the 2.4ghz band really fast; so there are still reasons to swap in the suburbs, but they're more niche.

Source: my PC is too far away from the router for wired, so it uses wifi. I had to switch to using 5ghz because my internet would drop out on 2.4ghz whenever I played VRChat.

I'm not OP, but I also live in a single family house in the suburbs and actively avoid 2.4-only gear. I do have one stubborn device on 2.4GHz though, my laser printer, so I have to keep buying simultaneous dual-band gear until I get around to running Ethernet cable to it.

[–] AA5B@lemmy.world 2 points 1 month ago

I wish I could but too many devices still require it

[–] rottingleaf@lemmy.world 5 points 1 month ago (1 children)

This spectrum is already on the verge of being complete trash.

Radio shouldn't be used when avoidable. It's for emergencies, aviation, hiking, short-range communication for convenience maybe. Phones - yes.

But providing internet connectivity via radio when you can lay cable is just stupid.

[–] henfredemars@infosec.pub 11 points 1 month ago* (last edited 1 month ago) (1 children)

I mostly agree with you. I find it really weird how I live in a world where all my Internet is being run through 5G cellular for political and social reasons and not for technical ones. Due to the monopoly on the cables, it’s actually much cheaper here to buy 5G home internet. It seems unnecessarily complicated and choosing to use a shared medium for no reason. It’s just the politics.

In case you’re not from the States, we have a monopoly pretty much everywhere for Internet services.

With my 5G I have unlimited data, and it’s 300 down 44 up on a good day. It’s perfectly serviceable if you can live with increased latency.

we have a monopoly pretty much everywhere for Internet services

Fortunately, that's not true everywhere, and municipal fiber is becoming more and more common.

5G home internet

The problem here is latency. It's entirely sufficient for most web browsing and video streaming use-cases, but it sucks for multiplayer gaming and other interactive use-cases (e.g. video calls). So while it's probably a solution for a lot of people, it's not really a replacement for physical cables.

[–] Windex007@lemmy.world 0 points 1 month ago (1 children)

Sounds like they basically crafted some special messages such that it's nonsense at 2.4ghz but smoothes out to a LoRa message on a much much lower frequency band (<ghz).

[–] towerful@programming.dev 4 points 1 month ago (1 children)

It's LoRa on 2.4ghz.
It's just that chirp signals are easy to decode from a lot of noise.
And they don't really affect most other modulation techniques. I think you can even have multiple CSS coded signals on the same frequency, as long as they are configured slightly differently.

LoRa is incredibly resilient.
It's just really really slow

[–] Windex007@lemmy.world 2 points 1 month ago* (last edited 1 month ago) (1 children)

I don't think it's "just" LoRa on 2.4ghz, because if it were existing lora devices wouldn't be able to decode the signals off the shelf, as the article claims. From the perspective of the receiver, the messages must "appear" to be in a LoRa band, right?

How do you make a device who's hardware operates in one frequency band emulate messages in a different band? I think that's the nature of this research.

And like, we already know how to do that in the general sense. For all intents and purposes, that's what AM radio does. Just hacking a specific peice of consumer hardware to do it entirely software side becomes the research paper.

[–] towerful@programming.dev 2 points 1 month ago (1 children)

WiFi uses BPSK/QPSK/OFDM/OFDMA modulation.
LoRa uses CSS modulation.

This is about hacking WiFi hardware to make WiFi modulated signal intelligible to a receiver expecting CSS modulation, and have the WiFi hardware demodulate a CSS signal.
Thus making WiFi chips work with LoRa chips.

LoRa doesn't care about the carrier frequency.
So the fact that it's LoRa at 2.4ghz doesn't matter. It's still LoRa.

I'm sure there will be a use for this at some point.
Certainly useful for directly interfacing with LoRa devices from a laptop.
I feel that anyone actually deploying LoRa IoT would be working at a lower level than "throw a laptop at it" kinda thing

[–] Windex007@lemmy.world 2 points 1 month ago (1 children)

I didn't realize that LoRa didn't care about carrier frequency, that's for sure the root of my faulty assumption! Thanks for taking the time to explain

[–] towerful@programming.dev 1 points 1 month ago

It's pretty serendipitous, actually.
The past month I've done a somewhat deep dive into LoRa for a project.
I ultimately dismissed it due to the data rates, but for simple remote controls or for sensors - things that report a couple bytes - it seems awesome.
I'm sure you can squeeze higher data rates out of it, but when I evaluated it I decided to go with a hardwired network link (I had to have stability, dropped info wasn't an option. But the client had a strong preference for wireless)