I agree with them, that game is a masterpiece. Didn't you love it?
kogasa
It doesn't top out below 144Hz. There are benefits with diminishing returns up to at least 1000Hz especially for sample-and-hold displays (like all modern LCD/OLED monitors). 240Hz looks noticeably smoother than 144Hz, and 360Hz looks noticeably smoother than 240Hz. Past that it's probably pretty hard to tell unless you know what to look for, but there are a few specific effects that continue to be reduced. https://blurbusters.com/blur-busters-law-amazing-journey-to-future-1000hz-displays-with-blurfree-sample-and-hold/
Yippee I missed these
It's a number and complexity refers to functions. The natural inclusion of numbers into functions maps pi to the constant function x -> pi which is O(1).
If you want the time complexity of an algorithm that produces the nth digit of pi, the best known ones are something like O(n log n) with O(1) being impossible.
The direct connection is cool, I just wonder if a P2P connection is actually any better than going through a data center. There's gonna be intermediate servers right?
Do you need to have Tailscale set up on any network you want to use this on? Because I'm a fan of being able to just throw my domain or IP into any TV and log in
I just use nginx on a tiny Hetzner vps acting as a reverse proxy for my home server. I dunno what the point of Tailscale is here, maybe better latency and fewer network hops in some cases if a p2p connection is possible? But I've never had any bandwidth or latency issues doing this
It gets around port forwarding/firewall issues that most people don't know how to deal with. But putting it behind a paywall kinda kills any chance of it being a benevolent feature.
Possible reasons include:
-
fun
-
inflicting needless suffering on fish [applies if you hate fish]
It's got a very high barrier to entry. You kinda have to suffer through it for a while before you get it. And then you unlock a totally different kind of suffering.
The last time I had fun with LLMs was back when GPT2 was cutting-edge, I fine-tuned GPT2-Medium on Twitch chat logs and it alternates between emote spam, complete incoherence, blatantly unhinged comments, and suspiciously normal ones. The bot is still in use as a toy, specifically because it's deranged and unpredictable. It's like a kaleidoscope for the slice of internet subculture it was trained on, much more fun than a plain flawless mirror.
What specifically constitutes a hole is somewhat ambiguous, but if you pull on the thread a bit, you'll probably agree that it's a topological quality and that homotopy groups and homology are good candidates. The most grounded way to approach the topic is with simplicial homology.
Sounds like a skill issue. If that ruined the game for you, I dunno what to say. Might be a replicant?