I'm not here to defend the meme link I provided, but in its defense, it explicitly refers to verifiable facts.
Like, you know, the price of games adjusted for inflation.
I'm not here to defend the meme link I provided, but in its defense, it explicitly refers to verifiable facts.
Like, you know, the price of games adjusted for inflation.
No, you and I are saying the same thing. The article (not just the headline) is a screed about how people buying on sales is a result of games becoming more expensive and game publishers becoming greedy. Except it's not, people always dug for sales because games used to be way more expensive than they are now.
I may misremember many things about last century, but I don't misremember the way I spent 90% of my spare time or how I acquired the games I played during that time.
Just a friendly reminder from an old fart that used to save their allowance to buy catridges:
Gaming is insanely cheap and accessible right now. You kids have no idea how good you have it, we used to go get our games twenty miles away, uphill both ways.
But no, seriously, that piece is a terrible take from somebody that either doesn't understand how games are made and sold or is too young to understand why they're so fundamentally wrong. Probably both. This comes to mind: https://indieweb.social/@emilygorcenski/111533761630028005
No worries. Paradoxically I feel like a pedant now for using the big word.
Anyway, that question is weirdly different from the "no HUD" one, I agree. Some of the games that make me look more at the world instead of at the pointers and indicators are full of HUD stuff. Somebody mentioned Zelda, which is fine. PUBG is a weird example, because yeah, it looks like a (messy, cheap, poorly designed) HUD, but the whole proximity audio and high stakes gameplay makes you stare at things like a hawk. We take it for granted because Battle Royale games became such a huge deal, but that was a neat trick.
Immersion is a bit overused and misunderstood.
It maybe works better as "suspension of disbelief", like in other fiction. You sustain it and you can go very abstract. You break it and things get weird.
Dead Space, which has come up a lot, does have a hud, it's just all diegetic. Whether that fits or not is up for debate.
For true zero hud stuff the first one I think of is Inside, for instance. If you're going for immersion that counts, but of course it's a very light, focused game. Journey and Flower are in that space, too. So is Mirror's Edge, technically, but it feels more intricate due to being first person, for some reason.
There's a bunch of minimal HUD games from that period, too. There's a thing here and there, but not a full HUD. There's the Portal games, which technically show which portals are up on the reticle, but nothing else. There's the Metro series, which will pop up some HUD but mostly relies on other visual cues. There's The Order 1886, which at the time was one of the standard bearers for minimal HUDs but I think now it's just slightly lighter than average, because that game is super underrated in how ahead of its time it was in terms of setting triple-A standards.
Does The Witness count as diegetic HUD or just no HUD? It's borderline. I think the Talos Principle has some light HUD elements, but they may be optional.
And hey, let me call out the times when a super dense HUD is actually immersion-creating, especially when it comes to representing tech or machinery. There's Metroid Prime, making the HUD part of the suit and placing you inside it. There's Armored Core, where the mech stuff is such a part of the fiction. There's the new Robocop, which I don't like but does a lot with its HUD. HUDs can be cool and immersive.
But I stumbled upon those, I didn't plan on acquiring them.
That's why college kids don't plan on what to make of their lives after college. They're kids! If they knew, they wouldn't need to be there. It took me a degree and a half, a number of failed creative projects and taking a job out of necessity to end up back in a completely different, adjacent career, eventually in multiple different countries. I could have predicted none of that when I started my first degree. For one, I didn't know what I didn't know, that was the entire point of university. For another, I didn't know half of the options I ended up taking even existed or were available to me. Many weren't, in fact, until a particular set of circumstances lined up.
But I'm sure glad that in the meantime I learned crucial things that made me more capable of taking advantage of those circumstances when they came by.
There's this girl I remember from that time. I was a bit older than my classmates, owing to that whole changing tracks thing, so a few gave me more credit than I deserved in some areas. This girl once walks up to me and asks me if I'll read some stuff she wrote. I didn't know how to say no, so I said yes. And it was terrible. No style, no flow, no command of language. It's a high school essay at best, corny and florid in all the wrong ways. I weaseled my way out of giving her feedback and mentally discounted her as a writer.
She's now a professional journalist involved in many high profile activist movements. I've read her stuff. It's great. Turns out the reason she was bad at it back then is she was twenty and had many years of getting good at that crap ahead of her. That's fine. It's fine to figure yourself out and learn to do things as an adult. That's supposed to be the point of higher education when it's universally accessible.
Anyway, I don't think you're wrong, for the record. I think you're right in your context. If public university wasn't basically free around here that would have been a very expensive approach to learning creative writing and figuring yourself out. At most all I'm contributing is I'm glad we do it that way over here. I spent ten years, give or take, doing that stuff and I spent between sixty and six hundred bucks a year doing it. And that's because I didn't qualify for any grants or government student aid. For some of my classmates it was free, or they even got some help for books and housing. I go to vote every time (and pay taxes) thinking that contributing to keeping that up is the most important thing I do in life.
This take is super depressing, but like I said elsewhere, maybe it makes sense in the US. And that sucks, to be clear.
For what it's worth, I spent maybe a decade in university, bounced around a couple of things before I got my actual degree. I did not do a STEM degree, I still got a lot out of it in both soft and hard skills. Also in relationships, experience and general ability to approach situations and extract information from the world. Frankly, if your time in higher education has to be driven by a securing a specific job or goal then you're in a broken higher education system. If it leaves you in crippling debt you're also in a broken system, but I'm pretty sure you guys know that already.
I could risk a suggestion, but you seem like you are in the US and frankly what you describe does not line up with my experience of higher education pretty much at all, so I'm pretty sure it wouldn't be useful.
I'll just say that that seems like it sucks, it's not what higher education should be about and you guys should probably get around to fixing it at one point or another.
Well, the idea of the original post is that ALL algorithms used for any reason are bad, and the retort is to explain that a chonological feed is still a (simple) algorithm and use that to "well actually" a distinction with proprietary algorithms.
Which is fine, but nitpicky. I'd think most Masto users get that, or at least take no issue with the obvious explanation. For all I saw the majority of the response to BlueSky's idea of an algorithm marketplace where you pick and tune how your feeds are sorted was relatively well received.
But as always around here I don't doubt that with a different set of follows and even usage times the pushback on principle may be more frequent or obvious. It just hasn't been my experience and I think the "what algorithm actually means" bit is a bit deceptive.
A few of these are interesting and accurate (email comparisons), a few are pretty obvious and widely distributed already (privacy challenges), a few are a bit of a straw man argument (not sure "algorithms are bad" is a thing) and a few I'd caveat a little bit (quote tweets).
Going through all that would mean a whole response piece, though, so I'm more than happy to vaguely nod and move on.
I think the more interesting question is why you wouldn't want it, and why it's a standard. Which I think has to do with flexibility, honestly. I use my keyboard on a different surface than my mouse often, and you need your mouse wire to have a surprising amount of lack and a consistent direction towards the back of your table, otherwise you get weird pulls on it and it's more annoying to use.