Well, they implemented some graphical improvements and options, as well as workshop mod support so now would be a good time for a replay.
PlzGivHugs
Technically, it can and has been done already. The problem is that AI is very bad at creating new ideas and even worse at understanding what it has created (as is required for plots or jokes). As a result, any writting created with heavy AI influence tends to sound like a child's stream of thought with an adult's vocabulary, and any jokes rely purely on randomness or on repeating an existing well-known joke. Similarly with art and animation, because the AI doesn't understand what it is creating, it struggles to keep animation of elements consistant and often can't figure out how elements should be included in the scene. Voices are probably the strongest part, but even then, it can be buggy and won't change correctly to match the context of what is being said.
None of this is to say AI is useless. Its very good at creating a "good enough" quick-fix, or to be used to fill unimportant or trivial work. If used to help clean up scripts or fill in backgrounds, it can speed up the process greatly at minimal cost. It's a tool to be used by someone who knows the field, not to replace them.
Who even thinks this is a big deal? This screams boomers being upset about violent video games all over again.
While its not a big deal, at least for Gmod specifically, they really could use some further moderation. I've seen some pretty reprehensible stuff there. The most recent example that comes to mind is "Burning Ukrainian Soldiers" on the front page (complete with combat footage to compare against and racist description of Ukranians), but that sort of thing is not rare at all.
In general, I agree, but I think you underestimate the benifits it provides. While ray-tracing doesn't add much to more static or simple scenes, it can make a huge difference with more complex or dynamic scenes. Half Life 2 is honestly probably the ideal game to demonstrate this due to its heavy reliance on physics. Current lighting and reflection systems, for all their advancements and advantages, struggle to convincingly handle objects moving in the scene and interacting with each other. Add in a flickering torch or similar and things tend to go even further off the rails. This is why in a lot of games, interactive objects end up standing out in an otherwise well-rendered enviroment. Good raytracing fixes this and can go a really long way to creating a unified, but dynamic look to an enviroment. All that is just on the player's side too, theres even more boons for developers.
That said, I still don't plan to be playing many RTX or ray-traced games any time soon. As you said, its still a nightmare performance wise, and I personally start getting motion sick at the framerates it runs at. Once hardware catches up more seriously, I think it will be a really useful tool.
A couple of major factors:
Users who expect low prices - This partly because of the history of mobile games being smaller and/or ad-funded but also because the vast majority of people playing games on their phone are looking for a low barrier to entry, time waster, not specifically a game.
Lack of regulation or enforcement - other gambling heavy fields tend to be at least somewhat regulated, but mobile games are very light on regulation, and even lighter on enforcement. This allows them to falsely advertise their games and how they function (both in terms of misleading ads, and lying about chance based events and purchases in-game).
Monopolistic middlemen - On other platforms, theres more direct competition (IE, Sony and Microsoft's generally more direct competition) or companies that prioritize long-term growth and stability (IE Steam or Itch.io). Apple and Google, on the other hand, largely compete on brand perception and hardware specs. These means that their app stores, where they make most of their money, have zero competitors. Seeing as they have no reason to make the stores better, they can instead promote whatever makes them the most money; that being exactly these manipulate, sketchy, virtual slot machines.
I think it is technically possible - with the Valve Index you can read the camera input like a webcam, and I'm sure theres some way to do it with the Quests (although probably not easily). That said, as others have noted, between the bulkyness of the headset, the lower quality of the cameras, the risk of losing tracking, and the natural shakyness of people's heads, it likely wouldn't be an improvement. Try watching VR footage from someone who doesn't stream/video it regularly and you can get an idea of how hard the footage can be to follow, even before the lower camera quality.
It would be anywhere between 2 and 5.
Either physical or digital. And I have access to a PC (with a bunch of emulators) or mobile phones.
Exactly what everyone else said, regardless of how much troubleshooting they did. Lol
Literally just a 3.5mm headset plugged directly into the mobo. As simple as can be.
I hope I can switch some day, but I doubt a lot of my actually specialized software will work any time soon, even if the audio issues have been fixed (or my computer replaced) so I don't expect to any time soon, unfortunately.
I think it was about a year ago. I have given up since then, but I have yet to find anyone capable of getting my audio to work properly, despite many hours of trying and help from multiple people from the Linux support subreddit. It might have been fixed since then, but I don't have the disk space nor the time to attempt the switch again, and if I can't even get audio to work, then I can't use the OS.
Edit: I believe this is the mobo if you want to look into things yourself. The best I every got was audio that was delayed by about a second. https://www.asrock.com/MB/AMD/AB350M-HDV/index.asp
Episodes 1 and 2 are now also bundled in.