GolfNovemberUniform

joined 7 months ago

Modding it is hard. You need to unpack game files, install mods in a specific order which you almost never know, repack it and hope that it works. Alternatively you can install a merge (modpack made by other people) if you find one you like. Btw every update deletes most mods afaik.

But besides that there are a lot of mods and even without them I found the game pretty enjoyable. Just don't screw up the tech tree and you should be fine.

If you'd like some mod recommendations, I'd say 2013 vehicle handling, alternate gun sounds and a graphics mod if your machine can handle it (some require at least a 3060 for smooth gameplay). There's also Living City which adds a ton of customizable random freeroam events and features but that one is up to personal preference. I personally prefer a more vanilla experience.

Well I'm pretty sure it'll still be better than my 3050 and I mostly focused on the kinds of games I play when doing my research. I can only think of one game I play that may be considered CPU-intensive which is BeamNG.Drive but there are settings in it which can bring literally any GPU to a crawl if needed.

[–] GolfNovemberUniform@infosec.pub 1 points 3 days ago (2 children)

Meh I basically only play AAA games on this system. I have a separate machine for work and that kind of things.

Ok I'm pretty sure supporting such argument will get me banned for a few days but whatever. If you're so sure that there are better deals in my country, suggest me some options and I'll check them again.

[–] GolfNovemberUniform@infosec.pub 1 points 3 days ago* (last edited 3 days ago) (2 children)

Lemmy people seem to have the issue of completely forgetting the original topic. I noticed this earlier when asking about mouse compatibility on Linux and people started recommending mice for almost the price of a new PS5. I was asking about the potential bottleneck. I already did my research beforehand and confirmed that 3080 is indeed good value and suitable for my needs.

[–] GolfNovemberUniform@infosec.pub 1 points 3 days ago (4 children)

Meanwhile there are newer GPUs that will also render 1080p games with the same performance for less money

Like 5060 Ti for 1.5x the price and 20% less performance? Nah I don't think so. I think I know prices in my own country better ngl.

[–] GolfNovemberUniform@infosec.pub 1 points 3 days ago* (last edited 3 days ago) (6 children)

From quite an extensive research I did, I can surely say all of your points are wrong.

3080 is an overkill for 1080p but over time games get more intensive so it becomes only more of a viable option. Limited VRAM works against it but for now it's still fine in most titles unless you go 4k.

3080 made total sense. It had enough VRAM at the time and now is even considered one of the best NVidia GPUs ever made by some people. NVidia could make it more future proof but wouldn't it attract miners?

Prices depend on the area but I haven't seen one where anything beats the 3080 for the price. 6900 XT can beat it in some titles but it's actually slightly slower on average (if I'm not mistaken). And obviously the AMD card has way inferior productivity features.

[–] GolfNovemberUniform@infosec.pub 6 points 3 days ago* (last edited 3 days ago) (1 children)

Watch Dogs mentioned!

Btw the sequel is very enjoyable too though a bit heavy.

Well 3080 is probably like 70% faster (don't wanna check now) so it's a whole different story. Though my current CPU handles games at 120 FPS (or even 240 in competitive titles) so I think it shouldn't be much worse if I keep the same framerate and crank up the settings (specifically path tracing lol). And in older titles I can underclock the card I guess?

Unfortunately I can't upgrade the CPU due to some very important reasons. And I don't need it either as I'm planning to get a new machine next year.

[–] GolfNovemberUniform@infosec.pub 1 points 4 days ago (1 children)

Because developing country market. Also idc about the "features".

[–] GolfNovemberUniform@infosec.pub 0 points 4 days ago (1 children)

I mean, what's the point of getting a worse GPU (for example 5060 Ti or 3070 Ti) for the same price? 3080 seems to be of good value here and not too much of an unnecessary waste like 3080 Ti.

 

Recently I asked about a new PC build. I got helpful responses about the topic but also a suggestion of just upgrading my GPU to a 3080 (from 3050). I looked deeper into it and it looks like I can do it easily even right now. Then I saw a 3080 FE for sale and I've always been a fan of how they look so now I want one.

However I also discovered that my CPU (i5-11400F) will be a severe bottleneck in that configuration. I don't really mind decreased GPU utilization and I'm pretty sure my CPU cooler will keep up just fine (tested in benchmarks and UE5) but will it give me any serious issues such as freezes or full on crashes? My resolution is 1080p btw (with the monitor itself actually being 768p but I increase resolution in games beyond that for better quality) but I might as well upgrade it to a 1440p one soon if necessary and use it for the new build when I undoubtedly waste my money on it.

 

I'm a decently happy owner of a system with an i5-11400F and an RTX 3050. It just barely works for my needs (1080p 60-90 FPS ultra) but next year there will be some new games I'd like to play such as GTA 6 and FH6. With the current trends it's obvious my current system won't handle that on settings higher than low. So I'm thinking of getting a new PC.

For now I'm thinking something like a Ryzen 7 and an RTX 5070 Ti should work. That would be around 2.5-3k USD in my area depending on the components. Also I'd like a large monitor so I'll have to upgrade to a 1440p one which will increase the hardware requirements too.

AMD cards are quite expensive here apparently (9070 XT is significantly more expensive than 5070 lol) and I'm a massive fan of RT so those are not an option (I can remove the RT requirements if there's no way to use it with decent settings at no less than 60 FPS). Also NVidia 40 series is not good value here like at all.

Another interesting option is RTX 5080. It's still within my budget of around 3k but I'm very afraid of the connector melting issues. After all I can't build a PC myself (not an option at all) so a special well known company will handle it instead and nobody knows what connectors they use (I can ask as they're pretty open about this stuff but still). I've already worked with the company btw and it's not shady so that should be fine as long as I don't forget to edit a decent PSU in the specs instead of their common firework ones.

However with my limited knowledge I can't predict how far technology will go in the short term. We already saw that the latest gen showed pretty much no improvement over the previous one. So is it even worth waiting for the next year's tech or will it just be the same but with more AI frame gen slop and zeros in the price tag strapped to it? And will games get so much more demanding in just a year or two that trying to target ultra is already a bad idea?

What makes me even more worried is the slightly unstable financial situation in my country. It's possible that tech will get significantly more expensive here soon.

Yea this post is very long so I guess say gingerbread if you read it till the end lol.

 

I suggest just reading the full article and making your own conclusions but I personally deleted the game for now. I need to see how far these measures will actually go and will they want to like take my DNS history or something.

 

I love ray tracing and path tracing when they're done right. Ik fully ray traced scenes are hardly playable even on high end cards without upscaling but like if one has a powerful enough card, why not utilize its potential? Yet most people don't seem to care about RT.

When it comes to upscaling though, I hate it, and I'm not even talking about frame gen. It makes things look blurry and causes annoying artifacts. I think playing on lowest settings with clear textures is more enjoyable long term than maxed out in 4k with a consistently blurry image. Also this new technology makes devs care less about optimization (which will backfire btw as we're approaching the physical limit of transistor size).

view more: next ›