oo1

joined 1 year ago
[–] oo1@kbin.social 5 points 5 months ago

in case it's not clear from the comments . and sorry for repeating if it is, but this >> thing is a really useful terminal thing to know in many cases.
>>
will trap and redirect terminal output.

So consider any old commmand and its output:
echo abc

This invokes the echo command and echo outpts "abc" to terminal.

If we add on >> we can catch and redirect the output:

 echo abc >> blah.txt

Will capture the output "abc" into the file.

Note this is an APPEND operation, so run it twice to the same output file and you'll add more and more output to new lines at the end of the same file.

[–] oo1@kbin.social 3 points 5 months ago

+1 to this.

You can reduce likelihood of any known risk with a preventative measure, in this case the permissions and ownership structure. That is good.

Backup does not reduce likelihood of risk.

It does something more wide-reaching, it mitigates against the bad outcome of loss (from most causes).So it defends from many unknown risks as well as known ones, and unexpected failure of preventative measures. It sort of protects you from your own ignorance and complacency.

Shit - i'm off to do some more work on backup.sh.

[–] oo1@kbin.social 5 points 5 months ago (2 children)

yeah I paid a lot for an apple laptop in 2008. (more than the hardware was worth - but the form factor was good)
It was okay, and osx was ok for most stuff for a few years .

But they cut support for updates well within 10 years and the version I was stuck on eventually just got too far behind on security updates and couldn't even get firefox updates and stuff.
So they forced me back tolinux full time - thankfully dual bootng macos+linux was really easy on the old x86 ones.

It seems you have to keep shipping them big buckets of dollars every 5 years or so - fuck that.
I'd much rather just give the odd bit of pay-what-you-can/ tip jar to a few linux projects than chuck out perfectly good hardware every few years.

[–] oo1@kbin.social 2 points 5 months ago (1 children)

There's always tinycorelinux for hardcore minimalists.
I can't say about package support either - i've not used it enough, but theres a "dcore" extension that lets you acess debian repos.

I've installed it on a potato easily enough - and I did find it to be astonishing for how small it is.

But I don't use it day to day, or much at all, so i'm not going to endorse it.
It's not necessarily the most user friendly. and some people might cal the gui slightly dated - persnally i did like that.

So this is just make you aware of one of the lightest distros I know of (that is sort of usable out of the box)
Recommended: spec is 128mb ram and pentium2. min spec 46mb ram (maybe thats without the gui desktop environment)

It's possibly a bit lighter than antix - for some reason i never quite got on with either antix or mx - not sure why.

[–] oo1@kbin.social 7 points 5 months ago

steam deck? I wonder how many full-time staff valve devotes to testing and pushing regular updates.

I think a lot of arch people want the bleeding edge updates, so it seems a lot like to go btrfs or and setup snaphots or something if they want a safety net.

[–] oo1@kbin.social 19 points 5 months ago

you mean like human babies?

[–] oo1@kbin.social 8 points 5 months ago

TLDR;
Doom was massively popular in it's day because it was and still is an awesome game played on ibm pc compatibles.
Popularity was basically nothing to do with ports to other os ses or hardware.
Doom is an "MS- DOS game" not a "windows game".


It had a brilliant shareware (free) version containing 1/3rd of the game - that spread like wildfire.
It had great multiplayer network deathmatch and coop modes.
It maybe gained a bit of notoriety by some morons (who probably didn't know what a BBS or shareware was) calling it to be banned as a "video Game Nasty" - but it'd have been insanely popular without that because of how many light years ahead it was the previous gen - say wolfenstein or catacomb abyss in basically every way.

It also grew a network of BBS communities who shared user created WADs with levels and mods and stuff extending the game's content and longevity - and creating a subculture of doom-obsessed tech geeks. Competitive home gamer "speedrunning" and stuff became possible at home as you could basically "record" and share a level on BBS and people could effectively validate each key-press to check for cheating.

It's true that it was ported to mac and linux and a few other OS fairly soon after release, but the vast majority of home gamers would have been on MS-DOS. Probably there were a bunch of workplace deathmatches on networks of solaris terminals or something like that - but if you had a pc at home, you were playing DOOM on MS-DOS.

Back in 1993/1994 and for years after linux was just nowhere near MS-DOS in popularity, stability, usability, compatibility etc. Debian was literally only just born the same year - but if you think Arch or GEntoo is hard to get up and running . . . that's peanuts to what a 1993 era linux user would be doing. In fact "linux programmer" is likely what you were - I don't believe there was such a thing as "linux user" until a years later - and it was still very painful and unstable.

Back then MS-DOS with it's CLI was stable, simple and fairly efficient - massively more so than the "windows GUIs" that would follow.
DOS was fairly cheap - and there were "other" ways to get it anyway - I don't think MS cared about home user piracy much - they just wanted B2B deals (and pre-installs with pc sellers).

"Windows" was just not relevant for gaming in 1993 - even in win'95 and win'98 days windows was not really an "operating system".
windows 3.x/95/98 was just a program that you could choose to run after booting into MS-DOS - and you'd only start up that mess if you wanted the GUI or some wizzywig programs like desktop publishers or something - of course Mackintosh was still the no1 choice for most pro GUI stuff.

Even when windows 1995/98 and so on came out for most gaming I'd have been booting into DOS anyway. everyone had a few DOS 6.2 boot disks lying around. Going into the naked DOS CLI meant you could access the large contiguous chunks of extended memory that games typically needed - starting windows always RAMmed you somewhere uncomfortable.

It wasn't really until 3d graphics drivers became packaged into directX that that Windows became a real thing for gaming.
From memory something like Grand Theft Auto (1) in about 1997 would have been the first game I would have actually started windows for.

Doom was basically 4 years old and pretty ancient by then. But it was still the number 1 multiplayer game in my house - since by that time we had a couple of PCs capable of Doom plus maybe a laptop or one brought over from a firends. . . . and a bloody unreliable BNC-coaxial bus network. Couldn't get enough PCs that could run quake well enough to be a fair fight.

However I could imagine a lot of people wanting to get up to four networked devices going to death match at home. SO that may well have been a driver for porting.

I didn't install it on weird devices like sony ericsson P800 or my ipod until much later - for example not until those devices were invented and cheap enough.
And all that was just a gimmick -or geeks fucking around "because they can" - the control interface of P800 touchscreen was just nowhere near the proper keyboard experience. If you can't simultaneously sidestep+sprint+turn and run backwards - you can't play doom.
DOOM on a ipod click-wheel - just fucking stupid - surprisingly slightly better than the P800 though.

[–] oo1@kbin.social 5 points 5 months ago

Samuel L Jackson was actually played by Alec Guinness in blackface(+CGI).
And as for Carl Fisher . . .

[–] oo1@kbin.social 2 points 5 months ago

Schroedinger's polecat

[–] oo1@kbin.social 18 points 5 months ago (1 children)

A line plot is a much better choice here as the time intervals between the data points seems important.

It's suspicious that an entire dimension is not depicted visually.
The reader is forced to calculate the time spans and the implied rate of changes in the two intervals.

The chart doesn't actually use the space on the page to show anything more than text would; it is a waste of space.
They could easily have presented the data in a line graph to show more and make the reader do less.

[–] oo1@kbin.social 3 points 5 months ago

yep line plot for sure.
The years aren't equally spaced in time, and the future forecast should be clearly differentiated - maybe with a dotted line, or a high to low spread.

The graph hides how ambitious it is to more than double the rollout rate.
We should expect to see cost per year (or workforce or some measure of resources) aso more than doubling.

Presumably this is a funded a plan, not strictly a forecast, so it's not unresonable to have accelerating growth, if more resources are going in.
If resources are constant, then yes I'd think diminishing returns wold shape the forecast.

view more: ‹ prev next ›