waspentalive

joined 1 year ago
[–] waspentalive@beehaw.org 13 points 9 months ago* (last edited 9 months ago) (2 children)

The mad rush to sell the sizzle, not the steak.

Wouldn't it be nice to have one company create a simple printer that just prints. It does not have a local webpage. It does not monitor your ink supplies. It does not phone home. It uses ink from bottles sold inexpensivly.

[–] waspentalive@beehaw.org 2 points 11 months ago (1 children)

If they overlap, aren't you in danger of having your company try to take over your passion project?

[–] waspentalive@beehaw.org 3 points 11 months ago

I write programs for myself. I have learned enough C, Pascal, Fortran, Basic to write small things and even larger things like a visual file manager for MSDOS, or my own version of the venerable STAR TREK game. I even know of big O notation (But I don't know how to calculate it for a given algorithm)

But I never wanted to be a programmer - having to work on other people's programs 8 hours a day. That would ruin programming as a hobby. When I am self-directed it is fun.

I was a Data Center tech instead. Minding 3 football fields of other people's computers.

[–] waspentalive@beehaw.org 1 points 1 year ago

In before the XKCD reference!

[–] waspentalive@beehaw.org 3 points 1 year ago

I also had to (under KDE)

  • Edit the settings for each of the folders in Dolphin (The file manager)

  • Edit the location of the desktop folder in the settings found by right-clicking the desktop and going into "Configure Desktop and Wallpaper" Location.

  • Edit the show item by choosing Custom Location, and adding the XDG directory for the desktop. This setting may not stick.

[–] waspentalive@beehaw.org 15 points 1 year ago* (last edited 1 year ago) (3 children)

Why aren't all of these just normal directories under either .local (for data files) or .config (for configuration)???

Actually, I think the XDG directories should be under a single XDG directory either dotted or not (a better name would be OK with me) ~/xdg/Documents, ~/xdg/Music, ~/xdg/Pictures etc.

[–] waspentalive@beehaw.org 2 points 1 year ago

This is why I never became a programmer. I am retired now, but for the last 10 years, I have been a data center support agent. Programming is fun, I would hate to ruin that fun by having to work to someone else's rule.

[–] waspentalive@beehaw.org 1 points 1 year ago

I am interested in all things random or mathematical. I have written programs to simulate the decay of radioactive 'stuff', a program that simulates the CA Lottery by flipping a coin (someone said that your chances are about the same as flipping a coin 25 times in a row in a run of either heads or tails).

On the mathematical side, I have written a program to run the 3n+1 (Colatz) series and record process features, like counting evens and odds, the number of steps, and the maximum value found in the series. Perhaps the average of the values in the series would be interesting to calculate...

Combining mathematics with randomness - I have worked on the 100 prisoners idea, How many loops are created in this run, and how long is the longest one? If any loop contains more than 50 members then the prisoners lose and don't get to go home.

I have ideas for a traditional basic interpreter only lines are labeled not numbered.

I have a traditional Star Trek program that I have written many times improving slightly each time.

[–] waspentalive@beehaw.org 11 points 1 year ago* (last edited 1 year ago) (2 children)

But that was just Q's ad hominem response to an even better exchange: Q: What must I do to convince you people (that he is mortal and without powers) Worf: die.

Sorry I know you said "not epic" but most things Worf says are epic.

[–] waspentalive@beehaw.org 4 points 1 year ago

Of course you do. Nvidia wants you to buy the expensive card instead. Since they are almost the same card in some instances the only difference is knowing that you can change values in certain registers to make cheapcard act like expensivecard. I personally use Intel graphics and won't have nvidea.

[–] waspentalive@beehaw.org 18 points 1 year ago (3 children)

Nvidia does not 'hate' Linux, Nvidia simply never thinks about Linux. They need to keep secrets so people can't buy the cheap card and with a little programming turn it into the expensive card.

view more: next ›