this post was submitted on 26 Feb 2024
137 points (97.2% liked)
Programming
18790 readers
318 users here now
Welcome to the main community in programming.dev! Feel free to post anything relating to programming here!
Cross posting is strongly encouraged in the instance. If you feel your post or another person's post makes sense in another community cross post into it.
Hope you enjoy the instance!
Rules
Rules
- Follow the programming.dev instance rules
- Keep content related to programming in some way
- If you're posting long videos try to add in some form of tldr for those who don't want to watch videos
Wormhole
Follow the wormhole through a path of communities !webdev@programming.dev
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
A lot people compleatly overrate the amount of math required. Like its probably a week since I used a aritmetic operator.
Sometimes when people see me struggle with a bit of mental maths or use a calculator for something that is usually easy to do mentally, they remark "aren't you a programmer?"
I always respond with "I tell computers how to do maths, I don't do the maths"
Which leads to the other old saying, "computers do what you tell them to do, not what you want them to do".
As long as you don't let it turn around and let the computer dictate how you think.
I think it was Dijkstra that complained in one of his essays about naming uni departments "Computer Science" rather than "Comput_ing_ Science". He said it's a symptom of a dangerous slope where we build our work as programmers around specific computer features or even specific computers instead of using them as tools that can enable our mind to ask and verify more and more interesting questions.
The scholastic discipline deserves that kind of nuance and Dijkstra was one of the greatest.
The practical discipline requires you build your work around specific computers. Much of the hard earned domain knowledge I've earned as a staff software engineer would be useless if I changed the specific computer it's built around - Android OS. An android phone has very specific APIs, code patterns and requirements. Being ARM even it's underlying architecture is fundamentally different from the majority of computers (for now. We'll see how much the M1 arm style arch becomes the standard for anyone other than Mac).
If you took a web dev with 10YOE and dropped them into my Android code base and said "ok, write" they should get the structure and basics but I would expect them to make mistakes common to a beginner in Android, just as if I was stuck in a web dev environment and told to write I would make mistakes common to a junior web dev.
It's all very well and good to learn the core of CS: the structures used and why they work. Classic algorithms and when they're appropriate. Big O and algorithmic complexity.
But work in the practical field will always require domain knowledge around specific computer features or even specific computers.
I think Dijkstra's point was specifically about uni programs. A CS curriculum is supposed to make you train your mind for the theory of computation not for using specific computers (or specific programming languages).
Later during your career you will of course inevitably get bogged down into specific platforms, as you've rightly noted. And that's normal because CS needs practical applications, we can't all do research and "pure" science.
But I think it's still important to keep it in mind even when you're 10 or 20 or 30 years into your career and deeply entrenched into this and that technology. You have to always think "what am I doing this for" and "where is this piece of tech going", because IT keeps changing and entire sections of it get discarded periodically and if you don't ask those questions you risk getting caught in a dead-end.
At the same time, I find it amazing how many programmers never make the cognitive jump from the "playing with legos" mental model to "software is math".
They're both useful, but to never understand the latter is a bit worrying. It's not about using math, it's about thinking about code and data in terms of mapping arbitrary data domains. It's a much more powerful abstraction than the legos and enables you to do a lot more with it.
For anybody who finds themselves in this situation I recommend an absolute classic: Defmacro's "The nature of Lisp". You don't have to make it through the whole thing and you don't have to know Lisp, hopefully it will click before the end.
??
Function/class/variables are bricks, you stack those bricks together and you are a programmer.
I just hired a team to work on a bunch of Power platform stuff, and this "low/no-code" SaaS platform paradigm has made the mentality almost literal.
I think I misunderstood lemmyvore a bit, reading some criticism into the Lego metaphor that might not be there.
To me, "playing with bricks" is exactly how I want a lot of my coding to look. It means you can design and implement the bricks, connectors and overall architecture, and end up with something that makes sense. If running with the metaphor, that ain't bad, in a world full of random bullshit cobbled together with broken bricks, chewing gum and exposed electrical wire.
If the whole set is wonky, or people start eating the bricks instead, I suppose there's bigger worries.
(Definitely agree on "low code" being one of those worries, though - turns into "please, Jesus Christ, just let me write the actual code instead" remarkably often. I'm a BizTalk survivor and I'm not even sure that was the worst.
My take was that they're talking more about a script kiddy mindset?
I love designing good software architecture, and like you said, my object diagrams should be simple and clear to implement, and work as long as they're implemented correctly.
But you still need knowledge of what's going on inside those objects to design the architecture in the first place. Each of those bricks is custom made by us to suit the needs of the current project, and the way they come together needs to make sense mathematically to avoid performance pitfalls.
On the other hand in certain applications you can replace a significant amount of programming ability with a good undertstanding of vector maths.
We must do different sorts of programming...
There's a wide variety of types of programming. It's nice that the core concepts can carry across between the disparate branches.
If I'm doing a particular custom view I'll end up using
sin cos tan
for some basic trig but that's about as complex as any mobile CRUD app gets.I'm sure there are some math heavy mobile apps but they're the exception that proves the rule.
Negl I absolutely did this when I was first getting into it; especially with langs where you actually have to import something to access "higher-level" math functions. All of my review materials have me making arithmetic programs, but none of it goes over a level of like. 9th grade math, tops. (Unless you're fucking with satellites or lab data, but... I don't do that.)
Tbf, that's probably because most CS majors at T20 schools get a math minor as well because of the obscene amount of math they have to take.