That or build something that can stand up to being hit. Tall order, but the inner armchair engineer in me thinks it's like, totally possible.
ApatheticCactus
No. Absolutely not. Lots of future tech comes from sci Fi fiction, which sometimes becomes real. Fiction about 'what if' scenarios give insight into how things could happen given certain events taking place, helping decision making for present events. Relationship books? I mean, those can be great examples of how healthy or unhealthy relationships work, and can help one identify the status of their own relationships. Fantasy books and sometimes a combination of the above, and all useful.
Nonfiction helps one understand what has happened. It gives context to the world we live in now, and what came before. Both are valuable, just in different ways. Reading anything helps your ability to empathize and think of alternative perspectives and is always useful.
Would creating the cyber truck be considered as a suicide attempt?
Generally speaking, you learn more about how something works when the core functionality is exposed to the user, and just janky enough to require fiddling with it and fixing things.
This is true of lots of things like cars, drones, 3D printers, and computers. If you get a really nice one, it just works and you don't have to figure anything out. A cheap one, or something you have to build yourself, makes you have to learn how it actually works to get it to run right.
Now that things are so comodified and simplified, they just work and really discourage tinkering, so people learn less about core functionality and how things actually work. Not always true, but a trend I've experienced.
I still get hit hard from just the trailer.
I'd be watching a car accident compilation and a Buick starts trying to tell me to ask my doctor about Cymbalta. You know... I might actually watch that.
Pluto, obviously.
"Raises just aren't in the budget". Yeah, because the guys at the top took it all.
We put the charging port underneath the car!
Well, sort of. Thing is time flows at different rates for different things. There is a lot of relativity shenanigans that kinda breaks the idea of a universal clock.
Could we have a future where we have an arm main CPU, gaming GPU, and also an x86 card?
I have to do similar things when it comes to 'raytracing'. It meant one thing, and then a company comes along and calls something sorta similar the same thing, then everyone has these ideas of what it should be vs. what it actually is doing. Then later, a better version comes out that nearly matches the original term, but there's already a negative hype because it launched half baked and misnamed. Now they have to name the original thing something new new to market it because they destroyed the original name with a bad label and half baked product.