this post was submitted on 19 Dec 2023
151 points (82.1% liked)
Technology
59472 readers
3380 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
The document-centric model of desktop applications largely originates from the early Mac. How do you open a document in a desktop OS? You double-click on the document, and the OS finds the correct application to open it with. That was a Mac thing. On most other systems of the mid-1980s, you run your application program (from the command line) and then tell the program to load a file.
Applications as "bundles" of code and data was a Mac thing too, starting with the resource/code division in the classic Mac System. Rather than an application coming with a mess of directories of libraries and data files, it's all bundled up into a single application file that can contain structured data ("resources") for the GUI elements. On a classic Mac, you could load an application program up in ResEdit and modify the menus, add keyboard shortcuts, and so on, without recompiling anything.
The Apple Newton had data persistence of a sort that we now expect on cloud applications like Google Docs. Rather than "saving" and "loading" files, every change was automatically committed to storage. If you turn the device off (or it runs out of battery power), you don't lose your work.
Other systems did have double-click, and app bundles (which I still think are just fantastic) were a NeXT thing. (which of course became Apple, but they weren't at the time). But yeah, Apple way refined and brought those to a mass market.
App bundles were just a better implementation of resource forks, which were invented by Apple and pre-dated NeXT.
NeXT was founded by people who worked at Apple (not just Steve) and they were largely put in charge when they came back to Apple. I wouldn't call them separate companies. Just a weird moment in the history of the company. A lot like what just happened at OpenAI.
App bundles have virtually no relationship with resource forks. I guess you could say that App Bundles COULD include SOME metadata that you could have included in Forks, including the idea that something was an application or not. But that's about it.
On the NeXT always being Apple thing - I mean, some of it maybe was spiritually Apple, and eventually it was 100% Apple. But we're splitting hairs.
Eh, the difference between app bundles and resource forks isn't the functionality itself, but rather how the filesystem interface cuts through the functionality.
An OSX bundle is a Unix directory, whereas a classic Mac application is a file in a filesystem that supports multiple forks within a single file. Either way, you have typed objects (files or resources) that get carried around with a master object (the application).
The first Mac came out in 1984; NeXT didn't have a product until 1988.
NeXT was later bought by Apple and their tech became the foundation of Mac OS X in 2001.
But I was referring to the original '80s Macintosh System, not OS X. :)
Kinda funny that iPad/iOS has sort of gone in reverse on this, by virtue of not really having an open file system. You now open the app, then open the document within it.
There’s also the Files app too that Apple added that does give you a filesystem view, where you can tap files to have them opened in their associated application.
Originates from Xerox PARC. I see you discuss this below, it was Xerox BOD that couldn't see beyond their nose and sold it to Apple. From Jobs own description of being blown away by Xerox, it sounds like he would have never thought of it.
Didn't they steal most of that from X? As in Xerox's graphical desktop environment? It was around long before Apple grabbed it.
Xerox's prototype desktop computer was called Alto, not X, and had some of these features in a very early form. It was never made into a product for the open market; it was used internally at Xerox and at some research universities.
Apple didn't "steal" from the Alto; Xerox invested in Apple and allowed Steve Jobs and Apple engineers to tour their facilities for product ideas.
You might also be thinking of the X Window System for Unix, whose modern descendant most Linux systems are still using. It's pretty different from the Mac approach.
https://en.wikipedia.org/wiki/Xerox_Alto
No, I was thinking of Xerox's initial investigation into rectangular-window based use environments, which literally every single GUI desktop system inherits from. It's name wasn't especially relevant, given it was the only element of its kind at the time.
Most early on, people saw it from Apple. I'm most certainly not referring to the very modern (if simplified) X Window System, which I happen to have in a BSD VM.
My point, which you seem to agree with, is that Xerox did it first, Apple just brought it to market. They didn't invent it, and they didn't ultimately innovate it any more than Microsoft, Sun, KDE, GNOME, or anyone else did; they just served as the earliest exposure most people got to the concept.