How Does One Explain The Difference In The Size & RAM Usage Of Modern Operating Systems?

Paralel

Tinkerer
Dec 14, 2022
115
47
28
My installation of System 7.1.2 on my Blackbird takes up only ~10 MB of hard disk space and uses ~3300 KB of memory, and can do most of what I can do with a modern system. Same trackpad, same keyboard, same LCD screen. Same ability to connect to external input devices and video output. Browsing the web, on a wired or wireless connection. Access a CD\DVD drive. Use any given piece of productivity software. Etc... If the hardware was present, I'd likely be able to access USB devices without much more overhead as far as hard drive and memory usage based on the mass storage and USB extensions used by Apple in later operating systems.

My Windows 11 installation takes up ~29 GB and uses 5.3 GB of memory. How, in ~30 years time has the size of operating systems expanded ~2,900x and memory use has increased by ~1600x and I honestly can't do that much more with this OS than I can do with that one?

Has the coding ability of the people writing this stuff really gotten that much worse? Or am I missing something?
 
Nov 4, 2021
126
98
28
Tucson, AZ
Some legitimate causes of bloat:
  • more pixels and colors. Icons and graphics are a lot bigger and there are more of them. There are often multiple resolutions now and they're all true color and often with alpha channels. For example, video RAM used for the whole screen on a Mac Classic is under 22KiB. A normal 1080p modern display uses almost 8MiB and is often duplicated to hide drawing glitches and then duplicated again for every window that gets composited together for the final display. All that is used for making updates faster and smoother and avoiding having to re-run comparatively slow rendering.
  • Internationalization: These days we graciously allow non-English speaking people to use computers, but to do that we have to keep complete copies of every piece of text for each language
  • amd64 instructions inherently larger with 64-bit addresses. arm64 is probably similar but maybe not as bad
    • being 64-bit native compilers & programmers use 64-bit integers for everything by default because using smaller sizes requires extra instructions to pack and unpack them. In olden times making every byte count was worth a little processing overhead and extra programmer effort.
  • Caching & virtual memory: In the days of yore nothing unnecessary was kept in RAM and loading things from storage had to be deliberate. Now, if something might be used again it'll be held onto in RAM, depending on the OS to swap it to disk if physical RAM needs to be freed up. In a similar vein many (most?) data and code gets mem-mapped these day, which is to say lazily loaded into RAM by the OS when it's accessed but it immediately counting against memory usage.
  • Indexing: "all" user data is scanned and stored in inefficient-for-storage but efficient for searching databases to enable instant searching
  • real-time assistance: Spell check and it's associated database. Predictive text lookups. Other keyword lookups like @ tagging people on this forum.
  • backwards compatibility: Windows 11 probably comes with a dozen different equivalents of the Mac Toolbox and all OS libraries to maintain backwards compatibility with 30 years of revisions
  • web-everything. Lots of apps use Electron or other embedded web views for their UI. Web style interfaces a easier to develop and maintain at the cost of huge runtime bloat.
    • an addendum of this one: JavaScript/NPM: libraries built on libraries built on libraries built on libraries
  • users multitasking and offloading their brains to the machines: I have over 50 tabs open in Chrome right now, most of them "remembering" hair-brained ideas, projects in progress, reminders of things I want to look into later. Plus Discord with god knows how many servers and channels, Signal with several chats all with full histories and interspersed with memes, Notepad++ with 9 random text files.

I think there is a strong streak laziness and using resources just because they're there too.
 

Trash80toG4

Active Tinkerer
Apr 1, 2022
910
260
63
Bermuda Triangle, NC USA
Great explanation, to push it even farther back in history:

In ancient times when my dad was a systems engineer at IBM, resources were incredibly tight. As he explained it, there was no operating system at all, each program running standalone on the computer at machine level on what really was the bare metal in the early Sixties. Higher level languages for programmers coding to run on operating systems developed in tandem as more cycles and resources became available through the Sixties

IBM pulled him into the Glendale Labs (IBM's equivalent of Xerox PARC) during that later Sixties development phase. Later on in the Seventies, he was in and then ran the senior engineering group during the rise, prominence and fall of the MiniComputer. Heady times I think, much like the early days of PC and Mac development.
 
  • Like
Reactions: rikerjoe

Paralel

Tinkerer
Dec 14, 2022
115
47
28
Great explanation, to push it even farther back in history:

In ancient times when my dad was a systems engineer at IBM, resources were incredibly tight. As he explained it, there was no operating system at all, each program running standalone on the computer at machine level on what really was the bare metal in the early Sixties. Higher level languages for programmers coding to run on operating systems developed in tandem as more cycles and resources became available through the Sixties

IBM pulled him into the Glendale Labs (IBM's equivalent of Xerox PARC) during that later Sixties development phase. Later on in the Seventies, he was in and then ran the senior engineering group during the rise, prominence and fall of the MiniComputer. Heady times I think, much like the early days of PC and Mac development.

It was the MiniComputer that lead to the development of the OS as we know it now, or am I wrong about that?
 
Last edited:

Trash80toG4

Active Tinkerer
Apr 1, 2022
910
260
63
Bermuda Triangle, NC USA
Dunno, it seems a blur to me. "As we know it" is a slippery slope in terminology, covering a whole lot of the ground in question. Big Iron->Mini conversion started from the very early Sixties. I'd say some version of UNIX might be considered the starting point, becoming Linux and Mac OSX of today at some point?

 
Last edited:

speakers

Tinkerer
Nov 5, 2021
98
76
18
San Jose, CA
peak-weber.net
My installation of System 7.1.2 on my Blackbird takes up only ~10 MB of hard disk space and uses ~3300 KB of memory, and can do most of what I can do with a modern system.

No -- you can only do a small part of what a modern system can do. You can't access the modern web and certainly not watch 4K YouTube videos. Nor edit multi-channel audio and video. Nor interoperate with most other modern machine. Nor use machine learning. I could go on and on.

And most of these modern capabilities are built right into the operating system itself. And others are available as modular additions.

All this requires many orders of magnitude more code than System 7 and its entire installable code. Modern 64-bit systems are less memory efficient than 68k machines of yore ... but they don't need to be. Transistors are 3 orders of magnitude more plentiful, efficient and faster. And storage capacities are 4 orders larger and 3 orders faster. And for less money!

I love my retro machines but I can't carry them around in my pocket and do the stuff that my iPhone does.