Skip to main content


The engineers who designed the #Voyager probes half a century ago even thought of the possibility that a wrong sequence of commands may point the antenna dish away from earth (like someone did a couple of days ago).

And they implemented a self-adjusting mechanism that a few times a year scans the positions of a few known stars to infer the position of the earth, and point back the antenna in the right direction.

50 years later, these wonderful machines are still working, tens of billions of km away from earth, with only 69 KB of RAM, and even a wrong sequence of commands won't put them out of use, while nowadays 4 GB of RAM aren't even enough to start VsCode or IntelliJ.

The more I understand how they were designed, the more I feel like an early Medieval engineer looking at the Pantheon or other marvels of Roman architecture. Some amazing skills, knowledge and attention to details have been lost from that generation to ours.

This entry was edited (9 months ago)
in reply to Fabio Manganiello

Only 69Kb of RAM? So I'm guessing you never ran VisiCalc on a 48Kb Apple II …
in reply to Fabio Manganiello

The machine on Voyager was purpose designed for one job only. Its hardware and software were designed together by experts for a single purpose, and it works wonderfully. A PC is a general purpose machine designed to do thousands of different tasks well. A PC can run a simulator of Voyager or other space probe, but Voyager can’t play Doom. @june
in reply to Nora Rose

@Nora Rose @:gay: zetta :transknife: @Fabio Manganiello there's alot more to it than that, and purpose built hardware doesn't affect this very much (purpose built software however is a major factor).

But also, there's years of cruft from programmers no longer being under pressure for resources. Very often today the memory usage of a program relates to the fact that it would take a lot of work to optimize it, and systems have more than enough resources to run it unoptimized without much issue...

in reply to Nora Rose

@bumpus @june I've dug on this aspect (abstractions vs. specializations in engineering) in this message https://social.platypush.tech/@blacklight/110804405853563768.

You are right, when you optimize you usually specialize and lose abstractions, which means losing the general-purpose aspect.

But the opposite is also true - create too many abstractions, and you start introducing more cognitive burden, besides the performance overhead.

I feel like there's a "sweet spot" for general-purpose software.


good point there.

I've looked at some of the Assembly code that was written at NASA in the 1960s and 1970s (the one for the Apollo programs was also open-sourced a while ago).

I was impressed by how "embedded" it was, with literally zero room for abstractions. The code was tightly coupled to the hardware. Each single bit was crafted and optimized for its specific purpose on that specific hardware.

You can do amazing things once you remove all the abstractions. You can get very creative on how to optimize resources if you know that your code will only run on one specific piece of hardware, for a very specific purpose.

Let's not forget that until at least the early 1980s most of the computers didn't even agree on the "word size", and even on the byte as the minimum quantity of addressable information.

As we started seeking for compatibility, more general-purpose hardware and software, lower entry barriers for coding etc., we started introducing abstractions.

Those abstractions came with many advantages - namely lower barriers for coders and greater portability. Also, decoupling software from hardware meant that you no longer have to be a electronic engineer (or an engineer who deeply understands the architecture the software is running on) in order to code. But I also feel like there's an inflection point, and once we pass it those advantages start to fade.

It's obvious why a compiler that can digest the same piece of C/C++ code and produce an exe on different architectures is a useful abstraction. Same goes for a virtual machine that can run the code on any architecture without recompiling it from scratch.

But when building a basic website ends up creating a node_modules folder of 200 MB, requires the developer to be familiar with npm/yarn/gulp pipelines, transpiling etc., and maybe also with deploying their code to a Kubernetes cluster running on somebody's private cloud, I wonder if we've already passed that inflection point.


in reply to Fabio Manganiello

@Fabio Manganiello @Nora Rose @:gay: zetta :transknife: Absolutely agree!

That's why I tried to avoid value judgement terms in my commentary on it... I mean, I'm a python programmer (hobbyist), that right away puts me right in there with using more resources than I technically need for a project!

in reply to Fabio Manganiello

"If the young but knew..." My first computer had a whopping 4K (yes, 4 Kilobytes) of main memory. Today I'm forced (occasionally) to deal with a circle (jerk) of Agile coders who are powerless when the IDE on their MacBook Pros won't launch, with absolutely no idea where to start in fixing the problem; the file servers, the datastores, their Macbook, the network, or a hundred other things they know nothing about. Oh, and trying to teach them about IPv6 is basically pointless.
in reply to Lorq Von Ray

@lorq trying to teach anybody about IPv6 is pointless, I say as somebody who’s been doing this stuff for ~30 years now.
in reply to BJ Swope :verified:➖

@cybeej @lorq Do you (both) think it has to do with the commercialisation of education itself?

I keep seeing all those certifications and courses for and by companies and their products as well as universities and schools being filled with tech by either Google, Apple or Microsoft, and I wonder whether studying informatics is the only way nowadays to actually get taught necessary basic skills to understand the technology you're working with.

in reply to Natasha Nox 🇺🇦🇵🇸

I think it is due to the abstraction of technology. As products and services lower the barrier to using tech they usually remove the need to learn the underlying fundamentals. So more people use the tech but less understand how it actually works. It’s both good and bad.
in reply to BJ Swope :verified:➖

good point there.

I've looked at some of the Assembly code that was written at NASA in the 1960s and 1970s (the one for the Apollo programs was also open-sourced a while ago).

I was impressed by how "embedded" it was, with literally zero room for abstractions. The code was tightly coupled to the hardware. Each single bit was crafted and optimized for its specific purpose on that specific hardware.

You can do amazing things once you remove all the abstractions. You can get very creative on how to optimize resources if you know that your code will only run on one specific piece of hardware, for a very specific purpose.

Let's not forget that until at least the early 1980s most of the computers didn't even agree on the "word size", and even on the byte as the minimum quantity of addressable information.

As we started seeking for compatibility, more general-purpose hardware and software, lower entry barriers for coding etc., we started introducing abstractions.

Those abstractions came with many advantages - namely lower barriers for coders and greater portability. Also, decoupling software from hardware meant that you no longer have to be a electronic engineer (or an engineer who deeply understands the architecture the software is running on) in order to code. But I also feel like there's an inflection point, and once we pass it those advantages start to fade.

It's obvious why a compiler that can digest the same piece of C/C++ code and produce an exe on different architectures is a useful abstraction. Same goes for a virtual machine that can run the code on any architecture without recompiling it from scratch.

But when building a basic website ends up creating a node_modules folder of 200 MB, requires the developer to be familiar with npm/yarn/gulp pipelines, transpiling etc., and maybe also with deploying their code to a Kubernetes cluster running on somebody's private cloud, I wonder if we've already passed that inflection point.

in reply to Fabio Manganiello

Maybe that's because the #Voyager2 Probe had more $$$$$$$$$ given to it's #SoftwareDevelopment than anyone who worked on #VScode and #IntelliJ combined to this day... ?
in reply to Kevin Karhan :verified:

@Kevin Karhan :verified: @Fabio Manganiello you cite two (relatively) high level programming environments, which no amount of money would achieve that efficiency.

You know what does it easily? Writing your code in assembly like they did.

Every level above assembly your programming language is results in an exponential loss in efficiency, you trade resources for programmer workload and vice versa.

I can't find it anymore, but I remember seeing a programmer's website where they had a whole bunch of windows gadget programs, full fledged GUI applications and non-trivial ones all under 5kb in size because they were written in assembly (they made a game/hobby I believe of rewriting programs in assembly to see if they could get them under 5kb).

in reply to LisPi

@LisPi @Fabio Manganiello that's certainly one way to put it lol

I write Python because when I look at trying to write the same code in C or assembly... well, I definitely don't give enough of a shit for that lol

in reply to Shiri Bailem

That's not really the only option though, #CommonLisp performs within a magnitude of C/C++ with barely any effort (maybe just a few type annotations in some loops).

There are other structural problems with cpython (not sure if inherent to the language though or just a cpython problem) because it performs significantly worse than many other bytecode VM languages (like #Racket).

(I'm very willing to believe it's just cpython being a bad compiler/implementation)

This entry was edited (7 months ago)
in reply to LisPi

@LisPi @Fabio Manganiello it's definitely wildly different, and part of the reason I threw assembly as a comment in there too.

You could write a whole paper on the inherit performance problems in Python, it's a language whose first and foremost goal is readability and ease, it makes ALOT of compromises for that.

For one, CPython is interpreted, not compiled. This makes a drastic reduction in performance (PyPy uses JIT and for heavier applications ends up drastically faster than CPython).

The fact that it uses heap memory and dynamic types takes some performance hits as well.

And it's abstracted to hell and back, which makes it super easy... but super inefficient.

At your mention of it, I took one look at CommonLISP and definitely said I don't give enough of a shit to learn that lol.

in reply to Shiri Bailem

@shiri @lispi314
The following opinion is based on 16 years of python experience(including some where I had to squeeze as much performance out of it as possible without dropping to C) and 14 years of common lisp experience: cl has every advantage python has as a high level dynamic language with support for abstraction way above what python can offer and it performs 2 orders of magnitude better in some cases. Also cl takes way more than one look to learn.
in reply to 🏴 Lispegistus 🧠➕🖥️

@🏴 Lispegistus 🧠➕🖥️ @LisPi @Fabio Manganiello definitely not every advantage... one glance is enough to get the basics with python, which is kind of the point with the language. From what little I know readability is cited as the biggest reason Common Lisp isn't more popular.

I'm not going to say that Lisp is bad in any way, just that it's designed differently from Python and scratches a different itch.

Also, "support for abstraction" is a meaningless phrase, if we're going to talk about the amount of abstraction a language supports then assembly would rule them all. The difference is the base levels of abstraction, a more highly abstracted language is going to have less options for adding additional additional abstraction and you have to go significantly out of your way to reduce abstraction (Python's print command is abstracted all the way to hell and back for instance, which isn't the greatest for performance)

in reply to Shiri Bailem

@shiri @lispi314 Yes, python is slightly more readable to someone who is not used to reading lisp. To someone who is, they're basically equally readable.

The problem is in python you don't have a choice, everything is either a very poorly thought out VM bytecode interpreter, or a C extension. So dicts are very fast, because they're written in C, but if you want to add your own data structures they're slow unless you write them in C. In Lisp you can do it all just fine.

in reply to 🏴 Lispegistus 🧠➕🖥️

@shiri @lispi314 Python was designed as a teaching language where nothing mattered except how easy was it for a beginner to make something happen. That's a poor base to build a serious development on, but people have tried and mostly managed to get some use out of python in some contexts, I happen to work in many of those contexts and pythons limitations and poor design have definitely wasted a lot of my employers money paying me to do stuff harder than they would be in a better lang
in reply to 🏴 Lispegistus 🧠➕🖥️

@lispegistus @shiri I can remember several instances of breaking changes in minor version changes (let's completely remove/rename stdlib modules, what's the worse that could happen?) and it promptly breaking my code.

Amusingly, as far as teaching languages go I think that Racket might do a better job of both the teaching and being used effectively in prod anyway.

in reply to LisPi

@lispi314 @shiri Oh, racket is unmatched as a teaching language, that's what nearly 30 years of iteration in actual classrooms gives you.
in reply to 🏴 Lispegistus 🧠➕🖥️

@🏴 Lispegistus 🧠➕🖥️ @LisPi @Fabio Manganiello while it's popular for teaching, I don't think Python was ever intended as a teaching language specifically.

As I understand it Python was inspired by ABC, which was a teaching specific language, which could definitely be an influencing factor on a lot of things.

But the biggest thing in Python is "pretty code" (so much of PEP8 is style formatting).

in reply to Shiri Bailem

@LisPi @Fabio Manganiello @🏴 Lispegistus 🧠➕🖥️ also I see here a lot of the same logic I see other places where a language is considered easy to read after you've learned it and that being judged against Python's design intent of being easy to read if you don't even understand programming as if they're the same thing.

It's much like the classic argument of whether someone should learn C before Python or Python before C. One side argues that the learning value should be judged based off of the completeness of their knowledge on the other end, while the other argues that the learning value is in how easy it is to reach the end (with the completeness being a later step). (I'm in the Python before C camp, I believe a language at Python is better for learning the high level concepts and learning Python before C would be faster and easier learning both than just learning C in the first place).

Also, I think the biggest reason Python is so popular: it's so easy to make something accidentally production. So many Python projects started as a demo gadget that just became production software without any real hitches along the way (first example that comes to mind is Ansible).

in reply to Shiri Bailem

@shiri @lispi314
You're only going to be a complete beginner for maybe a few weeks to a full semester depending on the person. I'd rather a language used in production be optimized for the hopefully many many years after.

Also, yes, that's true, I actually had to maintain some python that accidentally became production and fuck that.

in reply to Shiri Bailem

@shiri @lispi314 Possibly, it did very quickly shift to a scripting language and was used mostly by sysadmins and as an extension until the mid-2000s when it gained a foothold in web development, that's when I picked it up.

Pretty code, compared to perl, that was the big competitor at the time, and also one of the lowest bars in existence(when you unintentionally create an esolang because you're too smart for your own good, yes, perl is on the esolang list https://esolangs.org/wiki/Language_list :)

in reply to LisPi

either way, all other implementations are either slow as shit or hardly compatible, so not like anyone gave a shit about it besides the folk working on improving performance right in cpython (altho much of it might only come in CPy4 because it’ll break a bunch of stuff)
in reply to Fabio Manganiello

agree but have you ever coded a website for a 56k connection? Same principles apply today as they did then.
in reply to Fabio Manganiello

I bet there are some Russians performing wonders with washing-machine processors.

@inthehands

in reply to Fabio Manganiello

Reminds me of this old article about the software team working on the Space Shuttle.

"The most important things the shuttle group does — carefully planning the software in advance, writing no code until the design is complete[.]”

https://www.fastcompany.com/28121/they-write-right-stuff

in reply to Fabio Manganiello

Time, cost, quality — pick any two.

It's no different now than it was 50 years ago. It's just which two NASA picked, and which two most regular commercial software projects pick.

in reply to Fabio Manganiello

That’s a big ole false equivalency. There is plenty of that sort of low-level bit crunching being done, with much more safety built-in, and much more capability; but if every program had to be made with that level of effort then computers would be as expensive and useful as they were, well, in the 70s. IntelliJ and VScode are massively powerful non-mission-critical tools that are free (or practically free) that can be run on most anything by most anyone with a passing interest.
in reply to Fabio Manganiello

This is the world in which game developers no longer make patch files for their games but instead just make people download the entire 40 gigabyte game with the current build again.
in reply to Fabio Manganiello

Some amazing skills, knowledge and attention to details have been lost from that generation to ours.


Also money and time that things where given

in reply to Fabio Manganiello

Considering the majority of modern programmes sizes are the graphics, not the code behind them, it is not surprising that sevices that only use code need so little preocessing capability. Also the processing on it is massively over clocked due to operating in a near absolute zero environment.
in reply to Arco Fox

@Arco Fox @Fabio Manganiello actually space is really rough on cooling since you can't use convection cooling. On top of that they would be more likely to underclock than overclock as they'd emphasize reliability over performance, especially when you've got your CPU bombarded with cosmic radiation.
in reply to Fabio Manganiello

The evils of the multiplication of technology when it begins to exceed the needs and understanding of it.
in reply to Fabio Manganiello

people of the #demoscene still obsess about getting the most out of limited hardware / within size limits, so not all is lost. :-) https://www.pouet.net/
in reply to Fabio Manganiello

it happens in every generation. Read Richard Preston's engaging _First Light_, a history of the Palomar Big Eye - the mirror support structure was a sort of mechanical computer developed in the 1930s whose history was lost, and when parts needed replacement in the 1980s, they had to CAT scan existing bits + think a great deal to recover that knowledge.