4 Comments

I am having some problems understanding what you mean by "algorithm" here. There is no such thing as a program without an algorithm; even if you try to sort a list by picking elements at random, that is an algorithm, even if it is a bad one. Clearly you must use the word as a proxy for a certain subset of algorithms.

That certainly applies to the internet as well, it is full of algorithms. That TCP guarantees delivery is ruled by an algorithm, the routers that make the internet work has lots of algorithms in order to manage their routing tables and also there are routing protocols that implement algorithms between the routers which is crucial to keep the information flowing (see for instance OSPF https://en.wikipedia.org/wiki/Open_Shortest_Path_First).

Expand full comment

Hey Christian,

I'm saddened by this question. It means that my essay completely failed in its purpose. Please forgive me. I'll try to explain a little more poignantly here.

You're absolutely right. There are algorithms everywhere. But past a certain scale, algorithms don't work. Imagine an algorithm that tried to route every IP packet. Somewhere, a database would including every edge in the network and every node's IP address. It find a shortest path to route your packet. You might be able to write such an algorithm, but it would be incredibly difficult, requiring more and more work for less and less returns. Besides code scaling issues, there would also be difficult physical scaling issues to route every packet, let alone manage the data for a constantly changing network topology.

IP works by having each node maintain its own state, which it can either be configured to know or by discovering it locally. And then the protocol includes rules for forwarding packets. The whole thing was engineered to adapt to network outages and malicious nodes. But it does so in an emergent way (engineered but emergent) from the interactions of many small algorithms operating on local state and communicating with its direct neighbors.

This kind of thinking that was successful in networking is what Alan Kay was trying to bring into the realm of programming. Algorithms don't scale. So can we build small algorithms that, as a group, work together to form a system that can scale? What are the necessary language features we need (message passing, local state, local process he discovered)? And what are the principles and patterns (the paradigm?) that enable us to program that way?

The paradigm is what really failed to transfer out of the lab. I basically have a degree in OO Java and I wasn't taught anything about that. I might talk more about that in the next issue.

Thanks and my apologies again!

Rock on!

Eric

Expand full comment

You obviously can define "algorithm" anyway you like but I do not think it is helping the point you are trying to make by conflating algorithms (as in the prescribed method to achieve a goal) with domains of control (which is my best guess as to what is going on).

So how do you define "algorithm"?

As a quick aside, though not really important to the discussion, I think your view of how IP works is woefully simplistic. It may have been somewhat accurate back in the 70s when ARPANET was both new and small, but that is far from the Internet we know of today.

It obviously would not work to a have a single central node be responsible for all routing, but routers will generally have a pretty good view of the network topology (using for instance the OSPF I mentioned to exchange such knowledge) and will use that when deciding how to route packets, routing tables can become huge.

Expand full comment

Hi Christian,

Thanks for the discussion. Your questions get right to the heart of the matter. I hope to clarify my intentions on my end, because I think they're key to the mental shift it may take to appreciate my essay.

So the point isn't to define "algorithm". The idea is to explain Alan Kay's claim that he was trying to "avoid data structures and algorithms."

It was confusing to me, because, like you say, even IP is an algorithm. And how do you avoid algorithms when you're programming? Seems impossible. He's a smart dude, so I know he's probably trying to say something I don't understand. And in a fast-changing field like software, I guessed that perhaps the words had changed their meaning as they often do.

So the trick is to try to piece together the meaning Kay had in mind at the time he said it. I think I did that pretty well.

As to my understanding of IP, I do think it is woefully inadequate for many purposes, but for this purpose, it is just fine. First of all, the context we're interested in is the 60s and 70s, when Alan Kay was designing Smalltalk. So an understanding of it at that point is what we're looking for.

Second, you say "It obviously would not work to a have a single central node be responsible for all routing", but that was not obvious at the time. Further, how to find something that would scale was farther from obvious. IP took a leap of ingenuity, and Kay wanted to bring an analogous leap to programming. That leap is what this essay is about.

Third, your example of how modern routers work is a spot on example of the kind of engineering that is possible with the scalability that Alan Kay was looking for. Each router maintains its own routing table. It is not centralized. And it is discovered dynamically. This is exactly the thing he was talking about when he encapsulated state into an object.

But the fourth and final point is one Kay has made elsewhere that I don't want to find right now. I'll paraphrase: the internet has been running continuously since 1983 (Flag Day, when the ARPANET switched to IP), despite no machine from the original still being on it, and without ever going down. Even though the routers do much more sophisticated work than they did originally, they're still based on the same basic protocols. That's the kind of scaling Kay was talking about.

To elaborate on some of the possibilities Kay was looking for, here are some provocative questions:

* Why does software ever have to go down, even for an upgrade?

* How can software improve as it gets larger?

* How can software improve as we learn how to do things better?

* How can software scale up without slowing down development?

So please don't get hung up on the words like I did for so long. My current understanding is that it's about a subtle shift from "step by step procedure" to "a complex system with a critical attractor". (Sorry to bring up more technical terms right at the end!)

Thanks again for the discussion!

Eric

Expand full comment