Transcript

Transcript prepared by Bob Therriault, Adám Brudzewsky, and Igor Kim.
[ ] reference numbers refer to Show Notes

Transcript 

00:00:00 [Conor Hoekstra]

This kind of conversation half the time, I just, there's birds flying around in my head being like, what is the truth of it all?

00:00:08 [Marshall Lochbaum]

It's a common thing for a bird to say.

00:00:11 [CH]

My birds talk Marshall. They all correspond to the combinator birds. There's a black bird, there's a blue bird.

00:00:17 [ML]

That's why. Uh, huh?

00:00:18

They're all just chirping with each other.

00:00:21 [Stephen Taylor]

I can't wait to see what Bob comes up with as a title.

00:00:24 [Adám Brudzewsky]

But we need to say happy array programming, we didn't round it off. You said it's a wrap.

00:00:28 [Bob Therriault]

We're still recording.

00:00:28 [CH]

This is all part of the episode.

00:00:30 [ML]

That's the cold open.

00:00:32 [Music Intro]

00:00:41 [CH]

Welcome to another episode of ArrayCast. I'm your host Conor and today with me, we've got our four panelists. We'll go around and do short introductions. We'll start with Bob, then go to Stephen, then Marshall, and then Adám.

00:00:51 [BT]

My name is Bob Therriault, and I am a J enthusiast. I like J a lot.

00:00:55 [ST]

I'm Stephen Taylor. I'm a APL and q programmer.

00:00:59 [ML]

I'm Marshall Lochbaum. I started as a J programmer.I started as a J programmer. I worked at Dyalog for a while, and now I develop BQN.

00:01:06 [AB]

I'm Adám Brudzewsky. I do APL, have done so all my life.

00:01:11 [CH]

And as mentioned before, my name is Conor. I refer to myself typically as a polyglot programmer, C by day, array languages by night, and sometimes during the day as well, and super excited to be chatting about our topic today, which is kind of maybe a follow up to last episode which we had with Henry Rich. But before we do that, I think we've got 2 1/2 or 3 announcements from Adám and then we'll go from there.

00:01:35 [AB]

Yeah, nice. There was this meeting called APL Seeds, which was on the 22nd of March, 2023. And for new APLers, and there were some presentations there. The videos for all those presentations are now up. Then there is a meetup happening on the 24th and 25th of April in Bingen in Germany. The APL Germany is arranging that. You can sign up for that there. And then there is my other podcast, the APL show with Richard Park. We did a reaction video to a presentation called Change the Way You Write, Change the Way You Think and both parts of that reaction video are now up as well. [01]

00:02:21 [CH]

Awesome. So links to all of that, of course, will be in the show notes or you can find them at arraycast.com. And I think with that, we're going to throw it over to Bob to start the discussion for whatever topics we end up talking about today by reading some feedback we got, or not feedback, I guess it was a follow-up email from the conversation that we had with Henry from Henry. So over to you, Bob.

00:02:42 [BT]

Yes, Henry sent us this email and he said, "Guys, I much enjoyed our discussion on Tuesday and found it enlightening, something we always hope for." Two remarks especially caught my attention. Marshall pointed out that APL, JBQN and K, here and after called the TPLs, he refers to them as true programming languages, have a different idea of simplicity than the other languages do. The others create primitives for dealing with the smallest items and then provide simple means for the programmer to join those primitives. Randy McDonald used to call ordinary computer languages pinhead languages because they operated on blocks that could fit on the head of a pin. The simplicity of the operations guarantees that many lines of code are needed to create even a small program. TPLs create the powerful primitives we know which, as a special case, can deal with atoms but usually do much more. Are there analogies in other fields? What about architecture? The architect imagines a building as being made up of floors, windows, and walls, not bricks, ducts, and pipes. By avoiding detail, they are free to design grandly. Or chemistry. The chemist knows that chemical bonds stem from quantum effects, but to get the day's work done, they work at a level that largely ignores these details. I was proposing that the TPLs are the true programming languages as opposed to mere computer languages. Conor said, while making a different point, everyone recognizes C as a programming language. I want to suggest that this is a case where what everyone knows is wrong. It's hardly the first time in our business. How often have you read 64-bit computers are better/faster than 32-bit computers because they work on data in blocks of 64 bits rather than 32. That hits the nail on the thumb. 64-bit machines have 64-bit address space and to a programmer that's the difference between being chained in a prison cell versus being free. Similarly, in a podcast that focuses on array languages, you will do your listeners a service. If you point out that C should be thought of as a language for guiding a computer rather than a language for describing computation. To the extent that programming a computer means telling it exactly what to do, I suppose you have to call C a programming language. It is as different from a TPL as a bricklaying is from architecture. Array languages don't have to be TPLs. You can add a set of matrix primitives to see and you still have a computer language. I hope your listeners will learn one of the TPLs." And that's Henry. So that was Henry's feedback to us about his discussion with us. I'll leave it open for comments.

00:05:32 [ML]

Yeah, well, to start with, I think I'd like to qualify what I said, because I mean, this is something that I didn't have a lot of time to think about. It was during the conversation. And I'm still kind of struggling with, you know, what is really the difference? I mean, I get, I think Henry's accurately described the difference between APL and C, but what is the difference between APL and a lot of other languages that offer abstractions? So object-oriented languages, functional languages, particularly things like Haskell that have type classes. And I don't think it's right to say the difference is just that APL as the only language that thinks bigger than a single element. But at the same time, it does seem that there's something different about having this one particular data type, the array that you then build up into something that could be made into a whole programming language, as opposed to saying to the user, "Well, we provide all this scalar stuff, but on top of that, you can build other abstractions. So like in Haskell you have a monad which has a few operations, but I may be wrong about this, but I think that the programmer generally thinks, well this is a way of kind of summarizing or unifying a lot of individual structures that have their own, that are the real underlying thing that's happening. So the programmer thinks about this as an abstraction instead of maybe thinking about it as actually when I'm programming, I'm just manipulating monads. And array programmers definitely think in terms of, well, my program works at the level of the array. So that's kind of the way I think of the difference. I don't have that much experience in these other languages, but that's how it seems to me.

00:07:21 [CH]

Yeah, I'm very uncomfortable with the TPL being true because I think that it's...

I agree in the essence of kind of what Rich is saying, but it sounds very...

like true is like the one language to rule them all, which I definitely disagree with if like if that's how it's going to be interpreted. I don't think that's what Rich means, but I think if I hear just TPL it stands for true programming language and then there's a list of you know five languages after it sounds it sounds like I would I wouldn't be surprised if a few people heard that without like the context and then assumed that they were saying like this is the one language you could use for everything which is definitely I don't think what we're just trying to say and also I'd be very cautious to ever use that title because I agree like I prefer like high-level language which completely gets away from sort of the trueness and sort of the royalty of that word true because there's something a little bit more royal about you know tpl but like what and i also think it's like what qualifies as sort of what marshall just said like programming you know computation or whatever the exact words that rich rich used versus sort of like stuff that is not describing the algorithms or building the program you know for I'm sure a lot of people would say that like automatic memory management like that is detail that you want to suppress but if you're working in an application where you know you can't have little hiccups of whatever microseconds or milliseconds because the gcs got to go and do something garbage collectors got to go and do something like I would argue that like for that application that actually is a part of the problem like a part of the problem is making sure that you don't have these like you know frame skips if you're doing like you know video game programming like you can't have some GC in the background yeah that every once in a while it's gonna have to do some some mark and sweep or whatever method they're using and for a lot of applications you don't need to care about that and so it's not a part of like the problem you were solving but for many applications I think GC would like or avoiding GC and doing all the memory management manually is something that you need to work. Anyway, so I just I think that there's like some different categories of programs that you know need different languages and for a large set of them where you don't need to care about the things that array languages suppress, that detail, they're the perfect candidate but you're never going to go and program you know a triple a video game and Dyalog APL or something like that because it's just not going to be a good fit, right? At least that's my hot take in response. I do agree that everyone should go learn an array language though, or a TPL in Rich's words.

00:10:05 [ML]

So there's a nice quote that I think summarizes that, or that gets to what you're saying about different languages or for different use cases. And I don't remember who said this, but the quote is a, actually it may have been Alan Perlis. The quote is, "a programming language is low level when it requires attention to the irrelevant". [02] So that's a nice, distinct description. I think it actually captures the difference a lot better than most descriptions of this. But the thing is, what's irrelevant is context-dependent. So it may be that performance is relevant. It may be that where on the in-memory you place your values are relevant. It may be that the layout that they're stored in is relevant because you're going to interface with some other system that's going to just read them off the disk, and so on. So there's languages that are high level or low level in different ways. And depending on what's relevant to you at the particular-- in your particular use case, something may be too high level or it may be too low level.

00:11:16 [BT]

I tend to think of them as paradigms, and that sort of separates them out. So you've got things like Smalltalk, which are object oriented, and that's a different paradigm than the array programming languages or LISP, which is LISP processing. [03] And it's different again in the paradigm it's using. And then the procedural languages like C and those are a different paradigm. And I think as you guys have both pointed out, different paradigms fit different uses really well. But you use the right paradigm for what you're trying to do. And as Conor says, don't ignore an array language, but don't think that that's the only paradigm that would ever apply. I think quite often people who are array programmers are criticized for trying to wedge too much into an array, and you can do remarkable things with them. Similarly, you can do remarkable things with procedural languages. But sometimes if you get the right paradigm, the tool fits much better. And when you stay in your lane, you're more likely to get a lot further down the road..

00:12:15 [ST]

I doubt that the distinction which we're looking for between one kind of languages and another can be drawn in black and white as hard and fast as we'd like it to. But I'm drawn very much to what Henry was saying about the difference of this metaphor of bricks and architecture and the idea that the array languages, the primitives focus on the large structures and treat the atoms, the scalars of the bricks as a special case. So you can get lots of stuff done in an array language and you can think at the high level. And occasionally you're gonna need to deal with atoms. You're gonna have to put a particular brick in a particular place, but you think of those as special cases. That seems to me to be, that it chimes with the difference between working in a scalar language where I'm starting with bricks all the time.

00:13:13 [AB]

I think it's kind of funny to notice that I just look at the language bar for APL. There's exactly one primitive that is entirely a scalar function in the sense that it can only take a scalar as argument and nothing else, which is the deal, the question mark. And that's only for APL for J. That's not true.

00:13:35 [ST]

It's a very common Qubie error with new q programmers is to start putting writing in explicit iterations, even using the iteration operators, when actually the primitives will just do it.

00:13:47 [ML]

Well, yeah, times each is a common missed primitive, which is of course equivalent to times almost always.

00:13:56 [AB]

Another one I see is an open brace alpha, then some function omega close brace, and then each. So the way of thinking of it is that this each loop, it has to have some kind of scope structure. I can't even do each directly on a primitive. I have to wrap the primitive in a container that I can then use for looping.

00:14:21 [ML]

Well, and that is another thing that APL gets you is that sometimes you can work with your primitives are not arrays, but functions and work with the functions as the basic building blocks and say, well, I'm just going to take these functions and put together my program without ever even, you know, kind of in a way this is not as solid as the way that an array programmer works with arrays, but not thinking so much about the arguments, just thinking, well, I'm going to put this function together with that function.

00:14:49 [BT]

To bring it back to the architect analogy again, because I have friends who are architects, they absolutely are concerned with ducks and structure and the way a building's going to go together, but as opposed to someone who's just doing construction who's only concerned with that thing. The architect also is thinking at a higher level and the way the whole design looks. So I guess if you were to extend that analogy, a C programmer is working with that construction, those bits and pieces moving into the right spot. And you know an array programming language person is very much concerned with that although they might not get into those details because somebody else is taking care of that for them. They're thinking at a higher level but I think there is a benefit to knowing the procedural part to it. As we were talking about performance, the thing that struck me was there's so much I don't know about what's going on under th covers that if I did know, I would be better at. And that's like an architect who's got grand designs but doesn't know what duct or what materials can go in and what materials can do. The best people can do are aware of all those things and then they master their subject.

00:15:57 [ML]

So something I'm wondering about that is what does the architect specify? I mean, they, much like an array programmer, they can't actually say what brick goes where, right? I mean, they're making an architectural design that just doesn't deal at that level. So they might think about it, but not...

00:16:15 [BT]

They're bound by the laws of physics, though. So they may not think...

00:16:18 [AB]

No, no, no, no. I wanted to protest that. Okay. No. I mean, some architects, and they're different. And I mean, that's kind of a question. I don't know whether what I said is right or not. No, I think there are different expectations in different places in the world and different levels of architects. But there definitely are architects that are not bound by the laws of physics. And you need engineers to go and modify the design to actually be possible to build. Because they might make some amazing design when the entire building rests on the tip of a pin. And then the engineers come and say, can't build that or that's not safe in the case of earthquake or some other things. You have it with these concept vehicles that car manufacturers make all the time, that often cannot drive and could not drive because physics don't allow that kind of thing or we don't have the technology for it. But I think it's important there to notice what are the thought processes? They are thinking of super high level. I want the fenders to look like this. I want the car shape to look like this. I want the building to give this kind of impression. These are the things I want to achieve. And the performance might be hurt by it, or not. It may or may not be possible, but that depends on the people who are dealing with the lower levels.

00:17:34 [BT]

I take your point, but I would think that I would put those people in a different category, and they may be architects, but if that's all they do, they're prototypers. So they're not expected to build something that necessarily can be actuated by, you know, where it exists and people could live in it. But what they are expected to do is push the boundaries so people can respond to that and find other things that can be done with it. But I think a working architect isn't going to be working long if all they do is design buildings that engineers have to come in and fix and the design changes.

00:18:09 [AB]

Well, I mean, I've heard APL back renamed as a prototyping language. And it gives you these concepts in the form of symbols. And you string those together to express what it is you have in mind. And then it happens to be executable in modern day APL.

00:18:32 [ML]

Yeah, so I mean, the nature of APL is that if you can write it in APL, it must be able to run. Maybe it will go into an infinite loop. But even in that case, that's probably because you used an imperative construct in the wrong way.

00:18:45 [ST]

Well, I'm going to get on some thin ice here and start a rabbit running and mix some metaphors. And a K programmer I know was complaining to me recently about how lazy APL programmers are. What he meant by this, to the extent that I understood him, was that whereas regular industrial programmers, when they need a new technique, a new technology, we'll dive on into it and figure out how to do it. APL programmers wait for it to appear as a quad function, and have it all done to them and served up to them in APL. I suppose a case in point some years ago would be when XML was popular. And I'd hear a lot of APL conferences about stuff being done with XML this. Everybody was waiting for the quad XML primitive, which would mean that they didn't have to learn all this. And I thought about my friend's comment and I thought, damn right I'm lazy.

00:21:23 [ML]

At the same time, I feel like one of the really great things about the J community in particular is that there's a real emphasis on being able to do stuff yourself. And I mean, whatever way you come up with, But on the forums, [04] you always see, you know, somebody asks, how do I do this? And somebody else will reply, well, here's the way I just dreamed up. And maybe it's not great code or whatever, but J programmers always seem willing to dive into that. So maybe it's possible that you can have a language that does a lot for you.

00:21:58 [ST]

When you say dive into it, do you mean to solve something scratching J? Or do you mean to plug some C# into it or something?

00:22:05 [ML]

No, no, do do it in J. I mean, unless the question is how do I connect J to C# also? I don't know if you can do that. That sounds tough.

00:22:17 [ST]

Now my friend was my friend was complaining about the people who won't go to do it in C or C# or whatever is needed.

00:22:24 [ML]

Oh yeah? Well, the the J mentality is that only J is needed and they're right.

00:22:31 [BT]

The J delusion. I share it, I guess sometimes.

00:22:36 [CH]

well, so this this brings up a thought I've had in the back of my head while this conversation just been going on that. I think maybe what was Bob when you were saying something? No, I can't remember if its Bob or Stephen and I had the thought they were talking about, you know the high level miss or the trueness or whatever, but at the end of the day like one of these low-level languages is always Is always there like and so like this isn't even really my belief I just work with enough and technically, you know do C++ know that there's people that would make this argument that like Everything at the end of the day involves one of those languages There's this talk that I've seen once where it talks about small talk and Java and all these Vm two languages and then shows a chart of like what language the VM that the languages written in and all of them are written in C++. And a further point to extend to that, what's Dyalog APL written in? C. What's J written in? C. I think cbqn is written in C, but the VM is written in bqn.

00:23:42 [ML]

That's what the C is for. Oh, yes, it's also -- so we -- the compiler and the self-hosted runtime, which is slow, so we're gradually replacing that, are written in bqn. And then for SIMD stuff, and we're starting to use it for some generic things as well, we use a language called Singeli, which is written in BQN and compiles to C.

00:24:06 [CH]

Interesting. Okay, so BQN gets the closest to like, but the point being here is not that like, it's good or it's bad, but just that if really, if I want to play devil's advocate on behalf of the low level systems programmers, if you know, these array languages are the true programming languages that like sort of you know the high-level architects they know the nitty-gritty but they also know the high-level detail you could flip that and say well there's there's there's no APL or these array languages like without all these other like C or C++ or one of these low-level languages almost always exists at the end of the day yes there are some self-hosted languages and such stuff like that but like in a lot of these popular libraries like in Python at the end of the day you end up you know executing some C libraries and C++ library. The point being is that these languages are ubiquitous or how they're used. They're usually a part of the foundation that these things are built on top with. And you can't say that about APL. Maybe for J because I don't know as much about it. They seem to think that all you need is J, but even J is written in C, right? So that kind of pokes a hole in that argument. It's like, "Well, if you only need J, why isn't J written in J?"

00:25:13 [ML]

You need one C program and that's the J interpreter.

00:25:19 [CH]

But I have to note that every language family has a way to write stuff not in C. So in J, there are a few functions implemented in J and Dyalog, there are quite a number that are implemented. In APL, they're using magic functions as, yeah, magic functions is the name for that. bqn and we've been over k. Well I know ngnk uses what are called k-strings to implement things in k. Q is pretty much entirely a layer over k implemented in k. So there's been quite a bit of work on getting away from C and being able to do things in array languages because it's faster. You're not fully getting away from C. The argument here is that at the end of the day there's some like kernel of truth which some people could argue like it makes it the quote unquote true programming language. Stephen, you want to hop in?

00:26:21 [ST]

Yeah, this is time for one of my favorite quotes. This is Arthur Whitney addressing the 40th anniversary celebrations of the British APL Association at the Royal Society in London in 2004. He'd just been talking about how K has a code volume about two orders of magnitude smaller than equivalent C programs. And he went on to say it is theoretically impossible for a K program to outperform a C program because for every k program there is an equivalent C program that runs exactly as fast. Nonetheless, K programs routinely outperform hand coded C. And the reason why they do that is that it's a lot easier to find your error in four lines of C of k than in 400 lines of C.00:27:14 [CH]

Yeah, I definitely think there's some truth to that.

00:27:17 [BT]

I think I'm going to start the other end of the whole spectrum and create a machine language podcast because everything exists on that. And you think about it, well, that's silly, but C does exist on machine languages, on machine code, and it's bits and bytes when you really get down to it, ones and zeros, but you couldn't ever program in that and keep things straight.

00:27:50 [CH]

I agree in general that's true, but there are stories of back in the day before these high-level languages, there's some famous story of Paul Allen on a plane to some demo when Microsoft was just getting off the ground and Bill Gates had to stay behind. I don't know, I think they were in the MIT area or Harvard area and then he like, something was wrong and he hand coded this like bootstrap launch program for like to get this demo working. He hand coded ones and zeros while like, don't get me wrong, this guy's clearly gifted. I'm not saying that like, but just to say that it's not possible. At a certain point in time, there's people that knew exactly what the opcodes were for each of the assembly instructions, and if it was a short enough program, or even if it wasn't a short enough program, if they knew the structure of the equivalent of for loops and ifs and whatnot, they could slowly go through and basically use their brain as a compiler and write out ones and zeros. When I read that, my reaction was like, what the F, that is absolutely nuts. 'Cause I can't even write hand assembly, let alone the op codes that correspond to it until you just get one long binary string.But like, I think if that was the world we lived in and we didn't have a, you know, decimal numeric system and we had a binary system and that's just how we did numbers, like it could be a lot more, then we could be living in a parallel world where like that is actually how we wrote code. Is that the world I want to live in? No, but it's possible.

00:29:20 [BT]

Not entire programs that way. I can see going in and doing surgery on it, fixing a specific spot because you know the whole system that well. But I can't see anybody actually constructing a full application with ones and zeros. And if you're out there, my hat's off to you.

00:29:37 [CH]

But you could say the same thing about like, could you imagine if someone built a whole program on C, like writing every single for loop themselves and every single branch themselves, like that same argument between like that same argument, like from like a high level language, like APL to C, I think is like the same argument of C to assembly and the same argument as assembly and ones and zeros, we're just so far removed as a intellectual whatever community working more in these like the lowest typically people go is C with some inline assembly and the highest we go is APL. But if it was back in the '70s or '60s before you know sort of LISP and these things took off, I think this conversation, like our takes would be completely different if people were like, oh well you know, Joe the other day he did write a, you know, a four page, you know, ones and zeros. And yeah, we didn't recommend it 'cause he had a couple mistakes, but I think it would be like a tool in our tool belt that we would have to have. We're just half a century removed from that now. So it seems like, wow, that's terrible. Sorry, Adám, you've been trying to hop in.

00:30:40 [AB]

No, I'm not hopping. Just add to this. My father worked at IBM from, I think it was '68 to '72. And he told me about how they could often work like that. Computers running grinds to a halt because there was some kind of issue. So they walk over to it and they look at the little indicator lamps that are showing what are the current bits set to in its working memory. And they noticed that some of the bits are not the values they should have. So they flicked the little switch to switch them from one to zero, zero to one, and then they press go and it continues. And then he also, I mean, at home, he didn't have the fancy hole puncher where you would type like a typewriter and you would punch all the seven holes for that bite, character, you punch one hole at a time. So everybody knew the binary alphabet by heart, and you make the holes so that you spell out what you want to spell out. You write your things like that, and you bring them to the office the next day, and you run them. And so they would definitely do that. And APL 360, which sure, it doesn't have the scope of a modern day TPL implementation, but that's been written entirely in 360 assembler, which is probably even the primitive assembler by modern standards.

00:31:56 [ML]

Yeah, I mean, writing large programs in assembly has been common for a long.

00:32:00 [ML]

I mean up until the 90s probably.

00:32:04 [ML]

The the difference between assembly and machine code is there's just no point going to machine code because it says the exact same things, but it's using these numbers that you have to memorize instead of the names that you're used to.

00:32:16 [ML]

Yeah, I mean writing large programs in assembly has been common for long, I mean up until the 90s probably. The difference between assembly and machine code is there's just no point going to machine code because it says the exact same things, but it's using these numbers that you have to memorize instead of the names that you're used to, and it's all packed together is the other thing. But like if you think about it, all the letters that we're used to, I mean, they're all just these these lines and circles and stuff. And you spent your whole life learning to turn those into meaning. But I mean, if you like if you are encountering an alphabet you weren't familiar with, that takes a long time to learn. So a lot of it is that we've we specialize towards this, these particular lines and squiggles. But if we focused on dots instead, maybe it's fundamentally harder, but we would still be pretty good at reading those.

00:32:52 [AB]

Hold on. There are people who read braille are essentially reading binary code, right? Yeah. And they don't seem to be suffering a whole lot from that. It's not holding it back. We should do this, like Conor suggested with this MEL system. It's so primitive. We can count to 10 on our two hands, most people at least. But if we counted binary, you could count to to 1023 instead on two hands.

00:33:18 [CH]

I think we're on like the edge of the philosophical discussion that I constantly have by myself in my own head of like, what is readability? And like, I think array language programmers, you know, probably all at some point have like dealt with this question because it's one of the number one things when you're dealing with other languages and like, Oh my goodness, it's unreadable. And we've all, I mean, probably on this podcast, I've mentioned the Russian poetry quote is like is Russian poetry unreadable just because you don't speak Russian and same thing with Chinese but just you know the there's a lot of things that we take for granted like it's very hard to recognize bias until you are exposed to like a different point of view and even if you're exposed to that point of view like the best way to really understand it is if it's like shown if you see the utility instantly I think a lot of people if you try and explain to them oh APL it's this high-level thing it's beautiful whatever, it's not immediate to them like how they, oh wow, I can see the new mnemonics and the expressiveness of this language and this vocabulary. It's very hard to communicate that instantly. Usually there's a little bit of a journey, they have to go watch a couple talks, they have to play around with it for themselves. So the point being is that it's very hard to convince people and it's the same thing with why we live in a decimal system. I've heard anecdotally that it's 'cause we have 10 fingers and that's at some point, that's why we went with 10 But is it really like the best numeric system? Like there's a whole community that thinks, I can't remember who was talking about on this podcast, but the base six community that think that like everything should be base six instead of base 10. And also like if computers are programmed in binary, like why don't we just count in binary? You know, it's like we've got two arms. How come they went to the digits instead of just the arms and we didn't have a binary? Anyways, it's just, there's like so many things that I think we take for granted. And it's just like the way society is, but it could have, like, the way the world evolves could have been incredibly different. And yeah, you know, but like I said, this is like a very philosophical, hand-wavy, typically it's why I only talk about it with myself in my head when I'm running because it's a, yeah. Not a very technical topic, but at some point I want like there to be a talk like, what is readability? And then like, it's just like a philosophical talk, trying to convince people to keep their mind open to the next time they see the J incunabulum [05] which even myself I think I have said yeah oh it's impenetrable like the first time I saw it and it's like I'm not even recognized I think in the Bob pointed out like well this is coming from a people on a ray language podcast you know don't think we should take a step back before we start call things and readable and it's a great point right like even we are like subject to calling things like ah like how could anyone ever read that and it's It's like, of all people, we should be the ones never doing that. Anyways, monologue over.

00:36:08 [BT]

There was an article I read yesterday. My wife actually sent it to me and I think it was Scientific American about a new number system that was developed in Alaska by the Inuit. And essentially it's like a visual representation of what we think of as our 10 fingers so that not only does...

00:36:26 [AB]

Hold on. When it's base 20.

00:36:28 [BT]

Yeah, it's base 20, but also because it's based in all your body and then on top of that they've done the the visual representation, so there's actually certain operations that are much easier to do because the visual presentation changes like some subtractions and some multiplications can be done consistently because the visual representation actually guides you towards the answer. Which I mean I looked at it. I certainly haven't got my mind around it, but I looked at it and thought, wow, this is something that's really amazing and it's completely different than the system that I've grown up on, which is decimal and a little bit of binary because you know I'm into computers, but this is different again. And I think often the same way you look at something that and I go: I got to come back and take another look at this. This is fascinating. But I think a lot of people who are more practical then I am just look at something and go: Sure, fine and move on with counting by 10 and you know, exchanging dollars and.

00:37:28 [ML]

I kind of figure there must be, you know, for the decimal system to have been adopted. There must be some benefit in having all the digits look so wildly different from each other.

00:37:37 [AB]

Well, hold on. Hold on. Decimal System has nothing to do with the digits you used to write them.

00:37:40 [ML]

Well, the whole. The the OK decimal and Arabic writing system like now.

00:37:44 [ST]

Eclipse, shall we say.

00:37:46 [ML]

Maybe it was, maybe it was just that the those were the shapes of the digits and they... It became popular because that was the only decimal system there was, but I mean it kind of seems like there must have been.

00:37:59 [AB]

That's that. That's not the case, that's for sure. Not the case.

00:38:01 [ML]

OK. So, so I my point stands, I think you know that there must be some benefit to having these crazy digits. And I mean, I do think it would it, it would be easier if there were some pattern to them like you know, why is 3 round and four spiky that I don't think there's any good reason. And then when you go, you know, you might think 3 is prime and four is not, but then seven is kind of spiky and eight is round again so. It seems like they're just arranged nonsensically, but I think there must be some benefit to how it's done. It's not obvious.

00:38:38 [CH]

I think there's a book, actually I. Know Ben Dean is a colleague and friend in the C++ community and it's like the history of notation and I think it covers. I have not read this book. It's unfortunate. I don't think because if someone had read this book in this group, I feel like this would be the exact moment. It's like I've actually read this book and there's a history behind it, but I have not read the book. So I don't think Ben listens to this, but if he is listening, he might be thinking in his head. This is... There is a history behind this and it's documented, though we just we just. Don't know what it is.

00:39:08 [ST]

Or whatever happened to variable base arithmetics. Like the Romans used. [06]

00:39:12 [AB]

Variable base arithmetic?

00:39:15 [ST]

Yeah, you got your ones, you got your fives. You got your 10s, you got your 50s, you got your hundreds, you got thousands.

00:39:21 [AB]

But that's that's I don't think I would call that variable based the Roman numeral system is definitely a base 10, no question about it. It's just a shorthand that you have a symbol for the fives and the 50s and so on.

00:39:30 [ML]

2 and 5.

00:39:35 [AB]

So you don't have to write out IIIIIII you got and the same thing goes for that system that Bob was mentioning it also starts off by going up to five by just adding verticals landed lines and then once you add five it starts going out it horizontally instead. It get we... I think there have been some studies that find that people can count to 8 or so. And beyond that, we start losing track of stuff, so it makes sense to stop well before that, especially if you're doing base 10, then sure, stop halfway.

00:40:11 [BT]

And Conor just put up on the screen. A history of mathematical notations. 2 volumes bound as one by Florian Cajori.

00:40:20 [CH]

This is the book that I know Ben has on his bookshelf, and we won't spend too much time on this. But it does show like 1234567 is questionable, although seven is at the top 8-9 and it is like the evolution of things. And note that the one horizontal line, 2 horizontal lines and three horizontal lines correspond to the Mandarin spellings, that's what they are in Chinese. Anyways, I'll go read this and then we'll report back in a couple of months of. Because it is a good point that Marshall said there's probably more reason than he didn't say explicitly, just like 10 fingers, but like.

00:41:04 [ML]

Well, and even looking at this cover, you can see a lot of them looks like they start out more regular and then they become less regular and closer to the digits we know overtime. So that even suggests there's, you know, some sort of gradual pressure that pushes them towards this irregularity.

00:41:21 [CH]

Yeah, you can kind of see going from three line 3 horizontal lines to represent 3 to the 3 that is now rounded but like the next step, it just looks like they're actually connecting those 3 lines. So you have 3 horizontal lines with two lines connecting them.

00:41:37 [AB]

No, this is normal for handwriting purposes and you have scripts where the printed form is separate lines, but the handwritten form connects them. In letters as well.

00:41:48 [CH]

And actually look. At this so like get this in this middle one 1 uses one line, 2 uses 2 lines, 3 uses 3 lines, 4 uses 4 lines, 5 uses 5 lines. So like maybe that was like a huge motivating factor between how these look because it's the number of lines that you need to spell it if they're all like sort of the same length to generate that so you end up with eight that in this system. Looks kind of weird and then nine is just eight with the extra line on it, which you know at some point evolved to look like 9. But anyways, it's just even looking at the cover we haven't read the book, we haven't done our research, but you can see that there is, you know there's more motivation than one would sort of naively expect anyways.

00:42:32 [AB]

You should have started off at least the lower digits started off as counting sticks type things and one line for 1 two lines for w. But that's so complete.

00:42:43 [BT]

Welcome to the glyph cast.

00:42:44 [AB]

Yeah, this is entirely different from from base 10.

00:42:48 [ST]

If I'm following this discussion of base 20 fell out of popularity with sandels.

00:42:54 [AB]

Hey, in Danish we still use base 20 when we speak and even to the point of being so extreme. [07]

00:43:02 [ML]

Yeah, French as well, right?

00:43:03 [AB]

Yeah, but the French French has is one step less insane than the Danish system. In French they'll at least go in whole 20s, and so you might have 3 * 20 + 17. But in Danish they will insist on using 20. So if you have to you have the 20. So we would say halfway between 3 and 4 * 20.

00:43:41 [ST]

For those of, for those of you who don't remember or never knew how to write a cheque, you'd write the numbers in Arabic numerals and you'd also write out the the number in words. And in Denmark, back in the 70s, we used to write the numbers out in Swedish because the Danish system so insane.

00:44:08 [AB]

Also, bank notes would have the text they send notice they now that I moved back to them, they switched it to to be saying in abbreviated Danish, but they used to have the text in Swedish as well, not pseudo Swedish. Using the Swedish sane system in the Danified way, so rather so, I mean we should bring back to what we actually talking about.

00:44:29 [ML]

Yeah, I have the. I can do it. So one thing about APL is actually that the symbols. [08] So there are some patterns to them, like for example you have well the the main pattern is all these over-strikes. But APL symbols are pretty irregular in some ways, and one possible issue with BQN is that I've gone and made the symbols too regular. Which is something that I thought was a potential concern when I was designing, but I just really didn't have anything better to do. You know, there's a limited set of symbols and all so like the the square brackets with the underlines that it preferred not to use, but I couldn't find, you know, a symbol that made a better fit and the square bracket and the underlined square bracket are usually related. So I mean I figured I'll use this relation, so it's interesting to say you know, where do you want regularity versus irregularity? Definitely some of the regularities APL has are helpful, but I think also the irregularities are probably pretty useful when you're reading stuff.

00:45:33 [BT]

And for J, I mean, as much as it's criticized for its dots and colons. That is a place where you start to build a, you know... It doesn't have a glyph, but instead, quite often appending something with a dot will have a relationship back to the function that doesn't have a dot. There's actually a relationship between those, which makes it a bit easier to remember them.

00:45:57 [AB]

Is it or the other way around? I have the issue of not remembering which one is which. They're not distinct enough. I know that one of them is with the dot. Another one is with a colon for some particular ASCII character, but which one is which? They're not. There's nothing in the dot or the colon that makes them distinct in that way.

00:46:14 [BT]

Yeah, I think you're right. When you say if it's going to have a dot, or a colon, which one would be which? But I think it's pretty common the the ones that don't have a dot or a colon are the most often the very primitive ones that we've learned first. And then there's the dot and the colon changes them. But you're right how it changes them sometimes isn't quite as consistent. And then you get the things where you've got a dot or a colon following a semi colon, and that you know well, what does a semi colon mean? You have to, you know, there's. A whole bunch of...

00:46:44 [ML]

Too many dots at that point.

00:46:45 [BT]

Yeah, you're down the rabbit hole then. But at least on some things like you know addition or subtraction, the dot or the colon does kind of make sense. Of course there's exceptions, because you know there's always exceptions.

00:46:57 [ML]

So yeah, there's definitely that question with with closely related things. Do you want them to have similar symbols to show that relation? Or do you want them to have different symbols so that you can't mistake them for each other? Cause like if you think about words opposites very rarely have similar names. Words like up and down. No one could mistake those two words. But then in APL we use the up arrow for taking the down arrow for drop and those are confusable they're I mean learning which is, which is possibly a problem. I mean, I would say most people when they learn APL have problems like swapping 2 related symbols, sometimes, not always the same set of symbols. It's just whatever your particular associations are.

00:47:45 [AB]

Hold on, isn't it very obvious that down arrow is dropping. It all seems like you would not confuse up arrow for dropping.

00:47:54 [CH]

I agree with Marshall like take and drop I don't usually have an issue with, but the. Mix and split. Even though I think in your nomic one like uh, you pointed out, someone pointed out that they use the fact that the the rank reduces. But like, if you watch my videos when I'm programming, I always end up just like trying both and if I got it right on the first time, it's just because I guessed correctly. Like there's certain symbols that when I type and the mix the monadic mix and split are like it's probably the worst offender to me because it's very common operation when you want to split. [09] But I can never remember. I can never remember which one it is.

00:48:27 [AB]

Those those, those two particular functions are controversial as well. The also the pairing of it and it's not the same across API implementations either. So you identified those correctly as being problematic, but if I look at j, things like OK, so less the less than and greater than signs mean less and greater radically. And then less than dot is for min the minimum and less than colon is less or equal.

00:48:56 [BT]

Lesser or greater of yeah, yeah.

00:48:58 [AB]

Yeah, sure. But yes, I understand the association between them that makes perfect sense to me. Uh, but normally I write less than or equal to less, less than under score. So I might think that a single dot is like instead of the underscore. Colon is not that exactly and I also have.

00:49:19 [ML]

But I do think less than and greater than for Max and min is pretty good because you're reusing an existing distinction that the user has to have learned already.

00:49:27 [AB]

Oh yeah, the pairing makes perfect sense, but which one exactly? Is it the dot or the colon or?

00:49:34 [CH]

I've always remembered that, and I hopefully I have this right, but the one with the colon is the equal and equal to because two dots is like the start of an equals sign.

00:49:43 [AB]

Yeah, but I don't use a programming language where you write less than or equal to is less than equal. I use a programming language where it's less than the under score that's in mathematics.

00:49:53 [BT]

And to bring it some Marshalls point where some some languages, you know, always distinguished up from down, they look very different. But on my screen right now I'm seeing a button that says mute and if I hit that button I'm going to see unmute. So there is sort of a a balance between those things. We have a the UN is a dot.

00:50:15 [AB]

You have another application where if the button says mute, that means you're currently muted. If you click it, it will go back to show your status as unmuted name. So that that's another issue of symbolism. Is it a symbol of of what it is you have that you want to do something about. Is it a symbolism for the the process of changing things. Or is it symbolism for the state you want to get to and Marsha was talking about this as well. The left argument to dyadic the transpose we yeah, we had a whole discussion about this. [10]

00:50:49 [ML]

Oh yeah, that's tricky.

00:50:52 [AB]

That's right. And it's so it's reordering the access, but is it, is it telling you where you want the access to go to or is it where you want the access to come from.

00:51:04 [BT]

Well, I think we're, we're right now. We're a whole bunch of architects talking about what brick we like to use and why we use a certain brick and the thing is that we do think about these things as this higher level. One of the points that Stephen made that I think is really powerful is that when you're at a higher level like an array language and you don't need to know the lower level, it allows you to know the lower level in your domain of expertise. So in essence, if there's a fit at all between the two levels like they can actually correspond to each other. You don't need to know all the details of the computer side of things, but the valuable thing that you do know is your details in your expertise. You know how DNA molecules put together what things can happen, and as a result you can mirror that back up with a computer that can do the same thing. But if you didn't know the bricks and mortar of what you were trying to do it would be a disaster. So you do have to know the bricks and mortar of whatever area you're working in but all of the higher level languages which I think is probably a better term or all the different paradigms if you fit them to the right substructure, you're going to get a lot of power that way.

00:52:20 [ML]

Well, I think this is a way in which APL is lower level because. If you're a domain expert, you're thinking about the bricks and mortar of your domain. But when you program an APL, you are writing with arrays, which are the bricks and mortar of array programming. In this way you know APL is very consistent about it. It picks the level and it sticks with it. The array level. Which is useful in a lot of ways. And this can be a good or a bad thing, but it's not so... It it doesn't push you towards making abstractions that fit your particular domain, at least not in a way that would hide the array level. I mean you can you can definitely make functions that work with things and you know arrange things so that they make sense from this other perspective of, you know whatever chemistry or whatever that you know. But you have to think at the level of the array. That's what APL says.

00:53:26 [CH]

Which is it's it's very useful. That's why I think everyone should go and learn that because one, it's just great. But two, you do get to the edges of what you can do with arrays and then you end up, you know, learning about the group primitives or like key and J and stuff like that or how every array language has like a slightly different solution for that. Some are more similar to others you know between like APL and J. And I think there was that blog post or someone was it Chris Pearson that talked about the fact that group was being removed from K9 and... [11] The point being is that like I think there's a lot of agreements on what plus does you know between a scalar and a rank one array or a matrix.

00:54:13 [CH]

But when you get to. The edges of what like array languages aren't amazing yet then you see a bunch of interesting language design choices of how does key work in J and how does key work in APL and how does I think group work in BQN. And I think that's like the really interesting stuff because there are different solutions in other languages. It's like Oh well, I just have some hash map kind of function in smalltalk or Python and I'll just use the Counter collection in Python. And you're good to go. But how you solve that in sort of array land is completely different and like it's interesting to see like what what the decisions that the language designers made, whether they were the best ones or not the best ones is sort of not the point, it's just that, like, oh, how do you solve this problem in this domain? Because I would have never had to think about it that way, if I stayed in Python.

00:54:57 [BT]

I think one of the advantages that the array paradigm has is that it it opens up easily to a lot of mathematical ideas. And so you gain the power of a lot of things in mathematics because an array is a regular thing that can express and change things in a regular way that's mathematical. You're you're piggybacking on the things that have learned in mathematics and the consistency. Whereas say, for instance, an object oriented language like smalltalk, you can do mathematics absolutely. But there's not quite as strong. I don't think there's quite a strong a bond between an object and mathematics. I suppose, and I'm thinking a lot as I'm doing this. I'm thinking in some areas like category theory, maybe there is a stronger bond back to object oriented programming, but in most of the standard things that you end up doing day-to-day life arrays really fit the mathematics of what you're trying to do quite well.

00:56:01 [CH]

And that's a wrap.

00:56:03 [ML]

I didn't hear you once say past the hour mark. Have we ever recorded an under an hour episode?

00:56:09 [CH]

This is true. I'm looking at my recording. It says 58:30 I'm like, what's going on?

00:56:14 [BT]

I don't think we've ever recorded it under an hour episode. I definitely have produced them.

00:56:21 [ST]

We must be getting good at it.

00:56:23 [AB]

And this one will be even shorter because I suppose, but we'll cut out some of our long breaks, we don't know what to say.

00:56:31 [CH]

It's not that I don't know what to say. It's that like this, this kind of conversation half the time it just. You know, just birds flying around in my head being like what is the truth you know of it all.

00:56:42 [ML]

The common thing for a bird to say.

00:56:45 [CH]

My birds talk, Marshall. They all correspond to the combinator birds. You know, there's a black bird, there's a blue bird and they're all just chirping with each other.

00:56:54 [ST]

I can't wait to see what Bob comes up with as the title.

00:56:58 [AB]

But we need to say happy array program, we didn't round it off.

00:57:01 [BT]

You said it's a wrap. We're still recording.

00:57:02 [CH]

This is all part of the episode.

00:57:03 [ML]

That's the cold open.

00:57:05 [BT]

I guess at this point I get to say if you'd like to talk to us, you can or respond to what we've done. We started this all off with the response we got from Henry. And thank you so much, Henry, for that response because it filled up an episode with some really interesting ideas. But you can contact us at a contact@araycast.com [12] and we're welcoming to all your thoughts about this and and they maybe we never want you to do an episode like this again but everything's fair.

00:57:37 [CH]

Well, I'm interested to hear if listeners have thoughts on sort of the comparison between these, quote, unquote, TPL's, which makes me uncomfortable even saying the acronym versus other paradigms and array language or paradigms and languages at different levels so.

00:57:54 [AB]

Maybe we can call them concept programming languages. That's how I feel it, at least that I have symbols.

00:57:59 [ML]

I was just going to go with SCTPL - So-Called True Programming Languages.

00:58:07 [AB]

According to. Yeah, but I the way I feel when I use these kind of languages is I have a symbol that is mapped in my mind to a concept. It doesn't have a name. Many of the names for the primitives I find either silly or obscure anyway, but I don't think about like that. I don't mouth out my programming when I write the API. I just think in terms of the symbol and the concept in my mind, and I'm putting these concepts together to represent the idea that I have and then I can run it.

00:58:42 [ML]

Well, so the name for one of those things that fits to me best is actually primitive, although I'm not sure we would can say we're proud users of Primitive Programming Languages.

00:58:56 [AB]

Primitive Programming Languages are the most advanced programming languages.

00:59:00 [CH]

So wait, we've got TPL true programming languages. STPL so-called true programming languages. Adám thinks of it as concept programming languages. Marshall not advocating for it, but just threw it out their primitive programming languages. Which has the nice PPL people. In my head I actually think about them as like algorithm and Combinator programming languages. Although typically I refer to them as array-combinator programming languages, but like I think the primitives breakdown into like algorithmic ones and combinator ones. Sure, the algorithmic.

00:59:38 [AB]

But you can you can remove the combinator ones and. You'll have a one of these languages, right?

00:59:44 [ML]

Well, so to me you've got three classes there's the number primitives, the array primitives, and the function primitives, which K doesn't really have function primitives. So depends on how wide you want to draw your circle.

00:59:58 [CH]

Say that again. So we've totally messed up the end of this episode, we apologize to the listener.

01:00:01 [AB]

The number primitives are the mathematical ones, right? That's what you mean, Marshall.

01:00:04 [ML]

Well, depending on what you think math is, but all the arithmetic, the scalar functions or the I call them arithmetic functions in BQN.

01:00:07 [AB]

Like plus and minus and those kind of things, yeah?

01:00:12 [ML]

So things that work at the level of a number. So modulus and addition. And then you have array primitives like reverse and catenate, and replicate things like that reshape, yeah. And then you have the function primitives that I mean, these are all operators, they work on functions and give you new functions, so swap and compose and so on.

01:00:43 [AB]

But even the fold would be that right?

01:00:46 [ML]

They're kind of crossing the bridge. Bridging between function and array primitives.

01:00:53 [CH]

So yeah, in in my mind.

01:00:53 [ML]

I would probably call them array primitives.

01:00:55 [CH]

And scan and fold and reduce are in there too. Number and the array ones go under like the algorithmic umbrella and even the scan and reduce insert whatever you want to call them. Those are under the algorithmic ones and then any what you call function primitive that is kind of like a function manipulator like technically both scan and you know a top or compose or whatever you want to call it, take functions as arguments and return you functions as the results, but one of them creates like, in my mind, a scan and a fold like those are new types of operations, whereas like compose is just like manipulating the order in which you apply these functions like they're function manipulators. But anyways, now we're now I feel like we're in like a separate episode of.

01:01:48 [ML]

If we pass the hour mark yet?

01:01:51 [CH]

We have passed hour mark.

01:01:52 [ML]

All right we made it.

01:01:54 [AB]

Good luck, Bob.

01:01:57 [CH]

Stephen, Bob you you want to weigh in on any extra acronyms we should throw out. So I don't know what mine was...

01:02:03 [ML]

ACPL I think.

01:02:06 [CH]

ACPL algorithmic combinator programming ACPL.

01:02:11 [AB]

Why a combinator not the type of algorithm as well?

01:02:15 [CH]

They they are. But like I think of them as like fundamentally different and worth explicitly like delineating between them because. I think combinators [13] are like very, very important and there are very few languages that have them, let alone like recognize that as a part of their paradigm. Like you only ever hear about array programming languages, you don't hear about Array Combinator programming languages, and like the combinators I think are so, so, so, so important. Like your ability to build up terse expressions that are point free depends on the combinators like you can't get that without the combinators, and there's very few languages you know Haskell's one of them. But Haskell doesn't even have as strong as support like built-in support. You can do anything you want in Haskell, but like I think the array languages is the best paradigm in the world for Combinator programming. So like in my heart of hearts I want to like point that out. I think Marshall was going to say something.

01:03:11 [ML]

Well, I mean since K is the only one that doesn't have combinators, you must say that K is the only APL there is.

01:03:18 [CH]

Well, k does have it has very few. It has a couple of them, right. It's got that.

01:03:22 [ML]

Ohh, you're still going to say it's an ACPL and not an APL.

01:03:25 [CH]

It's got the the the famous example that we are the famous example. It's famous because we talked about it, you know every 12 episodes. But in Stephen's blog [14] where he does the application of you know two element array to a binary function which kind of maps to like the B1 Combinator, but.. And I've actually I saw. Was it Phineas on Twitter, he replied to some. K code or Q code I had written and did a bunch of like point free stuff. Where they didn't mention any arguments and I was like, what is what is? This I didn't know you could do this, so I think. K and Q do have some support for a subset of what is possible in the other languages. Stephen, you go ahead.

01:04:04 [ST]

Yeah, I wouldn't give you another class of languages, and those are the ones with terse folds in. We can call those the WTF languages.

01:04:13 [CH]

First fold, so does that mean just a slash.

01:04:16 [AB]

War if colon, colon.

01:04:17 [ML]

Presumably if you use the different character it would well, I mean BQN's got the ticks.

01:04:21 [AB]

They're just superscript slashes, right?

01:04:23 [ML]

Well, yeah, but they're not slashes, they're not slash characters.

01:04:27 [AB]

Not if you ask Unicode, but who does?

01:04:30 [CH]

All right. Any last acronym, Bob, that you want to.

01:04:33 [BT]

Naming naming is hard. I'm not going to name anything.

01:04:38 [ML]

Well, this episode will just be published on a page with No title.

01:04:44 [BT]

So yeah, throw it back to me, I get to name.

01:04:46 [ML]

The whole time WTF PLC all right.

01:04:49 [ST]

I think you just did. Naming is hard.

01:04:52 [AB]

But what's the W for you? With.

01:04:57 [CH]

All right. I think with that. Seeing as we're now solidly past the hour mark. We'll get cut down to shorter than an hour only Bob knows and with that. We will say happy Array programming.

01:05:09 [ALL]

Happy Array Programming!