Transcript

Transcript prepared by Bob Therriault and Adám Brudzewsky
[ ] reference numbers refer to Show Notes

00:00:00 [Adám Brudzewsky]

Maybe Microsoft paint is not the right tool for this?

00:00:03 [Conor Hoekstra]

I will not, I will not entertain that.

00:00:06 [AB]

Real programmers use notepad and real presenters use Microsoft Paint.

00:00:10 [MUSIC]

00:00:20 [CH]

Welcome to another episode of Array Cast. I'm your host Conor, and today we have with us four panelists. We'll quickly go around and do brief introductions and then hop into a couple announcements. So we'll start with Bob, then go to Steve and then go to Marshall and then go to Adám.

00:00:36 [Bob Therriault]

I'm Bob Therriault. I am a J enthusiast. I'm not a professional. I'm working on the J Wiki and that is keeping me really, really, really busy.

00:00:45 [ST]

I'm Stephen Taylor. I'm a Q and APL programmer, and I also run something called Iverson College.

00:00:53 [ML]

I'm Marshall Lochbaum. I'm a former RJ programmer, then I worked at a Dyalog and now I make BQN.

00:00:59 [AB]

I'm Adám Brudzewsky, full time at Dyalog, doing APL programming and teaching.

00:01:06 [CH]

And as mentioned before, my name is Conor. I'm a research scientist slash C++ developer at NVIDIA and in my free time I'm a huge array language enthusiast. So I think for our announcements we're going to start with one announcement from Stephen, then we'll go to three from Adám, and then I've got kind of a postfix update on the meetups, and then we'll have another postfix thing by Bob, and then we'll hop into today's conversation.

00:01:29 [ST]

OK. On Sunday the 25th of September from 2:00 to 4:00 PM London time Iverson College is holding an online workshop session on getting into vector programming [01]. This is for Q programmers who've made the transition from some other language to q and are looking for how do you do things in a vector way how do I do things without loops? How do I make sure that I'm using the facilities of q for iteration. If you'd like to join this, please contact me. Personally, my e-mail address is very short. It's SJT. That's SJT at thenumber5jt.com. That's Sunday the 25th of September 2 to 4:00 PM London time.

00:02:22 [AB]

Right then, there's a new thing in the APL world, yet another new thing called the Jot Dot Times [02]. It's actually a revival of an old APL newsletter title, but you can find it on APL dot news.

That's the whole URL and it's basically just a news aggregator for blog posts and things that are published around any type of APL and APL dialects. And then I have this weekly thing going on called the API request which consists of a meet up in the APL orchard chatroom and and the people were there fairly advanced then I make a follow up video to that. And based on feedback, I've now started to lower the level to be really introductory. So if you're completely new to APL and rate programming these new videos and beginning with the one that's so the last by the time this gets out will be last couple of videos. It's the 2016 round I think then everything will be explained so you should have a good chance of being able to follow along [03]. And finally, there's a user group called APL in Barcelona [04], and it's global despite the name for now at least, and they're just starting a new four part series of meetups. The first one is September 17, so you might just make it if you hear this as it comes out and it's a discussion on Iverson notation as tool of thought. There might be something interesting to participate in all over. This, of course, goes in the show.

00:04:02 [CH]

That's awesome, yeah, and I would highly encourage folks to check out the APL dot News, AKA the jot dot times. Do you know who runs this 'cause it's pretty cool.

00:04:13 [AB]

Yeah, I mean it's, it's one of my colleagues that has set it up, but on this.

00:04:21 [CH]

And I had no idea that Paul Mansour has been. I guess I guess has his own blog [06] and has been solving little leet code problems just in blog style. And they're very very nice to look at 'cause they have the correct Unicode font, etc. Anyways, some things that I did not I I recognize some of the articles in here. I've definitely seen them, but not all of them for sure.

00:04:44 [AB]

So you you might want to check out also, the APL wiki has a page called blogs [07], and it lists all the blogs that we know of that publish APL content.

00:04:55 [CH]

Yeah, we'll we'll throw obviously links for all this stuff in the show notes, but definitely cool to be exposed to content like, I didn't even know about this, so getting to learn things from my own podcast? The thing that I wanted to mention was just a kind of a short recap. So if you're a regular listener, you will know that we had or I helped run along with Morten Kromberg and Dyalog and a couple other presenters, Lib Gibson in the Toronto meet up and then Josh David in the New York meet up, who've been both guests on past episodes. We had the two APL meetups, so September 1st was the Toronto one and September 7th was the New York one. Just want to say thanks to all the folks. If you happen to be a listener and you came out, it was a ton of fun from my point of view. I got to meet not only you know Josh and Morton, who have been past guests on the podcast, but also a ton of other folks that over the last two and 1/2 plus years that I've basically sort of entered the array language community. Folks like Rohan and Rick Proctor. I'm sure I'm forgetting a few names and and I have met a couple people that were also there. Devon McCormick I'd met before, but anyway, it's just awesome to meet people in person.

And then outside of those meetups, I got to have a couple, you know, meetings with folks, you know that either were or weren't at the meetups that were in New York or in Toronto, because people were you know, in town and close and, uh, yeah, hopefully I think I talked to Morten and we might be doing something this like this going forward, not necessarily in Toronto and New York. We might try different cities, but maybe once a year, maybe 2 times a year to try and do something like this which would be super awesome and for folks that weren't able to attend in person, my talk will be re recorded and posted on my YouTube channel this coming Saturday if everything goes going to plan [08]. So if you're listening to this it will probably already be online or if not in a few hours it should be online and I know that Morton is planning on re recording and we did manage to record Lib Gibson's, and we're going to be trying to put that online somewhere. And Josh David from the New York meet up said that he plans on giving his potentially at a future conference like Appleseed, so all of the different talks should be hopefully available at some point in time and whenever they become we'll make sure that we have links in the show notes so that people can check them out if they're interested.

00:07:09 [BT]

When you do your talk, are you going to do it live like you did the last time with people watching you?

00:07:13 [CH]

I don't think so. I will do a premiere version where I'm in the comments answering questions, but it is my observation that a pre recorded premiering talk does better metrically AKA with views afterwards then a live talk does because I think people view it as a live stream and also you have less control like you can monitor your time but if you're pre editing and polishing it you can make sure that it's nice and tight whereas my last live stream ended up. I think part of it was that. It was a two hour talk because I realized that I can just go as long as I want as you know, people, if they're interested, will drop off or not drop off. But yeah, that's a long answer to what could have been a short answer. No, it won't be live, but I'll be in the chat to answer questions and potentially uh folks want I can think about setting up like a jitsi resume if they want to do like a live Q&A after, to sort of replicate the the fact that I can answer questions live in a live stream. That is the one advantage, that there's a little bit of an opportunity to do more interaction.

00:08:15 [BT]

So I've just managed to create a whole bunch more work for you.

00:08:18 [CH]

Yeah, it's OK. Setting up a jitsi link at the end is is not too much work, so. Anyways, from that we'll skip over to Bob, who's I think last sort of announcement and conversation will bleed into today's topic, which we'll mention in a sec.

00:08:31 [BT]

Well, my announcement, and it's you often see these lists of programming languages and and you know how popular they are and everything. And in the most recent IEEE spectrum List [09], J shows up. And depending on when I did a really quick count of the number of languages they had listed, it was 57, which I don't know, it's a kind of a weird number, and J was sixth from the bottom, so we're either 51st, or if it was actually 50 and I've just missed by 7, then we're 44th in the list. But in any case J shows that APL doesn't and K doesn't, which surprised me, but I always wonder about how they're doing these lists and. But in any case, JJ does show up and and hurray wonderful. I guess we get a participants badge or something. I haven't received anything yet. I don't know whether Eric has, but we're on the list. And so if if that makes a big difference to you as a as a listener, well then then we've made the grade and you can you can study the language in comfort knowing that we're not complete renegades. And I guess there's a we've overcome that obstacle of being a single letter language and actually showing up on the list because that's.

00:09:50 [ML]

That may be why you showed up.

00:09:53 [BT]

What do you think is that they only had space for J.

00:09:54 [ML]

It it goes one way sometimes and the other way sometimes, so.

00:09:58 [CH]

Hold on. Well, I mean C is on the list? There's there's multiple ones that are there. There's D I think those are the only.

00:10:08 [ML]

Alright, if they had V it would be very suspicious.

00:10:09 [CH]

V Oh yeah, that's a that's a brand new or that's a that's a nascent language, yeah.

00:10:15 [ML]

'cause nobody uses V.

00:10:17 [CH]

Yeah, we'll leave a link to this. I'm not sure if everyone in in the recording right now has a link to this, but there's sort of three different rankings out of these 50 plus languages. So there's spectrum, which I just think is overall popularity where it looks like it's 7th last above, so and we'll just do the top three. First is Python, second is C, and 3rd is C++. And then you've got other languages. A few down, you know, Java, JavaScript, TypeScript, go. You know what? You might expect. And then the bottom 7 go J. 4th Elm, which I'm surprising that JB Dell. Uhm, Raku, AKA Perl 6, Webassembly, Coffee script, and Eiffel. So that's on spectrum. Then we go to jobs, AKA I think hiring outlook for these, in which the top three are SQL, Java and Python And then J comes second last above another. Single letter language D. And and right above that. Just to mention a couple others. Or F# and Eiffel. And then the third one is trending which I guess is you know up and coming or I don't know popular, how much it's searched for. I don't know what it's actually based on where J I think comes 12th last? So things are looking up, I guess for Jay is what we can. There's no jobs but there's lots of people. Googling for the language.

00:11:47 [BT]

Uh, maybe they're more accurate than they thought they were.

00:11:51 [ML]

Well, I'm sure all the fleeing D programmers will pick up J.

00:11:57 [CH]

Yeah, I'm not really sure what the what to think of all this stuff? Oh yeah, and the top three for for trending is Python, Java and C and and yeah yeah. Any any thoughts from folks you know? It's hard to read into these things, especially when you've got languages like ladder logic which I've never heard before and uh, you know Verilog and I mean, I've heard of Verilog, but it's.

00:12:22 [ML]

If you find out there's a flaw in their methodology which elevates these one letter languages and keep it quiet, please.

00:12:31 [ST]

Do you think C is a trending language is evidence of that.

00:12:35 [CH]

Well, I mean, I think C's going nowhere. Although there are it's very interesting in the low level language space that there are some. I mean, let's search quickly.mIf there's a ton of, I wouldn't say competitors now you could call them competitors, but languages like NIM. What's the other one that I I mean, crystal, I wouldn't really call a competitor, but there's all these languages that are sort of aimed at competing in the same space.

00:13:04 [ML]

Well, I think Nim is kind of higher level too. So Zig is the only one that's really C like.

00:13:07 [CH]

Is it? Yeah, I don't know much about Nim. There's another one though I'm I'm forgetting, but does Nim.

00:13:14 [BT]

What about Rust?

00:13:16 [CH]

Rust operates more at this. See? Like if Zig is to see what rust is to C++, you know, I just upset, you know, I'm sure 10% or 50% of our listeners because Zig doesn't have any of the safety stuff. The point being is that, you know, I think zig is definitely trying to compete in the C space, whereas Rust is trying to compete in the C++ base. My you know, asterisks all the different language features and what they do well, but NIM doesn't show up I mean, so the fact zig, I think is actually quite popular like if you're a C programmer, you probably heard of Zig. And I do know quite a folks. Quite a few folks that are doing things with Zig. So the fact that J is on this list, I'd be interested, you know, was this curated by a single person or was there a they're dumb.

00:13:57 [ML]

I think if. They're like, if they're looking at academic sources at all, that would be a very trailing indicator, so you wouldn't see newer languages in there as much.

00:14:05 [CH]

Yeah, there's another indicator called TI0BE [10] the importance of being earnest and and I I remember seeing a tweet. I'll have to find it, maybe we can throw it in as shown out links, but someone was like you know, we should we should completely disregard this. I can't remember why, but like it has something very odd, you know, like delphi slash object Pascal comes in at at #13 and above languages like SWIFT and Visual Basic is a top 20 language. Anyways, all of these like, there's PY poll, there's Red Monk, there's the GitHub ones, there's all these different ones, and they all base it on different stuff.

And depending on if it's based on searches versus you know the number of files that are on GitHub or lines of code obviously that would impact.

00:14:59 [ML]

The Stack Overflow survey [11] is one that's actually good because they they have people actually saying I use this language. If you can trust the respondents, then you have a better picture of you know what languages do users of Stackoverflow use.

00:15:16 [CH]

Anyways, it's interesting. If folks are interested, check out the link and I guess we'll kind of segue into today's topic, which we will start off by deciding what we should actually title. I had suggested the title of this podcast being the essence of array programming languages. And then alternatives were suggested as you know what makes what is and what isn't an array programming language. And this is actually kind of similar to I think episode #1 or episode number 21 into. I can't remember which order it was, but one was what is array programming? And the other one was why do we like array programming? [12] I think the the why we like it was number.

00:15:56 [BT]

One, I think that was the first one, yeah.

00:15:58 [CH]

And then we talked about in the second episode sort of what makes a language an array programming language. So this is kind of the same thing 30X episodes later. But the reason I wanted to talk about this is that I recently, as a part of sort of the talk that I gave at these meetups, created a repo called array language comparisons [13] where I'm comparing APL J BQN Nial Futhark, single assignment C, Julia, R, and NUMPY, which technically isn't a language, but it fits in the space, and I basically started creating these little tables of small little idioms like how to reverse do role reversals versus column reversals in a matrix, etc. And have run into a few things that exist in certain languages but not in other languages and it got me to thinking like what really is the essence of an array programming language specifically, you know, one of the things that came up really quickly is that I didn't really realize that K&Q don't have, and I think this has been mentioned on the episode. I just, you know it didn't stick in my memory when when someone said it is that K [14] you don't have, I don't know what you call them, but true higher dimensional. Uhm, and because of that, they don't really have a concept of rank the same way that like J, APL and BQN have. And from sort of diving into it a bit more, really the view of K&Q and there's a couple articles online that talk about this is that, you know, Arthur Whitney really combined sort of two languages like the array language APL and Lisps, and it almost in my mind has me thinking of K&Q as almost like a hybrid array language that has some differences from APL, J and BCME and and then looking at other languages like it wasn't clear to me, but when I first looked at Nial which is spelled NIAL is a language [15] that came out of Queens University, which fun fact is where Ken Iverson was an undergrad? I'm not sure there's any any coincidence or relation there, but when I first did some like, uh, array math, where I, you know, did try to calculate if a number in a vector a rank one array is less than another number, you get back a list of ones and zeros, AKA booleans. But I got back like low, low, low, low, low, low, low like Lol Oh and I was like what is this? And I I literally thought like I had broken something and and I at one point had thought that, oh wow, they represent Boolean arrays different from numeric arrays. But later on it's because a numeric array is just one space 0 space 1 space zero, etc. Whereas this Boolean one shows up as L0L0 which I realized is the closest El is end 0 or the closest alphabetic letters to ones and zeros, and it's like a terser way of representing or basically writing in a Boolean array which I thought wow. This is this is so broken like I don't want two different things like they should be the same thing, but I think later on I realized that they actually are the same thing, they're just a different way of spelling it. Anyways, I'm going to stop my ramble and where we can now open the discussion too 'cause, I'm interested to hear you know what the panelists here I think what really he determines like, what are the features that you're, if you have like a little checklist of these are the things I would expect to see in array language that you're surprised when one doesn't show up and and we'll go from there. Stephen, go ahead.

00:19:32 [ST]

This is, this is right, very very simple. I just think it's kind of minimal. There has to be a vector notation. There has to be a way of writing out a vector which doesn't involve actually constructing it.

00:19:46 [AB]

Well, hold on. How do you even distinguish between those? And so in original Iverson notation [16] you would write vectors as ( 1 comma 2 comma 3 comma 4 ) It's account. So Iverson notation was not an array language.

00:20:02 [ST]

Oh, I'm sorry. In the original Iverson notation, was that comma or function? Like, Catenate as it's as it subsequently became OK so my point here is if the comma is representing catenates or or join then your your vector your array is being constructed and what I'm proposing here is the kind of like base basic for an array language is that you can write a vector. You don't need it constructed out of operations.

00:20:43 [ML]

Well, so part of the foundation of the APL2 [17] family. Uhm, Jim Brown. So, uh, one thing he was very interested in starting this extension of APL was that he said stranding should be a general notation. And as part of that, I would interpret that as saying stranding is not a notation but an operation that works on multiple values and combines them all. So by that criterion I'd have to say that APL 2 is not and, and that includes Dyalog and pretty much. I mean any actively developed APL today in RS2000?

00:21:23 [AB]

Even even BQN like that, right? Even though the users are simple, but if you could still call that an operation.

00:21:28 [ML]

Yeah, BQN is much more explicitly like that.

00:21:30 [ST]

Yeah, you gotta read my lips, I said vector, not array.

00:21:33 [AB]

No, but that is, we're talking only talking about vectors.

00:21:36 [ST]

Stranding.

00:21:39 [AB]

It doesn't matter whether it uses the explicit symbol or not, does it?

00:21:42 [ML]

Not after you parse it.

00:21:43 [AB]

Right and and I would disagree anyway with that whole thing. How often do you actually write literal arrays out when you're programming? Sure, for teaching and demonstration purposes, but when you're dealing with real data, you can easily have a large program that doesn't have a single array listed anywhere. Surely removing the ability to write explicit arrays and forcing you to use concatenation or reshaping or whatever would not make APL or any other array language any less of an array language when they.

00:22:15 [BT]

I'm going to take it a bit of a different direction. I think of the array languages more conceptually. So although in an array language you don't have to, you don't have to program like you're using arrays, you can do things on a scalar level, it's quite permissible. It's not the way you would usually use them because you're you sort of like I guess, you know, driving a car around and only using one cylinder. But you know, if you really want to do that, that's fine. The thing I find is what an array language allows you to do and actually encourages you to do is not think about scalars, but to think about groups of numbers and their relationships. And when you get to those groups of numbers, it you know, you think well, there's no difference. If I add one to an array, I get one plus every item in there, right? Even that really simple thing. You're no longer thinking about adding one to each item in the array. You're thinking about changing an array as a group by adding one, and to me that's array style thinking, and that can be extended into very complex kind of array operations such as transpose [18] or reverse or rank, so that you can control those arrays in ways you can bend them, fold them and moves them around, but the whole time you're not thinking about the single numbers you're thinking about the relationships between the numbers in that array, so it's the structure itself that encourages you. I don't want to say forces because you don't need to do this, but it encourages you to think about the relationship between all the different numbers and the operations you're putting on them and to me, a language that causes you to think or gives you the tools you need to think that way is an array language.

00:23:58 [ST]

Could we say that that, could we say? That that is implicit iteration. So for a lot of the primitives iterations simply implicit. You don't. You need to learn how how the implicit iteration works. But once you grasp that, just as you say, you stop thinking about it. And you mentioned transpose, which arguably is not an iteration. I think Conor will call that an algorithm, but I think the same the same point goes.

00:24:29 [CH]

So just to recap, Stephen Taylor, your point, your observation or I don't know what the best word is indicator of an array language is being able to write a literal vector or for for vector languages at least which was and I kind of got lost when Adám was saying if you go, you know, 3-4 reshape of sequence of numbers that doesn't count as like you need to have a way for basically writing a matrix literally. So just to clear that up or what? Did you say something slightly different than that?

00:25:05 [ST]

I did but 3-4 reshape [19] is an excellent example. The 3/4 is a vector and would be a good example, a counterexample to what Adám saying. You don't know to write vectors in code, yeah, you write 3 4 or reshaped.

00:25:20 [CH]

Oh, so specifically you were talking about rank one arrays. You don't necessarily need to be able to represent higher dimensional arrays literally, but you should have a convenient expression that doesn't require you know brackets or or something like that.

00:25:33 [ST]

That's why I limited it to vector. So vector is a rank one array of where all the items are the same data type. And I guess I'm really just saying, if you if you have a language in which you can't write a vector out, then you're not even in the running.

00:25:49 [CH]

And that so does that exclude Q and K? Because there's a lot of languages, the 1st that comes to mind is C++. You can reach for a library utility called, you know, vector or one is literally called array, but it requires you reaching for that. You know it's written in the library, so that doesn't count. It has to be in the language as what you're saying.

00:26:07 [ML]

Yeah, well, I mean. There there's the question of like, C array initializers, right? 'cause when you compile the C code, it's not going to, it's just going to stick all the values that you gave it in the data segment of the program. It's not going to run any code in order to initialize that array so if you have you know int brackets 3 so a three element array equals curly braces one, comma 2, comma 3 Uh, that has commas, but at the same time it is a static notation that that specifies an array rather than giving you code to be run.

00:26:45 [CH]

Yeah, that's true of a lot of languages actually. Like the another one that comes to mind is Haskell. Like Haskell technically has a list notation. Is what they call them.

00:26:56 [AB]

Uhm, loads of languages have, right? You have Json.

00:27:00 [CH]

Well, so I guess this is not.

00:27:02 [ML]

APL is one of the few that don't do it... or well. I guess J&K would be the ones that... no K does.

00:27:09 [AB]

Hey, that's J doesn't have.

00:27:10 [CH]

OK, so we don't have to consider I guess these as as soon as you have this, it makes you know Array language, it's these are criterion. Or yeah, criterion that you would expect an array language to have, so this isn't just just because you have, it doesn't make you on. So that was just recapping sort of what Stephen said, Bob, yours? You know, similar to what Stephen said, implicit iteration. I was thinking what you were kind of talking about was the, you know, the five cent word for at rank polymorphism, but it sounded like you were saying more than just, you know, being able to add a scalar and a rank two matrix. It was being able to operate on arrays holistically with operations do. You want to. Am I capturing that correctly or is there something some other way, other things you want to add to that that my of my summarization?

00:27:55 [BT]

I think your summarization is pretty good. What I would extend it to is the way that it it makes me approach a problem. So that if I'm. You actually had, I'm trying to think. I think you used one of the examples for your for your meet up in Toronto.nyway, I'm not sure you did the one in New York, but it's the one with the cross in the center of the you know the identity matrix or the.

00:28:18 [CH]

Yeah, the X matrix, yeah.

00:28:20 [BT]

Yeah, yeah, the X matrix. And to me that got me thinking about a whole bunch of things about how I would turn around and and create that matrix because the the one I think you used was. Essentially you create the identity matrix and then you reverse all the rows, so you get your X and you and the more I think in your case 'cause, you were trying to take the the least of them you did the the floor of it between the two.

00:28:46 [CH]

This will be coming out in the Saturday talk that I gave, but it was the Max of the identity matrix and a reversed identity matrix because then you get all the ones and someone I think it was you, Adám pointed out I could do a logical or would it be the same thing?

00:29:01 [BT]

But yeah, that's that's one way of doing it, yeah, and so so. That's sort of the base level. But what that got me thinking about is how would I create that X matrix 'cause you don't have to just create it by reversing. There are other ways that you can do it and I was thinking of is there a way like in in J using imaginary numbers [20] you can actually insert zeros between ones so that if you had your top row as one, say 3 zeros and a one you would say have a copy of 1j3, 3 would make the three zeros and then it would go one but then the next one would be 1J1 and whether you could do a count going up in your imaginary numbers So, 1j1 gives you 10, 1J3 gives you 1000. Essentially you would create a scalar or not a scalar or vector of your ones and zeros, but by using these imaginary numbers in ascending or descending scale. I have already played around with this but it just started to get me thinking that way. Is there a way to actually create this X matrix a different way and in that different way? Are there advantages in solving some problems? A language that encourages me not just to go, OK, you do it this way, but what are the relationships between these different positions? Do I index into it? Are there different counts I can use? Are there different ways I can get in and change the specific spots I want from ones to zeros? And can I do that in terms of thinking about a matrix. I could go in and just index by saying this one turns to one, this one turns to one, this one is the one that would be almost a scalar way of doing it, but I would be looking for ways to do it as a matrix. Is there a transformation I can do to create that? To me, that's what an array language allows me to do. It takes it up to a different level. Not saying I always do that, but often when I'm solving a problem, if I can solve it one way, I'm always looking for that other way. Is there a simpler way that this falls out? And one of the people I think of a lot in this case, well, a lot of the people on this panel can do this really, really, really well, but the other person I think of always is Roger Hui [21], who you'd he create, you know, 3 characters and do something that you suddenly erase all. Oh, oh, OK. I get. What's going on here? You know, you you do a sort and you do this and you do this and and and suddenly something that would take what you would use maybe five or six operations can be done in three. And there's just an elegance to that. And that's what I see in in array languages. They allow you to think that way. They allow you to do those kind of things, whereas when you go to languages that are not array language but can represent arrays, you know, such as C++, you're doing an awful lot of work just to work the array. You're not even at the level if you get it. It's just like, oh, I'm done, thank goodness that's done. I'm moving on. But an array language allows you to work with an array and then play with an array, and to me, that's what an array language does.

00:32:07 [CH]

Yeah, I was. Well, I'm about to kick it to Adám probably. I'll see if I can anticipate what you're about to say. When I was going to say this, is that while you were saying that Bob is, I had tweeted out a preview of one of the slides in my talk.

And Adám had posted a couple different solutions, specifically with respect to creating that identity matrix, one of which I would have never thought of him was beautiful. So I'll get Adám to explain that and then I'll, I'll check before. Is that what you were going to say? Were you going to say something else?

00:32:37 [AB]

No, no I was just going to come with my bid on on what it means for something to be in their very programming language.

00:32:43 [CH]

Alright, well we'll queue it up. Do you recall or do you need me to refresh what you did?

00:32:46 [AB]

I recall this thing, but you gotta give me a link to the to that tweet. 'cause, I don't remember what the whatever.

00:32:51 [CH]

Alright, I'll explain. I'm sure as I get 3 words into it, Adám will remember he took and I. So the way I did identity matrix was by just going iota, a number, and then you do an outer product. And then you know mixing that with the reversal row wise of it and you get your X matrix. Adám did it by basically doing the same outer product equals but on a different sequence. He didn't use an iota sequence, he used a sequence that was basically IOTA. And then taking that IOTA sequence, reversing it, and then taking the minimum of those two sequences. So you basically get like a mountain 12321. And then if you think about that and you do an outer product sequel on that, you get your X matrix, which when I first saw that, I worked through it and I was like, what that does that doesn't work. And then you look at it and you're like, holy smokes, that that's actually perfect. But I didn't change my talk to use that example because it ruined sort of some points that I was making in my talk. So instead, I said, if you're thinking there's a shortest solution, you know, go find the tweets. And you might have found one that some folks like Adám found. But yeah, like that is when you were explaining, finding different ways to solve it. Like there are ways to solve things that I would never think of in a million years and then you just start to see these patterns. I'm not sure, Adám, if you want to add anything to that or just hop straight to what you think are the languages are or what's the essence? What do you expect to see in them?

00:34:25 [AB]

No, no, I think you covered that.

00:34:27 [CH]

If you didn't, we'll leave a link to the tweet. I'll find it later and it is a cool exercise to work through, yeah?

00:34:33 [AB]

I think I'd like to say that under a language is something where the array is the core data type. Uhm, that doesn't mean there couldn't be other types of collections of things, but I see a lot of programming languages that have all kinds of collections there. Sets and I don't know node collections and and tables and and whereas in array language I think they have in common that any collection of things. It's always going to be in an array, and then you have this set of powerful operations that you can do under arrays. And that's your whole thing like, I've ever heard Conor on the DSP podcast saying, like, you know which data types you reach for, for this thing, for that thing? For the array programmer in array programming language, that question never comes up. How should it represent this? Well, I'm going to represent those in array 'cause that's pretty much the only thing I've got in in K and q You have, yeah.

00:35:30 [ML]

The question is which array?

00:35:32 [AB]

Yeah, right. And the exact structure of their ratio and can't you have a little bit more of a choice and in in with they have some some tables. And maps and so on and and J has a sparse arrays. But basically we're always working on arrays and this bothers me when I have to write some some JavaScript, so my JavaScript looks very not JavaScript. I do every programming there so I'll I'll do some some some query selector thing and then all so I get all the nodes that match something, but unfortunately that's like an HTML node collection thingy. I don't know and no array operations work. Got it.

But luckily JavaScript allows me to write this square brackets with three dots in it and magically it becomes an array and now it can start working. And same thing goes for string operations, though obviously strings are just arrays of characters, but that doesn't work. But luckily JavaScript allows me to put square brackets three dots around and now it's an array and now I can do my my work and then I just have to remember to do the join the quote quote at the end. So because yeah, whatever, see that kind of thing right? Where, yeah, you can do this in other languages, but it's not the obvious way to do it. And that, I think, is the important thing of array programming the exact vocabulary or exact notation.

That's not the main thing. It's thinking in arrays and always mapping over arrays, and things that reasonably can be automatically mapped over arrays are automatically mapped over arrays.

00:37:09 [CH]

Marshall, do you want to round out our panelists, opinions on the criterion you expect to find in.

00:37:15 [ML]

Well, I I guess I can. So I'm remembering that when I was a Dyalog and gave my talk at LambdaConf [22] that was mainly on the outer product I gave at some point a definition of an array programming as meaning that everything is an array. This is this is true in APL and J, definitely. I think most K'ers would say this is also true of Kand Q. It's the case that even if you have one number, that number is inherently an array with 0 axes and and in K this is a little weird 'cause 'cause axes don't mean quite the same thing. But most programmers would interpret an atom that's not nested at all as having zero levels of nesting, and there's levels of nesting our axes, so it's also a 0 axis array and then I made BQN which uh, which doesn't do this and has values that are not raised. So uhm maybe BQN just isn't an array language? But I mean, the way I think of it now is kind of really just that an array language is 1, where the primary influence is like in Iverson's APL.

And that's kind of just all it is 'cause there are, there are several different features of these languages that come together, so you've got Uhm, you do have the focus on raise the UM the use of arrays is a single data type, so there's only one way to represent things mostly I mean, there's If you have, if you if you know your layout, there's one way to represent that layout there may be different layouts that represent the same idea and then there's the syntax that's based on, you know, the particular way that can kind of took mathematics and made a uniform by turning everything into a function and operator and uh, so some languages like Nial don't quite work the same. They change things to words. I think Nial actually has an an order is left to right, but it has a reasonably similar expression syntax, but it changes that up a lot.

But at the same time it's getting all its array ideas from APL, so it's still very similar uhm. And then other things you do have the symbols, which Nial definitely doesn't have the use of saying that the most important functions, the ones that you use all the time, those are primitives, and they're written with one character, and maybe two or three and J. So yeah, but my definition would just be something where the primary influence is the APL programming language.

00:39:52 [CH]

So now we've got one of the criterions from Stephen being to write arrays as a part of the language Bob was implicit iteration, and also to kind of at the end of it, it sounded more of a school of thought. The way that it changes, that you approach problems and that you know the playfulness of the language, you see one solution, then immediately start thinking of a next solution, Adám said the core data structure being array and then Marshall, maybe one of the most interesting ones is that it's very loose definition, where the main inspiration was at least traceable back to Ken Iverson APL [23]. So yours is probably the most generous definition that you could call the most number of languages array languages.

00:40:44 [ML]

Well, I don't know 'cause it rules out stuff like Julia and NUMPY because there's some, even MATLAB probably. I mean those are more inspired by like LAPACK and stuff like that [24].

00:40:56 [ST]

Let me toss something else into the into the mix here. Uh, Bob and I guess I was saying earlier that a key thing is that a lot of iteration is just done implicitly. And I would add to that that there must also be iteration primitives. So for when the implicit iteration isn't the iteration that you need. You've got primitives, operators or functions however you how you turn them in the language which will specify how the iterations to be done.

00:41:33 [AB]

No, no, I had to protest. That was ruled out. APL\360 is an array language [25].

00:41:38 [ST]

Well, it doesn't. It doesn't have a reduce.

00:41:42 [AB]

Yeah, but you said mapping now.

00:41:44 [ST]

I didn't say map.

00:41:48 [BT]

No, I think, I think Stephen was talking about iterators not. I mean I I can see where you would take it to mapping, but I I think if you're just talking about a way to use an operator that will allow you to do multiple things without too much effort, that's probably is that we're thinking of as an iterate or Stephen.

00:42:05 [ST]

Yeah, simplest example will be each.

00:42:07 [AB]

But API 360 had no each and no rank and no way and no for loops. The only way you could iterate over an array was by initializing a counter and then indexing into the array. Thats very not APL. If you want using a primitive that did it by itself, there's nothing you could do.

00:42:26 [ST]

APL\360 had reduce and scan.

00:42:30 [ML]

Yeah, but the only operands you could have were arithmetic primitives.

00:42:34 [AB]

So let's say you you define your own factorial function. You know, some some languages have it built in. So you define the function which uses IOTA or the index generator on the argument, and then does the multiplication reduction over that. And now you want to apply this to every element of an array. And so yes, in modern languages you've got a niche, actually J doesn't, but you can still manage with rank.

00:43:02 [ML]

Well, the standard library has each.

00:43:05 [ST]

I'm not sure where you're coming from with this, you are you arguing that APL\360 on my definition cannot have been an array language? I'd I'd say it's got operands reduce and scan and and so qualifies. I'm not arguing for the set that APL\360 is like a a full and complete language to programming.

00:43:26 [BT]

And Adám, you were saying though that that in APL\360 the operands to the scan operator would only be arithmetic. Is that right?

00:43:36 [AB]

Well, they could only be, but not. Well, it depends what you mean. Arithmetic in APL\360 and all the way to a plus though what we today consider operator is higher order function whatever are really just special syntax they could own if we look at them as operators with operands, they could only take a very specific set of operands, and that's it. You could not supply your own custom function to use there.

00:44:04 [BT]

But would you need to do that to have an array language like I? I would think the fact that you could do it with some operands is enough to be able, within that context, to be able to think of it as an array language.

00:44:15 [AB]

Uhm well, in that case, instead of spinning it out separately, I would say any any programming language that has a SUM function that's just it's not actually SUM. It's spelt S, and then UM is the reduction. S means plus, so now you've got plus reduction, so it's an array language.

00:44:35 [BT]

What I would say is you need also have a PUM Function, which the P means something else, and you know any number of things because the UM becomes something that allows you to repeat.

00:44:46 [ST]

MUM for multiplication.

00:44:49 [AB]

Yeah, it's simply.

00:44:50 [ST]

DUM for division.

00:44:53 [CH]

Here's a here's a question about Q [26] does because Q doesn't have a concept of rank really. So is there a term for the equivalent of rank polymorphism 'cause? I do know that Q&K they overload the meaning of the ASCII functions, or if if that's the correct word for them. I mean, in cave they're ASCII functions in in Qi. Guess there's no overloading because they they're still ASCII. But yeah, there's a a single word for each built in.

00:45:22 [ML]

Well, they're still ASCII.

00:45:27 [CH]

Function, but the fact that 'cause I just tested that in order to sum the rows in a matrix, you do a sum each of that matrix. So there it looks like some more reductions really just work on sort of your rank one arrays AKA vector. There's is there a parallel concept of your glyphs having different behaviors depending on the I don't know if you want to call it data structure 'cause I know that I'm not sure if they're called tables or dictionaries and K or Q. I know that they have sort of different behaviors if you're passing that to ASCII function versus uh, a vector or rank one array. Is there is there a terminology for that or is that sort of doesn't exist in K/Q.

00:46:17 [ST]

Yeah, we do have a terminology for rank [27]. It's called rank. And what's more, it applies to functions and to data structures. So, applied to a function, the rank of a function is the number of arguments that it takes, and applied to a data structure, it's the, it's the number of different indexes you can give it. So a matrix has rank 2. The plus operator has got rank 2. And the and it's very important for this concept. It's it's central to understanding explicit iteration because maybe because just as you can iterate, you can reduce a list with a binary operator you can reduce it with a matrix. Worse than that, I mean. Well, if if you like you, you probably come across the founding insight in Q or K that functions, arrays, the functions, their ranges of the values of our ranges and their domains are there seems to have a dictionary its domain is the key of the dictionary, and the range is the values of the dictionary. If you take a dictionary in which all the in which all the values are also valid keys then you've got a finite state machine, and you can use that to traverse a traversal list. Uhm, if you take a list of unary functions, each of those unary functions has got rank one. My list has rank one, so my list of unary functions has rank 2.

00:48:31 [ML]

Now this is the thing that bothers me.

00:48:33 [ST]

I like use that as a state machine too to converge on evaluation?

00:48:42 [ML]

What bothered me about this is that a function of 1 argument returning a list still has Rank 1, right?

00:48:50 [ST]

Yes, yeah.

00:48:52 [ML]

So it's very weird to me that the way that functions get multiple get higher rank is that they're just specified all at once to have uh, multiple arguments and the way that lists have higher rank is that they're nested so that's just dumb I mean it's not horrible or anything, but it bothers me that K does all this work to unify different things that have indices, but then they actually do act different, and in a more direct way. You have functions that just decide what they're doing based on the type of the argument.

00:49:32 [CH]

So Q does have the concept of rank and rank polymorphism it just doesn't have a rank operate.

00:49:40 [AB]

But you you've got each left in each right, which you can pair up to do pretty much what you want.

00:49:46 [ML]

Yeah, I think Arthur [28] has mentioned before that that adding uh rank operator would be possible that you know, once in a while he says, well, rank would be nice to have here or something like that.

00:49:57 [AB]

Well, I think he even said maybe time to bring back the rank operator. He did invent it after all, right?

00:50:06 [CH]

And it is an interesting thought exercise of playing the game of well, did APL\360 have it? 'cause APL\360 also didn't have a rank greater yet, it's become something that I think a lot of people would put on their list of criteria that they expect to see in an array language, if not a rank operator, at least rank the concept of rank.

00:50:30 [BT]

Yeah, yeah, I was going to say it had operators like, you know, reduce and scan it's going to have a concept of rank.

00:50:38 [CH]

Right.

00:50:39 [AB]

And if I'm not mistaken, actually, I'm not really sure the APL\360 have bracket access for scalar functions. I'm not even sure.

00:50:51 [ML]

I don't think so. I think that was much later.

00:50:54 [AB]

OK. Yeah, so the old way of was, but see, it did have automatic mapping for scalars, and it just didn't allow you to specify the cells that you were going to apply on. And so it's actually fairly easy in any given situation to just use reshape to get what you want, and so if I have.

00:51:15 [ML]

Yeah, and replicate possibly, right?

00:51:17 [AB]

No, no, we shouldn't.

00:51:18 [ML]

Well, no, it didn't have replicate, did it?

00:51:20 [AB]

They did have replicates it did.

00:51:22 [ML]

It had compressed.

00:51:24 [AB]

Oh yeah, right, originally didn't have replicate. You're right, but I don't need that, reshape should be fine. So let's say you have a matrix and they rank. So that's a rank 2 array, and you have a rank 3 array and you want to add them together. So that doesn't work, right? So you want this matrix to be added to every layer of the rank 3 array? This rank 3 array being basically a collection of layers connection matrices. And So what you would do then is assuming of course that the matrix has the same shape as the individual layers in the rank 3 otherwise it's not going to work anyway. So all you really need to do is take the shape of the rank 3 array and use that to reshape the matrix because reshape is cyclic, so we'll just reuse the same data over and over and over again, just stacking layers on top of each other and now they have exact same shape. The scalar operation will work.

00:52:19 [BT]

And the reshape is what gives you replicate because it's cycling through them.

00:52:23 [AB]

Well, it's not replicate, it's a, because replicate does end copies of this, and then in comes up the next element, and then comes to the next element and that's not So what you could do is you could have increased the rank by adding uh, needing access and then replicating along the leading axis, but without replicate you couldn't there. So the traditional way of doing it is reshaping.

00:52:44 [ML]

Yeah, I think people mostly use reshape, but it is possible to do this without our product too. And then you can add an access to the beginning or the end. By doing you didn't have an identity function, so you do outer product addition with a vector of zeros on one side.

00:53:02 [AB]

Well, you could do a dyadic transpose on the outer product.

00:53:07 [ML]

Yeah, probably.

00:53:08 [AB]

It'd be wasteful, but but again, this traditional way of doing it was definitely reshaped. I've seen that code in the wild, and and and so you you could do this, and it's only really slightly awkward. It's it's us array language enthusiasts as opposed to the people actually need to make money doing using the array language is that all enamored with rank polymorphism and and the rank operator and all this fancy stuff that. People just need to get the work done. It'll take them OK 30 seconds extra to wrap that reshape at most.

00:53:41 [BT]

But actually the exercise we just went through where you were trying to figure out how you would do it without replicate, and you could do it with a dyadic transpose and you could do it with adding zero to it and increasing and the the dimension that way, that whole exercise is exactly what I was talking about. When you're working with an array language, you start thinking about these things in these concepts because you have operators that work on the entire array, and you think you're not thinking about the individual numbers, you're thinking about what I can do to transform this array. It's all in the transformation and the way it makes me think. And to me that's what makes it an array language. And it has to do with the terseness, because if it's if there's too much work involved in doing that, I'm not going to be able to follow what you're talking about but knowing array languages as soon as you say dyadic transpose on a, you know, on the on the matrix I'm working with I go, Oh yeah, no, that would work that would be interesting. Probably not efficient, but I know what you're talking about.

00:54:39 [AB]

I don't think the symbols are so important for that if you chose this, yes, but automatically the code would be contrast even if you were to use these relatively short names for these operations, so things like Neo, UM, or even Q and for all the magnetic functions if I remember right, and and even K for certain things, at least some versions. OK, in fact, in before Unicode, when there were all kinds of encoding problems for APL, they was fairly common to have various spelling schemes for and and that the code surely stayed array language code even if you were to spell names of things.

00:55:22 [ML]

Yeah, well, even in J, since it has a standard library, it's pretty common to write like each instead of ampersand dot greater than.

00:55:31 [AB]

Even though that's shorter.

00:55:32 [ML]

Yeah, I always wrote the ampersand version.

00:55:35 [BT]

I tend to write the ampersand version because it doesn't disguise what it's doing and that gives me more options. When I think about breaking it apart, you know it's it isn't under an open in that way. It's an under open and close, and there's so many things you can do where you're the under means that you do something. You sorry, you do it and then you reverse it and and that way you get that either end of the under and that's a really powerful concept. So if you write each that gets disguised, but one thing I was going to say is I'm the last person who's going to argue that the actual symbols are really important because over and over again what I hear about J is it's not nearly as beautiful as APL.

00:56:20 [AB]

Are we talking beauty? We're talking about array programming.

00:56:23 [BT]

Well that's what I'm saying it's it's it's it doesn't really matter so much about the symbols to me it's it's the concepts and the fact that they're short enough to be able to and the perimeters primitives that are chosen.

00:56:34 [ML]

Yeah, I think the primitive choice is really important.

00:56:38 [ML]

And my complaint about your definition? I mean, I'm not going to say it's not a valid definition, but I think of it as a consequence rather than you know. It's something that comes out of being an array programming language instead of something that makes it an array programming language. So I think it's.

00:56:58 [AB]

Which definition are you arguing with?

00:57:00 [ML]

The definition where an array programming one is one that like encourages you to think in terms of entire arrays and operations on those so I think of Iverson vision is sort of one way to encourage you to to bring you into this way of thinking, but I also don't think it's necessarily the only way. And yeah, I don't think that consequence is what makes something an array programming language at its core.

00:57:26 [ST]

Oh, let me pick up on that. I think it's a really interesting point Marshall 'cause you were saying earlier that you were doing array programming in JavaScript if I understood it right, it's like when you have to write some JavaScript you naturally a favor array based approach. And if I was following that, it reminded me of when I was learning PHP [29] and I got into trouble when I had to ask questions 'cause I was using I was reaching for array techniques and writing in functional style. And so the answers I got back to my questions were were aimed at a much more experienced and sophisticated PHP programmer than I was. I must admit didn't understand them at all. It's it's been often said how influential Iverson APL has been on other programming languages, and we've seen how array capabilities have been kind of copied or replicated in other languages. And when in JavaScript, [30] for example, I find myself using the array features I often have the experience of like this is alright as far as it goes, but it's not the real thing. So I have a picture in my mind of the array paradigm having spread out, having rippled and diffused through the programming language space, but wherever it's moved to, it's kind of got stepped down, and so if you're working in another language, Python or JavaScript and you're using the array facilities. It's sort of like you've got 1/2 broken version of them. I was thinking, wouldn't it be cool if we had, if if we wrote a book in which we took some of the core array features that have been replicated through other languages and we show them as they were originally in their source language in APL, J, whatever how they work there? How they combine and then follow them into some other popular languages and so this is how you can use the same thing in this other language.

Or not how? This is how far you can get it, so that would be of interest. I guess it's a kind of computer language philologists who are interested in how these ideas spread, but it will also be a practical use to people who are working in other languages and would like to use array features. They know that they're using them to the greatest extent that they're available.

01:00:05 [AB]

Yeah, let's do that.

01:00:12 [ST]

You in this for this Conor?

01:00:13 [ST]

You know more languages than anybody.

01:00:20 [CH]

I'll write a forward. How about that? Yeah, forward.

01:00:23 [AB]

By host of the ADSP and the Array Cast.

01:00:27 [CH]

Rest of the bulk written written by panelists of this podcast.

01:00:31 [ML]

Well, I should push back against this mindset that, you know, APL is the source and everything else is downstream 'cause I mean one factually I think a lot of this stuff was developed in other languages independently, which I'm not sure whether you're saying that or not, but Lisp [31] definitely was pretty early to getting a lot of things like map and I think it got reduced from APL, but filter definitely got on its own. So a lot of this same style of thinking, I mean, they're pretty simple concepts if you have a certain mindset approaching programming, they do just emerge. And the other thing is I think that the thing that we're approaching is not necessarily APL like APL is part of the approach. So where Iverson came from was this linear algebra and you can see that there's all sorts of uhm, I mean he he directly tells you where things come from in linear algebra. So we had the idea that, like, the outer product is a is a specialization of the inner product, which I think, yeah, yeah, Iverson definitely did eventually, you know, completely disavow this and J it's called table and there's no relation. I did the same thing in BQN [32]. So I also think that that APL is definitely throughout the 80s it was like working on sort of purifying this vision of array programming as opposed to this, this thing that happens to be derived from mathematics. And I do hope that work continues.

01:02:16 [AB]

No, I just want to say it does. I mean by the very effect of what you're doing.

01:02:20 [ML]

I hope so well BQN is pretty much fixed now so it's not something I wanted to spend my whole life on, and you know, I've at least the design of the primitives I think is good enough. Whoever makes the next language can add their own improvements.

01:02:35 [CH]

Yeah, I was going to say this is where I hop in and say this is fantastic. This is exactly what I hoped it would be. And I now consider this podcast my personal think tank, where whenever I need a question answered or I'm having thoughts in my head, I just come here, I throw it at you folks and then I just sit back 'cause yeah, this is uhm.

01:02:58 [AB]

Turns out that Conor was actually preparing for some talk at some C conference or something, and he just he couldn't figure out, so he threw the think tank.

01:03:06 [ML]

He was literally sitting back too. he was just taking it all in.

01:03:08 [CH]

No, this is a I mean, I had, I did sit back for a while. At one point I was thinking, I mean, there was a few of us at one point that we all had our eyes looking upwards at the ceiling. But yeah, this has been fantastic, the two things out of this that have really both been Eureka moments, hence why I was sitting back and listening is I think implicitly in this conversation there's a delineation that we've all been talking about, but not explicitly mentioning what that is. At least this is, you know, it's a Eureka moment for me. Maybe other folks will disagree, but the the first observation is that I think there's a delineation between array languages and Iversonian languages and in my head, it's a Venn diagram where array languages is a superset of Iversonian languages. And this, at least for me, the school of thought and the way that it changes the way that you think goes inside the Iversonian. Well, actually, yeah, it goes inside the Iversonian. Well, actually, no. Does this Venn diagram work? Maybe it's not a superset. I got to slow down because if it's a superset, that means that everything in array language is. If it's a superset, all the array languages would contain the Iversonian languages, right?

01:04:33 [ML]

So you've got languages that are array languages, but not Iversonian ones in your model, right? Something like NUMPY which is not a language.

01:04:41 [CH]

Correct, yes. Yeah, like NUMPY Julia.

01:04:45 [AB]

That that's pretty explicitly based on the APL stuff, right? The authors of Numpy refer to APL.

01:04:54 [ML]

I think a lot of it comes out of MATLAB, which is not as APL influenced as you would think. Uh, Numpy definitely has APL influences, but it's not the primary influence in my as far as I can tell.

01:05:08 [BT]

And Conor is now busily googling Venn diagrams [33] to.

01:05:11 [CH]

See what? Well now, now I've opened, I have opened a mspaint dot app. Which is a like Microsoft Windows paint application from like Windows 97 Style UM. And so I because I clearly can't visualize this in my head. So yeah. The outer one is Iversonian languages and for me, I mean, I'm not sharing my screen unfortunately. Actually I could share my screen and do this on my other computer.

01:05:49 [ML]

Now he just has to draw it up.

01:05:51 [BT]

We're seeing a circle in a circle. Well, it's not a circle, is it? No, no. OK.

01:05:55 [CH]

It's not a circle, it's an ellipse.

01:05:57 [ML]

2nd ellipse.

01:06:00 [CH]

All right, so now we'll grab. We've got an inscribed circle inside another one and so. Iversonian and and then and this is where, you know maybe I need to think a little bit harder. How do I where's my paint brush it out to circle this Iversonian?

01:06:16 [AB]

Or maybe it's the area outside it's right.

01:06:18 [CH]

I've got this completely backwards. Let's do this again. Again, I've Iversonian languages is the inner circle.

01:06:31 [AB]

And so.

01:06:32 [CH]

I think that's what I said at first, but then somehow I confused myself. And then array languages is the outer circle and inside is APL.

01:06:41 [AB]

Oh, then so so so the title here refers to the circle.

01:06:44 [CH]

Inside them and then definitely on the outside is NUMPY. Julia R So then the question becomes uhm is Futhark? I would put Futhark.

01:06:59 [ML]

Outside, yeah, Futhark [34] is really interesting to talk about 'cause.

01:07:04 [CH]

And I would also put single assignment C. And the reason, and so this brings me to my second Eureka moment. So the first Eureka moment is that there's the delineation between Iversonian languages and array languages. And I think the things inside Iversonian languages that differentiate them from array languages are the school of thought and the way it makes you think that Bob is referring to. I don't think in my head makes you ineligible to qualify for being an array language, but I do think it's essential to what sort of Ken Iverson preached with his languages and I also think combinators. Uhm are a massive part, you know? And that's The thing is I know, so Dyalog APL ended up, but you know, and so actually, maybe, maybe there's actually a different circle here.

01:07:57 [AB]

What what Conor has drawn now is an outer group giant one called array languages where he's put foot Oregon single assignment, see, outside of that, things like NUM PY, Julia, R Insider, suppose also MATLAB and Mathematica and so on would be in there inside that we've got our Sony and languages APL, J, K I suppose.

01:08:17 [AB]

APL\360. Now Conor has added an overlap. And with the Iversonian languages, and that he's called combinator languages [35] , but it's the way it's drawn it now is it's entirely inside array languages and that can't be right, right?

01:08:34 [CH]

Oh yeah, I see where you're going with this. You're correct.

01:08:37 [AB]

You're correct, yeah, it has to overlap, but go all the way to the outside, yeah.

01:08:41 [CH]

I get. I see what you're going.

01:08:42 [AB]

Maybe Microsoft paint is not the right tool for this. See? Come on.

01:08:48 [CH]

I will not, I will not entertain that.

01:08:53 [AB]

Real programmers use notepad and real presenters use Microsoft Paint.

01:09:01 [CH]

That's all right. I mean, I don't have that's the problem with paint is I I have a single stack of backspace undo, and so I have to go back 17 steps to fix this. Combinator languages. Admittedly, I should have done this in PowerPoint now that I.

01:09:17 [AB]

Are there any combinator languages that are array languages? That are not Iversonian.

01:09:21 [CH]

I don't think so. So yeah, there's maybe even a nicer one of these but for we're good with this now and and I might put where does.

01:09:31 [AB]

Where does Cosy fit in here?

01:09:33 [CH]

I don't know enough about Cosy [36].

01:09:36 [CH]

I'm a put Haskell down here, and and so as, so that's actually the ones that we haven't listed yet. So Q I don't know enough about Q, but I don't think they have a rich support for combinators like. Do they have the?

01:09:53 [AB]

q and

01:09:54 [CH]

K Neither does K so.

01:09:56 [AB]

So I'd like to two exhibits here. One is in K it is fairly easy because K has functions as first class citizens to write your own combinators, people generally don't, but there's nothing preventing you from and another exhibit would be APL two, which does have operators. User defined operators. It doesn't have very many built in ones and and, but it's very simple in the field to to define your own combinators so that you have all these and combinators that you want. Whereas in APR 360 and even APR plus, I think, uh, you cannot define your own operators and therefore you cannot create combinators. So there's a when I try to make distinction between the ability to make combinators and comes with combinators out-of-the-box. You want to distinguish between those.

01:10:54 [ML]

Yeah, and I think what's built in should be the focus 'cause.

01:10:58 [CH]

And even to answer like on top of built in, this is like the answer to a conversation I had once with a bunch of C++ folks, a few of who were sort of functional programmers, and it was like what makes a language functional? Like, honestly, this conversation, this whole podcast is similar to what is a functional programming language. This is the equivalent but you know what is an array. Yeah, and I wish more podcasts like that's the things Lambda cast and I think they had a kind of episode on it, but not really free form discussion where 'cause, that's The thing is at one point, you know, a couple folks were saying I don't, you know, I disagree, but don't want to invalidate. I think when it comes to this, everyone is allowed, to certain extent, their own opinions of what they think a functional language is, or an array languages. I think it's just a super interesting discussion for what is it the what are the criterion that people expect and when? I for me personally, when I hear Bob talking about the way you think, I completely am in love with that idea of the array of the rate quote UN quote languages, but have sort of realized that I also think other languages are really languages that don't do that, but still have a lot of the same properties that are. A superset of the properties or a subset of the properties that I would look for in sort of the quote unquote Iversonian language. And I think when it comes to languages like Dyalog, APL, J&QN, it's very, very idiomatic to make use of the combinators because they're built in. Also, in Haskell you don't have a full set of combinators, but it's also very idiomatic to use. You know flip, which is the C Combinator, and join, which is the W Combinator. And there's something called app or the applicative functor which double s as the S Combinator. These things are super super common to be used in Haskell, even though they're technically, they don't come with the language. There's only one that comes with the language, which I believe is the B Combinator which is the dot operator, but so I think it's it's less. Does it come as a .

01:12:49 [ML]

Hold on if you install Haskell so all you have to do is write an import statement in your program and you get it right.

01:12:55 [CH]

Correct, but I.

01:12:57 [ML]

Though it comes with the language, but it's not.

01:12:59 [CH]

It's in a library, but I so I think it's less important whether it's in the language versus in the library. The question is what is considered idiomatic used are the is the average. You know Haskell or BQN or APL programmer 'cause. That's The thing is when we when I was having this conversation with the C++ devs talking about you know what is functional programming. That's what someone responses. Like technically you know where does Swift fall or what are languages like F# [37] that are both object oriented and functional and and even for that matter, Haskell is a pure functional language. But there are corners of the language where you can do a bunch of unsafe stuff and start writing code like C and his response was that because the majority of Haskell programmers program in a very sort of pure functional way it's considered a pure functional language, but if the norm was for everyone to be doing like C low level stuff and pointer mapping and whatnot, it wouldn't have the same reputation. Anyways, that's one take whether it's correct or not Uh is a different conversation. I'll stop talking though for a second. Do people have comments about anything that I've said in the last couple minutes?

01:14:03 [BT]

It it sounds to me like what you're talking about though, is is the paradigms of the languages, right? There's there's an array programming paradigm, there's a list Lisp type programming paradigm, there's object oriented type paradigm whenever you're talking about paradigms you're talking about ideas that might not be as strongly represented in the other languages, but there the concepts could be there, it's just that that's not primarily what that paradigm you know, addresses or works with and to me. It's interesting, like with combinators, you see them across different paradigms that you get functional programming like Haskell, you know sort of in terms of combinators, similar to combinators used in APL or J or BQN. But that's it's almost like combinators are something that could be considered cross paradigm, but actually maybe they are a different paradigm as well, as opposed to, you know, the Lambda calculus which is again a different paradigm for looking at operations that way.

01:15:11 [AB]

Obviously a programming language needs some kind of sequential storage, and every programming language pretty much has arrays of some sort or lists of some sort to store data, but many languages don't really have any facilities to directly work on their rays. Some don't even have some right, you have to loop over and and add up arrays having arrays is definitely not an indicator of being an array language. Every language has arrays of some sort. Question is, so a lot of languages have some facilities for dealing with the arrays. If nothing else, reversal or something or things like that, if you if you take away the all of those abilities, those languages will largely remain functional. Functional in the sense that you can still use them, you'd have to write something differently, but you could say. To use them if you take away the ability of a language to deal with arrays directly. Uhm, and that causes the language entirely to fall apart? Like you can't do anything anymore then I would say it's a proper every language possibly ever sonian type thing. And I would say that's the case for all of these APL type languages and Q&K if I remove the ability to operate on entire arrays. In in Q, or k, you can't just, you can't do anything anymore.

01:16:41 [CH]

Alright, we're at the like hour and 20 minute Mark here and we got to start to wrap this up. So we will come back to this. This episode will be titled, you know actually what is going to be titled. We'll figure out the title offline, and there will be a second Part 2 of this that will call Iversonian languages versus array languages. Maybe that'll be full stop, or maybe we'll do versus Combinator languages versus functional will get all the SEO optimization. But before we end, I think we should talk about very quickly Futhark and Q&K. So I think Q&K will be quicker. Uhm, where? Let's where do people think Q&K belongs? Is it? I think it's outside the Combinator languages circle. The question is, is it inside the Iversonian languages circle?

01:17:25 [ML]

Well, I think it's actually more obviously an Iversonian language than it is an array language 'cause all these. NUMPY and R are definitely pretty focused on multidimensional arrays. Obviously APL, J, and so on are and K&Q along with Julia are mainly focused on one-dimensional arrays I think. Uhm. Like Julia, if you just ask for an array, you're going to get A1 dimensional one, right? Like that's default. So I might even say that, yeah, Q&K or Iversonian, which is the category that I've been calling array languages, but they don't fit into this other category, you know, depending on how how strict you are about the boundaries. Certainly you can program as though you're using multiple multidimensional arrays you can think of them as multidimensional arrays, but they don't work in quite the same way as APL and J arrays.

01:18:25 [CH]

Should we quickly hear from our resident QK expert Stephen do you want to agree with or not agree with the fact that Q&K are being placed in the Iversonian languages.

01:18:39 [ST]

Yeah, absolutely. Solid core, Iversonian.

01:18:43 [CH]

All right, if you want it? Well, we'll do a 60 seconds from Marshall Lochbaum, on thoughts on Futhark and does it live outside of this, I've I've for folks that are watching this on YouTube, you'll see that it's living outside all the circles right now. Go ahead, Marshall.

01:18:59 [ML]

Yeah, so the Futhark does call itself an array language, but what I see from reading is this is much more about how it's implemented than than what the programmer is exposed to. Yeah, but it's interesting 'cause these levels, like have a lot of interface between each other, so on the programmer side, Futhark looks like kind of a restricted ML family language. It's a lot like Haskell or Ocaml or all the other stuff in the mail family, but in order to make sure it runs on GPU's you have a lot of restrictions on like what your types can be. And it has to, uh, a lot of things have to be possible to be resolved when it compiles. So when you're programming, you end up thinking about, you know. How are all my arrays laid out? But what are? I mean you have to say what type is this and the type has to fit into an array model basically. So in one sense it's just like a sort of like Haskell that's been cut down to fit into an array model, and in another it's like array programming that you express through Haskell. I don't know quite what to make of that.

01:20:16 [CH]

Yeah, in my, I mean, I didn't even really get to talk about my second Eureka point. The first one was the delineation, and we're going to do a hard stop in a couple minutes here 'cause, yeah we've we've definitely gone over, but like I said, we'll promise Part 2 of this, which we'll call something similar. And this the the preview of the second Eureka Point is that you know, Stephen mentioned implicit iteration. We've talked about rank polymorphism [39] and and just not having to be explicit about your indices, and I realize that rank polymorphism I have semantically attached to the rank polymorphism model inside of J, BQN and Dyalog APL, but I think really ranked polymorphism is just an umbrella term for any way to deal with rank polymorphically. It doesn't necessarily need to be kind and and the secondary thought to that is that you know a lot of these are leading axis rank polymorphism [38]. So you could acronym acronymise that to LARP, but there's also trailing access rank polymorphism. And then there's also the rank polymorphism of Julia and Numpy where the default of your operations is just to reduce everything to a single number so when you sum a matrix, you just get back a single number, whereas you know, I don't know what you call that. I've in my head been calling it axis agnostic, but it's really only access agnostic by default. And then, you know, I think I I said something in a past episode where I said, oh that the the reverse function of Numpy. Flip doesn't even work correctly. That was wrong. It actually just calls the two different one I was using axis equal to 1 and two axis equal to 0 is so X is equal to none is the default. That sums everything into a matrix so that would be the equivalent of raveling something and then summing it up in our Iversonian language. Which is into a scalar. So if you sum up a matrix in Numpy and Julia, the default is that it gives you back a single scalar, a Rank 0 number rank 0 Array which is the equivalent of raveling or raising something in our languages and then summing it up up.

And if you then do axis equals zero you get column wise. Uhm, summation and I think I switched. I was going from reversing to summing, but the. So so yeah reversing will reverse column wise and summing will sum column wise with axis equal to 0 and then X is equal to 1 is the row wise one. And I think in a past episode I said I tried to do axis equal to two and it doesn't work and they just have a hard coded you know flip U D for up down. So even though that is a hard coded thing, they do actually have the general concept of rank. It's just backwards. Anyways, I'm going to stop talking. We'll talk more about the fact that I think rank polymorphism is this umbrella term and if you have any facility in your language for dealing with rank such that basically you don't have to explicitly map which is what Futhark does, and which is why I have sort of futhark outside of the array languages, is that I was very surprised that when I got to Futhark I had in order to sum up the basically lists of your lists 'cause, they represent their matrixes as just like lists of lists you end up having to map like you do in Haskell or any sort of functional language. Any last final thoughts before we wrap this up which we're close to this being our longest episode after I said we were going to keep this short, but then I got carried away with my Venn diagram it's all my fault.

01:23:33 [BT]

And I'm not sure what we say to those people who might only be listening to this on their podcast during their walk or something 'cause that's how I'm gonna listen to it. Uhm, if you get a chance. I mean the the diagrams and everything are fairly. Uh, they show things fairly well, but there's I I don't think, having watched people create the diagrams, there's any way that you would be able to follow it on audio. There will be, uh, we'll put a link up for the YouTube of the actual video of the diagrams being created and and honestly that's fairly entertaining as well. So I mean even if you can't just watch the video, we can certainly kind of can save this and we can link to that just that image like say if you bandwidth can't handle that.

01:24:20 [CH]

Oh, I'm about to tweet this. I'm about to tweet this out in like 3 seconds.

01:24:23 [AB]

We'll link to the tweet. [40]

01:24:25 [CH]

And I plan on creating a polished version of this and yeah, definitely look forward to the Part 2 of this 'cause. Yeah, we haven't really even talked of all all the languages single Simon C and there was some topics I was hoping to get to like you know, representing Boolean says ones and zeros and some small things like that 'cause you know, there's a bunch of things that I think we could still continue to talk about on in this sort of domain I I just think we should all be listed as collaborators on your dissertation when it gets published, if it ever gets published, yes and no. I'm not doing a pHD currently. That was a joke from Bob, but if I ever do, I'll put this.

01:25:08 [BT]

It'll probably be on this, yeah?

01:25:13 [CH]

Alright, any last thoughts, comments? I guess I said that a second ago. Folks can contact us at arraycast.

01:25:19 [BT]

Contact at arraycast dot com. [41] Show notes will be in and there will be a YouTube up that people can take a look at and and follow this along or as Adám pointed out, we'll put a link up to the the actual still graphic as it is now, probably because it's sort of complete and is interesting.

01:25:39 [CH]

Well, I'd say it's probably it's complete with some definition, but I'm sure if once I post this people are going to say how come you didn't include language X and you know, sure, that'll be good for, you know, people commenting and whatnot, but yeah, I think this will it'll be interesting as a reference going forward, too, because hopefully we can get some folks to talk about languages that aren't really represented, like Nial and even languages that are array languages, but outside of Iversonian, because I think definitely we've we've been focusing on the Iversonian language. Which is rather than sort of the broader picture of array languages, so it'd be cool to talk to folks, you know, on Julia, Numpy and R and stuff like that.

01:26:18 [AB]

So I'd like to just mention that APL wiki has a couple of like Family Tree of Array languages [42], there's not really family as much as showing some that there's some influence going from here to there that might be if you like this kind of content, you might like that.

01:26:34 [CH]

Yeah, and while we're mentioning links, Marshall himself actually also has BQN. I think we've mentioned this before on the podcast, before you were a regular panelist. Is your, I think it's the functional programming post, but it has very similar to this Venn diagram of functional languages and array languages and sort of how BQN fix into it fits into that Venn diagram, but it also mentions a a ton of other ones, which is it's a super interesting read if you like this kind of programming paradigm kind of stuff [43]. Alright, with that we will say happy array programming

01:27:10 [ALL]

Happy array programming.

01:27:12 [MUSIC]