Transcript

Transcript prepared by Bob Therriault and Adám Brudzewsky
[ ] reference numbers refer to Show Notes

****This is part two of a discussion that began on Episode 36 of the ArrayCast ****

00:00:00 [Conor Hoekstra}

So you want to throw out Nial and q.

00:00:03 [Marshall Lochbaum]

I don't know if I can throw out Nial

00:00:05 [CH]

Bouncer Marshall, just standing at the front of the party and just no only no letting people in, he's going in and kicking people out

00:00:12 [MUSIC]

00:00:54 [CH]

Welcome to another episode of ArrayCast. I'm your host Conor and today with me I have four different panelists. We're going to go around and do brief introductions before we hop into a couple announcements and then our topic for today, so we'll start with Bob then go to Steve and then go to Adam, then go to Marshall.

00:01:10 [Bob Therriault]

I'm Bob Therriault and I am the J enthusiasts that I'm working like crazy on the J wiki and in the next couple of weeks and I may put something in show notes, [01] the next couple weeks. I may have something to show people because it's coming along. It's kind of interesting and I hope it hope it works out.

00:01:27 [Stephen Taylor]

I'm Stephen Taylor. I'm an APL and q programmer.

00:01:30 [Adám Brudzewsky]

Adám Brudzewsky full time APL programmer and teacher and lots of stuff.

00:01:35 [ML]

I'm Marshall Lochbaum. I'm a former J programmer and Dyalog developer and now I'm the BQN creator.

00:01:42 [CH]

And as mentioned before, my name is Conor. I am a C++ software developer and a research scientist at NVIDIA, but in my free time I spend basically all of my time learning and researching array languages, and I guess I do that a little bit at work now as well too. So for our announcements, I believe we have four of them. We'll go to Adam first who's got 2 announcements, one which is very exciting. I guess both are exciting, but one is especially exciting and I'm excited for the listeners to hear about it. And then we'll go to Stephen for one more and then Marshall for a final one.

00:02:11 [AB]

OK so by the time this comes out, I'm 99% sure that the first recordings from Dyalog 22 user meeting will be out [02]. I'm looking forward to seeing those myself, as I couldn't make it there and we'll include links, of course. And then I'm starting an APL podcast together with Richard Park and and we've just been recording the first episodes. It should be available by the time this comes out as well, and it should be available both in video and eventually, at least in audio form as well.

00:02:46 [BT]

Well, as executive producer on this podcast, I want to know what what two of my panelists are plotting here.

00:02:55 [CH]

A rival podcasts? Actually, that's true. How much of a rival podcast can it be if it's the same people half of the time at least. I guess you and Rich aren't on at the same time. So we'll call you our rival podcast, but really, it's not like a true rivalry.

00:03:10 [BT]

It's like a kid brother.

00:03:12 [AB]

No, I'm and I'm envisioning it as very different than this podcast. In that I mean, it's very much inspired by what Conor and Bryce are doing on the ADSP podcast. Just Richard and I getting together discussing either just some ideas or responding to something that somebody has written or said or done or whatever. Uhm, so I don't think it will be much of an overlap there.

00:03:37 [ML]

Well, maybe if you develop some multiple personality thing and start disagreeing with what you've said on the other podcast, then we can get the rivalry goinf.

00:03:45 [AB]

Anything I say on this podcast should not be taken as my opinion on that other podcast, so go and listen there.

00:03:51 [ML]

Shouldn't be the opposite of your opinion on the other podcast. Do you think the other thing?

00:03:56 [BT]

The anti Adam would be pretty cool.

00:03:59 [ML]

Hi, I'm Anti Adam. I work for anti Dyalog where I teach anti APL.

00:04:03 [AB]

That goes left to right right, yes.

00:04:07 [CH]

So when is the first?

00:04:09 [AB]

When is it coming out? I mean I, I suppose it will be available online by the time we release this episode. So alright, we've already done the video and audio editing, so it's you just need to upload it and make a site and whatever.

00:04:20 [CH]

So I'd say pause this podcast right now, but I'm not going to say that finish listening to this episode and then go and listen to Adam's, are we allowed to? Are you allowed to drop the name of the podcast.

00:04:33 [AB]

Yeah, it's not a secret. Can you guess it?

00:04:36 [CH]

APLCast if it's not a secret.

00:04:42 [BT]

It's not a secret, you guess it.

00:04:45 [AB]

We want to go all the way. I call it by a nine letter abbreviation or initialism.

00:04:50 [CH]

Now, this sounds this sounds confusing.

00:04:52 [AB]

APL Notation As A Tool of Thought.

00:04:55 [ML]

Yeah, yeah, all right I was. I was getting there.

00:04:59 [CH]

What that seems like Apple, oh, not, uh, not or something what?

00:05:02 [AB]

Well, who says you can pronounce the initialism, I didn't.

00:05:05 [CH]

I thought we were playing a game here.

00:05:08 [ML]

Well, I was supposed to figure out how.

00:05:10 [AB]

To I guess you could call us the the hosts. You could call the Apple Tots if you want.

00:05:15 [CH]

Alright, this sounds cute. OK, well we wll definitely have links in the description for that. And yeah, definitely excited I always love podcasts. I was saying that before when Adam mentioned it. Pretty excited and I will definitely be a listener. And yeah, we'll try and get some kind of rivalry rivalry going on, and this is just the beginning. I'm sure there's a listener out there now thinking what there's now an array language podcast and APL podcast, and so now soon there's going to be a q podcast a BQN Podcast a J podcast. I can't wait the array language cinematic universe of podcasts is coming and it's going to be great. Alright, we got two more announcements. We go to Stephen and then to Marshall.

00:05:55 [ST]

All right, this is for everybody who's trying to learn or improve at q by themselves. Uhm, there's kind of a a gap in the resources available for summer students, and it happened because for the 1st 20 years or so of its history, Q is pretty much confined to teams working in Wall Street investment banks and two of two or three consulting firms. So if you were joining those firms, you get an introduction in how to use the language, and then you kind of learn from the people around you, but if you're a solo student, you can find the initial introductions online, but there's not much in the way of study resources if you're trying to improve by yourself, and this issue was raised by voice in the vector Dojo quite recently, so we started pulling together some puzzles, solutions relatively fairly simple stuff, project Euler and whatever, but not just the problem and the answer or solution like here's a solution 'cause nobody actually needs the solution itself, but a discussion of the solution and some alternatives, and you can find this stuff on GitHub. It's all on open source. It's at Cubist slash. Study Q will have it in the show notes [03] and please do enjoy. Please contribute more. We will welcome just raw solutions, but as said we prefer to see discussion of how the solutions developed. So you find your way thinking into this because the problem that we're trying to address here is the challenge that people who are coming from C like languages face when they're trying to develop vector solutions.

00:07:42 [CH]

Yeah, this is super super useful and it looks like there are multiple contributors to this repo, so as you mentioned, if you're listening to this and like I'm I'm looking at the directories in the the repo and there's Advent of code, leet code Rosetta Euler. So if you have any solutions of your own or as Stephen mentioned, sort of articles discussing that I'm pretty sure if you go and add links to that into this repo, that would be totally welcome as sort of a a go to source to find different Q out in the wild, which is yeah, definitely a great resource if you're looking to learn. Alright Marshall last announcement.

00:08:19 [ML]

Alright, some news regarding the tooling for BQN, which is maybe a little early, but Dzaima, who is the creator of CBQN if you don't know, has been busy working on integrating that with a command line REPL tool called REPLXX [04]. So currently what we tell you to do is use RL RAP which which takes, you know any executable and just wraps it in a simple command line. They can do their works like Bash it has like history and uh, you can enter lines and you can edit things, but it doesn't have nice stuff like syntax highlighting which a lot of people have asked for so. Dzaima has been working with this REPLXX tool which integrates a little more closely with the language and it gives you syntax highlighting. He's added a completion for system values too, and I think names in the program as well. So currently, as I'm saying this, what you do to build it is you have to clone REPLXX into your CBQN repository and then you build with an option that says well and you check out the REPLXX branch which is not merged into the main branch yet. And then you build with an option that says use REPLXX and then you have your BQN executable. If you just run BQN has a REPL built in that you can use normally and it has syntax highlighting and all that stuff and tab completion. So that's pretty cool. And yeah, hopefully it'll get. It'll be more officially supported pretty soon, so you'll have it in the main branch and you'll just build. You'll say I want to build with REPL ** and it'll do all the work for you and figure out how to get you running with REPL ** and download the code faster.

00:10:09 [CH]

That sounds super neat. I'm trying to think of the REPL's that I know of that have syntax highlighting. And honestly, the only one that comes to mind is iPython [06] which is a super super nice interface for anyone that's used it before and I do know a few like the Haskell.

00:10:27 [ML]

Doesn't Node do it.

00:10:28 [CH]

I don't do a lot of JavaScript programming, so I do not.

00:10:31 [ML]

Yeah I I know Nodedb has the intermediate results like it shows results as you type, which is pretty good. I don't remember if it does highlighting actually.

00:10:41 [CH]

Yeah, I I know like the Haskell interpreter, it does error highlighting like it will sort of highlight things in red, but it definitely doesn't have like syntax highlighting throughout, which is which is really nice so. Yeah, definitely if you're using CBQN locally this sounds like something to check out. All right four announcements out of the way which brings us. I mean, if you're listening to this episode, you probably read the title, which means you've been eagerly awaiting this conversation which is Part 2 of our what makes a programming language and array language, which I believe we're going to title Iversonian languages versus Array languages. So this is if you haven't listened to part one of this conversation. It was two episodes ago. I believe we'll put a link to that at the top of our show notes. [00] We will assume that most folks or all the folks have listened to it already, but I would definitely go and listen to that one first, because it's going to probably inform a lot of our discussion today and I guess to recap briefly, at the end of that episode, we had screen shared and I had made a small little Venn diagram [06] to which I then went and made a nicer one in PowerPoint and tweeted it out and since then have added a bunch more things which I think locally we'll try not to refer to this a ton in detail and spend the next 60 minutes talking about it. I think what we do want to talk about though is sort of diving deeper down into what are the enumeration or delineation of things that make you fall into the Iversonian small green circle in this Venn diagram versus the larger Superset Circle, which is a red circle called array languages, and inside the Iversonian languages are the languages J APL 360, which is the we use the green Apple logo for that. In this case. Dyalog APL BQN and Nial or Nial? I think I'm pronouncing that correctly. The second one, Remora, which is a research language that was done by Justin Slepak and then the KX logo stands for K&Q and then outside of that circle and inside the array languages, there's R, NUMPY, Julia, MATLAB, Fortran and single assignment C and DEX. And so I think before I.

00:13:01 [AB]

I'll claim that the Wolfram language or Mathematica belongs in that circle as well.

00:13:05 [CH]

Yeah, I've been meaning to change that 'cause Adam's pointed out to me. I think twice now, at least that they've got, yeah.

00:13:12 [AB]

Third time. But who's counting?

00:13:15 [CH]

Rank Rank aware functions that conform to the leading axis theory so it should be in the red circle, but I I spent so much time 'cause this Venn diagram got so complicated that I've just been kind of deferring doing that. But yeah, Wolfram should potentially be in there, and even Futhark should potentially be in the red circle.

00:13:35 [ML]

Yeah, well so following the Futhark episode, we do need another circle for which ones we've discussed on Array cast. Troels says that makes them into array languages.

00:13:48 [CH]

Yeah we should. I don't know if it needs a circle 'cause that'll get too complicated, but we should put like a little a little emoji, happy face or something up in the right corner. But yeah, the last thing I'll say before I hand it off to the panelists is that we should try and 'cause I think that was one of the things is when we started in the two episodes ago talking about this is that this isn't necessarily a list of things or maybe we can discuss what we really want, like the definition of this criteria to be is it's things that we are expecting the languages in the green circle to have become. But just because actually, I didn't think about how many word this. Just because you don't have it, it doesn't make it so like what was The thing is that just because you have it doesn't make you an array language. That's what I'm trying to articulate. Yes, so I think the the the thing we started talking about two episodes ago was a literal syntax for rank one arrays AKA vectors and then that was something then point it out and then I think because of the confusion some people or I can't remember who exactly said, but like C and C++ technically have a literal syntax for arrays, but that doesn't make them array languages, so these are a list of things that we are expecting array languages to have, and just because a language out in the wild has this one particular thing from the list doesn't make them array language, but yeah, so it's a list of things. Two lists of things I guess things for array languages and things for Iversonian languages. Alright, I'm done talking who wants to go first. Alright Adam.

00:15:26 [AB]

I'll actually maybe add another language to the list here, but it might be controversial now. You'll fall off your chairs here. Texas Instruments Basic or TI basic. [07]

00:15:39 [ML]

Not the way I used it.

00:15:40 [CH]

I've seen that on the Wikipedia page that they do list it as a array language, but.

00:15:45 [AB]

They do, oh really OK. I didn't realize that because if you look at, say, the documentation for multiplication, then you will say value times value or value I guess is a number or a list times value or list times list, or I think. Also value times list will work so it does do that kind of thing automatically with no extra syntax at all.

00:16:07 [AB]

It just auto maps and I certainly have used it as an array language as a poor man's array language.

00:16:13 [ML]

When I used it, I didn't know what an array was. I didn't know what a for loop was. I thought you had variables for every letter and that was all you got because it has no multi letter value. And I wrote my code to go to using the variables and when I needed to store a lot of data, I would store it mod base 10 and packed into one variable.

00:16:38 [AB]

I actually have a fun anecdote of using TI BASIC as an array language, and when I was in in high school gymnasium, if you want I had a philosophy course. And one of the things we had there was formal logic. And we and then we had tests on that. And I asked if I could take my TI graphing calculator with me on the test because at the time at least I don't know how it is today in certain subjects. For certain tests, we were allowed to take calculators within other subjects, not and nobody could find anywhere we had said anything about bringing calculators along to your philosophy tests. So they seem to be OK as long as I cleared the memory of it in advance with next door stuff, the information needs to know by heart, so I do that and so for the formal logic part of the test we're given some crazy long statement about Flash Gordon or something like it was raining I don't know what and we were supposed to evaluate if it was could possibly be true or not. Instead of trying to figure this out by hand, which and which would be hard I just looked at how many variables there were and then created lists or two to the power of N with all the different possibilities as a binary counter thing and then unfortunately it's only a normal arithmetic that works with these array things in in the TI BASIC, but you can just define and in terms of multiplication, which is just multiplication. You can define or as not and of the nots, right? Sorry whatever.

00:18:18 [ML]

You can actually do from multiply minus add two I think yeah?

00:18:29 [AB]

Well there are many ways you can define it and then not is defined As 1 minus of other value and that all works with this array stuff. And so I've wrote it all up it like that as one giant formula and pressed enter and got the answer and went on to the next question. Save me a lot of time on that test, so definitely can do array that is array programming.

00:18:42 [CH]

Yep alright, well I mean, I think the moral of that story is is if you can bring your graphing calculator to what kind of course was it called something something?

00:18:50 [AB]

It's philosophy?

00:18:52 [CH]

I mean. I did take a philosophy course in university, but I don't think I have my graphing calculator then, but the point is if you were on 1st year university and you have a logic philosophy course coming up, bring your graphing calculator and learn array programming and you can save yourself time hopefully as long as they don't ask you to show your work, in which case you'd probably need a more complicated program. All right back to the lists of things, I'll start off seeing as I'm going to say this at some point, so might as well say it now. I've been doing a lot of studying of all the languages and I think one of the lowest criteria, and it actually sort of separates quite a few of the languages, is that the language has to have a scan. [08] Then you'd think well, don't all languages and scans. Wrong answer, so MATLAB does not have a scan, doesn't even have I think a generic reduce. Matlab does technically have the ability to pass functions around, I think with an at symbol, but it's not really that ergonomic, but there are.

00:20:01 [ML]

Yeah, but what do you mean by generic reduce 'cause?

00:20:03 [CH]

A reduce that can take a binary operation.

00:20:05 [AB]

Any, or just arithmetic ones.

00:20:09 [CH]

I mean, I will say 'cause APL 360 only took the built in ones, yeah so anything that can do like the basic set of min, Max etc multiplies divides. Ideally I'd like to see support for sort of custom ones, but I'm not even talking about, so I guess that was the answer to what's a generic, but just like the ability to reduce and scan, and I think that already rules out several languages, so MATLAB like I said, doesn't have a scan. I'll probably give a talk at one point called this, but I think even higher bar is to be able to do a minus scan.

00:20:50 [ML]

Yeah, well see, I'd say really I would think it's still in array language, even if it has hard coded plus scan Mins and Max can 'cause I think that's about all I ever use other than other than doing like more complicated functions in using scan for control flow, but that's not really an array programming thing.

00:21:08 [AB]

Yeah, so if you have a sum and a Max and a min function just takes a list and automatically.

00:21:14 [ML]

Yeah, if you have like cumulative sum, cumulative minimum cumulative maximum.

00:21:18 [CH]

Really so you think the bar.

00:21:19 [ML]

I think that would be a comfortable array programming language still, uhm, yeah.

00:21:23 [CH]

Really?

00:21:24 [AB]

Right and and MATLAB has that it has comparer and comin and comax.

00:21:27 [CH]

Yeah, terrible names, but uhm I mean so that's The thing is, is that I have these small lists of problems that I'm trying to solve, and one of them is the maximum parentheses depth and that uses a minus scan when it after it does the outer product.

00:21:43 [ML]

Uh, that's a reduce.

00:21:46 [CH]

Or sorry minus reduce.

00:21:48 [AB]

You said plus scan.

00:21:50 [CH]

Yeah, that's true I did, but I said minus I said minus scan. But in this case I also mean minus reduce which.

00:21:57 [AB]

But you're only using manage reduce on A 2 length and axis of length 2, which is really not a real minus reduction. It's just splitting. And it's basically just taking a binary operation and using it with the syntax of a single argument operation right it's you taking the two rows of a table and use them as the two arguments in a subtraction that's that's vectorized.

00:22:20 [CH]

How do you do that without a minus reduction without having to separate your two-dimensional list until 2 1-D lists and then.

00:22:28 [AB]

That's exactly how you do it.

00:22:29 [ML]

Well, you'd have to separate it, but like if you had a pattern like BQN's, pattern matching syntax makes that not too bad 'cause you're right, you could write a function that doesn't, you know, brace, and then you'd write a square bracket, list A comma B. And then colon, so that would be structure the argument into cells A&B. And then you give Colon a - B.

00:22:50 [AB]

But but that's essentially the same as manually assigning it. Yeah, that's just using the built in syntax.

00:22:56 [ML]

But I think that gets that captures the entire array part of the computation, right?

00:22:56 [AB]

From the start.

00:23:00 [ML]

Like you're saying 'cause the the the reduction only works on two elements, so it's not like a big array reduction, it's just it happens to go between the cells.

00:23:10 [BT]

And and just for the listeners following along at home what we're talking about here is, I believe, and get correct me if I'm wrong about this, but essentially you set up one line which will be right parentheses and one language would be left parentheses and match them to those that creates ones and zeros for right, separated from left, and then you subtract 1 from the other and you end up with a series of ones minus one zeros, and then you do reduction across that added up.

00:23:41 [CH]

Plus scan 1st and then do a Max reduce.

00:23:43 [BT]

Plus scan across that and then and then it gives you the different numbers which gives you your your your depths of parentheses.

00:23:49 [CH]

The key thing we're discussing is that if I have a 2 by N matrix, can I combine those two with a minus binary operation and admin martial or arguing well, I mean, that's a.

00:24:07 [ML]

Yeah, well, I've also seen this written a lot of times. It's just you know X = ( - X equals closed parenthesis so.

00:24:13 [CH]

Right, right? Yeah, that that's true.

00:24:17 [AB]

In in K&Q

00:24:19 [ST]

Let me jump in here. Let me jump in here 'cause Q recognizes this, and I suppose implicitly K as well. The distinction. So you've got the reduction using the using the over iterator. But if you just want to separate out the arguments of a function which in APL might be dyadic, but in Q can be up to 8 arguments to a function. You can take a list, a list or vector of the values, and you use the dot operator to apply a function to it. So a function which is you know takes 5 arguments F. You can say F dot and then a list of five. So for you, for your two, your your example of two elements it's it's trivial and it's neat to do that and we discourage people from using reduced to do it because reduce implies something more general. A longer list, whereas when you when you see the dot the apply operator used you know you're just doing it for for two arguments.

00:25:35 [CH]

Yeah, that is really nice. Does that correspond to a combinator or?

00:25:42 [ML]

No 'cause taking an array element isn't it isn't something a combinator can do.

00:25:46 [CH]

So is it in the case where you have a 2 by N matrix? Is it taking the first column? So instead of doing like, uh minus uh column wise minus reduce you do you just spell that?

00:26:07 [ST]

That we use leading access. So if you have a two row matrix you could pass that as dot and they would be the left and right argument of your function.

00:26:16 [CH]

Really so minus dot is how you would spell it.

00:26:19 [ST]

I'm sorry?

00:26:21 [CH]

Or sorry like so you're saying because it's leading axis. If you just have a binary function and you pass the dot to it, it'll take the first row as your first argument in the 2nd row as your second argument. So minus dot is literally how you spell subtracting the 2nd row from the first row.

00:26:39 [ST]

Yeah, there's a nice, sorry. There's a nice example of it in a solution I was putting up on study Q over the weekend [09] where the you set up an initial state. I think yeah, set up and up initial state, which is a list with two elements. The 1st is itself a list, and the second item is an index, which you're going to try. Yeah, I think the problem was I wanted the last item in a list which passed a particular test. So my initial state is the list and the last and the index of the last item in it, and then the iterated the function being iterated simply increments the second item of that list so decrements 'cause we're working from the back. So I go duty, duty duty, duty, duty going from right to left along my list and I'm using the wild construct of the iterate are so the whole thing stops as soon as you find something that passes test and what it's going to return is the original list and the incremented or the decrement index pointing to pointing to the item that passed the test. The leftmost element of the whole expression is simply dot then in square brackets at which is the indexing front index at function, so it takes that list and the index and index is the list. And bingo, there you've got your the item that was the last one to pass the test.

00:28:20 [AB]

Yeah, but you need square brackets for the dot, right?

00:28:22 [ST]

You do in that case, yes.

00:28:24 [AB]

So you can't because I I just tried in K writing plus .12 putting a space between the dot and the one so that it wouldn't parse this 0.1 and that failed. But if I write dot open square brackets plus semi colon 1-2 then I get 3.

00:28:43 [ML]

Can you put the plus in parenthesis maybe?

00:28:46 [ST] You can use the infix syntax. You should be able to write at space dot space and it'll apply the little duty indexing.

00:28:58 [AB]

Yeah, for indexing now I'm not doing indexing close.

00:29:01 [ST]

No, no, take it.

00:29:01 [AB]

So you say Marshall. You said putting parenthesis around the dot would work.

00:29:05 [ML]

Yeah, 'cause that should turn it from a function or from a verb into a noun.

00:29:09 [AB]

Oh yeah, that works.

00:29:13 [CH]

And if I recall, if I recall correctly this dot technique was you used in another Q solution that we discussed in an episode that I won't remember the number of where Marshall was explaining I think if my memory serves correctly, how to take the range between 2 numbers? If and you have a, you have a blog article and I remember reading that blog article and staring at it and being confused even having just had the conversation with you, being like what's going on here because you can't do that in any of the other array languages like you have to do the reduction in order you know the trick of a two element list and doing reduction and I think at some point I upon reading it and playing around with it, I figured. Yeah, but now that we're talking about this again, it is jogging my memory that, Oh yes, that's right. We discussed a solution that used this trick that I read and looked at and learned temporarily, but then clearly it's been pushed to the back of my memory. [10] So yeah, we'll link that article as well, 'cause I think that's a really good example of a use case of where it's like the bread and butter bread and butter thing that you want to reach for in that case. All right, so to get back to the list of things. So it sounds like though essentially we do not have a unanimous agreement that a scan or even a reduced generic versions of those are not essential for being an Iversonian language even just having hard coded with no generic reduction in the background that you can access.

00:30:38 [ML]

Well, definitely not for being an array language for being an Iversonian language. I think if you don't write, you know like some is plus slash, then that's a pretty strong point against you.

00:30:49 [CH]

So that's that's what I was after. That's what I was talking about is what what we're talking about now. So I think we kind of have talked about in the last episode, what an array language was and that's where we've got all these languages in these circles and we've got a little, you know asterisks we need to decide officially is Futhark, in the group or not. That's the red circle the Array language group, but now we're trying to figure out what's the what delineates Iversonian languages versus array languages so this is the thing the generic reduces and scans and specifically sort of the ability to do a minus reduction or a minus? And I think in order to get in the Iris Onyon club, you have to show up at the door step with your generic versions. And I thought that's what you were saying. Is that the hard coded ones are good enough? Which is I would say that they are for array programming OK, but but

00:31:40 [ML]

I mean I think it's expressing it as operator is much nicer, but uhm like if you're doing like if you're writing code that really is just dealing with big flat arrays, then for scans plus min and Max and I think are really all you ever use for reductions. There are a few more like you might use the product and probably some other stuff.

00:32:07 [AB]

Well, minus scan is depending on how it's defined is useful, which is why I wasn't defined it that way division scan.

00:32:14 [CH]

Division scan, I mean I consider.

00:32:17 [AB]

Yeah, it's alternating series.

00:32:20 [ML]

Yeah, but also if you don't have a minus scan, you just you just write like - 1 to the power of iota, yeah length and multiply that before you do your plus scan so.

00:32:30 [AB]

Or raise to the power of that before you do here.

00:32:33 [ML]

Yeah, and one of the time you already have an exponent, so you actually just want to make the number inside the exponent negative.

00:32:38 [CH]

I consider scans. I mean we should dedicate a whole episode to scans, but I consider the Dyalog APL and J scans broken and Q and BQN are the only ones that get it right in the and I think Nial also adopted APL version 'cause sure it works nicely for continuing fractions and what? What's the other one called? It's the minus one column. Alternating sums is what it's called. Something like that.

00:33:11 [AB]

Alternating products.

00:33:12 [CH]

Sure, you get those patterns, but like in one of my favorite problems, the maximum consecutive ones it breaks because of the fact that you're doing repeated. Uhm, fold. Right? You need to be doing it from left to right in one pass and. So yeah, it breaks my heart that I'm sure like like Marshall said, in rare cases it's what you want, but.

00:33:39 [ML]

Yeah, well, and even for stuff like continued fractions, there's a representation that uses a matrix like internally that you can compute all the convergence of the continued fraction with a left to right scan, and then you get it in linear time instead of quadratic time, right so you have to do more math for that. But still, it's like the ideal form is expressed in terms of a left to right scan. Right?

00:34:05 [AB]

But J has now that that generalized F with lots of dots and colons and stuff, right, you can do on. [11]

00:34:12 [ML]

Yeah, I'm sure the right thing is in there somewhere.

00:34:14 [AB]

Yeah, just figure out. Yeah it's the fold is a bit confusing, but yeah, I believe it is.

00:34:20 [BT]

And the other thing I wonder about, does your algorithm algorithm break for Conor's consecutive ones if you do the reverse scan and J 'cause you can do it the opposite direction is that.

00:34:31 [ML]

Well, no, it doesn't. The number of consecutive ones. It's the same forwards or backwards. So so yeah, J does have. I think we've kind of glossed over what J does, but Jay doesn't actually have a scan operator, even it has a prefix is operator [12] in a suffix, is operator, and so you write the APL scan as reduce prefixes. But you can also write reduced suffixes and that saves its results. So right the problem is that an APL reduction goes right to left, and J doesn't have a left to right reduction, except I guess in the fold family of functions that are newer. So J doesn't have a left to right reduction, but it has both prefixes and suffixes operators, so if you do the right to left reduce with a suffix operator, then it's allowed to save all the results in between and then it can run in linear time.

00:35:28 [CH]

Is APL reduce left to right or right to left. I always thought it was left to right.

00:35:33 [ML]

No, it's right to left.

00:35:36 [AB]

In principle, I mean if it understands that it can do it left to right then, yeah, whatever. Lots of footnotes there.

00:35:43 [CH]

'cause I I recall doing it, I recall doing a test once that I thought confirmed that it was going left to right, which is why I thought it was even more confusing when their scan was repeated right to left scan, but you're saying by in general it's right to left, but there's some optimization sometimes where it ends up going.

00:36:00 [ML]

Well, there's like one case in Dyalog where if you're doing a plus scan where the result consists of a single element on floating point numbers, then it will do it. And I mean the only difference here is the way the rounding error goes. So for most lists of numbers it's a very small difference. It's just like a little precision error. But yeah, those plus scans go left to right and everything else I think goes right to left.

00:36:30 [CH]

OK, alright we're going to step back a couple steps 'cause we haven't. We're not even close to building our list here, so. Uh, what so were we? I said we weren't in unanimous unanimous agreement, but then it sounded like there was confusion. People thought I was talking about array languages, so I was talking about minus scan minus reduce and sort of generic reduce and scan being something you need to get in the green door, the Iversonian language.

00:36:57 [AB]

I think it's a generalization of it. It's not whether or not you can do it, or whether not when it has a name, but the fact that is thought of as a single concept.

That's what makes Iversonian. Well, I mean we have to reckon with the fact that Ken Iverson J decomposes scan into 2 concepts and I actually went this way yeah.

00:37:21 [AB]

Yes, but it's still a generalization, yeah?

00:37:23 [ML]

Uhm, but it's a different one. It's not the same one that's used by the original APL.

00:37:30 [BT]

That's true, but I think we have to allow that the languages have evolved right? So that what the original APL might have done has changed as the languages evolved. That doesn't mean that a language becomes non Iversonian because it evolves.

00:37:46 [ML]

Oh, I mean they might.

00:37:48 [AB]

Is Iverson notation is surely not the Iversonian language, right? [13]

00:37:53 [ML]

Uhm, what would I mean? It has a scan and all that.

00:37:57 [AB]

It it doesn't have generalized multidimensional arrays, it does.

00:38:01 [ML]

I mean the book, uh, programming language doesn't use them, but uh it doesn't really preclude them does it.

00:38:11 [AB]

It has different symbols for the length along the 1st and 2nd axis of a matrix, instead of having a single symbol for the length along all axis, which means there is no definition for lengths along further axis, you can't access them in any way.

00:38:27 [ML]

So yeah, it would have to be extended to to deal with the Rank 3 arrays. I mean, you'd just pick the next Greek letter after mu, I think.

00:38:36 [CH]

All right so clearly my criteria. It's not the best one. Let's go around before even discussing it and you gotta, you gotta think on your feet. If you had to choose one or two things you're standing at the Iverson Ian language party of the century, all the Iverson Ian languages are invited, and the array languages want to get into, but you've got your list and you you've got a couple questions you want to ask them and they don't get in based on these questions or criteria? Does anyone have all right Bob got his hand up and everyone can think in the background and then we'll try not to discuss until we go around and unless you can't think of one. Bob, go ahead so sorry.

00:39:19 [BT]

Did you want me not to say what I think the distinguishing?

00:39:22 [CH]

No, we're just we'll say and then 'cause I know what's going to happen 'cause that's what happened last time I said something and now we here we are 30 minutes later and. We haven't even used that criteria that I suggested as if it's on the list. So let's try and generate a list and then we'll come back and discuss the the items on the list.

00:39:39 [BT]

OK, so I would say to begin with -Was your language originated by Ken Iverson?

00:39:47 [AB]

Yeah exactly.

00:39:49 [BT]

Because to me, that's that's and then on top of that, to even distinguish a bit more. Is it what I would refer to as mathematical in origin? So in other words, it's based almost axiomatically that you start with, you're not trying to add add on things to the language to allow it to do its operations. What you're doing is you're taking foundational elements and putting them together to give yourself a computer language, and to me that's what an Iversonian which does, and then the questions I've got is whether that corresponds to Tony Allen Rumora, 'cause I'm not sure too much about them I don't know them very well.

00:40:28 [CH]

Interesting, so this is. Actually we got wait wait wait wait.

00:40:30 [ML]

Well, so that for the first point.

00:40:31 [CH]

Wait wait wait, we're not discussing yet. We're not discussing yet we're going around, generally, see, I told you it was going to happen.

00:40:37 [ML]

Well, I I was not. I was gonna discuss the parameters of our discussion so maybe we.

00:40:40 [CH]

Oh no so that's The thing is, I was going to say it sounds like we got two lists here. We've got a list of questions that if the answer is no, you don't get to come in but it doesn't necessarily mean you get in just means you get rejected. And then there's another list, which is what Bob just added in item 2, which is basically like showing a VIP badge, you know. OK, Ken created me, I'm going. It doesn't matter what the what the list says Ken created me. I'm going through the door and you go OK. Yeah, he's got the he's got the VIP pass, which would mean you know. So J APL 360 I guess Dyalog APL, even though technically wasn't created by Ken, but it's you know, uh, reimplementation of the exact same ideas. Is here so those those three get you VIP badges and does like Nial count does Remora count? [14] It's like, well, those weren't created. So yeah, we could discuss that. So I guess you can add criteria Bob is implicitly created a new list. Feel free to add to that one as well if you think there's a criteria that gets you through the door no matter what language and Marshall do you want to say something?

00:41:40 [ML]

And yeah, well I'd I'd like to propose for this discussion that we we're not allowed to look at the history of the language. We have to just look at the design of the language. What it does and say purely based on that whether it's Iversonian or not.

00:41:56 [CH]

So Marshall's invalidating bob's?

00:41:59 [ML]

I don't think it's too interesting having a discussion to say, you know what does it mean for something to be an Iversonian language?

00:42:05 [ML]

Well, Iversonian if Ken Iverson created it. I mean, that's a valid definition, but, it's also not really a definition you can discuss 'cause you just say well. Yeah, I mean I guess we're pretty sure which language Ken Iverson created.

00:42:20 [BT]

I did add the second criteria, which to me is the important part where it's mathematical that you're building from an axiom axiomatic base, and then that that raises a discussion about whether building it up that way disqualifies the earlier versions of the APL's because it they may not have got to that point before they evolved

00:42:46 [CH]

All right, I will keep Bob criteria as like a shaken. Like what's that restaurant shake in and out? I'm thinking of Shake Shack it's like an in and out secret menu item where you can get like peanut butter on your burger if you whisper the right incantation to you know the individual let you into the party. He'll lift the page and say, oh that's right, you know you can come in because you were created by Ken, but it's not on the first page. It's still on a secret list though all right and and the mathematical notation part we definitely will keep on to discuss.

Adam Bob Marshall. I just said Bob, Stephen did you raise your hand to yeah.

00:43:26 [ST]

Implicit iteration on the primitives. That's the first, a second is a vector notation [15], as in our earlier discussion. You gotta have a way of doing vectors literally. And scan reduce as you were saying.

00:43:44 [CH]

And so those are all on the original list, if you can't answer yes to those three, you get turned away from the party. OK, I like that. I won't say anything though, 'cause we'll discuss in a second Adam and Marshall have you got your answers ready.

00:44:00 [AB]

I would say if it's at least an attempt at a generalization of traditional mathematical notation then it's Iversonian meaning, program languages kind of trace lineages back and the language has to be traced from. I know Marshall said we shouldn't look at the history, but but this is, but I'm not talking about chronological history. We're talking about notational history.

00:44:26 [ML]

Yeah, but you're right, you can't just say like the the criterion is it's similar to this other entire set of school of thought. You gotta like name one or two feature.

00:44:35 [AB]

Yeah, but but I very much see it as like a generalization is especially of linear algebra, right that's taking linear algebra and distilling it out to its very essence and generalizing it.

00:44:48 [ML]

But all right? So if you're at the party and ALGOL comes to you and says, Oh yes, I was based on linear algebra. What do you do? Do you spend an hour arguing about like what aspects of linear algebra are and are not represented in ALGOL or?

00:45:02 [AB]

No, no, we are not arguing about these these criteria. I'm saying this is required for it to be over Sony, and that's not necessarily sufficient to be Iversonian, but it just doesn't seem like an actual criterion to be like.

00:45:12 [ML]

How do I apply that to the language Algol 'cause I mean, if you're asking, you know, does Algol have reductions? I don't know, I don't think it does, but you know that's a yes or no question that's pretty easy to resolve.

00:45:24 [AB]

But does Algol attempt at being a traditional mathematical notation generalized, I don't think so.

00:45:32 [CH]

When you say notation, generalization like do you mean like when I hear that I think of J, APL, and BQN, like the languages that use symbols basically like either digraphs or Unicode characters, is that what you mean by like generalized notation or like? Can you have a?

00:45:50 [AB]

Doesn't mean that that helps, I would say, but it's not I don't think that's necessarily necessary. They have symbols because they're not necessarily glyphs, but it could be names for things as well that represent things you have in traditional mathematics, but they have been generalized to do more with whether the generalization is by collapsing. And multiple things into a single notational thing. For example, reduction is a generalization of the capital π and and capital Sigma [16] and so on. Or whether it is a decomposition into constituent parts for generalization, like the inner product is a generalization of the plus dot times in APL terms Uhm, instance of it and you could say even mathematics journey deals with with numbers and and lists and sets and matrices, and here generalizing to any rank or even nested stuff. Sorry it's taking these core concepts and I think those can be identified and generalizing them.

Distilling them out to their pure essence and then expanding on them like that, I think. That's a necessary requirement for something to be Iversonian.

00:47:17 [CH]

And I can see I can see why Marshall. He's definitely pushing back a bit 'cause then yeah, while I'm listening to that, I'm thinking of Haskell in my head, it's like Haskell.

00:47:27 [ML]

Yeah, I think you're you've got some hidden circularity in that what you think of as the pure essence is what APL thinks of as the pure essence, but like for example, I mean the base function is pretty. It's a very compound thing. I mean, why isn't that distilled to its pure essence, uh?

00:47:45 [AB]

Doesn't, that doesn't mean there can't be add-ons. Certainly there can be more things added, but it has to have that time aspect. At least that's what I think. I mean, we can argue afterwards.

00:47:54 [CH]

Yeah yeah, OK, so we we've got we've got Adams Adams generalized notation, tied in with sort of mathematical historical notation. Last but not least, Marshall.

00:48:05 [ML]

Alright, yeah I.

00:48:05 [CH]

All right!

00:48:06 [ML]

Think I've got mine together. Alright, first I think for being an Iversonian language, the syntax is very important. You have to have a syntax with where your functions are prefix or infix functions [17], so they replied to one argument on the right or an argument on the left and the right. And I mean maybe if you take that and flip it around, so you're going left to right, I might accept it, but even that to me is kind of questionable if you're saying it's really Iversonian. Uhm, I think probably it should use symbols for most of the functions. It should at least have you know some array functions that are represented with symbols. Uh, because that's really the distinction, like nearly every programming languages has some symbol operators, but usually they're just arithmetic. So if you have array symbols as well, I would say you need at least some of those to be an Iversonian language UM. And then, in terms of the functionality, I mean this is not a complete criterion, but I think having I think outer product is pretty important for array languages, so having a way that's simple to you, know pair every element of 1 array with every element of another. Uh, you know pairwise, in addition to having like the corresponding map that's just each and scalar well scalar extension is really a kind of outer product so that I'm not sure is even necessary.

00:49:41 [AB]

Uh, yeah, that's that's a good one the dual map if you can call it that. The one that maps over both right?

00:49:48 [ML]

Yeah, well, it's hard to talk about 'cause in you know the the wide programming world.

00:49:48 [AB]

And or both.

00:49:54 [ML]

There's really not a name for this kind of map. I think J went with table which is, uh which describes what it does alright, but it's awkward grammatically is the real problem with it. But yeah, this this idea of I want to pair and every element of the left side with every element of the right side, all the different pairs and get that out as an array structure, not just as a set.

00:50:21 [BT]

And that I think J's table was because of, you know multiplication tables or addition tables and you get the same result, yeah.

00:50:28 [ML]

Yeah, Iverson explicitly said that, definitely.

00:50:29 [CH]

Alright, before we hop into I feel like there's gonna be part three to this now, seeing as we're like close to the hour mark already before we hop into sort of freeform discussion, so let me recap what everyone said. I didn't even put my own name on the list and technically there's two lists here, So what I said was reduce, scan, which Stephen already said and I should stop typing 'cause it's going to ruin the audio of this, so I said reduce scan, generic reduces and scans and then also a minus reduce and minus scan because certain languages have reduces and scans but then the minus doesn't work. Bob, you initially said created by Ken Iverson, which we put on the secret list. And then in mathematical notation, like it's an attempt.

00:51:25 [BT]

Yeah, I I think actually Adam actually expressed it better than I did, but I sort of go along with his that it's origination in the language has to do with original mathematical notation and it's extended itself to be more general beyond that. So you can do the same sort of things with characters and different things as opposed, and this is where I think it gets interesting with K. You know where Arthur [18] was bringing Lisp into the equation, so you've got the sort of the mix of the two, and I think that still makes K&Q Iversonian, but now they have this extra flavor of Lisp in them, which is interesting.

00:52:04 [CH]

Yeah, that is a that is a good point and so it's good that I got you to re articulate that because then yeah, it sort of simplifies. It is that Adam's and Bob's overlap, or almost the same thing, so Adam will skip Stephen, who was second. Adam was the generalization of mathematical notation and tied in, sort of with the historical evolution of those symbols. Stephen's was implicit. Iteration #1, #2, literal vectors and #3 reducing scan. I think I got that right and then Marshalls was very interesting prefix and infix for unary and binary operations, mainly symbols for a lot of your operations and then outer product or the ability to do sort of the pairing of two different arrays.

00:52:53 [ML]

Yeah, so the second one I think I would change to symbols for array operations. This is really the big thing. I wandered around a bit, but mainly symbols for array operations. And I think I'd like to tack on 'cause we haven't mentioned it I, I think like a compression [19] I mean K has where, so some way of filtering out elements in array based on a Boolean based not on a function but on a boolean array. That, I think, is one of the is another really core aspect of APL.

00:53:27 [CH]

Interesting, alright? So I will kind of play moderator here and we'll skip around. So I I think one of the most two of the most well so do we all agree with Stephen's entire list 'cause I do.

00:53:43 [AB]

No, I don't.

00:53:46 [CH]

That I thought that was going to be easy. Alright, we'll come back to Stephen's then. Well, how about

00:53:49 [ML]

I want to hear what you disagree about?

00:53:52 [CH]

All right, all right, all right. I can't do that. Go ahead Adam or so. Wait who who raise of hands? Who agrees with Stephens entire list?

00:54:00 [ML]

I say it again. I think I do.

00:54:02 [CH]

So implicit iteration, which can be rank polymorphism or something, but doesn't need to be literal vectors, so way to spell rank 1 arrays and reduce and scan.

00:54:11 [ML]

OK yeah, I'm going to disagree with the second Adam might as well.

00:54:14 [AB]

Yeah, same here.

00:54:15 [CH]

All right, so we'll go to we'll go to Adam Bob and then or Adam Marshall and then Bob 'cause only Stephen and myself put our hands up for the listener, I hear I was thinking as moderator of this debate. I thought it was going to be unanimous agreement. Of course the host would think that about his own personal opinions, but anyways Adam, you go ahead.

00:54:35 [AB]

I mean, it's it's really that simple. No, you don't need a little notation for vectors of things. If I had to construct it using explicit consideration or reshapes or whatever, So what it was like if if I took away this notation today from K or Q or APL, or BQN or J, it would still be exact same language, except you would write every such literal list as a parenthesis with the concrete nation function between the elements and guns.

00:55:05 [CH]

Isn't that the same isn't that a literal list though.

00:55:07 [ML]

Yeah, well, I mean so. There's also the confusion about the literal list I mean I would not consider BQN's list notation to be in the same category as early APL or J or K stranding [20], so uhm.

00:55:21 [CH]

Things, so maybe there's there's a.

00:55:23 [ML]

I think to the extent that you're really saying something there, it's not a, and it is definitely Iverson, thought that you know being able to write out just the numbers with spaces was pretty important.

00:55:36 [AB]

Wait, what? I thought Iverson was against that. Uh, his original adverse annotation used open paren number, comma number, number number as close plan like that, and then when they implemented it I my as I understand it and remembered from people haven't told me he was against removing all that syntax and and there was somebody who was very proud of having implemented the ability to write open paren number, comma number, comma number, close paren and not have it be very bad performance because if you if you concatenate from the right then you end up having to reallocate memory over and over again so originally they did that then and then the implementers added this strand. I call this translation even though it wasn't called like that at the time between just numbers with spaces in between and Iverson was initially against that and it caused all kinds of problems later.

00:56:33 [ML]

I have not heard that I mean definitely in J, uh, I'm pretty sure this wouldn't have happened if Iverson didn't agree with it. It has, you know, lists of numbers of spaces in between are a single token that is. I'm not quite sure about this, but I think it's processed that you know the the time the source is parsed instead of when it's evaluated doesn't matter, of course.

00:56:54 [AB]

That depends on the implementation, and it's not that implies.

00:56:58 [ML]

Well, yeah, but there's only one J implementation.

00:57:01 [AB]

Ah, but but this implies in in in other APL's where you have more generalized stranding, or even if without that, even without nested arrays, the way that the the binding rules are in, say, APL 2 is that operators and bind to their operands stronger than adjacent elements bind to each other, so that means you cannot just blanket say oh in this expression it says 1 space, 2 space, three, so it's one thing I mean.

00:57:32 [ML]

But Iverson didn't like APL 2 so. [21]

00:57:34 [AB]

I don't know if you like it or not. He didn't like that particular aspect of it, but that's it's a binding strength decision.

00:57:39 [ML]

I definitely agree that like that, changing this aspect is not enough to like kick you out of the Iversonian Languages Club, I think. It's not I mean, even if it was important to Iverson, it's not really what makes programming in one of these languages the way it is. Like it's not a it's not an important feature to the experience of using it.

00:58:03 [AB]

I can quote Roger Hui [22], which isn't Iverson, but close has Iverson number of 1, right? As saying that stranding must die or standing Delinda rest, he said and we're stranding means this thing about putting adjacent elements and that becomes a single array. He said that at a time when APL the API was talking about and and did not have any other array notation. So if we just followed what he said and removed that then there would be no literal notation. The only way to have an array would be to construct it using functions. And surely, if Roger says that should be removed without him saying and we should have something else instead, you wouldn't now say, oh that's not Iversonian anymore.

00:58:54 [ST]

I remember this issue about stranding notation, but my understanding of it is we're not talking here about vector literals, but about the way in which, say, in Dyalog APL you can take 3 variables and make a list out of them just by typing their names together with spaces. That's a generalization of the the thing that was in APL 360, but even APL 360 allows you to write 1 space, 2 space, 3. What I'm challenging is, is the implication from what you're saying that Rogers opposition to stranding extended right the way down to vector literals?

00:59:32 [AB]

Well he used very strong language for it, and I even suggested to him to do something like QN and use an explicit character for it rather than just trickster position and says don't give it a slow lingering death or something like that. Just kill it completely. I can find the exact words.

00:59:49 [ST]

Well, I want to toss into this because I've been on there on record as saying that my primary motivation for being drawn to the Abyssinian languages as aesthetic is if I have to type ship between the integers I don't want to play.

01:00:03 [CH]

Interesting, so that's I'm gotta be honest, I'm a little bit confused on what we're all talking about here.

01:00:10 [BT]

I I've got I've got my following Adam and Marshall, I think the reason that Stephen second criteria doesn't isn't necessary is because all we're doing is determining the difference between array languages and Iversonian languages, and I think the array part of it or the vector part of it is implicit. So the difference between the two are the other two criteria. It's not whether you can represent something as an array because we're only talking about array languages and we're only talking about Iversonian languages, so it's a pretty clear subset.

01:00:44 [ML]

Well, I think it's OK to say that like when Iversonian I think it's incorrect but acceptable to say that, uh, neither Iversonian language is one that presents arrays in the way Iverson designs did. So you write them out in the source code with spaces.

01:01:03 [BT]

Sure, but we're not just we're not trying to define Iversonian only, we're just trying to distinguish between array languages and Iversonian languages. That's that's my argument to it.

01:01:13 [ML]

But the array languages don't do that like none of the Julia or MATLAB, [23] maybe our does, I don't, I don't think so none of those other languages let you write out an array, just this one space 2 space 3.

01:01:25 [CH]

Yeah, I don't think so.

01:01:28 [AB]

Neither does BQN so yeah, I'm. Oh, so here is the exact language from Roger. So I asked him so he wrote an e-mail, then entirety of the e-mail was strands delinda est and then I said, one could imagine explicit syntax like another 10 under score, 20 under score 30, which might be OK and he said he answered the whole answer e-mail was ... no. And then he wrote a follow up e-mail 6 minutes later saying I mean, if you're going to kill strands, kill them stone cold dead don't give them a lingering death.

01:02:05 [CH]

This is great that we're getting quotes.

01:02:08 [AB]

This was by the way, the last e-mail that he wrote to me in his life.

01:02:11 Speaker 6

So wow, that is a bit more somber.

01:02:15 [BT]

That's got that's got weight to it for sure. Yeah, gravitas.

01:02:19 [AB]

His last words to me before his death was death.

01:02:25 [CH]

All right so this literal vector is literally no stranding, no square brackets or any type of brackets even though the other ones aren't called brackets, they're called parentheses and other things in braces. So like you know, MATLAB uses bracket number, comma, number, comma and many of the other array languages outside of the quote UN quote Iversonian ones use that kind of notation for creating a list but that is not considered a literal. Is that what we're kind of discussing here because I think earlier in the episode I talked about the fact that like in C has quote UN quote like array literals, but those array literals are bracket number, comma number, comma, number, comma. So like.

01:03:20 [ML]

Yeah, they're only literals in terms of the semantics, not in.

01:03:24 [CH]

Right, so there's a delineation between just basically the juxtaposition of numbers being the way that you create rank one arrays and having to use either stranding or some other thing like, yeah, brackets and commas and those are the two different camps. So where does? If so, if my understanding of that is what folks were just discussing, where does BQN fall in there? Because BQN has both the sort of weird Unicode angle brackets with commas in between and then also the bent underscores whatever, owever, you refer to.

01:04:07 [ML]

Smile character.

01:04:08 [CH]

Smile the smile character. So like this technically, does BQN not have? literal arrays or literal vectors?

01:04:19 [ML]

Well, other than strings, no.

01:04:22 [CH]

So, and that informs why you disagree with point #2 then. So Adam was, and so actually is that the only one that folks are disagreeing with here, both implicit iteration and reducing scan everyone on the same page.

01:04:32 [AB]

About I I don't just I'd like to everybody else's things too.

01:04:36 [CH]

OK, so so. I think it's just this literal vectors, which is interesting because I don't actually think I sort of switched my answer because whereas Stephen said if I have to put you know commas, he used to something a little bit more extreme in between my numbers. I don't want to play for me. I don't actually think I mean, I dabble around in too many other languages that like I'm used to writing in Haskell and Python And every language basically it's like bracket number, comma, number, comma, number and if like if APL, J and these languages made me do that. I would think it's a little slightly less nice, but I mean 80% or 90% of the time that's what I'm doing anyways, so I wouldn't the same thing is like in MATLAB when I have to type the brackets and commas, it's like, oh, you know, just like everything else, so I'm. It doesn't break my heart and yeah, I don't think I would necessarily. Well, because of based on that criteria, we'd have to reject BQN UM then I would want to let BQN through the door so.

01:05:41 [ST]

Let me fight, let me fight back on that a little Conor I just I just left the microphone to go and see if I still have which I don't. 'cause I've lent them to a mathematician friend, Ken Iverson's early books on elementary analysis and so in we which which are full of the original. Uhm, APL golf ball stuff. And I I wanted to hold them up to the camera because there's pages and pages of examples which have the numbers represented. I mean matrices are just tables of numbers separated by spaces, and the inputs are like lists of numbers separated by spaces, and there's a beauty to it and like you, if I had to program, why putting in bloody commas and parenthesis everywhere I could live with that? Yeah, well, where's the joy is the joy brought, isn't it the joy that brought you to this in the 1st place? Isn't that why we're here doing this?

01:06:41 [CH]

I mean, uh, I completely agree with you know the same view that you have. You know you've mentioned. Not sure if it was one episode or multiple episodes about the the poetry of APL and Array languages. For me, though, doesn't extend to the literal vectors like to me, the poetry is in the reduces and the scans and the outer products and even the compress but yeah like for me like reduce scan it's it's the functional you know, generalized notation for doing all this stuff like? I don't want to go on a tangent, but like I've spent in the last over the last two days a lot of time thinking about and sort of upset about the fact that we have in language in most programming languages vocabulary for plus reduction and multiply reduction in the form of sum and product or prod if you're in. I think Python calls it prod and in Q. This is a very beautiful thing about Q adds the vocabulary for every single. Or not every single but a lot of the reduces and scans, so there's sum and sums with an S at the end and prd 'cause they drop the oh and prdes for our product. And they have min and mens. And anyways, I won't go on but. I realized, and I guess this is becoming a tangent, that in the there's an in video library called Maddox and they call their Max reduction our Max and their min reduction. Rmin because the vocabulary for sort of a naming a reduction for plus reduction at some for multiplies reduction its product, but then for every other reduction there isn't really a name and it's the same irritation of why we gave infix binary operations to plus, minus divides and times but then stopped at Min and Max and we then have to use prefix named functions. And so my I kind of and like this is what I've been dealing with is do I want? Do I want there to be other shortcut names? It's like I do. I wish there was something like, you know Cain and Abel for like the Min and Max reduce like some in product you know. Sure it comes from language, but like there's not really any in the names of plus. And some when you look at the characters, it's like there's no real relation there other than like how language solved and I just wish some in product like didn't exist because then I wouldn't feel bad about the fact that there is no name like 'cause then it would always just be like plus reduce times reduce minus reducing it all be regular. But because some in product exist and actually in most languages that don't have two characters for some reduction, having a three letter you know function in the name of some is very nice it then makes me think. So what should we call the min and the Max Reduce Haskell calls the minimum and maximum this in video library just prefixes and R and it's just, it's just it's frustrate. It's been this is my two-minute tangent of. There are no specialized names for these other reductions and I don't know what to do. And like it's what I really want is, I want the word sum and product to disappear for our vocabulary, so we could all just be having like the reduces, which is what you have in APL, and that's why it's so beautiful everything is 2 characters. It's you know the binary operation and the reduce the binary operation and the scan, and it's even nicer than some because the reason that I really like some is it's so it's so terse switching some to reduce parentheses you know, and then your binary operation plus end parentheses. It is nicer to type some and so you don't actually want to get rid of it, and everyone coming from Python knows that there's some there, and so they're going to be irritated. Why do I have to spell reduce anyways? Bob had their hand up, had his hand up and I'll stop my tangent.

01:10:20 [BT]

Well, one of the things that I've heard Henry Rich say I think he actually wrote it down somewhere, was the fact that J is a bit like a a plumber's toolkit where you have connectors and you got pipes and the pluses would be like the OR the operators are the verbs essentially are the pipes and the conjunctions and adverbs are the connectors and you can connect them all in different ways. So in essence, what you're saying about Json is in J. I think the way quite often people think about it is that you break it apart and you think about them that way. So it's plus slash and that's that's what you do for something you don't call it Sum because that creates problems for all the other ways you would use that slash you you just, you just say it's plus plus that's the sum. That's what you read it as when you when you do that, and so I'm kind of 1 on the same page as you with.

01:11:12 [CH]

I suppose that is the problem that I have is that like Rust and Python And all these languages, they have some and so like and and spelling it out literally is more verbose and kind of irritating, and so it's just that that's my wishes. I wish some just didn't exist so that the people from Python And rust aren't like how come I have to be more verbose when I come.

Even you don't in J and APL. Sorry, I cut you off Marshall though.

01:11:32 [ML]

Were you gonna say yeah, so a strategy that I really like for these non Iversonian languages where a function can have any number of arguments. Obviously this doesn't work if functions can't actually have any number of arguments, like a lot of languages do, like if they limited it. You know, some large number is to say that the plus function actually doesn't take two arguments. It takes any number of arguments and it just adds them all together. So so actually this does this.

01:11:58 [CH]

Yeah, Lisp does that. I was actually thinking about this.

01:12:01 [ML]

And then yeah, you would have the same thing you have in K where you would have some special way of applying the plus function to this list and it would Add all the numbers together. And then you could do that for min and Max and all the things and you could even do it for minus and that would make sense. Uhm, so then you're saying that, well, yeah, this plus reduction is not really even like you don't necessarily want to express it as a fold and what that also lets you do is you don't have this strictly imposed argument ordering that we've been talking about. Uh, where where you have to sum in a particular order and then the the language can do like you know, a binary something that's faster. So that's a that's a nice strategy that would that would is a different way of sort of unifying making all these reductions easy to write.

01:12:51 [CH]

So that that is actually something that I thought about, and it's it ties back to something I had, uh, I don't know if it was a year ago or months ago that lists like if if you think about, I think about sort of the four main higher order functions are reduced scan a map and then technically there's find, but actually I might be getting the 4th one wrong, but anyways you've got reduced scan and map and yeah, map is kind of implicit [24] in array languages like that's the implicit iteration, so you never need to spell out map technically there's each, but we'll ignore that, and so like a key feature of these languages is that the map higher order function is implicit, and this isn't the case for lists, but you could think of. What if you changed the higher order function that was implicit in your operation, and so instead of mapping so like when we do, you know, adding to matrixes or whatever we just put plus but like, imagine if instead of that we were doing reductions implicit, which is kind of what Lisp does in certain cases, which is what Marshall is talking about. You don't actually have to mention the reduce, you just give it a number of arguments and it'll do the reduction by default, which I think in some lisps you have to do like an apply plus others you can just go plus and then either 2 numbers or a list of numbers and that fixes. My problem is because now I don't even need to mention reduce. It's just it's called you know how do you do scand? Well, scan would be explicit once again, but The thing is is that we don't actually have names for scans except for in Q and Q does a very beautiful job of naming them by just adding the S, but like I don't have the problem of some like some is, it's really problematic product is not as much,'cause it's 6 characters and you think, well, it's still a pretty short number of characters, but like the difference between like reduce minus and product and like reduce plus and some it's like a factor of two and this sum is, sum is the really the problem? But the point is, is that we don't have a vocabulary for scans. So the fact that I still have to be explicit about my scans solves my problem. The point being though is, I think that like in order to get implicit reductions means that you have to give up implicit iteration AKA mapping, which I don't really want to do.

01:14:57 [ML]

No, I don't think so 'cause like if you had the function plus that takes any number of arguments, that's just all the arguments, intakes, and each one of those could be an array and then it would figure out you know what's the what's the common array structure of these?

01:15:11 [AB]

Well, let's say we didn't have this overloads that are common in in Iversonian languages. Yes, hey, maybe that's another criteria. Overloading monadic dyadic on a single symbol. Q has something to say and and but let's say we didn't have that with separate symbols and so you just did find the plus symbol if you only give it one argument on one side, then it's a sum and if you give it two arguments then it's.

01:15:40 [CH]

My mind is like blowing right now.

01:15:43 [AB]

So then then if you would write minus ten one you get 9.

01:15:48 [CH]

So actually, that's wow, uhm.

01:15:52 [AB]

All reductions, or we could say you can translate any infix operation to a reduction over the concatenation of the argument.

01:16:00 [CH]

So wait, why this is I'm missing something here? Why do we need the slash that's a great point, Marshall. So you said, wow, I'm not sure we have to figure out if it's a great point. Seems like a great point at the moment. Why do you actually need this like what's the purpose of the slash, why not have exactly what I think Adam was just saying is that in APL, instead of going plus slash on an IOTA sequence or list of numbers, you just go plus and that sums everything up. And then when it's infix, in the binary case where you would have mapping and in almost all situation where you're doing ranked polymorphism mapping you either you have your it's in a binary case you've got either a scalar in a matrix or matrix and matrix, etc. So what is why is this slash there? What is this slash giving you? What is the reduce giving you there?

01:16:45 [ML]

Well, it explicitly tells you this pairwise order, but, uhm, I don't think anybody really cares about this pairwise order. It's actually a pain when you're when you have a flow over A and you have to go in order 'cause that's slower.

01:16:57 [CH]

Oh so the the type of reduce the direction you mean. Yeah, so like the fact that it's a the slash means, uh, right fold or it could mean left fold or it could mean a nondeterministic reduce.

01:17:09 [ML]

I mean it does break it down into this individual you know a binary operation 'cause I mean really, you can only implement like you can only you have to 1st define. What the sum of two numbers is in order to define the sum of many of them it's I mean you could go straight to an array, but that's kind of artificial. I mean you basically be defining a reduction and building it into your definition if you didn't want to use the reduction, I think you'd still have to do it with rank, wouldn't you?

01:17:41 [BT]

You'd still be applying, so if you had a list of numbers and you say, put the the monadic plus, which is signum J if you want to. If you didn't have a reduction, you'd still need to do rank 0 so that you could do sign them to each individual one, because otherwise you'd be doing. You know rank one would would sum up the whole list you'd still need to specify rank I think.

01:18:06 [ML]

Well, the default would just be full rank, which is usually what you want.

01:18:10 [CH]

I mean, that's ranked.

01:18:11 [ML]

Yeah, well, it's weird with multidimensional arrays 'cause like is?

01:18:15 [AB]

Yeah, and if it's a scalar function then you would apply to each.

01:18:17 [CH]

I think rank would still be fine on both of these cases.

01:18:20 [AB]

And and this is again we have to get rid of the overloading, pairing up monadic dyadic forms in order for such a thing like that to work.

01:18:29 [ML]

Yeah, well I'm not being able to write, you know, minus X to negate X is pretty annoying that people are not going to like.

01:18:36 [AB]

That so is that any worse than not being ableto write minus five to mean minus five, you have to write either higher or low minors in these various languages.

01:18:46 [ML]

Well, you don't have to.

01:18:47 [CH]

All right, this is this is we gotta we'll come back I gotta think about this for like 2 weeks we're we're we're totally.

01:18:52 [ML]

Yeah, or this is not my Iversonian discussion.

01:18:55 [CH]

I'm being a bad, but I mean you said something that's like broke in my brain, is that and we will come back. I promise we will talk about this in the future about a language that replaces the overloading nature of you know two different meanings for the monadic and dyadic case of plus to just in the dyadic case. Implicit mapping in the Monadic case implicit reduction. We'll revisit that. Put a pin in it back to Stephen.

01:19:22 [ST]

Let's see if I can help get this back on track. I would like to take two items from Marshall's list on Adam to mine. One is the outer product and the other is the reduction compression, the where and are those uncontentious? Can we vote all of those?

01:19:42 [CH]

Uh, yes.

01:19:44 [AB]

I mean, K doesn't have, but Marshall did say an easy way to do it out approach k you don't have a built in, but never do you need a couple.

01:19:54 [ML]

Well, so they they both decompose outer product and the compression I I would say that they do this in a fundamentally Iversonian way, so it's not really.

01:20:05 [CH]

You can do an outer product in in Q I mean it doesn't look the same, but.

01:20:10 [AB]

What you can do, but there's no, there's no.

01:20:12 [CH]

Yeah, I mean it.

01:20:12 [AB]

Builtin for.

01:20:13 [CH]

It looks like a built in to the uninformed, which is basically me when it comes to Q. It's that like I spelt it wrong one time and then someone corrected me but by combining.

01:20:23 [ML]

It's just each left each right for all the listeners who are pondering this.

01:20:26 [CH]

Yeah, yeah, so that was the thing as I had just done one of the special cases that it just happened to work, but then someone pointed out that technically an outer product you need to do the slash colon slash colon and then put your binary operation on it. But I mean, yeah, so I vote that all those two things are on unanimously into other other people we got, we got Bob stand up.

01:20:48 [ML]

You can't vote for them to be unanimous.

01:20:50 [CH]

Adam and Marshall.

01:20:51 [CH]

Refused to be.

01:20:54 [ML]

No no.

01:20:56 [ST]

I took them from Marshalls, listing to me.

01:20:59 [AB]

No, I I I'm fine with the criteria but but having a very easy straightforward 2 built in in order to do it instead of 1 building just to get Q&K along since they don't have a compress and they don't have an in built in outer product.

01:21:16 [CH]

Yeah oh do k and q not have compress.

01:21:19 [ST]

Q is where it it the k form that is the ampersand.

01:21:23 [ML]

Well, q actually has the name, where, doesn't it?

01:21:25 [ST]

Yeah, here's the name where the keyword where and the K form of it is the ampersand sub monadic ampersand.

01:21:33 [CH]

Alright, I'm gonna as moderator. That seems like a unanimous enough agreement, so we're getting there. Implicit iteration, reduce scan outer product compressed with the list of boolean's, and we're like we're way over here, but I I did want it. I mean, I said I was going to skip Marshalls initial thing and then and then, or sorry, Skip Adams thoughts on Stephens literal vectors and then I was going to go to Marshalls thing so the other two and maybe 'cause we're short on time, we'll just talk about the second one, mainly symbols for array operations, so does that mean Nial gets cut 'cause right now Nial is in the party Nials, partying with the other array languages, but Nial doesn't have any symbols, as far as I know there are completely yeah.

01:22:17 [ML]

That's a good question. I mean, I feel pretty willing to throw out Q because it does. I mean, it's clearly an attempt to make the Iversonian languages more, you know, easier to to work into for people with other experience by making them less Iversonian.

01:22:32 [CH]

So you want you want to throw out Nial and Q.

01:22:35 [ML]

I don't know if I can throw out Nial and I don't know a whole lot about it as a language.

01:22:39 [CH]

So bouncer Marshall just standing at the front of the the party and just not only not letting people in. He's going in and kicking people out.

01:22:48 [AB]

I mean this is this is a bit. Uhm well I, I mean, I think it's an interesting conversation.

01:22:55 [ML]

So yeah, it's I mean to go back to my definition of an Iversonian language is one that's primarily influenced by Ken Iverson's ideas, is that really true of NEO, or is it now the weird thing with Neo is that it doesn't take a lot of outside influence, but it's influenced by APL and also Trenchard Morris ideas into. Some other person who is involved in creating Nial, so it's there are definitely not a lot of new things there and I could ask about my own languag, I, [25] in a similar vein, like the main influence on I is definitely J. But I did a lot of stuff and I I completely changed the syntax around I, you know, the functions are written with letters instead of characters and all sorts of things like that, so I don't know if I really is either Iversonian anymore or if it's you know, just some weird branch hanging off in space.

01:23:48 [CH]

So if you are listening to this and you're a Nial expert, I've interacted with some folks on the the APL farm discord. We will bring someone from the Nial community, and we'll we'll get educated by them. Uhm, but like from my dabbling around around and Nial, it like conforms very closely to APL. I don't know much about the, uh are they called?

01:24:10 [ML]

But that's in terms of the semantics, right? And not the syntax.

01:24:13 [CH]

Not that so that's The thing is. It's very similar in spirit in terms of like down to the way that scans work, which I consider them broken it like copies and rank and everything that I can do. Basically for simple Leetcode problems translates perfectly to Nial, but it's just that everything is spelt with you know what do you call the function names or I don't want to call them keywords, but they might be keywords if they're in the standard library, so you know everything you know iota instead of an iota symbol they have count. And instead of, you know the slashes for reducing scan they've got accumulate and reduce, and it goes on and I do like I think Marshall mentioned there's a couple extra ideas. I think they're called function trains or however they they have a way of chaining functions.

01:24:58 [ML]

Oh, they have atlases. [26]

01:24:59 [CH]

Atlas is what they're called.

01:25:00 [CH]

Thank you yeah, so.

01:25:00 [ML]

Which are listed functions, although these are actually very similar to function arrays and Dyalog APL except it's more like an integrated part of the language, so they're easier to create and use.

01:25:12 [CH]

Yeah, so we'll have an Nial expert on and they will also correct if our pronunciation isn't correct, but so the big difference is just it's the, it's the words and similar to Q is. It's just the words and it's like does that change 'cause a lot of folks that I've talked to they will argue that that sort of tool of thought and the dynamics of the symbols and stuff disappears even a little bit to an extent when you're dealing with J. And so, like I'm, I'm sure there is a camp of people that believe that as soon as you add words and you're no longer dealing with a quote UN quote, sort of symbolic language, or notational language, it doesn't not anyways Adam got something?.

01:25:50 [AB]

Yeah, if if I remember right I I I grew up on APL plus [27] and at least there was some version of APL plus that I used that had an option you could switch on and off to expose English names for for the primitives. OK, so if you switch this option on then you could write. Uhm, reduce first or something like that instead of slash bar. Surely you will not say that if I'm in the middle of my programming session, it maybe even suspended in the middle of my function running and I switch this option on.

Now I'm not using an Iversonian language anymore.

01:26:28 [CH]

I mean, I think some people, some people would. Some people would though I think. And sorry, I cut you off, Marshall, what were you saying?

01:26:34 [ML]

It it does have the symbols right, it just adds names.

01:26:37 [AB]

Yeah, but but and then if you look at very old like mailing lists and things or for APL, then, because of encoding issues then they would spill out primitive names in braces. Braces weren't really used for anything, so they would spill them out and or kind of picture them with ASCII characters like surely if you spell out or or or you could say like this you might take some APL code and I read it to you in English. Is it is my spoken form of the written code, not Iversonian, while the written code is Iversonian? Seriously, give me a break.

01:27:09 [CH]

The thing is it's like you are losing information by changing the mode of communication like to say that there's absolutely zero difference is like, demonstrably false, like the visual representation of some of the glyphs is lost when you say the word.

01:27:25 [CH]

Like how many times have I talked to or I've I brought this up at talks that I've given. But like the fact that reduce and scan or fold and scan have in the spoken version or in the written version when you spell out reducing scan have nothing to do with each other. But the the way that I discovered the relationship between those two algorithms or functions is that they're vertically symmetric in when you're dealing with the slashes like that that Eureka moment, or like the discovery of the relationship between those two algorithms, came from looking at what they visually look like, so there definitely is a difference between saying plus scan and plus reduce and looking at the plus slash and like plus back slash like it's even in trying to.

01:28:08 [AB]

I I wouldn't read them like that. I would say, like Bob, I would say plus slash and plus back slash and you can hear right away that they're related.

01:28:17 [CH]

I think most people say plus reduce and plus scan like they.

01:28:20 [AB]

I don't, I said I say plus slash for sure.

01:28:22 [CH]

Well, we got to get a Twitter poll going on here. How do you read the following in an APL speaker speaker J speak.

01:28:29 [ML]

Well, I mean that is part of the thing about APL. Is that when people speak APL, they say you know wildly differing things for the names of different functions like.

01:28:41 [AB]

Almost nobody says index generator everybody says.

01:28:44 [ST]

Yeah, I don't want to weigh in on corner side of this when I was revising the q reference, I found that earlier versions of the documentation spoke about scan and over which is reduced documented them separate. And actually, it's the same computation going going on in both cases. So it makes it's it's a little tricky in terms of topography and then writing the articles. But now I think properly documented as basically the same thing, and it's just a question of whether you want all the interim results or just the last result.

01:29:27 [AB]

Yeah, who doesn't say jot dot times?

01:29:32 [ML]

Well, I would usually say times table, but that's 'cause I use.

01:29:35 [AB]

It you don't say jot dot times when you read it, what do you say?

01:29:38 [CH]

That actually depends is like sometimes you read it literally as like the jot dot, but like usually well, yeah it depends. Sometimes I say outer product sometimes I think I think the the the meta point or the point that I'm just trying to make is that like there is a difference between having words and I don't even know what camp I am because like for me personally I think like the true authentic, you know, are a language Iversonian and language experience is in this like symbolic, you know it's it's looking at the symbols, you know it's it's, you know even to J like you, you lose something and so I think that's the purest form. But like once I've been swimming enough in that like Iversonian Lake of Ideas, it's like it's me going to Q or to J or to even Nial. It's like I already have learned like the purest form of that, and now I can just solve problems that way. And however I spell, it doesn't really make a difference but being like indoctrinated or like you know, falling like learning it through a language like niau I don't think is going to have the same impact because you know they call their scan accumulate and they call their reduce, reduce and like you're not going to have some of the same insights and even when it comes down to like a lot of the operators, I think in APL, they use, like the diaeresis or have you pronounced that so, like the fact that you know the W Combinator, which is, I think, referred to itself some people call it selfie. [28]

It's got the double dots over the compliment and like even there's there's some like symbolism in there and that, like the compliment is sort of like the flipped whatever. And it's like there's so much beauty in all these glyphs that gets lost when you wordify it, and that's not to say that we don't let the wordified languages into the party. I think I think those languages should come into the party. I just think there's there's even like a Sorry, go ahead, Marshall.

01:31:25 [ML]

In terms of letting things into the party, I think, uh, we have all these criteria now, but we have to admit that you know, if you miss one or two criteria, even if you do it pretty badly. I mean, you're probably still Iversonian language. Like so, even if you took APL, if you took exactly APL, the language, but you scrambled over glyphs, you assigned each one to a random Unicode. Yeah dude, we'd have to say I mean yeah they why did you do that?

01:31:51 [AB]

That's BQN for you, right?

01:31:53 [CH]

No no no.

01:31:54 [AB]

I'm joking.

01:31:55 [ML]

No, it it has a lot of characters in common. Still, I mean most of the changed characters are actually for substantially different functions.

01:32:01 [AB]

So can we let them in? But have them sit at the back of the.

01:32:04 [CH]

This is exactly my analogy. Is that inside the party it's a club, I was picturing it like a house party, but it's not we're at a club. And there's like a VIP booth where you know they gotta let you in, like the little sash ribbon. Oh yeah, they're allowed in so now inside our circle we have an even better circle. So which I don't know who who's in that circle?

01:32:22 [AB]

But yeah, it's right in the middle you've gotta have the ones that Iverson digitally made himself like they're sitting in these red plush chairs.

01:32:30 [CH]

Well, so now here's the question is like does J even qualify with the digraphs? It's like now we're getting we're getting into like the version of this argument, yeah?

01:32:40 [ML]

Yeah, 'cause I mean J is kind of far from APL.

01:32:43 [ST]

Yeah, jot dot is 2 characters.

01:32:47 [AB]

No, but that's OK. That's fine. Another episode why that is it has it does have a real real reason for the actually. The more I explain it to people like the more kind of feel comfortable just keeping that around it has a good reason, but you know about the controversy that was at the 1st APL conference when J was presented.

01:33:05 [CH]

Yeah, I think we covered this and I I put it in the show notes of my other podcast. I think I found the there was no slide deck, but there's a paper that was attached to it, right?

The APL slash question mark or whatever. [29] That what you're going to say, yeah.

01:33:17 [AB]

But that, but there was a whole large group of people in arranging or at the conference that basically didn't want Iverson to present this, saying this doesn't belong at an APL conference.

01:33:27 [CH]

Yeah, that's sad.

01:33:30 [ST]

Yeah, but they weren't arguing it wasn't an Iversonian language. They were arguing it wasn't APL.

01:33:35 [AB]

I, I mean I think it was kind of the same APL. Iverson was the same and I think I think Gitte Christensen the CEO of Dyalog she mentioned she she was there and she said like this, is this is ridiculous. Anything that Iverson has to say belongs at in APL conference.

01:33:53 [AB]

So basically she's redefining API conference as Iversonian conference and by definition, whatever Iverson has to say belongs there, which means J is definitely Iversonian.

01:34:04 [ML]

Well, did anybody argue that J wasn't actually created by Iverson? You know, like some sort of, you know, Iverson was actually killed off and some duplicate was brought in so.

01:34:15 [CH]

I was going to say are you are you like putting forth like a ratatouille like there was a you know somewhere some marvel alien?

01:34:21 [ML]

Which I'm not. I'm just saying, you know, many people are saying this kind of thing that perhaps there was a conspiracy to actually create an array language that appeared to be made by Iverson, but it wasn't.

01:34:33 [BT]

Oh God.

01:34:37 [ST]

I want to pick up on a point that the economy is making about the difference between using symbols and using English name. When when we translate poetry from one language to another, you have to pay a lot of attention to the associations of a word, so it doesn't do just to pick a word from one language and find its exact equivalent in another language. You have to think about the associations, because the original poem is, is bringing in and pulling on all of those. When I'm reading program code English words spark off all the associations they have in the language in my language, and which are generally very little to do with the programming problem. So when I'm using on reading symbols, I have this experience that I'm focusing tightly in on the on the problem this extends. So my choice of names for variables I try to avoid generally or mostly avoid. Uhm, using English words because I don't mean the the thing that the English word means. I mainly mean something like it, but I mean only in explicitly what I've defined it to be. So I might use I commonly use, say a an acronym I might put in there things waiting to be processed. And the variable name will be TWTB like that just enough to refer back to the original comment, but not enough to kick off that train of associations that comes when we use words from a natural language.

01:36:20 [CH]

Yeah, I mean I completely. That's a great analogy. The the the poetry one is that if you listen to, you know I'm not a poet, but if you listen to poets talk about translating is that it's some people will say that there's always something lost, like if you're translating, you know Russian poetry into English and that like. In order to really appreciate the poem is you need to understand not only just speak the language, but like you have to be fluent enough to understand those word associations and the like the depth of the poem is going to be lost to a certain extent when you translate it into a different language, which I think is like a perfect analogy, it doesn't mean that you're not reading quote UN quote the same thing and you can't understand it. But there is some extra meaning that gets lost when you go from the symbols to the word ified versions, and that yeah, that's a very apt after knowledge analogy.

01:37:09 [AB]

Yeah I I can say my my linguistic I unless in like at least in like English or whatever the human languages, center of brain and consciousness is not at all present when I concentrate on APL code things go directly from symbols to meanings. Concepts in my head without speaking it out.

01:37:32 [ST]

Now you're reminding me of an essay by the late American humorist PJ O'Rourke, called. Ferrari refutes the decline of the West. [30] It's an essay in which he describes driving at Ferrari from the East Coast to the West Coast of America. When he gets into the Rocky Mountains, he says there we found that the steering was frighteningly direct, straight from the left brain to the right.

01:37:59 [CH]

Bob, I think you were trying to say something earlier too, and it might have got lost in the commotion, do you recall no?

01:38:07 [BT]

No, I'm lost in the commotion. I actually one of the I guess one of the things is it was for remarking on on Marshall's idea of a conspiracy theory for the origins of J. And I thought I didn't start out trying to make conspiracy podcasts. I really didn't try to start out making conspiracy podcasts.

01:38:28 [CH]

Yeah, that's what it's become. That's why it's become I feel like unless if there's any last last thing we want to say. 'cause we've we've way blown by the one hour mark as if anyone has made it to the end of this podcast, will will know. Are there any last comments we want to say? Is there going to be a Part 3? No last comments. So I feel my general sense actually is that the difference between array languages and Iversonian languages like it's interesting a lot of this stuff is, you know, specific to where the conversation has been around spelling, or you know this one function existing and you know I suggested minus scans you know there's outer products reduces scans. I don't actually think the delta between the two is maybe as large as I had originally thought in my head, and it's more I don't want to say like ethereal, but it's more like style the way it makes you think. At least, that's the sense that I'm sort of you know the fact that a big chunk of this was a discussion on symbols. I think an interesting artifact of this conversation, and I'm sure there's actually some folks that are probably listening to this and being like. Why are these five individuals spending so much time talking about stuff that just doesn't matter like I've met people that.

01:40:00 [ML]

They don't know us at all.

01:40:02 [CH]

They're just like. Syntax versus semantics like I I definitely know, because Aaron Hsu, I think in a couple of talks has talked about the importance of syntax and there's some people that think that syntax is absolutely like meaningless. It's just semantics is all that matters. How you spell it. It means absolutely nothing. Like why are we spending discussion on the way that we're going to spell this, and I think it seems to be a trend is that folks that have spent time in the array languages definitely care about syntax and the spelling of things. And that as we were just discussing like you can lose meaning in the syntax that you choose. Which maybe doesn't impact semantics, and arguably you could say semantics are more important at the end of the day, but that doesn't mean that syntax is completely meaningless, especially at the point where you're dealing in Unicode symbols and you can have insights because of because of the way that these things are visually represented, but anyways, any any last thoughts folks have the want to share?

01:41:03 [AB]

I'm just curious. If we if we now go through all these criteria and and I might be mistaken, but but does the Wolfram language not end up being Iversonian then?

01:41:15 [CH]

Oh yeah, that was when we didn't get to talk to you, I mean I feel like we should what what is it missing?

01:41:20 [ML]

Well, it doesn't fit the syntax.

01:41:22 [AB]

Uhm, you said it has to have infix and prefix syntax. Well it does have those.

01:41:28 [ML]

I meant that was the like for every function. I didn't say that, but.

01:41:33 [AB]

Wait, but that the even.

01:41:34 [ML]

Like that, that should be the way you call functions, not because a general function in Wolfram language you'll call with square brackets, right?

01:41:43 [AB]

OK, but that's then Q&K out.

01:41:48 [ML]

They could be.

01:41:53 [CH]

We have we have determined that Marshall is the the official or unofficial bouncer of this of this club party that you know.

01:42:01 [ML]

Yeah well, so Q&K you can write a function application.

01:42:04 [ST]

Oh well, I've gotta jump in at this point in. I can entertain an argument that the q has strayed too far away from the Iversonian principles but I happen to know that Arthur Whitney is Iverson's spiritual heir. I've seen the robe and the bowl and and the sandals, and I, I think any criterion which excludes K is going to be seriously problematic.

01:42:26 [BT]

Oh no.

01:42:31 [ML]

Well, it can be a Iverwhitnian language then.

01:42:38 [AB]

That's circular, no?

01:42:41 [ST]

The story continues.

01:42:43 [CH]

Marshalls the bouncer, the conspiracy theorist and the creator of you heard it in this podcast first this new term.

01:42:50 [AB]

Corner of new terms of like programming languages. Now Wikipedia needs a Iverwhitnian category for programming languages.

01:42:59 [ML]

Yeah, but in terms of the syntax, why I was actually a little surprised to be the first one to to bring up the like that kind of syntax criterion. I mean, I definitely I don't want to say that you know the more Iversonian the language is the better.

01:43:14 [ML]

I don't think that even about BQN, You have to say that to the people who are programming in Iverson, Iversonian languages the syntax was very important, and that's what kind of makes this family of languages special in a way so as a distinguishing feature, yeah, you do have to point to the syntax as one thing that makes this family of languages what they are even if you think that you know all that, you should care about when writing a program is the semantics.

01:43:43 [CH]

Yeah, I think what I'm going to do, similar to the last conversation that we had where I made-up that Venn diagram. [33]I'm going to create a markdown file that has this list of stuff on the left or maybe at the top, and then the list of languages on my array. Language comparisons repo, and then the yeah table with you know, little greens and red hearts, for which ones have this? Which ones don't? And then. And then we can stare at it next time. We don't need to necessarily dedicate a whole episode to it, but I think it would be good to talk about the prefix infix one, 'cause that's also something that we didn't spend a lot of time on or. Any time on? We'll have a discussion and we'll get a full list of like we'll go through and raise our hands, and then we can get a tally of, you know which of us think each of these criteria? Yeah, and like should be or should not be included and also to like what is the result you know does Q make it in at the end of the day to the party? Does does Nial make it into the into the party at the end of the day and and and then maybe somehow we can release it out as not maybe a poll or a survey, but it would be curious what.

01:44:54 [ML]

As a mug. As merchandise.

01:44:57 [CH]

No no no, I was getting to getting people feedback of what you know the greater community at large how how they feel? Uhm, because as I think I mentioned this before, but yeah, I've had conversations about what makes a language functional etc and one of the responses I got was like it's a lot of it is defined by how the community uses it and views it is that you can have a functional language, but if it's not idiomatically or commonly used in a functional way. If there's a way to get at side effects, and that's what commonly is being done all the time, then it's you know make to less of a functional language because the community doesn't use it as a functional language.

01:45:33 [ML]

So well, and that's the thing about Wolfram language. I mean, yeah, it fits most of these criteria because it fits almost every criterion. It just has so much stuff packed into it. I think it might not have the the compression thing where you're filtering an array by a Boolean list, although I'm not sure about that.

01:45:52 [AB]

Come on, it's got 6000 functions. It's yeah.

01:45:55 [ML]

I mean, yeah, so it might have a function for that. If it's not written with this symbol, I think it doesn't, uh, it's not so good. But then Q doesn't meet that one either.

01:46:06 [CH]

Alright, stay tuned listener or listeners. Not haven't made-up my mind how to refer to the to the listeners.

01:46:15 [ML]

But you have to call them listeners because there aren't one of them. There are zero.

01:46:18 [ML]

At this point. We will have a follow up either half an episode, it'll probably let's be honest. It'll probably end up being a full episode. 'cause here I am saying it'll just be 10 minutes and this was supposed to be an hour and we're closing in at like the two hour mark at this point.

01:46:32 [AB]

Oh no, let's not go there.

01:46:34 [BT]

And add a contact AT Array Cast DOT com if you have an opinion on this and I'll be prepared to be flooded with opinions.

01:46:42 [CH]

Yeah, if if people have made it to this point in the episode.

01:46:44 [CH]

This might be our longest episode at this point.

01:46:46 [BT]

Oh, this will be our longest, yeah.

01:46:49 [AB]

Well, we should encourage people to send in additional criteria inclusive inclusion and exclusion criteria that they we might not have thought of.

01:46:57 [CH]

Yeah, yeah well so if by the time people are listening to this. I will have made my markdown file. You can find it in the show notes and if you're GitHub savvy. Feel free to make a pull. I will put a section for you know proposed criteria by the Community and and you can open up a pull request or even just leave an issue or comment and I can add them and then maybe we'll bring those up in the future discussion. Part three of this. Yeah, this has been fun. My brain's been broken. I gotta go and think about some things and either we'll be doing this again in two weeks, or maybe we'll have a guest stay tuned and I think with that we will say happy array programming.

01:47:34 [ALL]

Happy Array Programming.

[MUSIC]