Transcript

Transcript prepared by Bob Therriault and Sanjay Cherian
Show Notes

Transcript

00:00:00 [Bob Therriault]

How most physicists have learned about it isn't by that formula. It's by playing around with bicycle wheels or rolling things or hitting things or lifting rocks or anything like that. They have a tacit, and it actually is used as tacit knowledge, a feeling of how this works. And then they express that tacit knowledge into the formula. And now the formula expresses what they know tacitly. Anyway, Marshall, did you want to say something?

00:00:25 [Marshall Lochbaum]

Yeah. So then the way you express your tacit tacit knowledge is by not writing anything.

00:00:33 [Conor Hoekstra]

There's that cold open, folks. Well done, Marshall.

00:00:35 [Music Intro]

00:00:50 [CH]

Welcome to another episode of ArrayCast. I'm your host Conor and today with us I have our four panelists who we will go around and do brief introductions. We'll start with Bob, then go to Stephen, then to Adám, then to Marshall.

00:01:00 [BT]

I'm Bob Therriault. I am a J enthusiast, and I've always been interested in tacit, which I think is going to be important today.

00:01:08 [Stephen Taylor]

I'm Stephen Taylor. I'm an APL and q enthusiast, and I'm joining the session today from New Jersey, where we've just had Canadian Thanksgiving, which is where we give thanks for Canadians like Kenneth Iverson and Arthur Whitney and Conor and Bob. Thank you guys.

00:01:25 [CH]

You're welcome.

00:01:26 [BT]

I'm sorry about all of that. I'll have to take that out later. Oh, that's just Canadian.

00:01:31 [Adám Brudzewsky]

I'm Adám Brudzewsky. I'm also thankful for those Canadians. Awesome people. I've lived in Canada for a long time. They really are awesome people. Mostly though, for this, I'm really excited about APL. I like tacit programming as well. I remember when I was first introduced to it and it was all very mystifying. And actually, by the time this comes out, this episode, it'll be the day before I'm running a workshop at the Dialog User Meeting on tacit Programming.

00:02:00 [ML]

I'm Marshall Lochbaum. I started out in J writing forks there, and then I moved to Dyalog and continued to write forks, and then I made BQN so I could write more forks. And so I've been programming a lot of tacit.

00:02:14 [AB]

Hold on, you made the I language. That's like all tacit.

00:02:17 [ML]

Oh yeah, yeah, that was very much so that I could write forks. You can't do anything but a fork in I. Just forks. It's all forks.

00:02:26 [CH]

I'm so happy that forks are getting mentioned. As mentioned before, my name's Conor, research scientist, polyglot programmer, APL enthusiast, but I don't mention this enough. It's relevant today though. I am a Combinatory Logic and Combinators massive fan, maybe the biggest fan. I'm not sure who's a bigger fan on the podcast. That's the same thing as tacit programming. I come to you today enlightened. Enlightened from a few conversations that happened over the weekend at the Minnowbrook Conference [01] and we're going to get into talking about all that. Guess what, folks? There's a whole new world of combinators. I've been so happy sitting around here with my normal combinators and there's a whole new world that I discovered. We're gonna talk about it today, and I'm guessing within 10 minutes, everyone's gonna be confused. I've been confused basically since then, but it's enlightening. Anyways, before we get to talking about this, and this will be our fifth conversation, if I'm counting correctly, on tacit programming. So if you haven't listened to episodes, I don't actually know the numbers, but they're entitled, you know, tacit 1, 2, 3, 4, we will leave links at the top of the show notes for you to go back, pause this one. I'm not gonna say it's gonna help you Being less confused with this, but it might it might and no promises there Anyways over to Marshall for an announcement and then I've got a couple.

00:03:50 [ML]

So I have announcement that's relevant to iOS users If you've got an iPhone or iPad for some reason and you would like to run an array language on it We have just out is an app called Arrayground [02] which will run for you BQ in and also k So it uses CBQN and ngnk and it's a little interactive environment. I think it's fairly simple for those It's from the creator of BQN, which is a cross-platform BQN IDE if you've ever tried that So if you've been wanting such a thing, I think that's your thing.

00:04:24 [CH]

Awesome links will be in the show notes and My one and a half announcements the one is and I thank you to Adám for reminding me Is that I since we interviewed Kai Schmidt on episode 63 I think it was could be wrong about that. It's the last episode I Made three different YouTube videos, so I came I know I was kind of in hibernation for a few months You know I posted here and there, but not very frequently, and then I got very excited by the we while language For a plethora of reasons I mean my favorite language of the array languages is BQN and we was heavily inspired by that so it was very interesting But anyways if you haven't seen those my guess is that a large percentage of our listeners also have seen some of those videos already links will be in the show notes and In two of the videos I specifically compare BQN and Uiua to show sort of the difference in Combinators and and maybe we can chat about that later in the episode The other half announcement is that the Minnowbrook conference, I think technically it's titled the APL implementers workshop But I think they've rebranded it to the APL futures workshop, but colloquially everyone refers to it as Minnowbrook it just happened over the last week and There are a bunch of folks that I met and got to chat with that we are gonna hopefully be bringing on most excitingly is An individual by the name of Stanley Jordan [03] some of you may know that name. He is a four-time Grammy nominee Musical genius has had an album that was at the top of the charts for 51 weeks straight Anyways, he gave a presentation where he was playing music and interactively exploring You know different electric guitar sounds with APL and it was it was very very cool kind of reminiscent of if you've seen Andrew Sengul April compiler and what he does with sort of the light show it was like in that space except you know completely different cuz he's It's not just lighting up sort of the backdrop behind a DJ in most cases He is actually playing music anyways very excited We I chatted with him and he said be happy to come on and there's also a bunch of other folks Bob Smith on NARS 2000 and some other folks that we're gonna hopefully be getting on and that's the end of that announcement on to Oh, actually Bob's gonna say something.

00:06:31 [BT]

This is one of the reasons actually, Uiua was the first one that I really had a sense. We recorded we while before it became popular Like really popular and it was due to Marshall actually hooking into it.

00:06:43 [ML]

I knew about Uiua in like April Yeah, maybe other people knew but I was following the development and all

00:06:50 [CH]

You were way way before the cool kids were there

00:06:53 [BT]

But because Marshall's here I found out about it and within a couple of days we we had Kai coming on and then it started to pick up even before we released the episode But then to watch what happened after the episode was just amazing And it it was one of the reasons that I really wanted to have this kind of a podcast put together was because it draws people in and the Stanley Jordan thing in the Minnowbrook is true as well because There's people who don't know about what's going on in the array languages, even if they're in the array languages Yes, and and as a result this kind of crossing over as it as it grows I mean, we're now into our second third year, I guess Episode 64 but as this grows, I think you'll see more of that cross-pollination I think that's really important and that is one of the big things that I thought was missing in the array language community

00:07:46 [CH]

Yeah, I definitely feel Since I've started, you know taking an interest / falling in love with array languages like in the last little bit There seems to be like an inflection point like I've we've seen I think we because we actually had multiple topics that we were Discussing and fighting over which one we were going to talk about first and so in the next couple episodes We're gonna be having the other discussion that we're not having today, which is about Game programming in array languages and there's been videos coming out of individuals that are basically showing how to make little mini games in BQN, you know, the the Uiua excitement on Discord and anyways, and then you know, I think the vibe at Minnowbrook I talked to a few people that said like this this APL conference feels kind of different than the ones in the past where it seems like there's a a lot of really exciting interesting stuff that folks are doing and especially when you look at the number of like younger employees that Dyalog Limited is his hiring like when when Morten Kromberg the CTO was going through a slide deck like I Was expecting like one or two slides and then he went to a third slide and then he went to a fourth slide and like Literally like the the pictures of faces were sliding off the screen I was holy smokes like this is Dyalog's hiring quite a new folks and a lot of them are you know just out of university and so very exciting and Looking forward to chatting with those folks and and seeing you know in the next couple years Who's the next person to to create an array inspired language that being said here we go folks. I'm enlightened Let me tell you the little story of what happened so we all know hopefully about forks [04] you know they are both monadic and dyadic and that's there's two things I want to talk about we'll get to the second one we're gonna talk about the first one though first and Up until this last week. I thought that that was that was just it you've got your forks They take three functions three verbs whatever language you want to you know J calls them verbs, APLs call them functions, and they form one of two patterns. You know, the monadic fork takes two unary functions and applies those to the same argument and then takes the results of those and passes that to a binary function. So that's your monadic fork. Dyadic fork is the same thing except replace your unary functions with two dyadic functions and then it takes two arguments and follows the same pattern. And there's a bunch of other combinators, you know, and I think I should clarify this. I was causing confusion in the last episode with Kai about Uiua because I was referring to the BQN combinators as right and left when I meant to say before and after, which is what they're actually called. Correct, Marshall?

00:10:16 [ML]

Yeah, yeah.

00:10:19 [CH]

And also, we need to be more explicit in that because we have ambivalence, we have overloading, there's the monadic before and after and the dyadic before and after, which all correspond to four different combinators. And I think a lot of the times I was speaking, one, I was saying "right" when I meant "before." And then two, I was specifically referring to just the monadic one, not the dyadic one. And so a couple of the examples when Kai was talking about, I think, anyways, the point is we got to be more explicit. I have to be more explicit, specifically. I just noticed that when I was editing it. But here's where things get interesting, is that while I was at Minnowbrook, people were talking about, or a few people were talking about undenoted notation and whether that was a bad thing. I heard this from Henry Rich before when he refers to some of these, you know, patterns as invisible modifiers. It's invisible, the composition, and it's just there because you're juxtaposing things together. And I was like, what did Henry actually call this? Because a couple people were referring to this as undenoted. And I was like, let me go look up actually what, you know, Henry called it. And in Googling, or on the J software site, I googled invisible modifiers and came across this page called modifier trains. And it's just a table with some color highlighting. And I stared at this table for a little bit. And then I can't remember if it took five minutes or ten minutes, but I realized that this is a whole new set of forks, basically. And what J refers to the forks that I just explained, the ones that exist in APLs as monadic and dyadic forks, those are known as verb trains because they consist of only verbs, only functions. And the arity of those functions determines the pattern. But in J, they also have something called modifier trains. And I think actually it was just a couple of episodes or maybe it was you, Marshall, you pointed out, it might have been you, Bob, that modifier is actually the umbrella term for both adverbs and conjunctions. In APLs and BQN, we refer to those as operators and modifiers. And so J is the only language that actually has an explicit name for a monadic modifier and the dyadic modifier, which correspond to adverb and conjunction. And it's important to know this because on the table, they have letters that represent V for verb, A for adverb, C for conjunction, and N for noun. And so When APL speak, you can think of verb as function, adverb as monadic operator, conjunction as dyadic operator, and then noun as array. And similar conversion for BQN. And at this point, probably everyone's lost, but the point is is that based on these four letters, depending on how you juxtapose them in a three train, you get a modifier returned that can take functions as arguments, which basically means is like the forks that we knew before, which J calls verb trains, I basically in my head call those combinators. And this set of modifier trains, AKA adverb and conjunction trains, AKA operator trains are three things that when juxtaposed return you a modifier or an operator. And I think of those as like higher order combinators. They're combinators that basically follow the exact same pattern in terms of monadic and dyadic, but you can put operators in there, and then instead of forming a function that takes arrays, you form an operator that takes functions. And this is just mind-blowing, and the last thing I'll say, 'cause Adám's got a comment, is I'm gonna give a talk at some point called the Holy Grail, the Holy Grail of combinatory logic. And that is, there's two of 'em. There's the triple conjunction. We're talking conjunction, conjunction, conjunction, folks. Me and Peter Mikkelsen, who's one of the C devs working on the APL implementer at DyalogAPL, we spent probably two or three hours, can't remember if it was Friday night or Saturday night. You know, there might have been some drinks involved, so we might have been not at the top of our mental acuity, but we were trying to figure out, is there even a use for this? And also, hopefully everyone here can just suspend whether this is actually a good idea or useful. I'm not really interested. (laughs) I mean, I am. But whether or not this is a useful language feature is less interesting to me than compared to the fact that this actually exists. And so what we were trying to do was figure out a triple conjunction that's actually useful. And the two outer conjunctions that I wanted to do were a top and over, [05] which correspond to the B1 combinator and the side combinator. And the reason I really wanted this one is because they're basically inverses. Atop applies a binary function first, followed by a unary function. Whereas over applies a unary function, the same one to two arguments and then the binary one. So if you can find a triple conjunction where the two outer conjunctions on the left and right time are a top and over, not only is that amazing, but you require both the monadic and dyadic definitions of each of the functions that you're passing as arguments. We spent, like I said, two to three hours trying to come up with something. We found ones that like produced a result, but were not meaningful. And anyways, so if you're following up to this point, I'm pretty, I can see Adám smiling. I can see the gears in the rest of the panelists, you know, heads turning. You know, I don't actually think one exists. I've also spent like half of the car ride with Morten and Gita and Peter back to Toronto thinking about this. Like I was so tired, but I couldn't go to sleep 'cause all I could think about was like the holy grail of combinatory logic programming. And there's another one, the conjunction verb conjunction, which I think would be easier. But you know, anyways, Adám, over to you. I'm gonna stop talking now. This is a whole new world of combinators. I haven't been this excited, I mean, probably since I discovered Uiua actually, which was like a week ago. But, (laughs) anyways, Adám, what were you gonna say earlier?

00:16:22 [AB]

You were saying it's three things next to each other. In fact, about a third of the table on the JWiki actually only has two things next to each other, just like you can have verb, verb, or function, function, whatever, to make a to train. So too can you have two other things to make this invisible modifier train thing. But mostly I'm just surprised. Henry Rich spoke about this, about these things that were taken out from J and then recently put back in. And now you come and say, "Oh wow, look at this. The table was there all along."

00:16:59 [CH]

That is the beauty of the continuous learning path, is that sometimes you hear about something and it sounds cool, you might be a bit confused, but it kind of bounces off of you. And I remember Henry talking about that. [06] I mean, the first time when we had Henry on, he actually just talked about how it was there and then got taken out. And I was like, "Oh man, that sounds awesome." We talked to Henry a year later and he's like, "Guess what? We put it back in, baby." And then I was like, "Oh my goodness, I got to go check this out." And I think I did check it out, but I do not remember stumbling across this table. I think I remember stumbling across some other documentation. And if you scroll down past the table, and we will link to this in the show notes if you want to pause the podcast, the visual descriptions and the text descriptions of this stuff mean absolutely nothing to me. Like I am not smart enough to like, "Oh, that makes sense." I think when I looked into it, I was like, "Hmm, I'm going to need to spend some more time really to like digest and understand what's happening here." I didn't take that time which is which is like the art of learning stuff Sometimes you look into something you bounce off of it. Go ahead Bob

00:18:04 [BT]

I was late getting onto this this recording because I was reading a blog post about Seymour Seymour Papert Was really important in a lot of learning education in the 1990s and had mind storms and a bunch of other things but thing that this blog talked about was Seymour Papert talked a lot about people being ready to learn and What that means is it to him was most people if they're given a formula say Newton's formula F equals ma You know, that's how you're taught maybe in physics. They say this is how these things relate to it force equals mass times acceleration But how most physicists have learned about it isn't by that formula It's by playing around with bicycle wheels or rolling things or hitting things or lifting rocks or anything like that they have a tacit, and it actually is used as tacit knowledge, a feeling of how this works, and then they express that tacit knowledge into the formula, and now the formula expresses what they know tacitly. And I think this is actually really important for this episode as well, because a lot of people might not be ready to look at things this way, and that's okay. And the way that Seymour Papert suggested is get in and start working with some of these things, really simple things and just get a feeling of how they go together and then as you do that You'll start to build up a mental model and then these tacit Expressions will make a lot more sense because they're consistent And we'll talk about that later because there's been a lot of discussion in the last week about how consistent they are They're not as consistent as you'd like but they could be anyway Marshall. Did you want to say something?

00:19:43 [ML]

Yeah. So then the way you express your tacit tacit knowledge is by not writing anything.

00:19:51 [CH]

There's our cold open, folks. Woo! (laughing) - Well done, Marshall. Well, what I'll interrupt to say is definitely revisit, 'cause I know that same way that I watched that Point Free or Die talk and didn't understand stuff, but I started noodling on that for two months and then had two months later is, if you're listening to this and you're already confused, which I guarantee there's a percentage or proportion, if not most of our listeners, come back and listen to this episode a couple months later.

00:20:20 [BT]

And I'm just quickly gonna do a roadmap. [07] If you go to the NuVoc site on the wiki, which is sort of, NuVoc is the new vocabulary, it's sort of the reference for the J language. Down towards the bottom of it, there's modifier trains, and when you, there's forks and then modifier trains. Or, I can't remember what they call invisible modifiers, 'cause this term is actually up for discussion as well. But when you go to the modifier trains page, At the bottom of that, there's another link. And in that link, it gets into a lot more detail. Be careful before you go down there if you're not ready for it, because it'll be quite confusing. And part of the reason it'll be quite confusing is if you go to that page and you go to the discussion on that page, there's a really good discussion about a suggestion of actually changing how these, not so much how they're constructed, but how we talk about them, because there's a way to do this that makes them simpler, but we're using the more complicated version, and it's just a difference of parenthesizing. And if you use, if you go to the discussion page on the wiki, you'll see somebody suggesting a way to look at these by parenthesizing that makes the whole thing much clearer. And in fact, I think there's, it was done by Cameron Chandok, And he did a great job on it. 'Cause there were things he pointed out that I didn't realize a week ago. And I just said, there's a lot of things that came clear to me when I looked at how he's proposed things. But we might get into that later. Otherwise, it can be really confusing and it's admittedly confusing. But one of the things that's really good about that discussion page is that it does clear up a way of thinking about it that makes it much clearer. And I think maybe the way going forward with these to make them easier to use.

00:22:14 [CH]

All right, back to you, Adam.

I wanted to help the listeners a little bit by giving some examples. Let's say I want to name the each operator, or adverb, or one modifier. I can say each gets the symbol for it. Occasionally, I want to apply a function not to the elements of an array, but to the elements of the elements of an array. So I might want to do each each. So it isn't really fundamentally more complicated than being able to say each each gets assigned, and then the each glyph twice. That is a combinator train. That is an adverb adverb.

00:22:57 [ML]

Well, and that form actually was never taken out of J. So that was the one combinator, or the one adverb modifier combination that you could do for a while.

00:23:09 [CH]

That survived the purge, yeah.

00:23:11 [AB]

I think that there's a-- that has something to do with each other. That one is very easy to understand, and that one was the one that was kept. I'm bringing that one up. I know that it was not one that was removed, but I think it's very easy to conceptually understand what's going on here. And that derives a new adverb, or one modifier, or a monadic operator, which just does an each-each. It's not really that difficult. And also-- I'm going to blow your mind, Conor

00:23:40 [CH]

Ohh I'm so ready.

00:23:41 [AB]

You've been using Dyalog APL for a while. And here's the thing. Dyalog APL also has modifier trains.

00:23:51 [CH]

Does it actually?

00:23:52 [AB]

Well, it has a very small subset of them.

00:23:54 [CH]

Does it have? Does it have the triple conjunction and the conjunction verb conjunction?

00:23:59 [AB]

No

00:24:00 [CH]

Oh. Because that's the thing. I don't actually want to spell the....And henceforth, we shall refer to CCC as triple conjunction and CVC as the Oreo conjunction because it's got two conjunctions on the side and the verb in the middle. Technically there's a couple other Oreo conjunctions, but the Oreo conjunction we care about because what we want is we want a top and over and in J it's the at colon and ampersand colon, which is not as beautiful as the hoof and the paw, [08] aka the small circle double dot and the big circle and imagine the beauty, the beauty. If you can get the hoof and the paw on either side of either a verb or a conjunction, it's the holy grail, folks. Start thinking about it. If you find one, I've spent hours at this point.One, I can't even find one, and two, if I find one, I'm pretty sure it's not gonna be useful. And I found ones that work, but not that you could argue that you would ever, you know, anyways. Which ones does Dyalog APL have?

00:24:59 [AB]

So the ones that are, they have these short codes in the jwiki table called cn and cv. So that's conjunction noun and conjunction verb, which in APL terms is dyadic operator followed by end function or followed by an array. So for example, you can do this in Dyalog APL. You write the word twice, and then assignment arrow, and then the power operator, and the number two. So notice here, the entity is a dyadic operator followed by something. And that results in a monadic operator. And it is the monadic operator that takes an operand and applies it twice.

00:25:40 [ML]

And in BQN, if you try this you get a syntax error. It will tell you missing operand.

00:25:46 [CH]

Ah, listen, Marshall, you already have the number. My number one favorite language.

00:25:51 [ML]

Well, I'm. I'm sure I can fix that.

00:25:53 [CH]

To increase your lead here, you know. And actually, I take that back. Let's give it a couple months.

00:26:00 [ML]

Yeah, I was wondering about that.

00:26:07 [CH]

You know, I said, I've said we've suspended, you know, whether this is actually a good idea. I think it's amazing. I think it's beautiful. That being said, neither Peter and I have actually come up with a useful example. I think actually Peter did mention like a triple each. He's like, if you wanted to do like an each, each, each, like you could assign that to a name. And then I think that's even what the docs say is that a lot of the times you want to to name these patterns because if you do them inline, it actually is the exact same or maybe even a character longer and just makes it harder to parse. But if you're naming these patterns and then you add it to your vocabulary, might be more useful. But I'm looking for the inline triple conjunction, the holy grail, folks. We know it's out there. We're going to find it.

00:26:46 [ML]

So actually, one thing I did do in J a lot with the partial conjunction application, which is what Adám was talking about. You say if you do it inline, it just applies it directly, so it's useless. But what I would do is actually write function, and I'd write under open often, or which is each. And then I'd write parentheses, semicolon, at colon. So raise atop. And that means after you do this function on the left, then you're going to raise the results, turn it from a list of boxed lists into one combined list, or join those together. And what that lets me do is, instead of writing this function under open, and then putting that in parentheses, and then at the left, I have to write the rays atop, that means the two parts, this each and the rays atop, come together. And I think actually what I do a lot is write not under open, but just-- no, I'm not sure. But I would definitely have something that would work with the boxes. And then.....

00:27:58 [CH]

Wow. Marshall confused. You know, it's confusing when even Marshall is at a loss.

00:28:06 [ML]

I know what all the functions do. I don't know which ones I used. Cause I haven't written J in a while. So, but yeah, I would do something to create the boxes and then something at the end to combine those boxes together. And I didn't want those two parts separated because then it wouldn't be as clear what I meant to do, but putting them together makes it clear that it's kind of one super conjunction that's saying, it's sort of a flat mapping idea. I want to map this thing over all these different arguments, but I want the results all jammed together into one result list.

00:28:39 [CH]

I mean, that's literally, in functional languages, there's multiple names for it, but I think the most common is flat map. It's exactly that. You've got a structured sequence of sequences or something. you're mapping some operation on each of the sub-sequences, but you don't care about that structure afterwards. You just want to flatten it. And if you don't have flatMap, you end up having to do that in two operations. And so basically what you're saying is like, this is a way to spell flatMap because these two operations I can just put together instead of having to combine or do them after the operation, which makes it less explicit that what I'm trying to do is a flatMap kind of operation. Bob?

00:29:17 [BT]

I think you've actually struck on one of the uses of these tacit modifiers is that it makes you think almost like a meta level above what you would do. You're not so concerned about what you're putting in, but you're looking at the structure of how these operators are going in, or these, I guess, verbs, I suppose, in J, verbs or nouns, are going in and the structure that they're being put together into. So you're no longer focusing on what it's doing, but maybe more how it's doing it. And in that way...

00:29:48 [ML]

Oh, I mean, I hope not. I think it's by paying a little attention to how you're doing it, you can more easily write what it's doing. And then, you know, after you've done that a few times, you stop thinking about, "Oh, this is this special way to write it." And you just think about what it's doing and you think about, and then you write down whatever it is.

00:30:10 [BT]

But do you think you'd get to that next level as easily with—

00:30:13 [ML]

I think I was there. So I was just thinking, "Okay, I want to do this flat map thing. So I'll write this combination and then I'll write parenthesis, raise the top after it."

00:30:23 [BT]

But do you think you get there as easily without thinking about the tacit stuff?

00:30:28 [ML]

Well, you wouldn't write it in the same way, and then you'd have to. I mean, if you don't have that way of writing, it's not as easy to conceptually combine these two operations together. So the tacit combinations are the way you get there, but it's about more clearly expressing what you are thinking and thinking things that are more clear.

00:30:49 [CH]

Yeah. I mean, that is why flatMap is such a common operation in these kind of pipeline-esque libraries. Adám, yeah, you were going to say something earlier.

00:30:57 [AB]

Yeah, so I mentioned this thing about twice. You know there's a monadic operator, one modified in BQN that does the inverse. There's one that does cells. We don't have those in APL, but I can then easily define them. I can define cells that's just rank negative 1. I can define inverse as power operator negative 1. I could define a limit operator that applies a function until nothing more changes as power match. And an issue I'm sure you hit at some point is you have some dyadic operator, and the right operand is an array, and then it clashes with the right argument, forming a strand when you didn't want to. So people tend to either put a right tag in between them to break them up, or they could parenthesize. So I could write, for example, f rank something, too, and put that in parentheses, the f rank, too, and then that doesn't clash with the right argument. It could actually also happen on the left argument if you left operand is an array. It's rare, but it could happen. Now, most people that I've seen do this with parentheses, they will parenthesize the entire derived function.

00:32:10 [CH]

Yeah, I was just thinking that. But if you, if you parenthesize it differently, you're accidentally using this.

00:32:15 [AB]

Yeah, you're actually using this. And that actually brings me-- for a long time, I didn't even think about that. Yeah, OK, it's a quirk. You can do that, whatever. Or technically, that's kind of what happens, that when it gets to that part, it first binds together the operator with the right operand, and then it binds the left operand to that. However, I have slowly been converging towards intentionally doing this. Parenthesizing only the operator with its right operand. And why is that? Because that allows me to do a form of inline concatenative programming. Let's say I want to have a function with multiple modifications. Let's say I want to do f rank one each. Or f rank one power operator two. Now there is a mismatch in the structure. If I write open paren, f, rank 1, close paren, power operator 2, maybe even parenthesizing that whole thing, I have to go from the inside and dig my way out or outside in with the parenthesis. But if I write f, open paren, rank 1, close paren, open paren, power 2, close paren, then I can keep adding more segments to the right of dyadic operator, right operand, dyadic operator, right operand. Maybe throw some monadic operators in there too. And one place where this comes up, it's maybe a little bit ugly, but we have these system functions in DyalogAPL, and some of them have additional options that you can set using a very special operator, the variant operator, which both has a glyph and a system name, but that doesn't really matter so much. you can either specify all your options as name value pairs, or you can do one at a time. So you'd write system function, variant operator, name of the option, value for the option, then another variant operator, and then another name of an option and a value of an option, and another, and so on. But eventually you run into the problem that you need to apply this on an argument and you have this stranding problem. So we tend to need to parenthesize or give it a right tag or something like that. But if you parenthesize instead the variant operator and the name value pair on its own, then you have these building blocks like LEGO bricks. You just piece them together. And that means you can take these modifications to the original system functions and move them around without worrying about breaking any parenthesization or causing any stranding issues. Each one is a standalone unit, a modifier.

00:34:49 [CH]

Yeah. And so, OK, so two things I want to comment, because that makes a lot of sense is that one, Peter, once again, coming up with the actual useful cases where I was, I was purely focused on the triple conjunction because I immediately was like, that's the Holy grail. But while I was doing that, Peter works alongside, uh, Adám and he found the, um, uh, modifier train that you could use to spell the shortcuts that exist in BQN. And he was like, well, I guess, you know, if you wanted to name like a rank one operation, you could, you could do that. And I was like, oh, yeah, that's a good point. And then I immediately said, that's uninteresting though, because BQN already has those shortcuts. Back to the triple conjunction. So that was the first thing I wanted to say. The second thing that I think-- at first, I thought-- and I actually glazed over this, and I meant to mention it-- is the reason-- because you mentioned that this had come up and then it's been, I don't know, a year, a couple years since I've really discovered it and wrapped my brain around it. And it is because this table is so intuitive, in my opinion. And that's because at the very top, the first three lines in the three, the modifier three trains, the first one is noun, verb, noun. And that's just the application of a dyadic verb or function. Everyone knows how to do that if they've at least played around with APL or BQN or J. The next two say, you know, this creates a fork and note this is color coded. So pink is the top line. And then the next two lines are purple. It says creates a fork. One is verb, verb, verb, which immediately I was like, yes, of course, that's both the monadic and dyadic fork. And then the next one is noun verb verb, which I've always taken issue with the fact that we don't really have a name for this. And I think I asked you, Adám, once, and you said that we refer to that as the capital AGH fork. Like, I don't actually think it has a name. It's a fork where the left time is not a function, but an array or a noun. And the fact that that doesn't have a name is slightly irritating. But the point is, is these, the monadic and dyadic fork, and also the monadic and dyadic versions where the left time is a noun, correspond once again to four different combinators. And then you can do the same thing, I guess, not with the two trains, but we'll stick to three trains here. But then after those three lines, which are pink and purple, it then switched is to a color coded green for forming a conjunction and blue for forming an adverb. And at first I actually thought that the reason, and I could be wrong about this, hopefully, you know, Adám or Marshall or anyone here could correct me if I'm wrong. I thought that the reason they called it a verb train is because it's made up of verbs. And the reason they call it an ad or a modifier train is because it's made up of adverbs and conjunctions. But then I realized, wait, you can also include verbs in your modifier trains. And so then I realized, or I think this is what they did, is that a verb train is called a verb train because the train forms a verb. What is returned to you by that pattern is a function. And then I can see Bob shaking his head. And then the reason that a modifier train is called a modifier is because it returns a modifier. Bob's about to correct me, I think.

00:37:44 [BT]

Well, in a way, because actually if you think about it, if you... well, a verb train is composed of verbs or nouns. There's no modifiers in a verb train. So that's key.

00:37:57 [ML]

Well, if there are, they're immediately applied to form verbs.

00:38:02 [BT]

So yeah, and this is because this is somewhat recursive. If you say a thing is a verb and you put it in parentheses, you can do anything you want to create that verb, and it's a verb. So you could have conjunctions or adverbs mixed in there.

00:38:18 [CH]

I mean, the most famous three train of all (average) includes technically the reduce operator, but that is bound immediately with plus [Bob agrees] which forms a verb, and then you just have "verb, verb, verb" again.

00:38:30 [ML]

Yeah, so the components of the train are three verbs.

00:38:33 [BT]

The components of that train at that level are three verbs. But what happens is verbs are parsed right to left; so you move along right to left, just as you would expect. You know, if they're in a train, of course, that train has a different format. You mentioned it before: you've got the outside tines applied either monadically or dyadically; the center tine [is] always dyadic. So that's right to left. But modifiers are left to right when they're parsed. This is where it gets very complicated, and this is where one of the simplifications that Cameron Chandhok's [TODO: I hope I got his name correct] come up with. [It] is if you initially go in and say: "You're not allowed to have more than two verbs or nouns together in this modifier train". (Now the other spaces could be adverbs or conjunctions). If they're all verbs or just verbs and nouns, then they're a verb train. And it's just going to go right to left and you get that thing that a lot of people hate with forks: that if it's an even number in a train, it's a hook, and if it's an odd number, it's a fork. That is what happens when you actually end up with verb trains. But if you have to also include the left to right part of it, what he's suggesting is the easiest way to make sense of it is go in and whenever you've got more than two verbs or nouns together, parenthesize those so you're sort of isolating them. Then when you've got all those taken care of, go back and work your way from left to right, taking three items at a time. And that will consistently create what you want to create. If you don't do that, there are some very complicated rules about greedy algorithms [chuckles] grabbing things at different times, depending on the even and odd and the components involved. So his suggestion is: we can leave those in there [but] just let's not talk about those. [Instead] tell people to go this other route, where if you've got more than two verbs or nouns juxtaposed, parenthesize them and then go back and work left to right. And that makes everything a lot easier to understand. That is because of the difference between a fork (which is a verb train) and a modifier train (a modifier train will always have some kind of a modifier in it at that level). Does that make sense?

00:41:11 [CH]

That makes total sense, but does that mean that what I was saying is incorrect? Because I think they technically [can] both mutually work together. I guess the simplified version of what I was saying is that it's not the internals of what you spell even though it'll always have a modifier. The key thing is is that a verb train produces a verb whereas a modifier train produces a modifier. Even if that wasn't what caused the naming is that consistently true? Or is there an example of something that is not that case: a verb train forming something other than a verb, or a modifier train forming something other than a modifier?

00:41:54 [BT]

Well, I guess the part of this that is a little different is that some of these groups of threes are not trains. So for instance, with that column that you're referring to where it says whether it's a verb or a modifier, for "noun, verb, noun", that's not a train. That's because it's executed right away, right?

00:42:18 [CH]

Yes, but I mean I ... [sentence left incomplete].

00:42:20 [AB]

Why would we need to call it differently though? I don't understand. Why isn't a normal executable expression just a noun train? It's a sequence of tokens that result in a noun.

00:42:31 [ML]

Well what I do is just use the word "expression". I mean this is in terminology. So first, what you said about the type of expression being named after the result type, I think that's entirely correct. In BQN, we have a broader class, because not everything's an array. We call the things that functions might apply to "subjects". So you have a subject expression [which] is your sort of normal evaluation. And then you have a function expression, which might return a function. And I guess there are modifier expressions but in BQN, all they can really be is one thing. Well, they can have assignments, is what else they can have. So you can have "modifier, gets, jot", and that whole thing is a modifier expression. It's not terribly interesting.

00:43:28 [AB]

Can the function return an operator or a modifier in BQN?

00:43:35 [ML]

Well, yes, so in BQN, that refers purely to what the syntax is. A function expression is an expression which is syntactically a function. And then what it actually does at runtime, it could do all sorts of things. It could, on one invocation, it could return an array. And on another, it could return a namespace or a modifier or whatever. So the name comes from the syntactic role. In APL and J, the role and syntax is always the same as its type: they're kind of unified.

00:44:15 [BT]

And actually, Conor, thinking about it, the way you phrased it the second time, now I get you. Yes, you're right. A modifier train will always produce either an adverb or a conjunction. And a verb train produces a verb.

00:44:28 [CH]

Yeah. I mean, it's similar to the confusion initially. Not the confusion, but the misnomer of (which I think I've mentioned it on this podcast, but I've definitely mentioned it in a couple talks) of monadic and dyadic forks [which] technically are not either of those things. Both forks are always triadic. They always take three functions. And the monadic and the dyadic-ness is referring to the arity of the function returned. This is one of the points that I made at my Middlebrook talk: that we don't have a vocabulary for talking about the arity (the number of functions), that either a function returns or our arguments take. We say "arity", [09] we're only referring to the number of arguments our top-level function takes. But in combinator-land, your function takes only functions as arguments (except now we have modifier trains where they can't be operators and stuff, and there's exceptions for the tacks and whatnot). We've got the arity of our functions, which is important because it's the arity of "unary, binary, unary", that forms a monadic fork. So we technically have a triadic fork, and the arity of our arguments are "one, two, and one", and then that returns you a monadic fork, which technically is actually a simplification, because I made this point a couple times, and then people said: "Well, actually" and very quickly I was corrected when I showed the difference between explicit and tacit code. I showed "plus, arrow, brace, alpha, plus, omega end brace". And then I said: "this is explicit". Then I had "plus, arrow, plus", and I was like: "this is tacit". Then like four people shot their hands up: "yeah, but plus is a bad name". Because technically that's got two meanings. It can be called in the dyadic case (which is plus) but then in the monadic case, it's [complex] conjugate or whatever it is. So technically all these times when I'm talking about a monadic fork (this is the arity), well, there's never ever just a unary function. It's got a unary definition and a binary definition. And this brings us to the second point, which has to do with ambivalence. But I'll let Adám, you've got your hand up here.

00:46:35 [AB]

Well, I think I have a vocabulary for this. The way I speak about it is this thing (for example, circle diaeresis) is a dyadic operator deriving an ambivalent function.

00:46:53 [CH]

"Deriving", that's kind of nice.

00:46:54 [AB]

Yeha, then I'm splitting up what does it itself take and what is it [that] it derives. For example, the backslash ["\"] is a monadic operator (when used as an operator, whatever) deriving a monadic function, strictly. "+\" you cannot use dyadically in Dyalog APL. Whereas forward slash ["/"] is a monadic operator deriving an ambivalent function. And sometimes it can only derive a dyadic function.

00:47:22 [CH]

And this, I mean, that is nice, but "deriving", at first it sounded nice, and then three seconds later, I realized that's just a different word for "returning". [Adám objects]. So technically, you could just say ... [sentence left incomplete]

00:47:32 [AB]

No, it doesn't return. That's not true.

00:47:35 [ML]

Deriving never does computation.

00:47:37 [AB]

And I think that's kind of the distinction between a train and non-train in the J wiki vocabulary, too. There is the construction phase, where things are just being bound together, but no code is actually running; then we call it a train. Then there is something where there are actually data that's being modified by functions, whether derived or not. Then it's not a train; we can call it an expression or whatever. So there's a big distinction there. When I say "+/", nothing happened. When I say "+/..." [sentence left incomplete]

00:48:10 [CH]

You formed a function, though, haven't you? That's what I mean. When you return a function, you are forming that function, no?.

00:48:16 [AB]

But it didn't ... [sentence left incomplete]. Right. So you could call it that. But there's no actual code execution happening. It's only binding; we're only gluing stuff together without ever applying anything anywhere. We're not, you can't trace through the formation. Nothing happened.

00:48:30 [ML]

Yeah, well, and this is a difference. So what BQN and J have is you can have a modifier that does computation as soon as its operands are passed in. In APL, you can't; applying a modifier never does anything (it always just forms this derived function).

00:48:48 [AB]

I never understood that in J, or BQN for that sake. How does that work? How do you know whether or not to start running code or not?

00:48:56 [ML]

Well, it just depends on what the modifier is.

00:48:58 [AB]

Yeah, see? That I don't understand because then I can't ... [sentence left incomplete]. I have to look into what the operator is in order to know whether something will happen right now or not. I don't understand that. I want to be able to just look at the structure from the outside.

00:49:15 [ML]

Well, yeah, so the way that I think of this is that all operators ... [sentence left incomplete]. Well, I mean, I could define a block modifier that all it does is it takes its operands and it passes them to some other modifier, which derives a modifier. The result is the same as that other modifier. They both give you a derived modifier. So, I think of this as just a modifier is always like a function and that it does something. But [with] some modifiers, the thing that they do is just bind the operands and, pass that as a result.

00:49:59 [CH]

Yeah. I mean, coming from functional languages, I think returning a function from a function, people don't think as execution, but it is (like what Marshall said) you are doing something. Like there's a code path that your compiler/interpreter takes that is going to form something that will perform "an evaluation". But like really we're splitting hairs now because like there's there's application (which is you know one code path) but the forming of something to be applied is just another code path that you know is less frequently taken. But Stephen you had your hand up earlier. What were you going to say?

00:50:32 [ST]

I just wanted to chime in with Adám that I can was it [in] 2017 or 2018, where was a fair confusion in the vocabulary used for describing the syntax of Q. And we settled on pretty much the same formulation that Adám was just describing. But the few iterators (we don't have such a rich vocabulary of operators as APL) are operators that take the monadic (that's to say, unary; they take one argument, post-fixed notation) and they derive a function that you can assign. But like Marshall says, no actual computation happens at that point.

00:51:15 [ML]

Yeah, so q is kind of more APL-like in that it doesn't have these modifiers that can do things. And I mean, q is (from a tacit perspective), or from the maybe modifier-level programming perspective (which is not necessarily good) is very limited in that the only things that can have modifier syntax are these built-in modifiers. Clearly, if you go through the set of modifiers and you say: "all any of the adverbs do is to form a derived function", then you know, well, there can't be any other modifiers that do anything because you can't define more modifiers than that.

00:51:55 [CH]

Yeah. All right. So now that q has been brought up and we mentioned ambivalence earlier (and I realized we're already closing in on the hour mark) ... [sentence left incomplete]. All right. Bob's got one last comment before we pivot.

00:52:07 [BT]

Just one point of vocabulary: a "verb train" is a fork, but forks are not what these modifier trains are. So groups together are trains. And if you have three verbs or a "noun verb verb" combination, then you have a fork. And in fact, you can extend that so you can have forks of different lengths, but all trains are not forks.

00:52:32 [CH]

Right, right. That's a great. I'm glad you brought that up before we slightly pivot (we're still going to be talking about tacit programming). That's part of the reason folks why the Holy Grail of combinatory logic and combinators is the Oreo conjunction and the triple conjunction. Because those are forks. The triple conjunction is three conjunctions where the two ... [sentence left incomplete]. Well, Bob's shaking his head.

00:52:57 [BT]

No, it's not. They're not forks. They are trains. They're modifier trains.

00:53:02 [CH]

Let me clarify [chuckles]. The pattern that a fork corresponds to (the monadic fork and the dyadic fork correspond to), is the Phi and the Phi-one combinator, which the parenthesization of those things are applying the two outside tines, either to one argument or two arguments (in the modifier train case, these will be functions; in the verb train case, these will be arrays or nouns). And then taking the result of that and using those as arguments or operands to the thing in the middle. This is the same pattern that the Oreo conjunction and the triple conjunction form. Maybe we shouldn't refer to those as forks because in J-speak or SharpAPL-speak, they aren't forks in the classical sense, but in the sense that what the monadic and dyadic forks correspond to in combinatory logic, that composition pattern is the same thing that the Oreo conjunction and the triple conjunction (aka CVC and CCC) do follow that same pattern. That was what I was trying to say, if that makes sense.

00:54:01 [BT]

Yeah, the pattern's the same, but the reason it's distinct is because of the parsing. A fork parses right to left and modifier trains go left to right.

00:54:13 [CH]

Right, OK, yeah, that distinction does make sense.

00:54:16 [BT]

It does make a difference, yeah [chuckles].

00:54:17 [AB]

So there's been this proposal, an implementation with some people. Some people are upset about the three "function, function, function" forks, and saying that it just complicates the syntax and so on, and saying that you could actually achieve this using two dyadic operators. You would write "function, operator, function, operator, function" and the two operators together will combine to create the same pattern; the same overall combinator as what we know as the classic forks. However, here's the big distinction. For example the KAP language,[10] which is quite APL-y but uses k style (q style) trains (so "function, function, function"; not just atops), does provide a syntax so that you write "function, some other symbol, function, some other symbol, function" to have this pattern, the fork pattern. There are two different ways to look at that. If they are operators, then if you write a long sequence (a long train) using these operators, then it will group in groups of three from the left. So the leftmost three become a derived function, and that becomes a leftmost tine in the next fork going on. If you make them [a] special syntax, so that they're just some glue, then you can choose, and you can choose to make them group from the right, just like APL and J do. In principle, APL and J could have chosen to make forks bind from the left instead of from the right. Probably doesn't feel very natural to you now that you're used to it, but it doesn't contradict anything. If you see the fork as just another operator (but a triadic invisible operator), then why should it be different from all the other operators? Why should it group from the right instead of grouping from the left like all the other operators?

00:56:33 [CH]

I mean, it's a good question and something that could be explored in a different array language. And actually, it's a good segue.

00:56:42 [AB]

No, no, you don't need it in a different array language because in APL, for example, today, you can define such operators that do exactly this. In fact, you don't even need to define them. They're in the defense workspace already. And if you just get them from the defense workspace, you can write your forks using these explicit operators. They become kind of explicit forks. And they will behave in every way like any fork you used to, except they will bind from the left instead of binding from the right.

00:57:07 [CH]

Maybe I'm confused then. Are you saying that basically (I mean, this isn't a library, but it's easier for my head to wrap around) if you were to introduce a glyph that basically is triadic and takes three things on the right, you're basically coming up with a new operator syntax that doesn't require parentheses? I'm confused, yeah.

00:57:28 [AB]

No, no, no. no, no. We stick to the old syntax. Operators only take two operands. But imagine we had "f a g b h".

00:57:42 [CH]

OK

00:57:43 [AB]

And a and b are dyadic operators. Then this is a valid derived function, right?

00:57:48 [CH]

Oh, and you're just saying, why not do that instead of ... [sentence left incomplete]

00:57:50 [AB]

If you define a and b such that "f a g b h" is f applied to the argument or arguments and h applied to the argument or arguments and their results given as arguments to g, then you have effectively defined the fork, but using good old-fashioned dyadic operator syntax. Nothing new added to the language. Before forks were added to Dyalog APL, it was possible to model this behavior using exactly these two operators, which is why they exist in the defns workspace. You can actually go and play with them. The only difference being the binding. So imagine we have the classic average, right? The sum divided by the tally. Let's say we don't want the average, we want the absolute average. So you write "absolute value of the sum divided by the tally". That all seems natural to you now, but imagine if forks were binding from the left, then we would take the absolute value, apply it to the argument, and we would take the reciprocal and apply it to the argument. And their results would be given as arguments to the sum. And then that would be applied to the result of tally.

00:59:07 [AB]

Whoa! Start over again. First, we apply a tally, and that gives us a result. And on the result of tally, we apply this three fork, which is the "absolute-value, sum, reciprocal", which doesn't make any sense. But that's how it would bind. Following?

00:59:32 [CH]

No [chuckles]. I think I understand what you're saying. Basically there's an alternative way to spell forks and three trains that involve dyadic operators, that exist actually in the defns library. And it wouldn't require any changing to this kind of three train parsing. You could do without the three trains and just have dyadic operators and get what we have. But then the trade-off is that when you're inlining a three train needing parentheses for parsing ambiguity (or in the tacit case, you don't need those parentheses; you can just assign it to a function), you would now need two extra characters in the tacit case and zero extra characters in the inline case. But the trade-off is that you're gonna have two dyadic operators in between your three functions, if I'm following along.

01:00:24 [AB]

Yeah, that's correct. But the important part I'm pointing out here is the binding difference. So whenever you have a fork applied to the result of a function, you currently have to parenthesize (even if you're entirely tacit), you have to parenthesize the fork, right?. You would write "(f g h)" and then some other function. If we had the operator binding rules for trains, then you wouldn't need parentheses in that case. You would just write "f g h function".

01:00:52 [CH]

Yes, I see what you mean now. I would definitely need to noodle on that some more to know what I ... [sentence left incomplete]. I mean two of my favorite pieces of code ever: one is the Kadane's solution in BQN,[11] and the other one recently was the solution to, "is a function monotonically increasing or decreasing?". Marshall, you're not on Twitter, so you won't have seen it, but it's "match, after, sort, logical-or, reverse-sort".

01:01:26 [ML]

I think I did see that. I check your Twitter through an alternate viewer.

01:01:29 [CH]

Oh yeah. And it's very symmetric. It actually came out of a blog (I don't have their name off the top of my head, but they sent me a blog via LinkedIn exploring Uiua). And then that was the first or second problem they solved. And then I was like: "oh, we've got the sort primitives though in BQN" and then I solved it once. Then I was like: "wait a second, I can make this like beautifully symmetric where both of the matches are on the outside and then you've got the the vertically symmetric before and after". Anyways, the point being, I've got some beautiful solutions to these problems I like. And thinking of spelling that Kadane's one, which uses a dyadic fork, I think it would decrease the beauty. Although, I don't know. You'd have to show me what those combinators [are]. Maybe if they were before and after vertically symmetric, and the way that you can compose those to get different behavior, maybe it would look nicer. But yeah, I'd have to think about it.

01:02:29 [ML]

One think is, if you have modifiers that do computation, you could have modifiers that work this way that build up a train for you. Or at least the way that's obvious to me to do it would be to just have one modifier that you place at every place except the rightmost position, and then one modifier at the right that finalizes it. And what that first modifier does is just keeps kind of a list of all the things it's been applied to and builds it up into a big list. And what the final one does is it looks at the list of all the functions and then it builds a train out of that.

01:03:11 [AB]

Or any other pattern. Then you could have that final modifier. So essentially this is like the gerunds in J, right?

Yeah, you.

01:03:18 [ML]

Yeah, pretty much.

01:03:19 [AB]

You just build up a list of verbs and then you can finally take that list and do whatever you want with it. Combinator at the end that does with any number of operands and makes any structure whatsoever.

01:03:30 [ML]

Yeah, well, this is one reason why I don't really like this method so much, because you can make nonsense things as easily as you can make something that has a solid theoretical basis like trains. That's based on kind of the idea of mapping over functions. Like given powerful enough language features, you can make something like this, but it seems much better to have it built in.

01:03:55 [AB]

Well, in BQN it's very easy to get to this right. You just write the literal list of functions using their annotation and then finally you have something that applies it in whichever pattern you want.

01:04:02 [ML]

Yeah, and I mean, I've used that for some patterns. So if I have something specific that I want to do here, but as something that you use for general purpose programming, it's kind of-- I mean, it's like you're building yourself a language. Um, and if it's that important, it seems like it should be part of the language to start.

01:04:26 [BT]

Well, I was going to say on the J forums he's known as Pepe. Well, he's really, really informed, very good at this. In fact, he has his own version of J. I think he refers to it as JX, which has some things that he's called the wicked functions. And the wicked functions allow you to do things that J doesn't allow you to do anymore. But I refer back to a 2017 posting, [12] where he first talks about how you can create these, this verb that can create adverb modifiers and conjunction modifiers, which at that time you really couldn't do because the, what we're talking about, the modifier trains hadn't been put back in yet. That's what Henry put back in, you know, just recently. But now that they're in, I'm really interested to look back and maybe play with this because I was just looking at last night and I'm like, he's doing all this just because there were no modifier trains, except for the ones that you mentioned that were still left in which were adverb, adverb and conjunction noun or verb, which end up creating an adverb. So those ones were left in. But now these other ones are all left in. It'll be really interesting if I go in and play with the things he's suggesting to see whether that's simplified. But what Pepe does is it's all tacit and he only works with tacit and he uses the fact that things are tacit to be able to manipulate these things programmatically. And it's really amazing. Yes, I think I asked him at one point if he wanted to be on and he said depending on his schedule He'd be willing to do it. We'll just have to figure out when.

01:06:18 [CH]

So yes, I think we can get him on we'll just have to figure out when yeah I think for for tacit number six because as we talked I remember that was on the Kai the interview with Kai But he he made it might have said it on the interviewer afterwards that some of his favorite episodes were tacit And he's not the first person. So we always get scared of having these conversations. We love to have them clearly I mean, I've been trying to pivot to the second question for like 30 minutes now now, but we just can't successfully do it. Because I think there's so much stuff to think about. Already everything Adám said, I'm going to have to go and play around with my mind. But yeah, when we do task at number six, we'll have to try and get Pepe on, because I'm sure talking to him will unlock a whole other set of ideas. Speaking of the pivot that I've been failing to make, we've blown past the hour mark, so I don't actually think we can discuss it in full. But at minimum I want to mention the thought that I had out of sort of a discussion with Morton, the CTO of Dyalog, mentioned previously. And maybe we can get some brief comments and brief discussion, but obviously we're not going to double the length of this podcast because we're already currently over. And that is that the question of ambivalence, you know, glyphs having both a monadic and dyadic definition, is that a good idea? And there was a lot of discussion at Minnowbrook about that because I was kind of pointing out, and the reason I started thinking about this was we, we, I mean, I'd had the thought before of like, why does every language J BQN APL, it all, they all adopted the ambivalence of APL, which was motivated by a limited set of characters, you know, famously for some of the characters created, they had to be overstruck. You'd have to type one character, hit backspace, and then type another character to get the tally symbol. So they were constrained by the number of symbols they had. And so they had to make new symbols out of existing symbols. And that's what motivated the ambivalence to get more.

01:08:08 [AB]

Are you sure it's? That's the only thing that motivated it? I don't think so.

01:08:11 [ML]

I don't think so.

01:08:12 [CH]

All right, all right. Well, pause on if that's the only reason. It's one of the reasons at least. And the discussion that I ended up having with Morten was that I love tacit programming. I think the most beautiful code, the epitome of elegance when done right is tacit code. And Morten's not really a big fan. And I showed him that expression that was symmetric, solving the monotonically increasing or decreasing. And I was like, this has gotta be my second favorite piece of code after Kadane's. And he looked at it and he said to me, you know what? I think I've realized what it is about tacit programming that I don't like. It's when you combine the ambivalence with tacit, it becomes incredibly hard to read. And the reason he noticed that was because I was using the logical or and sort or reverse sort, which are the same symbol in BQN. [13] And so when he saw it, he immediately was like, wait, those are two different things. And then I said, oh, it's monadic or. And he's like, it's not monadic or, those are two different things. And then I said it for the next hour and it was irritating him because, you know, or and sort are completely different things. So to call it monadic or, he was like, what are you talking about? And I was like, that's what it is, Morten. And so then he kind of had this realization was that it's not that it's per se tacit, It's that it's tacit combined with ambivalence, where these things can have multiple definitions that makes it a lot harder to read, which then leads to the question, would a tacit language, array language, potentially benefit if we got rid of the ambivalence, which we might see a bit with Uiua. Anyways, Adám and Marshall both said that it wasn't the only motivation, so there's immediately some thoughts. I'll go to Adám first and then maybe Marshall afterwards.

01:09:50 [AB]

And a couple of different things here. And one is, is the only reservation, I think APL growing out of traditional mathematical notation, you immediately have this in traditional mathematics you got. The same symbol used in different ways in mathematics all over the place.

01:10:08 [ML]

All over the place?

01:10:09 [AB]

Well, yes, because every child learns so well, if you were to make subtraction and negation different symbols, that would be a very fragile choice in programming language design.

01:10:18 [CH]

I mean, Uiua did it? Can you think of any mainstream language that that doesn't have subtraction and negation on the same symbol? I mean Haskell does use the same symbol, but you have to add parentheses around the negation case a lot of times to disambiguate, so it is an issue. And we want not mainstream. And so this is I actually was I chatted with a bunch of folks at Minnowbrook and I think it was Roy Sykes literally took his drink and walked away when I was proposing you know what would happen I'm not saying this is a good idea I'm saying it's just an unexplored space. But when I said, what if we got rid of ambivalence? And he was like, what about minus negate? And I was like, I mean, we want added like a high bar for negation. He was like, no. And he had a beer in his hand and he walked away as a joke. Like he wasn't actually upset, but he was just like appalled by the idea that we would even consider doing this. Adám?

01:11:09 [AB]

Well, I think actually TI BASIC uses two different symbols for negation and subtraction.

01:11:15 [ML]

Oh, it's been too long since I used TI BASIC.

01:11:17 [AB]

I know, right? I believe that TI Basic actually uses the high minus for negation.

01:11:26 [CH]

What does q do, Stephen? Is there a word for negate? Is it N-E-G or something?

01:11:33 [ST]

I'm inclined to say positive. Yes, affirmative. But it's yes, it's negate.

01:11:38 [CH]

And I think that is because that's I mean, that's what when Stephen brought up q earlier, that was what made me I was like, perfect way to pivot because q is the one, well, I guess now there's two now that Uiua's been added, but it's the, and this was mentioned I think in the episode with Kai, that q actually does have like fixed arity for its keyword functions at least. And I think that actually if q wanted to add a bunch more combinator facilities, they could design a language feature that like inside your functions, if you don't mention X, Y, and Z, based on the arity of like the order that you pass things, you could form different patterns. Once again, I'm not saying that's a better language choice, but getting rid of the ambivalence, one, would definitely make things, I think, easier to read, and two, could create more possibilities. Let's see if we can respond, and then we'll go to Adám.

01:12:28 [ST]

That's actually one of the motivations for q over X. It took a lot of ambivalent glyphs, I'm speaking carefully here. So plus in its dyadic form, in its binary form, is the familiar ad, but in its monadic or unary use is flip or transpose in APL. And the experience with kx's users was that this was a source of confusion. So the unary versions got replaced with keywords.

01:13:05 [CH]

Yeah, I mean, that was actually one of the responses. I mean, I didn't respond that to Roy 'cause he walked away. But when I was talking with other folks, I was like, what is the other, 'cause if I think of the monadic definitions of the common infix binary operations, people were just kinda adding them. I mean, some of them do make sense. Like reciprocal with divides does logically make sense, but do we really need a reciprocal monadic function? [14] It's like because we were overloading, it's like we were finding useful things, but especially for plus and conjugate and what is times and sine, it's nice to have a sign function, but once again in q, I'm pretty sure they just have a monadic keyword that is sign, which I actually prefer, 'cause it's not common to use the multiplication symbol in mathematics for the sign operation. We just, we had a free monadic definition. Sign's a common thing, and it's kind of similar to multiplication. There's some connection. So the question is, if you're going to use the, there are some overloaded operations that would confuse users, what is actually the short list of those? Minus and negate.

01:14:09 [AB]

There's a list of connected ones on the APL wiki. But-- and I think it was me that came up with the idea of using monadic and an or for sorting, because they kind of point up and point down. And since then, I've regretted it. And if we were to add sort primitives, I would probably use monadic less than equal to and greater than equal to, because those do have connection. If you're sorting ascending, that means that adjacent elements are either less than or equal to each other.

01:14:40 [CH]

Listen, right, that's fine with me. I was asking more and I literally asked him, I think, 14 times at at Minnowbrook, I was like, and I actually started. I asked Peter. I was like, Peter, can you just, like, slip those in for me? You know, I give you authority. I'm pretty sure that's just as good as the thumbs up from Morten. And he said he'd think about it. But I think he was lying to me. Huh.

01:14:59 [AB]

But so I think there's value. I actually have another couple of examples of languages and they are all kind of APL-Avia-Sonyan languages. So we're completely breaking this whole thing that all the APL languages are inheriting this. No, they don't, because we have q that doesn't do it. We have Nial that doesn't do it either. It also gives separate names. We have Jelly, which is a J-inspired, all tacit language.

01:15:23 [CH]

Yeah, yeah.

01:15:24 [BT]

And J uses the underscore as subtraction and the minus for negation. The thing with J is the underscore indicates a negative number, but they still have the monadic minus for negating something.

01:15:36 [AB]

No, no, I was saying in Jelly.

01:15:38 [BT]

Ohh in Jelly. Sorry sorry.

01:15:40 [AB]

Jelly language, which is inspired by J, which takes the idea of differentiating between underscore and minus using those two symbols, but it uses underscore for subtraction and minus for negation, because, if I understood right, that allows you to import data from the world where a dash is used to indicate negation or negative numbers. so then it allows you to just take the data in and you don't have to worry about conversion. So I'd rather use a different symbol for an operation thing. So we have actually quite a few different languages. And Jelly [15] is an entirely tacit language and that doesn't use the three-train rule for constructing things because everything has fixerity. And therefore, it's much more-- you don't need parentheses. A lot of people make the mistake to think that Jelly is stack-based, because it looks like this concatenative thing. Because you can actually construct whatever you want to express, relying on the arity to do the right thing. So I think there's a lot of possibility there.

01:16:50 [CH]

I was going to say, you could even go the other direction, like Jelly didn't add trains. Or you could go the other direction and say, we've got trains, and now we've got fixed arity, so we don't have to disambiguate between, is the middle function, you know, it always has to be binary. You know, unary, unary, unary could be, you know, multiple, it could mean something else. You could have an even wider plethora of invisible patterns. Probably a terrible idea, but you could do it..

01:17:15 [AB]

Exactly. That is exactly what Jelly does. So a lot of people find Jelly is a bit overwhelming in how it works, but it has all these rules. It says, like, you have this arity function with this arity function, this function, and The funny thing is also that you can have, I think, even more arguments than just two, and you will just take them off the stack kind of thing, except it's not actually a stack. It's interesting. Look into it. But I think, as we do in mathematics, we have related things sometimes use the same symbol, sometimes different positions, sometimes in super and subscripts. For example, we can put the exclamation point on the left or the right of a name to make some factorial-like things. We have it in chemistry, physics, where we put little superscripts to the top left and bottom left and top right and bottom right, and they all mean kind of related things with a number of subatomic particles in the core of an atom or the overall weight and the charges. Having related things have related notations seems to help people to graph the notation.

01:18:28 [CH]

You're saying related though, but in APLs, minus q and Nial, they're identical. And that's the thing that I realized coding with Uiua.

01:18:36 [AB]

What do you mean that they're identical, but they're not identical because it's the positional thing that gives the difference.

01:18:41 [CH]

I mean, the glyph is identical.

01:18:44 [AB]

Yeah, but so that is exactly what you have, right? If you have a number or a symbol, like a summation point on the left or the right of an X in traditional mathematical notation, it's the same symbol, but it's the context in which it appears that makes the difference.

01:19:00 [CH]

I don't even know what the prefix exclamation means.

01:19:05 [BT]

Factorial

01:19:05 [CH]

No, I mean the postfix is factorial. Prefix, what does that mean? And this is the key thing is I agree that like having the overloadedness is beautiful, it's nice, there's like the related meanings. I think that the care that Iverson and friends put into the definitions of these is immaculate. And it leads to, like, I love the fact that the or, the logical or in BQN is overloaded because then I get this nice mountainy thing and it's just a coincidence that the problem calls for both the monadic and dyadic definitions. But like, would I like the solution less if the reverse sort and sort had primitives that weren't overloaded? I think I'd like it just as much as long as there was this kind of symmetry between those two glyphs. And when I program in Uiua, I don't actually miss the connection between the two. It's not even something that. The thing that I miss is because there's roughly the same number of glyphs, there's certain things in BQN that don't exist in Uiua. They have not a full set of the functionality that array languages have. I don't miss the overloading at all. It's easier to read, arguably, and I don't really miss it. It's something that is beautiful and that I like, but like, minus, pun intended, the minus and negate glyphs, like I can't think of anything that is irritating to like, oh now I need to use this, I need to find a different symbol for this.

01:20:34 [AB]

And I think- It's not just primitives. How do you like plus slash? Monadicly, it's a sum. Dyadicly, it's the windowed sum. Wouldn't be possible if you didn't do this whole ambivalence thing.

01:20:47 [CH]

Totally fine if they came up with another glyph for the dyadic case because that's something that I didn't even really discover or find super useful until I came to APL. A windowed reduction, I love that pattern and it does make me sad that I have to do a windows and then a three, whatever, three or an each with the operator if I need to do a plus reduce over those things. I love that APL has the nwise reduction and it's sad that BQN and Uiua don't have those But you can still spell them just as a couple extra characters, but like if if a peeled added a new glyph Deprecated the dyadic definition of reduce not a problem at all and tech and technically you could design it as an operator that That takes a function So you can still get something I mean technically that's what that's what reduces anyway, so yeah, just change the glyph to be something else like you know a slash with You know actually we already have that as telly I was going to say with like two or three lines in it. And I was like, wait, we got those both already. So you just come up with a different symbol that has some kind of slash or some, you know, relation to it.

01:21:51 [AB]

But it's not just that. A lot of, I don't know what you do with APL and array languages in general, but I think there is a, there's a kind of a big split between what you see online of people doing toy problems in these array languages. And then people who are actually doing bread and butter programming. I personally find it incredibly useful that I can write some kind of utility function that makes some assumptions, some, we say, default settings, and then you can give it an optional left argument to change those things. And if we remove the notion of ambiguity, which you can't just remove it from the primitives and not remove it from user-defined functions, because then how could you use user-defined functions in your trains and have them rely on the arity? Everything would have to be fixed arity. Then that option goes away as well.

01:22:40 [CH]

I mean, definitely there's trade-offs and that's a good example if you rely on the fact that you can design an ambivalent function where the unary case kind of assigns a default, that's definitely a trade off. You're losing that you would have to either have two functions or just always specify that second argument where you may might not want to do that.

01:22:57 [AB]

And I think the-- or you would have to have some kind of like a nothing left argument like BQN has to kind of indicate that, oh, I want to call this dyadic function, but I don't want to give a left argument will check whether the left argument is actually nothing. And it all seems kind of stilted to me.

01:23:14 [ML]

Well, I mean, presumably you'd just use Zilde, which kind of-- I mean, especially if you want to have a list of options, that makes sense. That's just natural.

01:23:23 [AB]

No, but maybe that might be a in the. The left side might be a value that you want inserted in places and and then Zelda would be a value. Then you say yeah, so that that doesn't mean there's a good reason why you have nothing in in BQN rather than just giving it a zero, right? Because you can't just use zero because many things would actually use a zero as left argument and it means something. No, but that might be-- the left side might be a value that you want inserted in places.

01:23:28 [ML]

Yeah, and then you've got a problem.

01:23:43 [AB]

And then Zilde [16] would be a value. Then you say-- yeah, so that doesn't really-- there's a good reason why you have a nothing in BQN rather than just giving it a zero, right? because he can't just use zero because many things would actually use the zero as left argument and it means something. And I think from a pragmatic perspective, that would be a relatively large loss. And the ease of notation, expression.

01:24:01 [ML]

I mean, I I'm not convinced because this.

01:24:04 [CH]

Convinced of what? What Adám's saying or what I said?

01:24:06 [ML]

Yeah, that it's a relatively large loss. If you take it relative to all the distortion that having ambivalent functions does add to the language. I mean, it would be a huge change. And so I think how big of a change is having ambivalent functions versus whatever else you would use, that's not that big relative to it. I mean, yeah, I think overloading is a pretty big negative point in the sort of Iversonian paradigm. It comes with the territory and it causes a lot of issues. I have a page on the BQN site about that. So I don't know, it's kind of hard to say until it's been tried to do something. And I mean, q is a pretty good example that you can remove ambivalence and come up with a pretty good language, but q is also has a lot of differences relative to the more APL like languages. So there's kind of a question of how close can you remain to the Iversonian paradigm and remove ambivalence that I don't really know the answer to.

01:25:17 [ST]

Oh, excuse me, Marshall. q does not remove ambivalence, a table of overloads on the reference and how to distinguish the number of different operators assigned to the dollar glyph, for example. Most impressive.

01:25:36 [CH]

I was just gonna ask, is there an easy way to delineate in q what is ambivalent and what is it not? Is it the keywords that it adopts, or the keywords, is it the ASCII symbols that it adopts from k that are still overloaded and all of the keywords that are added are not? Or is it fuzzier than that? There's corner cases.

01:25:57 [ML]

Well, so the functions you define can only ever be called using the--

01:26:03 [CH]

Fixed arity.

01:26:04 [ML]

Using the APL syntax where you just juxtapose with the argument. You can't define a dyadic, you can't use a function dyadically that way. You have to call it with the braces if you're gonna call anything that's parenthesized that's named with more than one argument.

01:26:20 [ST]

Yeah, let me intervene just a little bit here. So we stopped using the terms monadically and dyadically because you can have up to eight arguments.

01:26:33 [CH]

Why didn't you just add, you know, words for three, four, five, six, seven, and eight, Stephen? That would have been a good idea.

01:26:39 [ST]

Oh, so we did that. So we have, you know, ternary functions. like SSR, string search in our place, that's a ternary function. And there are quatenary definitions of some of the operators, like the @ symbol. And they're all carefully laid out and documented because they're enormously handy. There's a ripe source of confusion if you - I just stumbled into this as a newbie.

01:27:17 [CH]

I was kind of joking, but that is good to know. And I did see a very impressive use of the @ symbol. It was a partial application scan of a quaternary function that was adding four arguments together and it scanned over a list of four elements and it was using some @ scan. I didn't fully understand it. And you end up with a list of partially applied functions where the first one has three arguments, the second one has two arguments, the third one has one, and then the final one is evaluate. And I was like, holy smokes, that is not something that I knew was possible in q. And makes me, I mean, there's too many languages to learn. It makes me wanna even go and learn q more because that is definitely not something you can do in any of the other APL-like languages. End up with a list of partially applied functions. That's even tricky to do in a functional language like Haskell.

01:28:15 [ST]

Yeah, I think Marshall is in the process of saying that you can't define, you can't make a user-defined function, a lambda that is binary and has infix syntax. That is the first argument on its left.

01:28:31 [ML]

Yeah, well, so it's not really a matter of what you define. You can define any function you want, but you can't call a named function with infix syntax because that is not part of the syntax. There is no such thing.

01:28:44 [ST]

But if you derive a function by, say, scanning it, so you have your lambda scanned, then the derive function has infix syntax.

01:28:56 [ML]

Yeah, that's true.

01:28:58 [CH]

Interesting.

01:28:59 [ST]

Or lambda and lambda each.

01:29:01 [CH]

We're gonna definitely, we were talking about this before we started recording, but we have still yet, even though we reached out to at least one, maybe a couple of the q folks from kxCon, but we're gonna have to get those folks on and explore. 'Cause that is definitely like a missing part of my array knowledge is the differences. I mean, Michael Higginson, [17] when we had him on, he talked about a few of the things that he really likes about q that don't exist. And I think projections is what those are technically called when I was talking about in the terms of the partial applications. But it seems like there's a bunch of stuff that we spend a lot of time, I mean, we spend time talking about all the languages, but because APLJ and BQN all kind of are a lot closer to each other and k is slightly different, There are less times when the language features of q come up because they're the only ones that, there's the only language that have that language feature. Whereas if we're talking about APL or BQN or J, a lot of them share the same thing so it comes up. So yeah, I definitely want to make sure we have some folks on and we get to talk about that stuff. 'Cause you made that comment too, Stephen, when we were talking to Lynn Sutherland about Nial is that you were listening and then you had that great clip where you were like, "This This is very, very interesting because you're listing off all these things that q also has. Potentially there was some, I'm not sure if Arthur was aware of Niall, but maybe there was some idea sharing or conversations that happened where both languages ended up implementing them or Niall predates q. It would be interesting to know more about that. Anyways, we have blown past, Adám's got his hand up. We should have predicted this was going to happen.

01:30:35 [AB]

I want to throw a spanner in the works here.

01:30:38 [CH]

What a spin in the works?

01:30:39 [AB]

Spanner, it goes spanner and the.

01:30:41 [BT]

A wrench.

01:30:41 [ST]

Spaniard Spanish, I think it's a Spaniard, you said.

01:30:42 [AB]

I think it's a spanner. So, Conor, you're looking at this table of the J invisible modifiers and you're going like, wow, this can derive new modifier, derive new adverbs and derive new conjunctions.

01:31:01 [CH]

I can put an operator in my train. Who knew? Not me. Well, now I know. I didn't know before.

01:31:07 [AB]

Ever heard of hyperrators?

01:31:08 [CH]

Hyperrators?

01:31:10 [AB]

No, hyperoperators.

01:31:11 [CH]

Hyperoperators. Actually, I do recall seeing that in the title of an Iverson paper, and I can't remember if I perused the first page, but I don't recall. I either was confused or, you know, I didn't actually read it. I just saw the title. But I have...

01:31:29 [AB]

So something like how you reacted to the original seeing J's table of invisible modifiers.

01:31:37 [CH]

Or the first time I bounced off it.

01:31:39 [AB]

Yeah. So the same kind of reaction I'm hearing from you here. So think about this. You can have functions adjacent to an operator, and the operator builds a new function based on those functions. And we've seen that function, function, function can kind of combine into, invisibly combine into some new function. So in a sense, we're deriving a new function from some old function, an invisible combinator there. And we've seen the J table of even the extreme combinated, it was called conjunction, conjunction, conjunction, deriving

01:32:22 [CH]

The triple conjunction, baby, the holy grail. Woo.

01:32:24 [AB]

Wouldn't it then also be possible to have some kind of entity that took conjunctions and modified them or combined them? Because think about it, the parallel here, we can have an operator that takes functions and derives a new function. And then we can have this train of functions that becomes a new function. Similarly, we have the train of conjunctions that becomes a new conjunction. We should also be able to have a new type of modifier that takes conjunctions and derives a new conjunction, right? Hyperoperators. If functions take arguments and operators take functions, then what is it that takes operators?Hyperoperators or hyperrators?

01:33:08 [CH]

Is that actually what a hyper operator is?

01:33:10 [AB]

That's what a hyperoperator would be. Now they were theorized for a while and- So you said should. Hold on, I'm getting to that. So if you look at how DyalogAPL, for example, has the defense syntax, we refer to arguments as alpha and omega, we refer to operands as alpha, alpha and omega, omega. We could refer to hyperands as triple alpha and triple omega. And you know what? NARS 2000 does exactly that. It implements hyperators. Conor's mind was just blown for the listener here. They can't see the picture, but Conor now has his hands on his forehead, covering his eyes partially, and his mind is literally blown.

01:33:49 [CH]

Whoa! Oh my goodness.

01:33:50 [AB]

So that would be for another episode. Right.

01:33:53 [CH]

OK.

01:33:54 [ML]

Well, that's not the direction I was going, we should.

01:33:57 [CH]

Yeah, well, I mean, even at modifier trains, AKA, you know, operator trains, whatever you want to call them, that's the problem is that every language comes up with a different name. I mean, I will admit, Marshall actually borrowed modifiers. I'm not sure if you borrowed, I'm guessing you did, from the umbrella term that is in J for both adverbs and conjunctions. But yes, we keep on renaming these things. And even in my mental model, while you were describing that stuff, I consider the verb trains as combinators, the conjunction or the modifier trains as higher order combinators. They're combinators that take higher order functions. They form things that take functions as their arguments. Now you're telling me that technically you could just keep going and if we've got verb trains and we've got modifier trains, we technically could have whatever the name is for the next thing, which you're saying is hyper operators. And you're saying that in NARS 2000, so we were saying before this episode, I don't think it got mentioned that a Bob Smith was one of the folks at, um, uh, the Minnowbrook conference. And he presented a talk on sort of the most recent features that have been added in the last two years to his open source GPL licensed free, not on GitHub, but on a source forge.net, which I think is SVN. And you can go download this. And I was very sad that I didn't think to make the request cause I didn't download the program until I got back was it doesn't have the input method that I'm used to, which is either a back tick for the ride editor or a slash for BQN. I'm not a big fan of the alt control or shift overloading anyways, but you're saying now that in this free APO from Bob Smith, hyper operators are implemented.

01:35:37 [ML]

And now, so that's, that's a lot of preoccupation with whether you could, I I would like to suggest, and I think I also make this suggestion on behalf of Kay somewhat, in fact, you make more progress if rather than try to extend this framework and add more and more combinations, you cut back to what is really necessary. One of the things I've really found when working with BQN is that it's much better to improve the functionality of functions and make it so that functions can be applied in better ways and easier to work with-- and first class functions is really the core thing that you need to start doing this-- than to make all these mechanisms, like the modifiers and the modifier trains, for working at a higher level in functions. Because once you step up to this higher level, modifiers are never going to be as nice to program with as functions are. If you have functions that manipulate functions, well, of course you can make a train out of these because they're just functions. So like I said, BQN doesn't even support partially applying a two modifier. You can't pass in one operand. You have to pass in both at once. And so roughly three or four times in the past three years, I've had to write-- I have had to suffer the indignity of writing a one modifier that all it does is take the operand and pass it to a two modifier with another fixed operand. So that's the way you bind an operand into a modifier if you really have to. But I mean, this is so rare that you need this. And even the times that I'm actually doing this, I'm thinking, well, isn't this a sign that maybe I shouldn't have been using a modifier? So there is this other perspective that rather than trying to add-- trying to make more things that you can do, try to instead do more with the functionality that you already have with these functions.

01:37:37 [ST]

That's a really good point. It would be wonderful if we could get Arthur to talk about this. I know he's recently been thinking a lot about combinators, but they're pretty much just like not there in q. And the challenge, I mean, the question is, how useful are these things? They're really fun, but how useful are they? 'Cause utility is what q and k are all about.

01:38:03 [CH]

And that is, I mean, I did at some point, It wasn't right at the top of the episode, but I think it was about 10 or 15 minutes in. I said, "Please do suspend." Some of you might be thinking, "What is the point?" And I am not even sure, like I said, I spent hours at this point over the last few days since I discovered this on Saturday or Friday night, and today is Tuesday, that's when we record, trying to come up with useful examples and have yet to come up with, and specifically, like as I mentioned, I'm focusing on the Oreo and the triple conjunctions because those ones, they're the ones that appeal to me the most. And maybe the answer is that this is a very, very cool thing, but the utility of it is really small and the complication of having, like there's no way most folks are gonna memorize this full table. Just of the three trains, there is, like it looks like 10 to 20 of them, and that's excluding the two trains in this modifier train world. So potentially, like, I think it makes sense. Like if you think about the history of J and what Henry said is they added them and they decided at a certain point, yeah, it's probably not worth the complexity that we're adding to the language. But then for whatever reason, they decided to add them back and I love it. And actually, this will excite you, Bob. I think J is gonna be entering my top five favorite programming languages. We was there and this is, this is small little monologue here, but this is why I love the array languages so much. It's been since basically the end of 2019. So we're going on a full four years here. And like, I'm still discovering stuff. But there's no paradigm out there that has this depth, like the fact that I can't, who was it? It was you Adám that mentioned that NARS 2000 has hyperators? Like that, that is not where I thought this conversation would be ending at. It's like, oh, let me blow your mind even more. And another, so it's just like, there's so much to explore. There's, there's papers that haven't been implemented. One of the things that came up at Minnowbrook was why is partition from APL not an operator? Like they have cut and J that is kind of the equivalent of partition. And I could, it would be nice if I had like a partition operator and then I'm talking to folks there and then Bob Smith says, "Oh yeah, you know, check out my 1978 paper. I've got a section in it called, you know, mentions the partition operator." Haven't gotten around to implementing it, you know, in the last 40, 50 years, but, and I'm just like, "What?" So like I had this idea at the conference and then I'm talking to someone, you know, the implementer of NARS 2000 who then says, "Oh yeah, I mean, I had that same idea, you know, 50 years ago, 40 years ago." And it's just amazing. And on a sort of negative note, I am just 10 minutes ago when you were describing hyper radars, Adám, I do think the equivalent now of category theory behind functional languages like Haskell, the extent to the depth that this tacit programming stuff goes, this is the category theory of array languages, [18] unfortunately. And I say that unfortunately because I've tried to learn category theory and I got to a certain extent, but it just starts to blow your mind at some point and then most folks get confused. And I can see now that this might be the same thing. And it's like, is it even useful? I mean, category theory experts will be like, it's totally useful. You lock all these kinds of patterns. Other folks will tell you, well, like the learning curve to wrapping your head around what a monad and a monoid and an applicative functor and a functor is. And that's just the beginning. That's not even talking about left and right can extensions, which I do not understand myself. I read the chapter on them, but I'm still confused by it. It's like if you listen to someone like Edward Comet, he is a category theory god and he just kind of whips through everything or Bartosz Milewski, you know, the guy that wrote the book on it. And you listen to them and you're like, "Okay, you sound excited and it does seem useful what you're doing but I have no idea." Potentially that's where we're at here. We're having this like very, very in-depth conversation. We're all excited about it. We're saying it's super cool. We're questioning whether it's useful and probably most people are lost. at this point we're risking this making the this the longest episode and giving Bob more work than he wants to do.

01:41:57 [ML]

We need to know specifically should King Arthur go out in search of the Holy Grail?

01:42:06 [CH]

Ohh, is that the cold open you got?

01:42:08 [ST]

There's the cold open.

01:42:11 [ST]

How does the session on tacit session and tacit wind up being the longest session recorded?

01:42:18 [CH]

Yeah, I don't know.

01:42:19 [ML]

Well, we can call it the fifth and part of the sixth if we want.

01:42:25 [CH]

Oh my goodness. I hope people don't see the length of this episode and then decide to skip because you know, hopefully, yeah, we put the King Arthur bit at the beginning.

1:42:34 [BT]

The length isn't a problem because I don't think you'll get that many people getting to this point.

01:42:39 [CH]

Yeah, I'll have to find a way to plug this on Maybe I'll make a YouTube video or something. But yeah, it's, I mean, just the fact that J has this, is that a good reason for it now being in my top five? Probably not, but listen, it's my top five. I get to choose on what motivates languages for being there. And even if it's bad motivation, that's fine with me. Bob?

01:43:01 [BT]

I've got three things. First thing. I think your metaphor of category theory is absolutely bang on. Because category theory ties together. It's a meta theory that ties together ways of putting things together in other disciplines. And that's what the use of it is, is you start to see these patterns that extend out to other applications. I think tacit actually does the same sort of thing. You start to see patterns of how things go together and they can be extended into other areas. And it creates this extra level of you having to understand. It's not easy. But when you get to that point, which I think Marshall got to without having these modifier trains, but I think it's a rare person who can get to that point. And I think the modifier trains may help that a little bit. Second thing, if you want to get in touch with us, ArrayCast, contact@arraycast.com [19] because I'm going to get us moving towards wrapping up. And we love to hear from people and actually some of the most recent comments have been just excellent and very heartwarming and we really appreciate them and very interesting as well. Third thing, your CCC. What if you wanted to reverse two of the operators? In J you've got LEV which refers to the left operator and DEX which refers to the right operator and those are both conjunctions. And if you swapped your outside times, you'd be reversing the two operators.

01:44:37 [CH]

So this is, once again, I mean Peter's name has been mentioned like four or five times. While I was off focusing on the triple conjunction and the Oreo conjunction, he came up with all these useful things and he was on the NUVOC page and came across, was it Dex and Lev?

01:44:52 [BT]

Dex and Lev.

01:45:11 [CH]

Because he was looking for the equivalents of right and left and the tax basically in APL and then very quickly found those. And then his comment that he made was, probably a lot of the useful modifier trains that you end up spelling are actually simpler than the one that you're trying to find because one of them is just basically the identity or left or right equivalent for these modifier trains. And I said, oh yeah, that's probably true. 'Cause even a lot of the forks you write in APL because you don't have a hook are three trains where one of the tines is one of the tacks. And so yeah, a lot of the things that you're saying, Peter, very quickly, I mean, if I hadn't been so busy wasting my time trying to find the triple conjunction, the holy grail that maybe King Arthur, maybe King Arthur, I mean, if we get him on and the first thing he says is, "I'm actually an avid listener and episode 64 really loved, five minutes in though after you had mentioned the triple conjunction, I already had a good one and I was yelling at my podcast for the rest of the time, wouldn't that be, that would be the highlight of my life, folks." If Arthur, if you're listening, please find the triple conjunction and come on the podcast And we will name it King Arthur and the Holy Grail. And it'll be the best thing. It'll be, like I said, highlight of my life until I have kids. And even that, who knows?

01:46:11 [BT]

I think you have to pull the sword out of a stone first, Connor.

01:46:15 [CH]

Oh, oh my goodness. Well, this might be hands down my favorite episode so far. Hyper Operators, King Arthur, the triple conjunction. Is any of this stuff useful? Email Bob to let us know what you think. And we avidly await your results, what you have to say. And if you did enjoy it and you did make it full the whole, through the whole two hours, 'cause basically that's where we're at right now, let us know. And if it was too long, also let us know. We do try to keep these closer to an hour, at least under an hour and a half. We failed this time.

01:46:48 [ML]

If it was too short, just don't say anything.

01:46:52 [CH]

(both laughing) - Yeah, but yeah, we, I mean, we talked about a four hour podcast in the last episode and look at us now, we're approaching two hours. But don't worry, next one will be shorter. And yes, thanks to everyone here. This was amazing. I hope the listeners enjoyed. I hope they made it through. And with that, we will say, happy array programming.

01:46:52 [ALL]

Happy array programming.

01:46:54 [BT]

Tacitly

01:46:55 [MUSIC]