# The third way

The original version of this article was posted in the comp.lang.misc newsgroup. There are some minor editorial changes, and a notational change.

In the text function invocations always use lisp style () function invocation, sequents always use {} (braces) as delimiters, and arrays always use [] (brackets) as indexing delimiters. This differs from actual usage in many languages, e.g., in San brackets are used both for delimiting sequents and for delimiting array indices

There was a fair bit of response to the article, the bulk of which was about the differences between sequents and lazy lists. This topic is explored in a second article, Sequents and lazy lists.

One of the features in the San language that I am developing is something that I call a sequent – an alternative title might be a piped action sequence. It represents a “third way” to approach a wide variety of programming tasks, the other two being iteration and function nesting using higher order functions. Here is a simple task and its resolution in three different styles:

We are given a vector A and a function f. We wish to apply function f to each element of A and place the result in vector B.
One way to do this is to use an iteration loop; this would be the normal choice in languages such as C, fortran, and Pascal. The resulting code might look something like this:
```   for (int i=0; i<n; i++) {B[i] = f(A[i]);}
```
I have several objections to this kind of code. It requires auxilliary an index variable (i). It supposes that A is an array; it could just as easily be a linked list or some other kind of structure. And it presumes a divorce between information about the vector (its length) and the vector.

An alternative approach in a bastardized fp language might look something like this:

```   B[] = (map f A[])
```
Map, aka mapcar, takes its first argument, treats it as a function, applies it to each of the remaining arguments, and return the results as a vector. Now I like this kind of code much better. There are no superfluous indices. The information about the organization of A and its size are left under the hood where they belong. (But sometimes you need to look under the hood.)

There is a third path, that of connecting pipes. Pipes are connectors that pipe the output of one computing element to the input of another. In a pipe oriented language we might say something like

```   B[] = {A[] | f}
```
or even
```   {A[] | f | B[]}
```
I like this code too. It has the virtues of the higher order function approach (map). One nice thing about it is that applying a series of operations is cleaner. For example, suppose we have some data, call it A, and we want to perform operations f1, f2, f3, f4 on it. Ordinary function nesting looks like this
```
(f4(f3(f2(f1 A))))
```
This make make a lisp programmer happy but it doesn’t enthrall me. In an fp language we might define a “nest” function to get something like
```
(nest f4 f3 f2 f1 A)
```
or even
```   f4 \$ f3 \$ f2 \$ f1 A
```
where ‘\$’ is a composition operator.

In a pipe oriented language you might say something like this:

```   {A | f1 | f2 | f3 | f4}
```
There are problems with these simple formulations.

One is that in most languages functions can have multiple arguments but they only have one return value (it may be an aggregate of some kind). The format of these sequences of functions implies that they have one input and one output. How do we handle functions that expect more than one argument?

One way is to treat the arguments as being a “tuple”, i.e., the function is passed a container that contains the arguments. The catch is that some arguments will be explicit and some implicit.

Another way is to use a type of closure called partially instantiated functions. The general idea is that we make a new function from a base function that fixes some of the base function’s arguments. In some languages this construction can be done syntactically. Here is an example:

```   (map, (max 5), 1,2,3,4,5,6,7,8,9)
```
The expression (max 5) is called a closure; it returns the maximum of one additional argument (successively supplied by map) and 5. In this kind of usage we don’t actually need closures; it suffices to treat (max 5) and its kin as syntactical devices.

A third way to use pronouns; a pronoun is a conventional symbol that stands for an element to be supplied. Here I will the at sign, @, as a pronoun. Using pronouns our example becomes

```   (map, (max 5 @), 1,2,3,4,5,6,7,8,9)  // HOF style
{1,2,3,4,5,6,7,8,9 | max 5 @ }       // Pipe style
```
Using pronouns requires writing a trifle more text than is needed for closures but it is more general. For example instead of having a function in one of our stages we could have an expression, e.g.,
```   {1:9 | @*(@+1)/2}
```
which generates the triangular numbers, 1,3,6,10,15,21,28,36,45. We can do the same thing in HOF enabled languages but the construction is likely to be clumsier.

So far the constructs (nested functions and pipe sequences) appear to be equivalent. However there is a fundamental difference between them; nested functions communicate by argument lists whereas pipe sequences communicate by streams. Intermediate stages in a pipe sequence receive items one at a time from the input pipe and emits items one a time through the output pipe. That is why the map function is needed in

```   (map (max 5 @) (1:9))
but not in
{1:9 | max 5 @}
```
How can we process the entire input stream in one gulp? One way is to extend the @ pronoun usage with array notation. Thus we could say:
```   {1:9 | sum @[]}
```
sums the numbers from 1 to 9.

The intermediate stages can add and delete items as well as simply transforming them. For example

```   {1:3 | a, @ }
```
generates the sequence a,1,a,2,a,3.

As a final thought we can generalize the concept of a function by thinking of persistent computing elements that have input and output channels.

This page was last updated June 1, 2006.