# APL: An Array Oriented Programming Language | by A Kaleberg

*by*Phil Tadros

APL was invented by Ken Iverson within the early Sixties as a notation for array oriented calculations. It was exact and helpful sufficient, so it was applied as a programming language, one of many previous ones and one of many extra obscure ones.

The Fifties and Sixties had been a golden age of language design. So many concepts in fashionable programming languages had been first developed in these early early life. For instance:

- FORTRAN and Algol set the usual for a lot of languages and most fashionable languages use precedent primarily based expressions and distinguish between expressions and statements.
- LISP bared the construction of computation with applications as a primary order knowledge kind and there has just lately been a spherical of rediscovery with languages adopting options like rubbish assortment, top notch procedures and linearized knowledge illustration.
- COMIT and SNOBOL are nonetheless current in languages like Perl and anyplace common expressions are used for string matching.
- COBOL continues to be utilized in enterprise oriented knowledge processing and there are numerous “report era” languages and packages sharing COBOL’s presentation oriented knowledge philosophy.
- FORTH, as stack oriented language, lives on as Postscript and SVG for graphical presentation.

The primitive reptilian brains that advanced within the formative period nonetheless pulsate within the fashionable organism.

In distinction, APL has virtually vanished. There are implementations. There may be GNU APL, and Dyalog has produced a modernized model. There was a short APL revival again within the Eighties the place most of the language’s options had been adopted for programming the Connection Machine. The final Connection Machine I noticed was on the Museum of Trendy Artwork, and whereas its quite a few LEDs had been nonetheless blinking, it’s inner structure and programming paradigm have been all however forgotten.

We’re in a brand new golden period of pc language improvement. I’ll name that is the silver age with new languages like Python, Ruby and Javascript constructed and developed on the foundations of the golden age however providing larger comfort and expressibility. These languages have been knowledgeable by FORTRAN, Algol, LISP, COMIT and others. They provide the most effective options of golden age languages in fashionable packages to assist fashionable software program builders. Many of those options had been forward of their time and are solely now being acknowledged and integrated as processors and programs have grown extra superior and extra highly effective. In distinction, APL has been uncared for. It was means too far forward of its time, nevertheless it is likely to be time to reassess this.

One huge improvement in fashionable computing has been the widespread adoption of SIMD processors. SIMD stands for single instruction a number of knowledge. In typical processors there’s a single instruction circulation, and every instruction manipulates a single set of values. An add instruction may add two numbers to get a 3rd. In a SIMD processor an add instruction may add two arrays of numbers to get a 3rd array. You probably have lots of numbers so as to add, that sounds nice, however the place are these processors as we speak?

Because it seems, SIMD processors are in every single place. Graphics processors, GPUs, are SIMD, and so they can carry out the large computing wanted for 3D graphics, compression, actual time composition and different processor intensive operations. Even a modestly priced laptop computer or smartphone has extra computing energy in its GPU than could be present in a Nineties supercomputer heart. The issue with GPUs, nevertheless, is that they’re onerous to program. A lot of the code that they run has been particularly developed to reap the benefits of the uncooked energy out there. Nobody writes Javascript code to run on a GPU, and there actually isn’t a language that makes programming GPUs as straightforward as utilizing Javascript in a browser.

The primary SIMD processors had been constructed again within the Sixties. The Illiac IV was most likely the primary supercomputer to assist SIMD processing. It had a main typical processor and an extra 64 floating level models that had been managed by a single instruction stream. Within the early Seventies, when it was lastly accomplished, it supplied 50 MFLOPS. In a way it was the primary of its type, and it will be many years earlier than SIMD processors would turn out to be widespread.

In the meantime, APL lurked within the shadows. It had a little bit of a cult following, however by no means moved into the mainstream. One of many few indicators of its affect was in a language for programming the Connection Machine, a extremely parallel procesor for synthetic intelligence developed within the Eighties. The Connection Machine had tens of 1000’s of processing components however a single instruction path. Danny Hillis, its inventor, developed a programming language for the machine that supported “alpha” parallelism and “beta” discount. One needed to suppose by way of an intrinsically parallel knowledge kind, the “xector”, to program it. These concepts seem as we speak in fashionable parallel programming programs that use map and scale back. The first affect was from LISP, however Man Steele, who developed the programming language for the industrial type of the Connection Machine, had written an APL interpreter in LISP and virtually definitely adopted concepts from APL into the brand new programming paradigm.

There actually isn’t way more historic background. Maybe it’s time for a little bit of time journey. We’ll set the dial again to 1970 and discover ourselves in a small room with an IBM 2741 terminal, an vintage, however fashionable, teletype linked to an vintage pc, maybe an IBM 360 operating CP-67 and an APL subsystem. This can be onerous to think about these days. Quaint pc terminals had been simply computerized typewriters. That they had a keyboard and a platen and a ribbon for ink. Not like a typewriter, paper was fed from a field or roll fairly than in single sheets, however in any other case, an IBM 2741 appeared like a typewriter.

The IBM 2741 was completely different from most typewriters in that it had a single “golf ball” print head fairly than particular person putting components for every letter. The “golf ball” would flip and tilt, then slam the the inked ribbon towards the pag and print a personality. This had benefits. Not like a extra typical typewriter the place every key had its personal mechanism, there was just one transferring half, the “golf ball”. This made it simpler for a pc to manage. Much more wonderful, the “golf ball” might be changed with one other “golf ball”, so the typewriter may change fonts. In case you had been typing with the Roman alphabet and needed to kind one thing in Cyrillic, you simply modified the “golf ball”. We take the power to shift fonts as a right as of late, however IBM had executed one thing fairly spectacular.

The purpose of this was that APL used its personal particular character set. There was a particular APL “golf ball” and a particular keyboard mapping for it. APL had began out as a mathematical notation, however the “golf ball” helped it turn out to be a pc language. Mathematicians had been utilizing symbols like little arrows, triangles and comparability operators for ages, however computer systems again then couldn’t deal with issues like a greater-than-or-equal signal and even issues like curly braces. This meant languages like FORTRAN used awkward notations: greater-than-or-equal was rendered “.GE.” It labored. It was helpful. It was ugly. The entire level of APL as a programming language was to show wanting mathematical notation into wanting programming language. Turning its particular characters into EBCDIC suitable sequences would have defeated its aesthetic goal.

That’s sufficient preamble. Now for some APL.

APL had two knowledge varieties. There have been numbers and there have been characters. It had one mixture knowledge kind. If a datum wasn’t a scalar, it was an array. There have been no buildings, units, hash tables or courses. APL was about arrays. APL was about pondering completely different.

One obvious distinction was that APL used an uncommon numeric notation for numeric constants. Most pc languages use the identical character to point adverse numbers as for negation or subtraction. APL had a particular character to point adverse numbers: ¯ as in ¯3.5. That meant one may kind in a single dimensional array values as a sequence of numbers.

For instance:

1 ¯2 3 ¯4

is a one dimensional array of size 4 containing the values one, minus two, three and minus 4 in that order. Utilizing a extra typical notation for adverse numbers would have launched ambiguity.

A single quantity showing alone could be a scalar. A sequence of numbers could be one dimensional array. Characters had been delimited by single quotes, as they had been in lots of languages. A single character in quotes could be a scalar. A number of characters in quotes could be a one dimensional array of characters.

APL has the same old arithmetic operators, however the symbols for multiplication and division are the characters acquainted from arithmetic, not from programming languages. Most fashionable languages use an asterisk for multiplication. APL makes use of a multiplication signal as in 2×3. Division is indicated by the acquainted division signal as in 2÷3. In APL, most operators have a monadic, single argument type, and a dyadic, two argument type. A few of this needs to be acquainted, however APL provides just a few twists. For instance, monadic division returns the inverse, in order that:

÷4 → 0.25.

The arithmetic operators work on scalars as one may count on:

2+3 → 5

4÷5→ 0.8

The operators additionally work on arrays, even the dyadic type so long as the 2 arrays have the identical dimensions:

1 2 3+3 ¯5 4 → 4 ¯3 7

-1 0 ¯1→ ¯1 0 1

The operation is utilized to every component for a monadic operation and to every corresponding pair of components for a dyadic operation.

One may use a dyadic operator with a vector and a scalar:

1 2 3÷7→ 0.1428571429 0.2857142857 0.4285714286

7+ 1 2 3 → 8 9 10

Already, one can see how this encourages pondering by way of parallel processing. Since array operations are implicit, they might be compiled into environment friendly SIMD code. APL encourages array primarily based SIMD pondering by making it a central, pure a part of the language.

APL has lots of operators, most having each a monadic and dyadic type.

Operator/Monadic/Dyadic

+/Identification operator/Addition

-/Negation/Subtraction

×/Returns the signal as ¯1 0 1/Multiplication

÷/Inverse/Division

*/Exponential, energy of e/Elevate to the facility

⍟/Pure logarithm/Log in a given base

|/Absolute worth/The rest

⌈/Ceiling, spherical up/Most worth

⌊/Ground, spherical down/Minimal worth

= ≠ ≤ < > ≥/No monadic type/Comparability operators

∨∧/No monadic type/Logical or, and

!/Factorial or gamma/Combinatoric C

?/Random integer/Deal a random hand

The comparability operators return 0 for false or 1 for true a lot as they do in C and associated languages. The logical and and or operators deal with zero and adverse numbers as false and optimistic numbers as true.

With all these operators, operator priority turns into an issue. Does one carry out a most earlier than a multiplication? It may get very difficult, however APL finesses the problem. The order of analysis is strictly proper to left until modified by parentheses. Because of this:

2×3+4 → 14

APL provides 3 and 4, then multiplies the outcome by 2. Extraordinary expressions may look cryptic, however unique expressions shall be understandable if one retains this rule in thoughts.

APL could appear unique, nevertheless it does have unusual issues like variables which might be strings of letters and numbers so long as the primary character is a letter. Assigning a worth to a variable isn’t any completely different than in most acquainted programming languages:

abc←3 4 5

OK, the usage of a left pointing arrow is a bit unique. The results of an task, thought of as a dyadic operator, is the worth assigned. The instance above would yield a 3 component array containing 3, 4 and 5 so as.

Let’s think about a easy instance. Suppose we need to composite two pictures saved within the variables imaginatively named img1 and img2. Maybe we’re fading from one picture to a different, so we have now the fading issue, a quantity between 0 and 1, saved within the variable fade. We are able to composite the 2 pictures with the expression:

(img1×fade)+img2×1-fade

Evaluating from proper to left, APL computes 1-fade, a scalar. Then it multiplies every component of img2, the second picture, by that quantity. The following a part of the expression is in parentheses, so APL then multiplies every component in img1, the primary picture, by the fading worth and provides that array to the array computed utilizing img2.

There is no such thing as a iteration, no looping. There may be nothing concerning the dimensions of the photographs. This might work on two dimensional grey scale pictures or on three dimensional shade pictures the place every pixel might need three RGB values or 4 CMYK values. It may even be used for a volumetric grayscale fade or a volumetric shade fade. APL relies on array notation, so the main points of the array construction are hidden.

Though APL is an previous language, this notation might be compiled effectively for a contemporary SIMD processor. I doubt anybody has such an APL compiler, however anybody who applications such programs will acknowledge a pure match with the APL thought course of.

One method for speedy rendering is to make use of what known as a Z-buffer. A Z-buffer information the gap of every pixel in a rendered picture from the digital camera. Suppose every picture, img1 and img2, within the above instance had a Z-buffer with a distance worth for every pixel. Combining the photographs correctly requires utilizing the closest pixel worth from one picture or the opposite. In APL this might be executed with a easy expression:

(img1×zbuf1>zbuf2)+img2×zbuf1≤zbuf2

The comparisons of zbuf1 and zbuf2 return arrays of the identical measurement and form as the unique Z-buffers, however containing solely 0s and 1s comparable to which Z-buffer contained the nearer or farther pixel. The photographs are mixed utilizing multiplication in order that both a pixel from img1 or img2 is chosen.

Programming in APL includes pondering by way of array operations, operations which might be intrinsically parallel. Thus far, it has all been concerning the map a part of map and scale back. APL additionally has an operator for discount, decreasing the dimensionality of because of this. The discount operator, written as a slash, takes one other APL operator and turns it into a discount operator. That is finest defined with some examples:

+/1 2 3 4 5→ 15

⌈/2 ¯1 3→ 3

Within the first instance, the addition operator is was a summation operator which sums the array. Within the second instance, the utmost operator is was a maximization operator which finds the most important worth within the array.

This might be time to debate APL array construction. APL helps arrays with numerous dimensions, however APL array dimensions might be zero, so it’s doable to have a 3 dimensional array that’s 3 x 0 x 4. It might haven’t any components, however it will have a form, so it may match the form of different arrays. APL is usually fussy concerning the shapes of the arrays it should work with.

There may be an operator for coping with the shapes of arrays. Its monadic type returns the form of a worth, and its dyadic type produces an array of the specified form. The monadic type works like this:

⍴3→

⍴3 4 5→ 3

The form of a scalar is a one dimensional array of size zero. That’s why there may be nothing to the proper of the arrow. The form of a one dimensional array is an array with one component, the size of the unique array.

The dyadic type of the reshape operator lets one construct and reshape arrays, altering the scale and the variety of dimensions. The primary argument is a one dimensional array containing the size, and the second argument is a scalar or array which is repeated to create an array of the specified form with the specified content material. We are able to create a 4 by 5 array filled with random values between 1 and 10 fairly merely:

z←?4 5⍴10

That query mark is the random quantity operator, so every worth within the ensuing array would vary from 1 to 10. The ensuing array may print out as:

4 3 10 3 6

3 6 8 10 4

1 7 1 8 10

7 1 3 9 9

Your random numbers is likely to be completely different.

We are able to use discount to take the sums of the rows or the columns in a two dimensional array:

+/z→ 28 18 31 35

+/[1]z→ 16 29 20 28 19

+/[2]z→ 28 18 31 35

The default type takes the sums of the rows. It turns a 4 by 5 array right into a one dimensional array of size 4. APL discount may scale back alongside different axes, so it may be used to take the sums of the columns. APL isn’t any unusual pc language, and its origin as an array notation permeates.

There may be one other operator that APL programmers discover helpful, the iota operator for producing an inventory of integers from 1 to a specified worth. This operator makes it straightforward to guage collection. For instance:

+/÷!¯1+⍳20→ 2.718281828

That’s how one would compute the worth of e utilizing a Taylor collection in APL. Evaluating from proper to left, APL would generate an inventory of the integers from 1 to twenty. Then it will subtract one from every of them. Then it will take the factorial of every of these values. Then it will take the inverse of every of these factorials. Lastly it will take the sum of that array with the outcome proven.

Discount isn’t the one APL operator that modifies one other APL operator. One helpful operator, the internal product operator, combines two operators. It may be used to show the addition and multiplication operators right into a matrix multiplication operator. Take into account the instance:

a←3 2⍴⍳6

b←2 3⍴⍳6

a+.×b →

9 12 15

19 26 33

29 40 51

This shouldn’t be stunning. You’ve most likely already found out that you could possibly do a dot product of two one dimensional arrays utilizing +/a×b. Nevertheless, as soon as one begins pondering by way of array operations, different functions come up. For instance, looking out:

txt←3 10⍴’abcdefghijklmnopqrstuvwxyz1234′

This might set the variable txt to a two dimensional array:

abcdefghij

klmnopqrst

uvwxyz1234

If we use the internal product operator to mix the logical and with the equal comparability operator, we take a look at every column towards a selected string:

‘dnx’∧.=txt — → 0 0 0 1 0 0 0 0 0 0

APL encourages such a pondering.

There may be much more to APL. There are different operators for modifying and mixing operators. There are consumer outlined capabilities that may be mixed utilizing APL operator combining operators. There may be workspace administration, looking out, sorting and a bunch of different issues. One obvious omission is the dearth of management construction. The place the loops? The place are the conditionals? For essentially the most half, the loops are embedded within the array operations, and the conditionals embedded within the logical operators.

To be honest although, APL does have one specific operator for management construction. The precise arrow operator is used as a go to operator. That’s proper. APL has a go to operator, however no different management buildings. If that doesn’t encourage pondering by way of array operations, nothing will. Even the go to in APL is bizarre. Person outlined capabilities normally execute every line of the definition in flip. Traces might have labels indicated by a terminating colon character as is completed in lots of meeting languages. These label definitions outline integer constants by line quantity.

The go to operator is monadic and takes an argument, the road quantity. If the road quantity is a scalar, then execution continues on the road as numbered. If the road quantity is an array of size zero, that’s, an empty array, then the go to operator does nothing and the following line is executed. There you will have it, looping and conditionals, until you need to exploit the array operations.

APL is a curious throwback, nevertheless it presents a problem to the way in which one thinks about computing. It embeds its parallelism in its arrays, a lot because the Connection Machine embedded its parallelism in its xectors.

APL can be utilized as a manufacturing language. It’s surprisingly highly effective for each numeric work and textual content processing. A pal of mine even wrote a LISP interpreter in it. Nonetheless, APL might be extra helpful as a puzzle, a language that challenges one to suppose in a brand new means and supplies the instruments for exploring this unfamiliar mind-set.