FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR...

106
1 FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS . SUBMITTED BY HEMANT SREEKUMAR, TO THE INTER-UNIVERSITY MASTER’S PROGRAMME OF DIGITAL MEDIA, HOCHSCHULE FÜR KÜNSTE, BREMEN IN PARTIAL FULFILMENT OF THE REQUIREMENTS FOR THE DEGREE - MASTER OF ARTS (MA) IN DIGITAL MEDIA 15 MAY 2011 . SUPERVISORS : PROF. DR. FRIEDER NAKE, UNVERSITÄT BREMEN PROF. PETER VON MAYDELL, HOCHSCHULE FÜR KÜNSTE BREMEN

description

This thesis examines 'Processing' – the programming language for artists and designers created byCasey Reas and Ben Fry as a cultural artefact. The title makes a reference firstly to the 'Sketchpad'(the man-machine interaction paradigm conceived by Ivan Sutherland in 1963) and to 'sketching'(which denotes the act of using the Processing IDE ). Tracing the act of using the Processing IDE -'sketching' – a verb, to an important interface of digital media history 'Sketchpad' – a noun, covers thelast four decades of the history of the digital computer. More importantly this history conveys thesocietal and technological causes for the development of a 'programming language' specially cateringto the needs of the contemporary creative labour class.Using this word 'contextualising' before the name of a software product (as has been done in the titleof this thesis) also makes it convenient to locate key historical reference points from the context ofdigital media that bridge the two culture divide between science + technology (esp. computerprogramming) and the arts + humanities. To achieve such a 'contextualising' the main research aspectsare - evaluating the pedagogical significance of this development (a software) and defining the deepercultural history of this technical product.

Transcript of FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR...

Page 1: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

1

FROM THE 'SKETCHPAD' TO 'SKETCHING' -

CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

SUBMITTED BY HEMANT SREEKUMAR,

TO THE INTER-UNIVERSITY MASTER’S PROGRAMME OF DIGITAL MEDIA, HOCHSCHULE FÜR KÜNSTE, BREMEN

IN PARTIAL FULFILMENT OF THE REQUIREMENTS FOR THE DEGREE - MASTER OF ARTS (MA) IN DIGITAL MEDIA

15 MAY 2011 .

SUPERVISORS : PROF. DR. FRIEDER NAKE, UNVERSITÄT BREMEN

PROF. PETER VON MAYDELL, HOCHSCHULE FÜR KÜNSTE BREMEN

Page 2: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

2

DECLARATION OF AUTHORSHIP

I, Hemant Sreekumar, author of this MA thesis - titled“FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' :

THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS” declare that all literary sources that have been used have been correctly referenced.

: ............................... :

BREMEN, 15 MAY 2011

Page 3: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

3

ACKNOWLEDGEMENTS

I owe my deepest gratitude to Prof. Dr. Frieder Nake for the intense motivation and courage that he has provided me.

I am extremely thankful to Prof. Peter Von Maydell for helping me focus and showing confidence in this project.

I am much obliged to Compart Bremen for being my nest for nearly a year and providing me the apt context & exposure

without which this thesis would have been impossible.

I would like to thank Jukka Boehm and Stefan Kreitmayer for helping me gain confidence with computer programming.

I am heavily indebted to my wife for the divine patience she has shown since I filed for this thesis.

Page 4: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

4

ABSTRACT

This thesis examines 'Processing' – the programming language for artists and designers created by Casey Reas and Ben Fry as a cultural artefact. The title makes a reference firstly to the 'Sketchpad' (the man-machine interaction paradigm conceived by Ivan Sutherland in 1963) and to 'sketching' (which denotes the act of using the Processing IDE ). Tracing the act of using the Processing IDE - 'sketching' – a verb, to an important interface of digital media history 'Sketchpad' – a noun, covers the last four decades of the history of the digital computer. More importantly this history conveys the societal and technological causes for the development of a 'programming language' specially catering to the needs of the contemporary creative labour class.

Using this word 'contextualising' before the name of a software product (as has been done in the title of this thesis) also makes it convenient to locate key historical reference points from the context of digital media that bridge the two culture divide between science + technology (esp. computer programming) and the arts + humanities. To achieve such a 'contextualising' the main research aspects are - evaluating the pedagogical significance of this development (a software) and defining the deeper cultural history of this technical product.

Page 5: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

5

- INDEX

0 GROUND0.1 INTRODUCTION....................................................................... 7

0.2 METHODOLOGY...................................................................... 8

0.3 ON SKETCHING....................................................................... 10

0.4 ON PROGRAMMING................................................................ 13

0.5 ON DESIGN.............................................................................. 15

1 ARTEFACT1.1 META-MEDIUM........................................................................ 19

1.2 THE 'SYSTEMATIC DOMAIN' OF MEDIA DESIGN................ 20

1.3 SYNTAX.................................................................................... 22

1.4 ABSTRACTION ….................................................................... 28

1.5 ECO-SYSTEM …...................................................................... 33

1.5.1 INTERFACES

1.5.2 COMMONS

1.5.3 OPEN

1.6 ARTEFACT CONCLUSION….................................................. 44

1.6.1 SELECTION

1.6.2 ESSENCE

2 TECHNOLOGY2.1 CONTINGENCIES …................................................................ 48

2.2 ART AND TECHNOLOGY........................................................ 49

2.3 THE ARTIST PROGRAMMER …............................................. 51

2.4 ARTEFACTS …........................................................................ 54

2.5 CLOSE CONTEXT 1 …............................................................ 55

2.5.1 SKETCHPAD

Page 6: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

6

-2.6 CLOSE CONTEXT 2 ................................................................ 65

2.6.1 ALAN KAY

2.6.2 SMALLTALK

2.6.3 FROM THE OPERATOR TO THE USER

2.7 A DEEP HISTORY OF DIGITAL MEDIA …............................. 75

2.7.1 ROOTS OF THE 'VIRTUAL' MACHINE

2.7.2 LABOUR MACHINE

2.7.3 MILITARY MACHINE

2.7.4 COMMUNICATIONS MACHINE

2.8 TECHNOLOGY CONCLUSION................................................ 85

3 LANGUAGE 883.1 CONTAINER............................................................................. 90

3.2 ESSENCE OF COMPUTATIONAL MEDIA.............................. 91

3.3 PEDAGOGY.............................................................................. 95

3.4 CONCLUSION …...................................................................... 98

4 FINAL THOUGHTS 99

5 BIBLIOGRAPHY 103

-

Page 7: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

7

0 – GROUND

0.1 INTRODUCTION

This thesis has been entitled “ From the 'Sketchpad' to 'sketching' - Contextualising Processing – the programming language for artists and designers ”.

The first section of the title “ From the 'Sketchpad' to 'sketching' ” makes a reference to the “Sketchpad” - the man-machine interaction paradigm conceived by Ivan Sutherland in 1963 at the Lincoln Labs1, MIT. The first section also makes a reference to 'sketching' - a word denoting a quickly summarised figure generally drawn on paper. In the context of this title the word sketching is used to denote the act of using the Processing IDE2 . Also known as the Processing Development Environment (PDE) , 'sketching' in this context consists of writing code in the text editor to create programs.[ReF 2007, 9]

The second part of the title directly refers to 'Processing'. In its main wiki page FAQ section3 it's described as an open-source programming language which has been developed by Casey Reas and Ben Fry at the MIT Media Lab under the tutorship of John Maeda. The designers of the language claim it to be a programming language primarily intended for artists and designers.

So, the title charts out a time-line – from the creation of the “Sketchpad” in 1963 till recent times when a software product like Processing calls the user coded document - 'a sketch' and by default saves it as 'sketch_$$$$$' in a default folder called 'Sketchbook'.

Processing is an open source programming language and environment for people who want to create images, animations, and interactions. Initially developed to serve as a software sketchbook and to teach fundamentals of computer programming within a visual context, Processing also has evolved into a tool for generating finished professional work. Today, there are tens of thousands of students, artists, designers, researchers, and hobbyists who use Processing for learning, prototyping, and production. 4

This wiki page FAQ also states that it is an 'introductory language' enabling newer audiences from the arts and media field to learn fundamentals of digital computer programming using a simplified JAVA syntax and visual output. Initially hosted at http://proce55ing, it is now available at http://processing.org/. The current stable version of processing at the time of writing this document is the version number 1.5 released on 17th April 2011. For the first time, this release features the convenience to run programs on the open source mobile platform - Android 5.

1 | Sutherland, Ivan E. "Sketchpad, A Man-Machine Graphical Communication System." Ph.D Thesis . MIT . (1963) 2 | IDE (integrated development environment) - a software application to assist computer programmers in software development. Ideally it consists of : a source code editor : a compiler/ interpreter : a debugger.3 | http://wiki.processing.org/w/FAQ | 27FEB20114 | Quoted from the processing.org/ homepage | 27FEB0115 | Operating System and other key libraries for Mobile devices supported by Google Inc. http://www.android.com/ | 08MAY2011

Page 8: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

8

Tracing the act of using the Processing IDE - 'sketching' – a verb, to an important interface of digital media history 'Sketchpad' – a noun, covers the last four decades of the history of the digital computer. More importantly this history conveys the societal and technological causes for the development of a 'programming language' specially catering to the needs of designers and artists . The aim here is to examine ' Processing – the software ' as a cultural artefact and not just as a piece of technology.

The second part of the title also has the word 'Contextualising'. The word context comes early 15 th

century Latin word contextus - "a joining together" . It is also semantically related to the word textura, which meant a “web, texture, structure”. 6

Using this word 'Contextualising' before the given title of a software allows us to locate a specific survey into various social and technological conditions out of which such an entity (this software - Processing) has emerged . At the same time the larger focus will on the cultural history of the personal computer and software applications catering to artists and designers. To achieve such a 'contextualising' the three research aspects are:

ARTEFACT - Evaluating the significance of this development (a software). TECHNOLOGY - Defining the deeper cultural history of this technical product. LANGUAGE - Enumerating the facts that would corroborate the pedagogical claims of this product.

The sections 1, 2 and 3 of this thesis deal with these respective research aspects. The third research aspect would be interspersed within the first two sections as well. This section 'The Ground' – 0 , lays the base for this thesis. I define the research methodology, introduce 'Processing' and look at how terms like 'Computer Programming' and 'Design' can be understood.

The over-arching theme of this MA thesis is to find key historical reference points from the context of digital media that bridge the two culture divide between science + technology (esp. computerprogramming) and the arts + humanities. 'Processing' – the programming language, being just a recent and exciting platform that allows the convergence of these two separate fields of activity[Manov 2008, 78].

0.2 METHODOLOGY

In terms of the textual presentation, this thesis is extremely terminological. There is an overall effort to understand various key terms from the digital media design context such as - 'tools', 'language', 'technology', 'medium', 'abstraction', 'labour' etc. from different perspectives. All terminological entities that are temporarily special are signified in single quotes. Double quotes denote direct references from a text and special paragraphs which start from quarter page-width are longer conceptually pertinent extracts . Direct references that do not fit within the main narrative body are placed in the footnotes along with other comments and internet citations.

6 | http://www.etymonline.com/ > term = context & texture | 08MAY2011

Page 9: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

9

For the research aspect number one, I define the socio-technical 'ecosystem' upon which Processing exists by route of looking at its syntax. I look at the aims of the creators and locate how effectively they are being met. This section will deal primarily with defining the artefact per se'. Artefact – this term, implicitly presumes some intentionality and authorship. This fusion of the latin words arte “by skill” with factum “thing made” has since the early 19th century meant anything made by human (art)7. This term intends an artificiality (as opposed to a – natural , made by nature), an object with an anthropocentric purpose.[Hil 2008]

For the research aspect number two, I trace a history of the significant software tools and technologies directly catering to the needs of artists and designers. 'Cultural Software' – as we understand it currently because of its mass-appeal are predominantly computer operating system and internet browser based applications, which 'users' 'use' to interact with other users, interact with various media / information and share these interactions with other users.[Manov 2008, 13]

This 'Cultural software' most importantly rests on a massive 'largely invisible' communications network infrastructure. Thematically this section is aligned with the term “Software studies” [ibid, 5]

which denotes the field of study that seeks to understand contemporary society and its constituents through their mutual and unique use of various softwares to interface with each other. It pre-supposes that 'computer science' is part of the culture (academic, industrial, social) and unique ways to examine socio-cultural factors which go into the formation of these cultural or infrastructural software, must come from a convergence of media-theory, digital media-archaeology and cultural studies apart from the various technology studies domains.

.. if we want to understand contemporary techniques of control, communication, representation, simulation, analysis, decision-making, memory, vision, writing, and interaction, our analysis can't be complete until we consider this software layer.

Which means that all disciplines which deal with contemporary society and culture - architecture, design, art criticism, sociology, political science, humanities, science and technology studies, and so on need to account for the role of software and its effects in whatever subjects they investigate. [Manov 2008, 8]

For the research aspect number three, I look at some literature from where evidence can be gathered to substantiate the pedagogical aspects of Processing. Building on the level set after the research question one, this section looks at past pedagogical attempts which unite computer programming with a visual feedback structure within a pedagogical context. I will be focussing only on products catering to 'creative labour' like artists and designers but note of similar key developments in the field of children's education will also be looked at. This section thrusts on the pedagogical ideas behind 'Procedural-Literacy'8, a term which highlights radical notions of understanding what it could mean to be computer literate in the near future.

7 | http://www.etymonline.com/ > term=artefact | 27FEB0118 | See the section 3.2 - Term used from [Mat 2005].

Page 10: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

10

Literacy has to do with the ability to access a language, to be literate. The word literate also derives from the 15th century Latin word lit(t)eratus , which meant literally “one who knows the letters”. I refer to this because the present popular definitions of media-literacy and computer-literacy vary distinctly.

Computer-literacy is commonly understood as the ability to use a computer efficiently (using an operating system running some software) while media-literacy is generally understood as “…the process of accessing, analysing, evaluating and creating messages in a wide variety of media modes, genres and forms. It uses an inquiry-based instructional model that encourages people to ask questions about what they watch, see and read.” 9

The proponents of ideas like 'Code-Literacy' and 'Procedural-Literacy' situate their ideas in the middle of these two understandings. The creators of Processing have laid a great stress on the software being first and foremost a pedagogical medium, created to involve participation from people who are not trained as computer programmers. Unlike other professional IDE's (Microsoft Visual Studio, Eclipse, Netbeans etc.) they claim that as trained designers, they wanted the act of coding to be similar to traditional creative sketching. So metaphorically and literally the syntax 'sketchbook','sketch' etc. refers to the pre-supposition that writing long code to set up a graphics context is taken care of.10

0.3 ON SKETCHING

In the essay for the 2003 Ars Electronica catalogue introducing – 'Processing', its creators Casey Reas and Ben Fry provide a clear reasoning for why they developed such a non-commercial product.

The target audience, for this product, is ideally 'new' to computer programming (i.e. they have not been traditionally trained to script and design software) and are constituted from a hybrid of 'creative' labour domains - visual arts, graphic design and animation. They also claim an intention to connect electronic art concepts to a programming audience, thereby laying the foundations of a software ecosystem where there is mutual learning from the exclusive skill sets of the opposites ( interfacing the technically able with the aesthetically abled 11 ).

Processing according to Reas&Fry converges three concepts – a 'simple' programming language,a 'local' development environment and a pedagogical structure. The programming syntax is simplified from Java but supports important Java structures and the development environment exports programs to internet distributable Java applets.

. . . built specifically for learning and prototyping...

… shifts the focus of programming away from technical details like threading and double-buffering and places emphasis on communication..

9 | Websters online dictionary – Definitions - literacy / Computer+literacy / Media+literacy | 27FEB01110 | The idea of just writing a short piece of code that runs very easily (via a little run button) is a direct descendant of John Maeda's work in Design By Numbers. - http://wiki.processing.org/w/FAQ . | 27FEB01111 | Technically , here means the ability to read and write computer programmes.

Page 11: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

11

...a text programming language specifically for making responsive images, rather than creating a visual programming language [ReF 2003]

Citing nearly the two decades of the GUI's prevalence, Reas&Fry writing about their intentions behind creating processing mention that the existing status quo of teaching programming through the CLI (command line interface) can be progressed by teaching programming through a graphical/visual output. They expect this would bring learning computer graphics and interaction principles buoying up to the surface for the new audience.

Catering to advanced users of script based popular media design software like Director and Flash, the option to get deeper into more complex scripts like C++ and Java is made more pleasant by removing the drudgery of coding out an efficient graphics context. They claim that the possibility to compose fairly “sophisticated visual and responsive structures” attained from a balance of available features and simplification of syntax bridges into a faster familiarity with “vector/raster drawing, 2D/3D transformations, image processing, color models, events, network communication, information visualization, etc.” [ibid]

Furthermore the skill-set gathered from using / learning Processing is deemed viable for accessing more advanced languages and API's

...including web authoring (ActionScript), networking and communications (Java), microcontrollers (C), and computer graphics (OpenGL). The project is built around Java so that the programming skills learned using Processing can be directly transferable to these more advanced environments once the time is appropriate.[ReF 2003]

Though the particular IDE is called Processing, the designers state it explicitly that the IDE is just one unique instance developed around a Java environment which connects to the graphics features from PostScript and OpenGL 12. All text/code typed into Processing gets converted / interpreted as simplified Java code and gets compiled into Java byte code [ReF 2003]. This byte code is stored as an applet which can be displayed on all popular operating systems and web browsers. Since the word JAVA has appeared so many times, some references from the person who developed Java would be due. Gosling writing the 'Java white paper' in 1995 uses a couple of keywords to describe it.

simple, object oriented, distributed, interpreted, robust, secure, architecture neutral, portable, high performance, multi-threaded, dynamic language. [Gos 1995]

I will look at some of these keywords in the sections featured below but first I want to re-phrase what he intends by the word 'simple'. He elaborates 'simple' as a paradox - a complicated system which is strong enough to be programmed by those who do not know the 'esoteric' nuances of C++ 13.

12 | OpenGL > 2D / 3D graphics API + PostScript > a page description language.13 | JAVA omits many rarely used, poorly understood, confusing features of C++ that in our experience bring more grief than benefit. This primarily consists of operator overloading (although it does have method overloading), multiple inheritance, and extensive automatic coercions. [Gos 1995] .

Page 12: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

12

He states that the object-oriented features14 of C++ were used for their convenience but he also provides one instance of improvement by citing the case of constant allocation and freeing of memory inherent in C and C++ which is automated in JAVA, thereby making the programming 'easier' and cutting down on bugs. He further elaborates 'simple' as the ability to create software small enough to work independently on small machines.15

This aspect of the cause of 'Processing' is also made clear in the introductory section of the Processing hand book [ReF 2007] where it's stated that each programming language being a unique medium is distinct in some ways, all of them, suitable for some particular tasks and not useful for a lot of other tasks, 'Processing' just being one among the many. Quoting Larry Cuba's self proclaimed situation where each of his animation works have been developed on different systems the authors reiterate this inherent aspect of programming languages in general – the final decision – a choice of which features to use – depending solely on the project goals.

In support of the decision to use JAVA as the base for developing the Processing language they offer their intention to benefit Processing's pedagogical concerns by opting for a syntax and coding style that offers a sustainable learning path which can lead into other programming languages. The alignment of the intended audience's involvement with applied creativity is compared with how traditionally architects, musicians and artists have used processes like prototyping, demo-tapes, small-scale models etc., like a 'sketch', a form-finding process, before dealing with the presentation logistics of the final output.

So to bring closer the demo-medium (in which this form finding process occurs) closer to the medium of the final output in context to composing software, Processing is validated, given its convenience of exploring various ideas within a relatively lesser time. As mentioned before this convenience exists primarily because the advanced skills required to structure programming syntax well enough to provide a graphics context is taken care of. But on the other hand being a descendant of the C programming language 'heritage' , Processing inherits a valuable amount of programming language refinements improving the access towards people familiar with writing software. [ReF 2007, 7]

Processing can be obtained for Linux, Windows and Macintosh OS's using any web browser pointed to www.processing.org/download, where installation instructions are also linked. The Processing IDE, also called as the Processing Development Environment (PDE) is constituted of a minimal set of GUI utilities consisting of - a text box for typing code, a text console to output reports, a space to show activity status messages below it , a top section with tabs for code sections and a toolbar featuring menus listing generic actions.

The text box permits code to be copy-pasted from / into it (Ctrl + C, Ctrl + V) and has the usual text search and replace option with (Ctrl+F). The message area mentions errors and other generic save, export actions. Once the 'Run' button is pressed, a new display window opens to show the result after the code is successfully compiled, else an error message is shown in the output console. The output console also displays text output by Processing programs, complete error messages and text output generated by the running programs.

14 | This term is looked into more deeply in section 1.415 | The size of the basic interpreter and class support is about 30K bytes, adding the basic standard libraries and thread support (essentially a self-contained micro-kernel) brings it up to about 120K. [Gos 1995]

Page 13: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

13

The code typed into the text box, the software, when saved is by default 'saved as' “sketch_$$$$”, (hence the literal connotation with term 'sketching') in a folder of the same name consisting of a main .pde file and a data folder[ReF 2007, 10]. This data folder contains regular media files of images, fonts, music, video and documents in plain text, .pdf or spreadsheet formats which can be accessed when the software is running. These same media formats can be created as an output as well by the software when it is running or at its termination.

This 'sketch' or .PDE file can also be 'Exported' as a Java applet embedded in an index.html file within the sketch folder. All the .pde files containing the code sections are compiled into a unique Java Archive (JAR) file within the sketch folder – sketch -$$$.java which includes all the contents of the data folder. The index.html file opens in the default browser of the OS and displays the visual as it would run in the original window. Just to make things more convenient this export command creates separate folders containing unique Java applications for the Linux, Windows and Macintosh OS's including the necessary libraries apart from the source code and the applet itself.

0.4 ON PROGRAMMING

A digital computer can be described at its simplest as a piece of hardware with interconnected switches that can be switched on and off . This hardware can be 'operated' with just a binary (digits 0 or 1 , bits ) sequence of signs that can control the switches' state. Within the paradigm of the Von Neumann design16 both data and instructions have the same format. For this hardware which reacts to only these two signs, which themselves are just “signifiers of voltage differences ”[kit3 1992] software is 'literally' meaningless.

When meanings come down to sentences, sentences to words, and words to letters, there is no software at all. Rather, there would be no software if computer systems were not surrounded any longer by an environment of everyday languages.[kit3 1992]

To program this hardware means to manage sequences of bits (sign), either as data or as instructions. A programming language is the linguistic interface which allows humans to efficiently operate the on/off switches of this machine. It is a notation to represent algorithms and data structures, such that they can be manipulated. Programming is the task of composing series of inter related step by step instructions - algorithms - textual description of a computational procedure. The 'Program' is an implementation of an algorithm in some programming language. For the hardware an algorithm and a data are indifferent if not differentiated with an 'interpreter' which converts human understandable text into a binary string.

16 | Von Neumann's design postulated the following three system elements: Firstly a central processing unit for command-controlled processing of alphanumeric data by either mathematical or logical rules; Secondly a write-read memory for variable data and a read-only memory for programmed commands; Thirdly a bus system for sequential transmission of all these data and commands as bi-univocally indicated through binary addresses by pages and columns. With these three parts, von Neumann machines articulated the fundamental structure of information technology as a functional interrelationship of hardware elements. [kit11996]

Page 14: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

14

Deciphering C. S. Peirce's definition of the sign as a triadic relation between a representamen, an interpretant and an object at this early juncture would be fruitful.

A representamen stands for an object by virtue of an interpretant. The interpretant is an expression of what the sign means; the object is an expression of what the sign signifies; the representamen is an expression of what the sign is made of. [Nak 2001, 6]

The representamen is what is created in the human mind (the idea of 0 being 0, 1 being 1 and what they signify ), the interpretant is the 0 and 1 itself, and the voltage differences caused by the 1 and 0 is the object. When the object and interpretant coincide a 'signal' is created out of the 'sign' loosing its genuine corporeal character as a sign. [ibid, 7] Since this internal representation of data, command and structure occur with binary numbers all the other finite rules of structuring programs are governed by social processes.

Which is why computers in principle comprehend all other media and can subject their data to the mathematical procedures of signal processing. Data throughput and access time depend solely on physical parameters. Since 1948, when the transistor replaced the tubes/printed circuits of the second World War, and 1968 when integrated circuits replaced the single transistor, in each case reducing the space and time requirement by a factor of ten, real time analyses and real time syntheses of one-dimensional data flows (of speech or music for example) are no longer any problem. [Kit1 1996, 19]

To learn a programming language means gaining familiarity with composing algorithms ( primitive computational steps ) using a finite syntax to represent abstract mathematical functions / relations.

When he considered the labour process exclusively without its 'social formation' Marx listed three simple elements which constitute the labour process. I would like to consider these in context to programming and software development.

Firstly the labour process defines 'the purposeful activity', the work itself, which in this context would be typing higher or lower language coding, defining mental models and processes in explicit detail using combinations of alpha-numerical characters within restricted vocabularies of syntaxes.

Secondly the labour process defines 'the objects on which that work is performed', the raw materials, taken literally this would be the keyboard for input, other hardware, a compiler, other code libraries, technical software frameworks, the main project requirements itself and the broad digital communications infrastructure.

Thirdly the labour process defines 'the instruments of that work', which in this context would be a combination of the first two.

Page 15: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

15

Marx had noted that the labour process was the 'cultural universal' which develops and changes throughout history. The 'labour process' appropriates everything that exists in nature for the requirements of man, which in this context is extremely broad. Apart from the coded algorithms which are modelled on 'natural' processes and strategies of feasible abstractions, the hardware (storage, memory, processors etc.) are also engineered by machines using refined natural minerals from the earth. To make the coded processes and hardware, 'active' electricity is used, again generated by transforming 'natural conditions' of water, wind or minerals. [Mac-1984, 9]

Marx never distinguished the valorisation process and the labour process and saw both as different aspects of the same process of production. Labour process as defined above, associated with the ways in which people work, with instruments (code), on the raw material (code) to create a product with certain given properties (code). The valorisation aspect is important to understand since it causes the product to be of larger value than the sum of the three elements of the labour process.

In a programming context the valorisation process is the complete vision of the software and its efficient implementation by the project management (including 'bug' sorting, testing and marketing) . This valorisation of the 'software making' process is implicit in both open source and more explicitly in the commercial software. Since the 'identity' of a new software is, basically and only, this aspect of valorisation, I mention that Marx has noted the valorisation process is the 'social form' of the production process specific to capitalism.[Mac 1984, 10]

0.5 ON DESIGN

A spider conducts operations which resemble those of the weaver, and a bee would put many a human architect to shame by the construction of its honeycomb cells. But what distinguishes the worst architect from the best of bees is that the architect builds the cell in his mind before he constructs it in wax. . . . Man not only effects a change of form in the materials of nature; he also realizes his own purpose in those materials.

Karl Marx(1884) 17

In Bruno Latour's keynote lecture on 'a philosophy of design' [Lat 2008] for the “Networks of Design” meeting he explains how the usage of the word 'design' has changed since now it is applied to a wide range of domains.

“Design” he says, in its old meaning of “to relook” was a cosmetic aspect of an efficient but boring commercially engineered artefact. Design was similar to “adding a veneer of form to their creations”, basically a surface feature which was associated with “taste and fashion”. When design was admired, it was always as an alternative to the function of the object. This old conception looks at the object in two ways - through its intrinsic materiality and through its “more aesthetic or symbolic aspects”. 18

17 | Karl Marx, Economic and Philosophic Manuscripts of 1844 (London, 1973), p. 113 : [Mac 1984, 6]

18 | To "relook" means to give a new and better “look” or shape to something – a chair, a knife, a car, a package, a lamp, an interior – which would otherwise remain too clumsy, too severe or too bared if it were left only to its naked function. [Lat 2008, 1]

Page 16: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

16

… look not only at the function, but also at the design. 19

The idea of 'design' has shifted from a surface feature in the hands of a not-so-serious-profession that added features in the purview of much-more-serious-professionals (engineers, scientists, accountants), design has been spreading continuously so that it increasingly matters to the very substance of production.

What is more, design has been extended from the details of daily objects to cities, landscapes, nations, cultures, bodies, genes, and, as I will argue, to nature itself – which is in great need of being re-designed. [Latour 2008, 2]

He finds this shift very revealing of a change in how western civilization deals with objects and action. He states that his old claim, that “we have never been modern” is true since “matters of fact” have turned into “matters of concern”. He observes that the modernist divide ( between materiality and design ) is gradually dissolving away20 . He also states the provocation that “design is one of the terms that has replaced the word revolution”.

If everything can come under design and redesign (including nature), he concludes that things will “neither be revolutionized, nor will it be modernized ”.[ibid 3] Latour feels that this broad and expanding concept of design signifies deeper shift in human emotional responses21.

He claims that design always implies a superficiality, something transitory and relatively linked to fashion and tastes. Considering objects in terms of good or bad design means that : “their appearance as matters of fact weakens, their place among matters of concern are strengthened.” Latour considers that this change in perception is connected to the increasingly universal use of computers. Digitization has 'expanded semiotics into the core of objectivity' and finalised the transformation of objects into signs leaving no room for any difference between form and function.

when almost every feature of digitalized artefacts is “written down” in codes and software, it is no wonder that hermeneutics have seeped deeper and deeper into the very definition of materiality. If Galileo’s book of nature was written in mathematical terms, prodigiously expanding the empire of interpretation and exegesis, this expansion is even truer today when more and more elements of our surroundings are literally and not metaphorically written down in mathematical (or at least in computer) terms. [ibid, 4]

19 | This dichotomy was true even though the best design was one that, in good modernist fashion (as it did in "functionalism"), approximated function as closely as possible. "Design" was always taken in this "not only... but also" balance. - [Lat 2008, 1] 20 | The more objects are turned into things – that is, the more matters of facts are turned into matters of concern – the more they are rendered into objects of design through and through. [ibid 2] 21 | The modification is so deep that things are no longer “made” or “fabricated”, but rather carefully “designed”, and if I may use the term, precautionarily designed. It is as though we had to combine the engineering tradition with the precautionary principle. [ibid 4]

Page 17: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

17

The five conceptual basis of 'design' ( its advantages) he defines are:

The first conceptual basis of the word 'design' is that it evokes a humility unlike the words “construction” or “building”. Sourcing this modesty from the history of 'design' in its subservient role22 he remarks that saying 'to design something' is safer than saying 'to build something' since 'design' presupposes not laying a foundation.

The second conceptual basis of design is 'skills', “a mad attentiveness to details that is completely lacking in the heroic.”23 Based on this conception 'design' has close connotations to the words like 'skill', 'art' and 'craft'.

The third conceptual basis of the word 'design' is its semiotic skill. It creates a semantic layer on an object24. 'Being designed' an object is open to interpretation and analysis.25

The fourth conceptual basis of the word 'design' is that it is presupposes something that exists. 'Design' is never a process which starts from scratch, it is always remedial.

Designing is the antidote to founding, colonizing, establishing, or breaking with the past. To design is always to redesign. To design is never to create ex nihilo. The most intelligent designers never start from a tabula rasa.26

The fifth conceptual basis of the word 'design' is that it necessarily involves an ethical dimension 27. In the modernist conception ethical qualities of goodness and badness were those which matters of fact could not possess.

22 | to the “real” practicality, sturdy materiality and functions of daily objects. [Lat 2008, 4] 23 | . old way - to build, to construct, to destroy, to radically overhaul: “Après moi le déluge!” [ibid]24 | . be it symbolic, commercial, or otherwise. Design lends itself to interpretation; it is made to be interpreted in the language of signs.[ibid]25 | It is thus of great import to witness the depths to which our daily surroundings, our most common artefacts are said to be designed.[ibid]26 | A given, as an issue, as a problem. Design is a task that follows to make that something more lively, more commercial, more usable, more user’s friendly, more acceptable, more sustainable, and so on, depending on the various constraints to which the project has to answer. This is the advantage of the “not only... but also” feature. [ibid 5]27 | . which is tied into the obvious question of good versus bad design. [ibid]

Page 18: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

18

Considering programming code as software design is viable as well because of the fact that software is literally cheap to 'build' since this actual building is done by compilers and linkers. 28 From a development point of view every aspect of software development is a part of the design process – coding, testing, debugging and software design is still part of design.

Reeves has stated that in spite of being cheap to 'build' , it is very expensive to design software. Due to its complexity the various design aspects of software have their related design views.29

The sections featured next in the main body are -

1 | ARTEFACT

2 | TECHNOLOGY

3 | LANGUAGE

28 | We often refer to the process of compiling and linking a complete software system as "doing a build". The capital investment in software construction equipment is low—all it really takes is a computer, an editor, a compiler, and a linker. [Reeves 92]29 | In software engineering, we desperately need good design at all levels. In particular, we need good top level design. The better the early design, the easier detailed design will be. Designers should use anything that helps. Structure charts, Booch diagrams, state tables, PDL, etc.— if it helps, then use it. We must keep in mind, however, that these tools and notations are not a software design. Eventually, we have to create the real software design, and it will be in some programming language. [Reeves 92]

Page 19: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

19

1- ARTEFACT

1.1 META-MEDIUM

When describing the viable scope of 'Software studies' Lev Manovich makes a distinction between 'cultural software' and logistics/industrial software. This distinction is important to understand the types of software which he looks at. He is specifically targeting a special sub-set of application software which are used to create digital media content. Apart from the typical instances of such 'media-content' creating and accessing software like MS Word, Mozilla Firefox, Adobe Photoshop, Dreamweaver, Flash, After Effects etc. he holistically even includes complete programming environments along with entire GUI and post multi-media interfaces within the scope of 'Software studies'.[Manov 2008, 12]

He notes that “like alphabet, mathematics, printing press, combustion engine, electricity, and integrated circuits, software re-adjusts and re-shapes everything it is applied to – or at least, it has a potential to do this.” [ibid,15 ]

This first decade of the new millennium, he remarks, has seen a substantial increase in media-design students using programming, coding etc. in their compositions as well as the use of scripting languages like JavaScript, ActionScript, PHP and Processing by the digital-media industry. Keeping their API's (Application Programming Interface) accessible, he claims web 2.0 companies have made programming more efficient, if not more easy. The added benefits which easy access to API's permit (like through Processing, Javascript etc.) are motivating for young designers to take up programming more seriously in the future. Compounded by the existence of 'social software' and 'social media' i.e. platforms/tools which allow rich inter-human communication like “web browsers, email clients, instant messaging clients, wikis, social bookmarking, social citation tools, virtual worlds” he does not foresee in the near future a significant long tail of this democratisation of software development, but he does state that such a situation is coming around.[ibid,9]

In section two I will deal more closely with the techno - cultural lineage of Processing, but before going further into this artefact itself, I want to briefly summarise Manovich's idea about the meta-medium. Reflecting upon Alan Kay's association with the Smalltalk programming language from the 1970's at the Xerox PARC, he notes that Processing is close to Kay's original vision of a meta-medium. [ibid,78]

The characteristics of a 'meta-medium' he defines are - the ability to represent various other media, ability to add new properties to these various media, provide constant feedback to the user, establish a two-way dialogue, able to satisfy any information related need, act as a programming and problem solving tool etc.[ibid, 40]

Page 20: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

20

In Smalltalk he claims a pioneering approach to create a programming language was achieved since all the default GUI and media editing applications contained in the Xerox PARC computer were written in Smalltalk which caused rapid learning of the language even by beginners through allowing them to write their own custom software application by modifying and studying existing ones.

Smalltalk's contribution is a new design paradigm--which I called object-oriented--for attacking large problems of the professional programmer, and making small ones possible for the novice user. Object-oriented design is a successful attempt to qualitatively improve the efficiency of modeling the ever more complex dynamic systems and user relationships made possible by the silicon explosion.” Alan Kay , 1993

30

So, in the spirit of Alan Kay's understanding of the computer as a meta-medium, Processing, is classified as a meta-medium (whose 'content' is “a wide range of already-existing and not-yet- invented media.”) similar to PHP, Python, ActionScript, Vbscript, JavaScript, etc. and unlike traditional GUI based software for digital image,video and shape manipulation catering to designers and artists. This distinction is relevant since a meta-medium which can “simulate the details of any other medium”, even non existent ones, is also implicitly a meta-tool through which other existing as well as unique tools can be created.[Manov 2008, 78]

1.2 THE 'SYSTEMATIC DOMAIN' OF MEDIA DESIGN I now describe some ideas about creating an ontological cleanliness to put into perspective the choice of the 'Processing' syntaxes which are presented in the section after. According to these ideas described below 'Processing' can be seen as a computational tool applied to a profession oriented systematic domain31 of media-design.

The content of the domain of computational media-design is unique (digital media-objects) but there are common elements that cross its boundaries into computer science as well which is what the creators of this product hope for – this cross over. The choice of syntaxes ( ontological plan ) used in Processing can now be seen as an analysis of the user's context of activity. The user's context of activity itself can be seen as the result of an analysis of the domains of breakdown.

In the final chapter titled - “Using Computers : A Direction for design” - from their seminal book32, Winograd and Flores state that the most important 'designing' to assist human interaction with computer based system is ontological.

30 | http://www.smalltalk.org/smalltalk/TheEarlyHistoryOfSmalltalk_Abstract.html31 | A systematic domain is a structured formal representation that deals with things the professional already knows how to work with, providing for precise and unambiguous description and manipulation. The critical issue is its correspondence to a domain that is ready-to-hand for those who will use it. [WFlo 1986, 176]32 | [WFlo 1986, ] - Winograd, T. & Flores, F. "Understanding computers and cognition. A new foundation for design." (1986) Norwood, NJ: Ablex . Print

Page 21: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

21

They define designing ontological as “ an intervention in the background for our heritage, growing out of our already-existent ways of being in the world, and deeply affecting the kinds of beings that we are.” [WFlo 1986, 163] They see computers as “wonderful devices for the rule-governed manipulation of formal representations” and computer science as “the design of mechanisms that can carry out complex sequences of symbolic manipulations automatically, according to a fixed set of rules.” [ibid 174]

Winograd and Flores claim that ontologically oriented design is reflective as well as political since it permits a duality, to carry forth a tradition which could have been instrumental in creating the current state and also carry ahead in to transforming the undeveloped tomorrow. Ontologically designed novel tools alter the users self-perception causing new technological development. Lamenting the ambiguity of terms like 'user-friendly', 'easy-to-learn', and 'self-explaining' they remark that these terms despite their vagueness do present notions to be extremely concerned about.

Under the heading of “Readiness-to-hand” they interpret futuristic ideas of easy to use computers by making an analogy to the convenience of inter-human conversation. After a domain of conversation is established the mutual pre-understanding permits the use of minimum words and conscious effort. Self awareness of the structure of the conversation itself occurs only when a breakdown of this pre-understanding occurs. Based on this analogy they infer that if a machine's understanding were to become human-like, then interaction with a computer can turn transparent as well. They claim 'the transparency of interaction' to be very significant for the design of tools, especially computer systems, but warn against miming human faculties. Further, citing the example of driving a car with a steering wheel they note how this process occurs without the person becoming completely involved in the act and hence allowing her to concentrate on the road.

Phenomenologically, you are driving down the road, not operating controls. The long evolution of design of automobiles has lead to this readiness-to-hand. It is not achieved by having a car communicate like a person, but by providing the right coupling between the driver and action in the relevant domain (motion down the road).[ibid 164]

In reference to designing computer tools they state that the issues are the same. Allowing a user to focus on the work itself instead of being bothered with the awareness of formulating and giving commands makes a good tool. They lay greater stress on the 'design of the domain' where a user is situated. Thereby a bad design would be one which forces the user to deal with complexities that belong to a different domain, and a careful negotiation of the user's domain of understanding leads to a 'readiness-to-hand' and ontological relevance of the given tools.

However they also write that a breakdown is not entirely a negative situation which ought to be avoided. Since the breakdown achieves an unconcealed situation where the user realises hidden aspects of an 'erstwhile good design'.

Page 22: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

22

They suggest instead a designing of aids for those who work within a 'domain of breakdowns' since it's completely impossible to avoid breakdowns altogether by means of design. This would involve an anticipation of all possible future breakdowns and providing the relevant training to help with understanding each of the 'breakdowns'.

Winograd and Flores reiterate that akin to breakdown, blindness to future possibilities cannot be avoided by the designer. The designer according to them must stay aware to the possibilities being eliminated in relation to the new possibilities being created. They use the term 'Systematic domains' to emphasise the fundamental condition of using language. Since words signify the objects that they distinguish, they conclude that “ language does not describe a pre-existing world, but creates a world about which it speaks ” [WFlo 1986, 174].

They cite the example of the 'systematic domain' of financial markets which uses terms like “ 'shares', 'options', and 'futures' whose existence is purely linguistic based on expressions of commitment from one individual to another ”. The accounting program VisiCalc33 was noted as an effective computer tool in this 1986 book because of its clear connection to activities within the domain of commerce. They declare that the next challenge for future designers of computer tools will be to proceed ahead of efficient interfaces to manipulate the 'superficial structures' of text and data into interfaces to manipulate “the domains generated by what people are doing when they manipulate those structures”. [ibid 176, 165]

1.3 SYNTAX

In 1974 Ted Nelson had already noticed that computers were making it possible to transform (revise and improve) all old (cultural) media. He felt that the fundamental issues to be understood are not technical issues within a scientific basis, but to understand computer based transformation of all old media as a matter of “media consciousness, not technical knowledge”. [Nel 1974, 18]

He foresaw an idea of a wholeness of a system can occur to the user only if the unnecessary complexities have been streamlined. Taking the example of a text editing system which was simple to the 'user' but irritated computer people since it hides what to them were the most significant aspect of computer programming – being preoccupied with storage.

He remarks that computer and electronics people were like generals preparing for the last war and that hiding the details of storage allocations were precisely the sort of details which needed to be kept to the background if the system were to work for regular people. He considered technicalities important but useful only as “foundations of a cathedral”. [ibid, 5]

33 | Visicalc is a microcomputer program that lets a person manipulate an 'electronic spreadsheet' with rows and columns of related figures. It is one of the most commercially successful pieces of software ever created, and is credited with motivating the purchase of more small home and business computers than any other single program. [WFlo 1986, 175]

Page 23: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

23

THE TECHNICALITIES MATTER A LOT, BUT THE UNIFYING VISION MATTERS MORE. [Nel 1974, 5]

'A Modest Proposal '

Instead of devising elaborate systems permitting the computer or its instructional contents to control the situation, why not permit the student to control the system, show him how to do so intelligently, and make it easy for him to find his own way?

Discard the sequences, items and conversation, and allow the student to move freely through materials which he may control. Never mind optimizing reinforcement or validating teaching sequences. Motivate the user and let him loose in a wonderful place.

Let the student control the sequence, put him in control of interesting and clear material, and make him feel comfortable, interested, and autonomous. Teach him to orient himself: not having the system answer questions, all typed in, but allowing the student to get answers by looking in a fairly obvious place.

Enthusiasm and involvement are what really count. This is why the right to explore far outweighs any administrative advantages of creating and enforcing “subjects” and curriculum sequences. The enhancement of motivation that will follow from letting kids learn anything they want to learn will far outweigh any specialization that may result. Ted Nelson [ibid 13]

'Processing' provides three different modes of programming each one more structurally complex than the previous. In the most basic mode, programs are single line commands for drawing primitive shapes to the screen. In the most complex mode, Java code may be written within the environment. The intermediate mode allows for the creation of dynamic software in a hybrid procedural/object-oriented structure.

It strives to achieve a balance between features and clarity, which encourages the experimentation process and reduces the learning curve. The processing homepage contains extremely distinct documentation to guide a novice in to learning computer programming by starting with the rudimentary basics of computer graphics.

One such section processing.org/learning/basics/ features a comprehensive list of simple examples designed to acclimatise the learner with the fundamentals of composing a program. Categorised into various sections grouping contextually similar features of the Processing syntax, each section lists ideas ranging from the very simple to the relatively complex.

Page 24: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

24

The first category entitled 'Structure' starts with demonstrating the syntaxes relevant for distinguishing ideas of statements and comments (; , // ), setting location co-ordinates for graphical entities and structuring program flow for animation with the void34 setup() and void draw() functions featuring iterations, loops and recursions. Code placed within the draw function are read constantly (a new frame appears on the display window at a speed of 60fps by default) till the display window is closed. This section clearly shows that a processing sketch could range from just one line of code which could create fundamental graphical entities like a point or a line, to a more elaborate structure where features like the size of the display window, background colour, line thickness, frame rate etc. can be set once within the setup() function, while more dynamic qualities like the placement, alteration and movement of graphical entities are structured within the draw() function. Bringing the learner in to the 'abstract realm' starts as soon as they use a universal programming algebraic concept like ' variable x represents a value and this value is considered when the code is run'.

I have used the term 'function'35 repeatedly above. What it means in context to Processing and programming in general is “ a self-contained programming module” [ReF 2007, 181]. The Processing environment allows users to access a variety of functions which create the feeling of a systematic-domain for media design.

Functions have parameters which define specifically what needs to be achieved. For example - line(10,10,50,50); calls the function called 'line' which literally means line and the remaining parameters allows for setting the co-ordinates for the start and end positions based on the x and y axis. In processing, especially all the functions dealing with primitive shapes (line,circle,rectangle,vertex) work in this way. The naming convention of all these functions greatly relieve any ambiguity towards its intended purpose and are easy to recollect after a little usage.

The examples featured in the 'Structure' section already distinguish the draw() function from the loop() and noloop() by showing simple demonstrations of mouse-click based compositions. There is also an example showing how to design a unique function, that can be made to work just like the pre-existing functions by just adding some parameters. The next two examples demonstrating a simple recursion principle brings in a relatively more complicated scenario at this early stage.

The inherent nature of the internet-document guarantees that this list would probably change by the time this thesis is being read, but it does summarise the core ideas behind processing. Providing a visual context to learning programming principles greatly accelerates the graspability of such syntactical and logical features which drive all software.

The later categories of http://www.processing.org/learning/basics/ also follow a similar route of starting off with easy to comprehend applications of primitive graphical entities to more complex code structures. The category titled 'Form' charts the syntaxes and functions needed to compose points, lines, primitive shapes (rectangle, ellipses) etc. and then features more advanced constructs like composite shapes ( using | vertex() , beginshape() | ), bezier curves and bezier ellipses).

34 | Void - “Keyword used to indicate a function which returns no value. Each function must either return a value of a specific datatype or use the keyword void to specify it returns nothing.” - http://processing.org/reference/void.html35 | 'functions' - “allocating seperate task as a reusable code blocks to be reused throughout a program. Typically, one is concerned only with what a function does, not how it works. This frees the mind to focus on the goals of the program rather than on the complexities of infrastructure.”- [ReF 2007, 395 ]

Page 25: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

25

Similarly the category titled 'Data' shows form generation by explaining rudimentary concepts of 'variable' usage, data type (integer, byte, floating point, string, character) allocation, applying boolean (true,false) logic etc. to more advanced concepts of datatype conversion and variable scope. Further categories exist for showing control structures (iterations, conditionals), digital images, fonts, digital colour applications, mathematical functions (increment,decrement, modulo, trignometry, random), interaction (mouse based, keyboard based, system clock based), transformations (translate, scale, rotate), arrays (normal, 2D, Object based), linking to the internet and most importantly object oriented structuring.

In the Processing handbook36 by Reas & Fry however, the draw() function comes up only by the page number 173, at the start of the second section of the book. Though the two indices of the book (linear and categorical) implicitly suggest that the authors do not expect the audience to go from page 01 till the last, yet such a format shows another way of teaching, where a thorough survey of various core features is done, before any animation features are explained.

Before introducing the draw function, this text covers generic syntactical structures and data types ( int, float, char, String to distinguish between decimal digits, non decimal digits, letters and words). Often, unlike the website examples, a lot of examples make use of the print() and println() functions to show primitive CLI type effect through the output console.

In these initial sections a substantial amount of examples introduce primitive arithmetic operations 37 and decision operations38. Constant use of these primitive operations through all the other examples familiarises learners when relatively more advanced functions like | ceil(), floor(), round(), min(), max() | which deal with primitive selection are introduced. Further chapters on mathematics introduce trigonometry based syntaxes like | PI, QUARTER_PI, HALF_PI, TWO_PI, radians(), degrees() , sin(), cos(), arc() | and other random functions | random(), randomSeed(), noise(), noiseSeed() |.

More interesting than the learning structure is how the authors constantly reference various artists, artworks and the history of technology throughout these chapters to put these programming concepts into a relevant media-arts context. Other introductory sections introduce image related functions | loadImage(), image() , tint(), noTint() | , colour related functions | color(), colorMode() | and typography functions | loadFont(), textFont(), text() , textSize(), textLeading(), textAlign(), textWidth() |. Towards the end of this introductory section displacing and scaling features are looked at through the functions | translate(), pushMatrix(), popMatrix() , rotate(), scale() |.

This book clearly demonstrates a teaching methodology where before the animation and interaction sections a substantial space is devoted to examples dealing only with the basic mode of programming. Learning to use a variety of such domain specific 'functions' further orients the novice towards the concept of 'abstraction' , in context to software.

36 | [ReF 2007]Processing: a programming handbook for visual designers and artists. Casey Reas Ben Fry. The MIT Press . 2007.37 | OPERATORS + (add), - (subtract), * (multiply), / (divide), % (modulus) , () (parentheses) , ++ (increment), -- (decrement), += (add assign), -= (subtract assign) , *= (multiply assign), /= (divide assign) 38 | OPERATORS > (greater than), < (less than) , >= (greater than or equal to), <= (less than or equal to) , == (equality), != (inequality) , if, else, {} (braces) , || (logical OR), && (logical AND), ! (logical NOT)

Page 26: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

26

The mathematics used inside functions can be daunting, but the beauty of using functions is that it’s not necessary to understand how they work. It’s usually enough to know how to use them — to know what the inputs are and how they affect the output. This technique of ignoring the details of a process is called abstraction. It helps place the focus on the overall design of the program rather than the details.[ReF 2007, 181]

In stark opposition to to this approach of gently hand holding the learner through the fundamentals of programming the book by Ben Fry “Visualising data - exploring and explaining data with the processing Environment”[Fry 2007] as the title suggests skips straight into the topic of data visualisation. As he explains in the preface that the intended audience of the book are those who are primarily interested in working with data (programmers and non programmers) and not those who want to learn Processing. It just happens that the because of the author's situation 'Processing' has been used to explain the data visualisation methods and he does state that individuals with prior knowledge of how to code in Java, C++ or Actionscript would have the most success using it especially the chapters beyond the first four. [ibid, Preface viii]

The second chapter of this book gets the reader acquainted with Processing within ten pages with the second example already showing code to enact primitive mouse based interaction. In the section titled 'Functions' of this chapter a list relevant to dealing with data is provided which creates a sub-systematic domain catering to data visualisation.

The default functions are classified into acquire | loadStrings( ), loadBytes( ) |, parse | split( ) |, filter | for( ), if (item[i].startsWith( )) |, mine | min( ), max( ), abs( ) |, represent | map( ), beginShape( ), endShape( ) |, refine | fill( ), strokeWeight( ), smooth( ) | and interact | mouseMoved( ), mouseDragged( ), keyPressed( ) | .[ibid 27]

The following chapters orient the reader with techniques to use geographical maps, pull data from formatted files39 , websites and databases, cleaning up the data into useful formats, orienting the data into useful sections etc. Since the programming language used to demonstrate is Processing the author is able to conveniently demonstrate throughout various styles of graphical representation of the mined data and sophisticated mouse based interaction.

This book is unique among the pedagogical literature that has developed around Processing since it was written as a follow up to the author's Ph.D. dissertation40 to make the ideas from it accessible to a wider audience through practical examples. He also states that the featured examples are composed from a clean slate without just using existing software libraries to create generic charts or graphs. Since the code is provided for all the examples with related links into a website to access data files, the readers are naturally encouraged to customise what they develop out of these examples.

39 | .txt,.csv,.xcl etc.40 | Fry, Ben . “Computational Information Design” . Program in Media Arts and Sciences . PhD Thesis . MIT . (2004)

Page 27: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

27

A tool that has generic uses will produce only generic displays, which can be disappointing if the displays do not suit your data set. Data can take many interesting forms that require unique types of dis-play and interaction; this book aims to open up your imagination in ways that collections of bar and pie charts cannot.[Fry 2007, preface viii]

Another book which takes a unique pedagogical stance at teaching Processing is “Algorithms for Visual Design Using the Processing Language”[Ter 2009] by Kostas Terzidis an architect and designer. More on the lines of The Processing Handbook by Rees and Fry in terms of the learning curve with gentle hand holding through the programming language basics the focus of the book is on teaching the design of algorithms. Unlike a computer science text which would have focussed on the creation of efficient algorithms the aim here is to assist the design process itself by using algorithmic means to create new design tools.

The author mentions his hope of imparting knowledge to understand complex contemporary design problems and the chapter seven of the book features some 'Processing' implementation of algorithms such as Voronoi tessellation, stochastic search, morphing, cellular automata, and evolutionary algorithms which strive towards this end. 41

In spite of the abstract realm in which such algorithms function the author claims that they can be used to deal with matters of design if seen as metaphors or inspiration for design projects. Other Processing books include “Processing: Creative Coding and Computational Art” and “The Essential Guide to Processing for Flash Developers ” by Ira Greenberg, “Processing for Visual Artists: How to Create Expressive Images and Interactive Art ” by Andrew S. Glassner, “Learning Processing: A Beginner's Guide to Programming Images, Animation, and Interaction” by Daniel Shiffman. “Programming Interactivity: A Designer's Guide to Processing, Arduino, and openFrameworks” by Joshua Noble all of which as their titles suggest provide appropriate hand holding for non-programmers into the field of computational design, digital hardware based interaction and computer graphics.

The latest book at the time of writing this thesis, is the collaboratively authored FORM and CODE 42. Unlike any of the aforementioned books this one does not have any code examples explained linearly. Instead it divides the chapters similar to the website basics section described above. Providing a fast paced history of technology and art works which employ coding concepts literally or in concept.

41 | Voronoi tessellation is shown as a method of subdividing the screen into multiple areas using pixels as finite elements. Stochastic search is a method of random search in space until a given or an optimum condition is met. Fractals are recursive patterns that subdivide an initial base shape into sub elements and then repeat the process infinitely. Hybridization is a procedure in which an object changes its shape in order to obtain another form. Cellular automata are discrete elements that are affected by their neighbouring elements’ changes. Finally, evolutionary algorithms use biological Darwinian selection to optimize or solve a problem. .[Ter 2009, xxii ]

42 | Reas, Casey , McWilliams, Chandler & Lust. "Form+Code in Design, Art, and Architecture" . (2010) . Princeton Architectural Press .

Page 28: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

28

The book is sectioned into themes illustrating examples from media-art history which highlight the computation principles which are most directly relevant to designers and artists. It takes an unique approach to art history by gently classifying artworks through the computational principles of repeat, transform, parametrize, visualize and simulate. [ReW 2010] A similar classification can also be found earlier in the article Reas wrote for the 2003 Ars Electronica catalogue titled “Programming Media”.

It can be inferred from such a plurality of techniques to structure this 'curriculum' that the various authors are obviously catering to learners ranging from those with a completely zero background in programming to various levels of semi-advanced learners. The ease and efficiency of achieving advanced graphics outputs also makes Processing highly useful for usual computer programmers (not from a media design background) since the entire context of having a graphics output window is a given.

Michael Mateas has noted how while teaching an introductory course 'computation as a medium' to art and design students he once used raw Java. His reason was not to use specialised tools created for artists since they always make some projects easier at the cost of making others difficult or impossible even. The problem he confronted was that the standard Java classes for input/output along with the Swing library43 for graphical windows expected a learner to become acquainted in class and object relations early in the beginning much before opening even a window with a point.

1.4 ABSTRACTION

All the reference and pedagogical literature on Processing syntax emphasise the paradigm of Object-oriented programming. Initially all these texts demonstrate single file sketches 44 featuring one void setup() and one void draw() with all the constant graphic properties in the setup() function and all the dynamic graphic properties in the draw() function. After going through various code examples with this format the literature feature a section dealing with composing objects.

Objects and Classes, simply put are software programming models, which make the coding process more manageable when the code gets longer and various 'functions' are required to work together on some 'variables'. Classes are most literally - separate executable text files which form part of the saved sketch folder and feature the code relevant to itself. It is composite data type since it can hold various methods and variables within it.

Generally each class conceptually does a unique task and hides the complexity from outside. These classes are summoned by its 'methods' (unique 'functions' within the class) when references to them are made from the main functions – setup() and draw().45

43 | The complexities of Swing forced sophisticated object-oriented concepts too early in the course, and resulted in students only being able to complete four out of the six projects and a reduced number of readings. [Mateas 2005]44 | Programs that reside in one single file with the same name - [Ter 2009, 63]45 | If we start adding more and more procedures in the file, it will become bigger, more complex, and less efficient to search, edit, and organize. To avoid such an accumulative complexity and to establish organization and clarity, we will break the one-file program structure into multiple files, and inside each file we will write the code for organized sets of variables and methods called classes.” - [Ter 2009, 64]

Page 29: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

29

The most appropriate keywords which can describe the Object oriented paradigm in programming is 'modularity' and 'abstraction'. 'Variables' being the most primitive way of reusing and altering code within a program and 'functions' allocating autonomous tasks within a reusable code module, 'classes' + 'objects' enhances this modularity further by combing the use of unique 'variables' and 'functions' within a higher conceptual module. [ReF 2007, 395]

The unique 'variables' of the 'class' ( its 'fields' ) and the unique 'functions' of the 'class' - ( its 'methods' ) ideally stay private and inaccessible from outside the class unless accessed by 'objects' of that class from the main code body using a dot operator.

Object-oriented programming, uses objects and classes as building blocks. A class defines a group of methods (functions) and fields (variables). An object is a single instance of a class. The fields within an object are typically accessible only via its own methods, allowing an object to hide its complexity from other parts of a program.

This resembles interfaces built for other complex technologies; the driver of a car does not see the complexity of the engine while in motion, although the speed and RPM are readily visible on the console. The same type of abstraction is used in object- oriented programming to make code easier to understand and reuse in other contexts.[ibid]

In the section titled 'The Structure of Shapes', Kostas Terzidis introduces 'Object Oriented Programming' using an example of a 'hierarchical geometric structure' where each hierarchical unit has autonomy within itself and also connects to other hierarchical units in its proximity. He describes the idea of 'objects' in programming as interlinked elements in a hierarchical structural arrangement which make a complex entity do tasks efficiently. The tasks themselves can be monitored separately from its relation to the complex whole.

He provides the example of geometric structures as complex entities which can be arranged hierarchically. “ … by extrusion, points can form lines, lines can form surfaces, and surfaces can form solids. In reverse, a complex geometric solid shape can be composed of surfaces that are composed of curves that are composed of control points. ”[Ter 2009, 63]

More importantly the act of pressing the Run button (Ctrl+R) itself stands for wrapping the entire sketch as an object. This pre-supposition of a fresh sketch as one unique 'object' from a 'meta-class' also makes the initial learning curve of Processing not steep since the default object-oriented features are hidden.

Page 30: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

30

The Java white paper classifies Java as a 'object oriented programming language' since it features the following four features46 :

- Encapsulation : Information hiding and modularity.

- Polymorphism : Different objects' behaviour dependent on the nature of the object even when receiving the message.

- Inheritance : New classes and behaviour can be based on existing classes to obtain code re-use and code organization.

- Dynamic binding : Flexibility to send messages to objects without having to know their specific class during coding.

Modularity allows each object's source code to function independently within the system and in other coding projects. Since 'objects' are used only with 'methods' its internal workings stay concealed and reduces the code clutter. This re-usability is very significant for a software ecosystem to evolve around a code project, processing being just one realisation of a Java code project. Various software developers can re-use each others 'classes' and 'objects' allowing an for efficient distribution of work within a software development process. Similar to replacing a mechanical component within a bigger machine, 'objects' can be conveniently replaced and exchanged according to need.

The white paper describes 'objects' by conceptually citing a broad range of real-world objects “cars, coffee machines, ducks, trees, buttons on user interfaces, spreadsheets and spreadsheet cells, property lists, menus”, basically anything which can described by its state and behaviour.

The references from java.sun.com featured below demonstrate how ubiquitously these concepts can be applied.

... a car can be modelled by an object. A car has state (how fast its going, in which direction, its fuel consumption, and so on) and behavior (starts, stops, turns, slides, and runs into trees).

You drive your car to your office, where you track your stock portfolio. In your daily interactions with the stock markets, a stock can be modelled by an object. A stock has state (daily high, daily low, open price, close price, earnings per share, relative strength), and behavior (changes value, performs splits, has dividends).

After watching your stock decline in price, you repair to the cafe to console yourself with a cup of good hot coffee. The espresso machine can be modelled as an object. It has state (water temperature, amount of coffee in the hopper) and it has behavior (emits steam, makes noise, and brews a perfect cup of java). 47

46 | Java meets these requirements nicely, and adds considerable run-time support to make your software development job easier. - http://java.sun.com/docs/white/langenv/Object.doc1.html#2414 47 | What Are Objects? - http://java.sun.com/docs/white/langenv/Object.doc1.html#2414

Page 31: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

31

Dogs have state (name, color, breed, hungry) and behavior (barking, fetching, wagging tail). Bicycles also have state (current gear, current pedal cadence, current speed) and behavior (changing gear, changing pedal cadence, applying brakes). Identifying the state and behavior for real-world objects is a great way to begin thinking in terms of object-oriented programming.

... you'll notice that real-world objects vary in complexity; your desktop lamp may have only two possible states (on and off) and two possible behaviors (turn on, turn off), but your desktop radio might have additional states (on, off, current volume, current station) and behavior (turn on, turn off, increase volume, decrease volume, seek, scan, and tune). You may also notice that some objects, in turn, will also contain other objects. 48

'Class' conceptually works like a blueprint from which individual 'objects' are created. Various objects may have commonalities in their state and behaviour. To economise on this common code, object-oriented technology allows classes to inherit commonly used state and behaviour of their objects from other classes.

// EXAMPLE 1 - a simple class structure. The contents of this class, this code, is stored // as a separate file titled MyPoint.class MyPoint { // create the class of the name MyPointfloat x, y; // section 1 of the class - list all variables associated with the class aka it's members MyPoint (float xin, float yin) { // section2 - CONSTRUCTOR - assign parameter values // to the members using new variables x = xin; y = yin; }void plot() { // define the method plot rect(x,y,x,y); // define a shape and parameters} // Class MyPoint Over

// This class Mypoint is called from the main sketch body by the code listed below.MyPoint p; // declare an object 'p' of type – class MyPoint

void setup(){ // constant graphic features are setp = new MyPoint(20,20); // create the object 'p', // the term 'new' - allocates memory , declare the parameters. // these parameter values are passed to the Constructor.println(“x = ” + p.x + “ y = ” + p.y); // the object 'p' can be connected to the class members //with the dot operatorp.plot(); // display the shape by connecting the object 'p' to the method 'plot'} 49

48 | What Is an Object? - http://download.oracle.com/javase/tutorial/java/concepts/object.html 49 | Based on the example - [Ter 2009, 65]

Page 32: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

32

'Constructors' are part of the conventional class structure where values are assigned for objects belonging to that class. 'Constructors' have the same name as the class and work like methods by assigning values to the members of the class. The syntax ‘new’ whenever used in the main body while creating a new object signifies the existence of a 'constructor' in the class. A class can have more than one 'constructor' of the same name, differentiated only by the the number of parameters it has.

In Java as mentioned earlier any class can have one direct superclass, and each superclass can have various subclasses. When a class is being declared the keyword 'extends', is used before the name of the 'superclass'. A class which inherits fields and methods from another is the subclass. The class which gets extended in this relationship is the superclass.

All the methods and fields from the superclass are basically available to be used by the subclass apart from it's own unique ones. If the name and parameters of methods in these classes coincide only the sub class methods are utilised.

// EXAMPLE 2 - a simple sub-class structure. The contents of this class, this code, is stored // as a separate file titled NuPoint placed in a new tab on the Processing IDE.

class NuPoint extends MyPoint { // make the subclass inherit from the superclassfloat z = 40;NuPoint(float d, float f) { // constructor to receive the parameter valuessuper(d, f); // pass on the values to the super-class constructor }

void display() { rect(x+z,y+z,15,15); // make a shape using local variables along with the fields of the } // superclass. // Altered Main sketch BodyMyPoint p;NuPoint o; // declare an object 'o' of type – subclass NuPoint void setup(){ // constant graphic features are seto = new NuPoint(20,20); // Load the parameters // these parameter values are passed to the Constructors.

println("x = " + o.x + "y = " + o.y); // the object 'o' can be connected to // the superclass members.o.plot(); // display the shape by connecting the object 'o' // to the method 'plot' from the superclass.o.display(); // display the shape with the method from the subclass.} 50

50 | Based on the example - [ReF 2007, 456]

Page 33: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

33

1.5 ECOSYSTEM

The first traces of the notion of a software ecosystem comes up in the Ars Electronica 2003 article when a reference to a peer-learning environment is made51. Along with the learning examples featured on its website (with the visual outputs and the source code alongside) a wide pedagogical network can be seen active through sites like learningprocessing.com/, openprocessing.org/ and generative-gestaltung.de/code where one could claim the difference between the aesthetic / visual artefact and the pedagogical source code gets blurred. All of these show the primary concern with sharing and discussing 'Processing' sketches in a collaborative, open-source environment.

One of the crucial causes for the growth of the Processing project is because of the supplementary 'Libraries'52 composed by other software developers. A library53 consists of code documents (classes + objects + methods) which can be used within the Processing IDE providing new features to users. Developer contributed libraries are not part of the core Processing API. The core API comes with some essential libraries by default. To use any of these default libraries in a project the 'import' command is used followed by the library name | import libraryname.*; | . This command also makes explicit what code is packaged with a sketch when it is exported as an applet or application. The 'this' variable accompanies instantiated objects to connect it with one of the libraries. [Fry 2007, 27]

// EXAMPLE 3 - a simple library import structure.

import processing.video.*; // import the default video libraryMovie myMovie; // Declare a 'myMovie' Object of the 'Movie' class featured in the library code.

void setup() { // constant graphic features are set size(200, 200); background(0);

myMovie = new Movie(this, "totoro.mov"); // create the Object & link to the data folder myMovie.loop(); // connect the obect with a method from the library}54

51 | Hundreds of students, educators, and practitioners across five continents are involved in using the software. As of June 2003, more than1000 people have signed up to test the pre-release versions. An active online discussion board is a platform for discussing individual programs and future software additions to the project. The software has been used at diverse universities and institutions in cities including: Boston, New York, San Fransisco, London, Paris, Oslo, Basel, Brussels, Berlin, Bogota (Colombia), Ivrea (Italy), Manila, Nagoya and Tokyo.[ReF 2003] 52 | http://processing.org/reference/libraries53 | Every authoring and editing software comes with such libraries. In addition, both software manufacturers and third parties sell separate collections which work as “plug-ins,” i.e. they appear as additional commands and ready-to-use media.[Manov 2001, 120]54 | http://processing.org/reference/this.html

Page 34: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

34

1.5.1 INTERFACES

'Interfaces' are another crucial concept which help sustain and evolve the Processing software ecosystem. 'Interfaces' can be understood as an agreement to facilitate collaborative software development. In Java an 'interface' is an structural entity similar to a 'class' which has a list of 'constants' and 'method' declarations. 'Interfaces' are not part of the class hierarchy, they are used by instantiating a 'class' which provides a 'method' body for each of the 'methods' declared in the interface. 55

'Interfaces' are also known commercially in the software industry as 'Application Programming Interface' (API). These API's are the public connections between different software products allowing them to send and receive data. The provider of the API composes 'class hierarchies' to implement an interface. The user of the API invokes the 'methods' and 'constants' declared in this 'public interface'. The implementation of the API could be a closed secret and could be constantly refined. What the Processing environment provides as 'Libraries' and simply such a stack of Interfaces – Classes – Objects which the user can connect within the sketch to add to the capabilities of the core syntax.

Software systems developers invent abstract models, which are rendered in computable form, i.e. in programs and data. Users apply such programs to their contexts, which may, to some extent, be represented in the form of data. The use situation is called human-computer interaction. The interface is where the two interacting systems, human and computer, meet. [NaG 2006]

The object-class-interface format is a stack of metaphors and abstractions which exist between the human and the hardware embodying conventions which historically have proven to be the most efficient to convert the digital information ( images, websites, videos or text documents ) into a linear string of bytes - which are the only things computers know how to work with. Previously when 'end-users' worked with teletypes or the command line interface (CLI) they were closer to the bottom of the stack of metaphors.

In conventional GUI based operating systems and media manipulation softwares user interaction with the hardware is a heavily mediated, interpreted and translated process. Developments in GUI based interaction since the Macintosh OS were revolutionary since the CLI were not accessible to everyone especially the less technical audience who constituted a vast majority of the market. The first Macintoshes based out of one box containing both CPU and the monitor screen embodied the philosophical notion of the 'personal computer'- a consumer appliance. [Steph 1999, GUI's]

55 | In Java, a class can inherit from only one class but it can implement more than one interface. Therefore, objects can have multiple types: the type of their own class and the types of all the interfaces that they implement. If a variable is declared to be the type of an interface, its value can reference any object that is instantiated from any class that implements the interface. http://download.oracle.com/javase/tutorial/java/IandI/createinterface.html

Page 35: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

35

In context to 'Processing' the sound library Minim by Damien de Fede, 56 demonstrates an effective way of structuring interfaces, classes and objects. It defines all the commonly used functionalities in digital audio processing in four interfaces “ Playable, Effectable, Polyphonic, and Recordable”. These functions are created and accessed usually from the two base classes “Controller and AudioSource”. These interfaces are structured to contain only function definitions for the classes and it's possible subclasses to implement. The Minim library itself builds on the 'Line' interface defined in the JavaSound API which provides access to audio data between the system and application. The functions defined in 'Line' provide the control for features like pan, volume, and balance. These features are accessed by the methods defined in the base class 'Controller'. 'AudioSnippet' and 'AudioSource' are further subclasses of 'Controller' and hence inherit these methods. 57

1.5.2 COMMONS

The next stage of procedural literacy is learning to navigate the huge tower of abstraction that exists in any computer system, with each layer defining its own little process universe, and with all layers, including the programming languages themselves, contingent human-authored artefacts, each carrying the meanings, assumptions, and biases of their authors, each offering a particular set of affordances. [Mat 2005]

In the essay titled “Beyond the Computer”, Gabriel Pickard equates technological development to the question of power over the virtual and actual infrastructure of new media. He claims that the pertinent task is to consider equal and easy accessibility as an important part of the right to the freedom of information with recreation of the user interface and adapting it to human needs a viable aspect of this new-media accessibility. Commenting on 'user interaction' he notes 'the developer vs the user' divide - where the former view the digital computer (the medium) as the goal and the latter work towards 'real life' goals using the medium. He points out that though “free programmability has always been the most important feature of the digital computer, but users do not change programmes distributed as software, so for them free programmability is a 'lost feature'.”[Pic 2003, 264]

He states that the hardware/software binary is caused by the distributional business model and should not be seen simply as a technical architecture which came about from the requirements of sharing and distributing common algorithms. He cites this as reason for the power to freely programme to be transferred from the user to the developer. With reference to such a viewpoint, the processing libraries are ( in terms of their ease of use, 'systematic domain' and exquisitely documented resources ) a step towards making aspects of the digital media infrastructure and user interface accessible.

56 | code.compartmental.net/tools/minim/57 | http://code.compartmental.net/minim/javadoc/ddf/minim/Controller.html

Page 36: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

36

Apart from the default libraries mentioned below there are many other highly useful user/developer contributed libraries58 on the processing website. Especially the nature of the libraries featured under the section of 'Data and Protocols' address Pickard's claim. These libraries include facilities to send and receive data via Bluetooth and wireless networks, access the Yahoo! Search API in Processing, communicate with MySQL or SQLite databases, a TCP/UDP packet sniffer library, communicate with open source hardware59, read data from xls (Excel) files, access the Twitter streaming API, send and receive midi information, convert an HTML document from a linear string to a tree structure, read and write XML files, manage syndicated feeds etc.

The default libraries which are included with Processing consist of the following at the time of writing:60

VIDEO Import, export and playback Apple's QuickTime file formats + Connect to cameras

NETWORK Sending and receiving data via the Internet through the creation of simple clients and servers.

SERIAL Supports sending data between Processing and external hardware via serial communication.

PDF EXPORT Generates PDF files.

OPENGL Support for exporting OpenGL accelerated sketches. Utilizes the JOGL library.

DXF EXPORT Lines and triangles from P3D or OPENGL rendering modes can be sent directly to a DXF file.

MINIM Uses the JavaSound API, an easy-to-use audio library.

ARDUINO Allows direct control of an Arduino board through Processing

NETSCAPE.JAVASCRIPT

Methods for interfacing between Javascript and Java Applets exported from Processing

^ http://processing.org/reference/libraries/

58 | A list of these libraries, authors and their intended uses are provided in page 3759 | The monome 40h device – a reconfigurable grid of 64 backlit keypads which connects to a computer.60 | This situation is altering as can be noted from the case of a previous library 'Candy SVG Import' which has been moved to the Processing core code as of version 149 and functions like |PShape(), loadShape(), and shape()| are provided to load SVG files.

Page 37: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

37

Development of open-source61 hardware like Arduino62, which presents a microcontroller on a circuit board that can be programmed through an IDE based on Processing also form a important nexus in the ecosystem. Designed by a team lead by Massimo Banzi as a convenient hardware and software counterparts intended for creating interactive objects and environments, the Arduino projects can be stand-alone or active in communicating with software on running on a computer (Pure Data, Processing, MaxMSP etc). At the time of writing this document the most exotic library featured on the processing website would be the “gml4u” library by Jérôme Saint-Clair which provides a 'Graffiti Markup Language' (GML)63 library for Processing.

There is even a hybrid category titled “Compilations” which features among others the 'generativedesign' library by Hartmut Bohnacker and Benedikt Groß. A library which is composed as part of a pedagogical design publication “Generative Gestaltung”64, it provides access to highly sophisticated and well documented classes and functions for computational design with 3D modules, physics (nodes, springs, attractors) modules, access to graphic tablets and interfacing with xml, html and images together. To stress the pedagogical strength and the creative possibilities of this 'eco-system' a list of the various community developed libraries featured at the 'Processing' website are listed below.

- 3d + animationsurfaceLib Andreas Köberle, Christian Riekoff 3D surfaces, library of surfaces, an extendible

class.

Gestalt Patrick Kochlik , Dennis Paul Prototyping

Shapetween Lee Byron, Golan Levin Animation

- compilationsunlekkerLib Marius Watz STL import and export, Graphics.

gicentreUtils gicentre.org Data visualization sketches.

toxiclibs toxi Motion Graphics

victamin Victor Martins 3D, SMS Protocol

generativedesign Hartmut Bohnacker, Benedikt Groß Generative Design, Interactivity, Graphics

61 | Being open source these electronics boards can be built individually or purchased readymade and the software can be downloaded for free. The hardware reference designs (CAD files) are available under an open-source license, which anyone is free to adapt them to their needs.” - http://arduino.cc/en/62 | An open-source electronics prototyping platform. The microcontroller on the board can be programmed to sense the environment by receiving input from a variety of sensors and can affect its surroundings by controlling lights, motors, and other actuators.” - http://arduino.cc/en/63 | GML is an open file format designed to store graffiti motion data. 64 | The programs in the book illustrate some generative techniques on the basis of the four areas of design color, shape, typography and image. By combining the basis of six larger-scale examples of a number of principles and advanced techniques are explained. - generative-gestaltung.de/

Page 38: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

38

- computer vision + videoGSVideo Andres Colubri GStreamer movie playback and camera

capture.

OpenCV Stephane Cousot , Douglas Edric Stanley Computer Vision - blob detection, face recognition etc.

integralhistogram Giovanni Tarducci , Alessio Barducci Color, Graphics.

P-SURF Claudio Fantacci , Alessandro Martini SURF (Speeded Up Robust Features) implementation.

tuioZones jLyst device specific implementation, TUIO messages sent from a tracking application.

Face Detect (PC) Bryan Chung Computer Vision, Face detection.

TUIO Client Martin Kaltenbrunner Interactivity, device specific implementation- TUIO , reacTIVision.

- graphic interface controlP5 Andreas Schlegel Dynamic GUI elements.

Interfascia Brendan Berg Interface widgets (text fields, buttons, checkboxes, sliders)

G4P Peter Lager 2D GUI components (buttons, sliders, labels, text boxes etc.)

ezGestures Elie Zananiri Gesture recognition library.

What has the essence of technology to do with revealing? The answer: Everything ..

Instrumentality is considered to be the fundamental characteristic of technology. If we inquire step by step into what technology, represented as means, actually is, then we shall arrive at revealing. The possibility of all productive manufacturing lies in revealing. Martin Heidegger [Hei 1954, 3]

Page 39: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

39

- hardware interface Most Pixels Ever Daniel Shiffman and Chris Kairalla Sketch across multiple screens.

NXTComm Jorge Cardoso Device specific - Lego Mindstorms NXT robots.

proCONTROL Christian Riekoff Control joysticks and joypads.

Apple SMS Daniel Shiffman Device specific - Apple Sudden Motion Sensor.

novation launchpad Tobias Bielohlawek Device specific - novation's launchpad.

JTablet Cellosoft Interface Graphics tablet to Java applets.

ProTablet Andres Colubri Interface Graphics tablet to Java applets with the JPen package.

- import export unzipit Yonas Sandbaek Reading Bytes, Strings, PFonts or PImages

from a Zip-File.

postToWeb Yonas Sandbaek Upload pdf, png, jpeg, gif and tiff files to a web server.

MRI3DS Victor Martins Load 3d studio .3ds files.

OBJ Loader Tatsuya SAITO and Polymonkey Loads .OBJ files (geometry and texture data).

supercad Guillaume Labelle Export code 3d formats ( AutoCAD, Rhino, or SketchUP ).

- simulation + mathMatrixMath Francis Bitonti Matrix operations.

Cellular Automata Francis Bitonti Cellular automata calculations.

Cell Noise Carl-Johan Rosen Cell noise (Worley noise), a pattern generation algorithms useful for animation.

Eliza Andres Colubri Implementation of the classes A.I. bot, Eliza.

LSystem Utilities Martin Prout Lindenmayer Systems in 2D and 3D.

Physics Jeffrey Traer Bernstein, Aaron Steed Simple particle system physics engine (particles, springs, gravity & drag ).

MSAFluid Memo Akten Fluid dynamics simulations.

AI Libraries Aaron Steed Genetic algorithms and the AStar algorithm.

Page 40: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

40

- sound Ess Krister Olsson Sound data (load, stream, generated in

real-time, manipulated, saved, analyzed, playback.

ttslib Nikolaus Gradwohl Sonification.

Sonia Amit Pitaru Sound playback and synthesis.

Tactu5 Alessandro Capozzo Algorithmic music.

- tools Proclipsing + P5Exporter

Brian Ballantine, Daniel C. Howe and Matt Parker

Using Processing in the Eclipse IDE

Keystone David Bouchard Video mapping.

proDOC Christian Riekoff Documentation generator.

fullscreen api Hansi Raber Fullscreen + dual screen support.

Timeline D. Rifkin Timeline tool to draw curves representing variables over time.

-typography NextText Elie Zananiri / Obx Labs Dynamic , interactive text-based

applications.

Vertext Michael Chang Giant, detailed typography at high frame rates.

wordookie Michael Ogawa Word clouds

^ http://processing.org/reference/libraries/

Sister projects like Fritzing65 (developed at the Interaction Design Lab, University of Applied Sciences Potsdam, Germany) and Wiring66 (developed at Interaction Design Institute Ivrea, Italy + Universidad de Los Andes, Colombia), which provide convenient interfaces for electronics programming, hardware control and tangible media design are also important constituents of the 'Processing' ecosystem. Before the current release, mobile.processing.org/ had also provided hand holding towards designing for mobile platforms but has now been replaced by additional functionalities that help connect to the Android platform. The idea of learners and practitioners working freely with explicit 'processes' designed by others does drastically change the social relations. What is foregrounded is a generative pedagogical process through mutual support. The 'users/community' of processing generally do and are encouraged to credit the original authors of the processes (libraries) which they use in their works.

65 | Fritzing - an Electronic Design Automation software for physical computing and prototyping. [ fritzing.org/ ]66 | Wiring - Programming environment and electronics i/o board for exploring the electronic arts, tangible media and prototyping with electronics. [ wiring.org.co/hardware/ ]

Page 41: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

41

Marx had written about the manufacturing system of his time as well as the social relations within which people worked that had changed drastically[Mac 1984, 11], but the technical content of their work was unaltered. He also noted of his times that conditions latent in industrial technology itself would dissolve the division of labour.[Win 1986, 7] These libraries can be also seen conceptually as sophisticated instruments which do precise computational tasks and can be modified endlessly and re-used by the community.

The development of this library base has also addressed the criticism that though Processing provides appropriate scaffolding, its libraries do not deal with “other forms of procedurality such as text manipulation/generation, web parsing and recombination, and AI and Artificial Life models of behaviour” 67 from six years back.

There has also been dialogues between the developers and others in the community about the aspect of authorship when it came to using such enhanced graphics and utilities libraries running within an institutionally sponsored product. A segment of such a internet dialogue from 2006 shows both sides of the argument put forth.

Being focused on small code sketches/experiments and used by various respected artists the tool created an huge amount of interest fairly quickly. In retrospect (well, for me after almost 3 years) I also think it encouraged a slightly superficial view of computational design by quickly gaining cult status amongst people never been exposed to programming before. I think it's dangerous and a sign of crisis if every recycled L-System, Neural Network, Wolfram automata or webcam tracking experiment automatically is considered art (by their authors), simply because it's been "(Re)Built with Processing"...

In response to that I also believe it might hurt Processing as platform in future if experienced users will find themselves forced to breakout and leave the tool behind. To pre-empt this to happen, I think the community at large should pay more attention and spend time on extending the current library base. Above all, library authors should also respect the tremendous amount of work put in by Ben+Casey so far and too embrace the open source mentality of their core tool. 68

The current depth of the 'Processing' ecosystem ensures it's longevity, “even if users will slowly outgrow the initial proposal of the tool and only continue to use it library itself.” [ibid]

It is not in the scope of this thesis to deal further which such a topic but I end this section about 'Processing' with an anecdote. Langdon Winner notes the historical case when Frederick Engels, in defence of authority, had told anarchists in 1872 that even if they could crash capitalism and establish the ideal socialist state the 'authority' would have merely shifted hands.[Win 1986, 6]

67 | For projects that required such capabilities, I gave students starter code to work from. Since Processing is built on Java, in future iterations of this course it would be possible to provide such capabilities as library extensions to Processing, though it’s still useful to have students look at the source code so as to understand how such capabilities can be added. [Mat 2005, 13]68 | http://toxi.co.uk/blog/2006/01/note-this-article-is-using.htm

Page 42: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

42

1.5.3 OPEN The processing software and all its libraries are open source. In fact all its direct products – the sketches are themselves the source code which generated the media object. In lineage of this trend, when commenting on the situation from the mid 1980's when Microsoft started selling software, Neal Stephenson mentions how the corporation was criticised by hackers (from the the scientific and academic world) on both moral and practical grounds since they say software as just information. Since software could be copied easily they wanted it to be free. Even people from commerce found it hard at the time to conceive the idea of selling executable code.

Citing Richard Stallman's 'Free Software Foundation' and GNU project69 (which started in 1984 along with the Macintosh) to build a free Unix version against the Microsoft and Apple's operating systems he emphasises that the idea of re-creating an operating system from scratch evolved into as a feasible plan. Since the duplication of effort was seen as highly disagreeable to the hacker, ideas of generalized modular coding practise were deeply ingrained i.e. re-constituting large problems into small subroutines which can be re-used for different future contexts as well. He cites this coding practise as the direct cause why operating systems became mandatory, since an operating system can be seen as a large library consisting of the most commonly needed code (written once and hopefully written well and then made available to every coder who needs it). [Steph 1999]

In context to Processing, though a proprietary software like Photoshop is a 'closed system' and its source code is an official 'secret'. Yet all its constituent processes (effects, filters, selection logic) which do explicitly defined tasks can now be reverse (hacked) engineered into open source code in Processing libraries keeping only the needed actions and improved upon. Some examples of such a situations in the past would be ProDOS a rival product to the commercial MS-DOS, which written from scratch and WINE a Microsoft Windows for the Linux operating system. [ibid]

The cases of free softwares like GIMP and Inkscape can also be mentioned here as providing the same functionalities as any other commercial raster/vector software. Corporate instincts of proprietary software manufacturers forces them to add more and more new features into their operating systems causing more demand for enhanced hardware to support these features.

knowledge, instead of remaining the handmaid of labour in the hand of the labourer to increase his productive powers . . . has almost everywhere arrayed itself against labour. . Knowledge [becomes] an instrument, capable of being detached from labour and opposed to it. 70

69 | GNU is an acronym for Gnu's Not Unix, but this is a joke in more ways than one, because GNU most certainly IS Unix. Because of trademark concerns ("Unix" is trademarked by AT&T) they simply could not claim that it was Unix, and so, just to be extra safe, they claimed that it wasn't. [ Steph 1999, honey-pot, tar-pit, whatever]70 | Marx, Karl . An Inquiry into the Principles of the Distribution of Wealth (1824) [Mac 1984, 14]

Page 43: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

43

One of the major hurdles in the proliferation of open source softwares have been standardisation and compatibility which is a compound problem of economic, technical and social struggles. Pickard mentions that “innovations have been thwarted, monopolies upheld and idiosyncrasies painstakingly maintained for the sake of compatibility”. He lays the cause of this at the computer industry’s inability to base itself as an interactive node with a multiplicity of interfaces where singular incompatibility could not pose a problem. [Pic 2003]

The open source community in contrast has fostered such a network where the maintenance, production and use of newer technologies is more easily possible. The simplicity of open source ( and of Processing ) is attained by making transparent deep layers of the software architecture by a literal unification of data formats to allow flexible representation. This prevents the discrimination between the 'executable', the 'data', the 'application' and the 'document'. Since all digital entities at their essence are discrete, a software system being an information package, is a collective discretion where the relationship between data and information is flexible. [Pic 2003, 264]

The open source movement seems to embody a social revolution, a stage of development which Marx has stated occurs “when the material productive forces of society come into conflict with the existing relations of production.” 71 A lot of the developers who contribute to the various open source products (esp. the various distributions of Linux) are working a day job within the corporate software setup.

Marx had written that apart from “Direct Control” of labour power, management could also use a “Responsible Autonomy” strategy “to harness the adaptability of labour power by giving workers leeway and encouraging them to adapt to changing situations in a manner beneficial to the firm”.[Mac 84,

23] In context to open source development according to the report by Corbet et al [Lin 2010] , now the maximum amount of Linux development is being done by some corporations themselves to make customised products and keep some free products for others to innovate on.

Software production under the capitalist structure is not just simply an intellectual labour process, it is also, as Marx has noted about industrial production, a valorisation process, for adding value. The closed source software firm still like the industrial capitalist wants to produce a software product greater in value than the sum of the values of the commodities used to produce it. In contemporary contexts the means of production and the labour power are nearly the same, a complex system of many programmers' code running synchronously. But to expect a software to embody surplus value, means to deny the inherent aspect of computer code that it can be copied and distributed for free. The contradiction lies here, that the software industry is 'selling' a product which can be technically copied endlessly and made to run on any compatible system. [Mac 1984, 10]

In making and selling software, the feature of valorisation, which Marx described more precisely as a social relation,[ibid 12] is the project-management hierarchy which co-ordinates the work of many different coders and markets this tested aggregate code. Free software which automatically manages diverse code projects between different coders around the globe through SVN , google code and github have now reversed this valorisation process by automating the management . Linux, GNOME and processing have all developed out of such mutually accessible code builds.

71 | Marx, Karl . A Contribution to the Critique of Political Economy (1859) [Mac 1984, 10]

Page 44: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

44

The open source movement addresses Marx's criticism of the alienation of the collective nature of work as it occurred under capitalism. The entire open source ideology exists on the idea of mutual co-operation in the 'manufacturing' of software of the type which existed prior to mechanization when Marx was writing, involving rudimentary differentiation of tasks and labour. Marx had written that 'the basis of manufacturing' would remain the handicraft skill, the act of coding in current context. It is this handicraft skill, fragmented and specialized,which Marx claimed would be used in the struggle against capital. [ Mac 1984 13&15]

Marx had written about the valorisation in the labour process making many human competences previously necessary obsolete and bringing up the scope of new competences. This can be noted in how the past two decades of a GUI based commercial media-design software culture has altered the skills required by graphic designers and video editors. Now with the emergence of 'processing' it can be understood that slowly new competencies will be expected of digital media designers after the old process of valorisation is coming to a close. [ibid 22]

In context to open source development, with developers spread at various locations, the collective nature of this predominantly voluntary work appears like a reversal of what Marx noted as manufacturing workers loosing the intellectual command over production which the handicraft worker had. The deeper division between the head and hand is enforced by closed source large commercial packages since what the 'user' looses is what appears as the capital confronting him in the task. [ibid 14]

1.6 CONCLUSION

The most obvious inference which can be made from the previous sections is that the Processing software is completely unlike other GUI based commercial software sold and widely used for media design like Adobe - Photoshop, Premiere, Flash, Director etc72. In “The language of New Media” [Manov

2001] Manovich remarks on the new logic of computer culture that are generally constituted from ready-made parts and genuine creation from a scratch has been superseded by selection from a menu. More than a 'creator' the new-media designer is a 'selector' whose skill is choosing pre-defined elements - “3D models and texture maps, sounds and behaviours, background images and buttons, filters and transitions”.[Manov 2001, 120]

Everywhere everything is ordered to stand by, to be immediately on hand, indeed to stand there just so that it may be on call for a further ordering. Whatever is ordered about in this way has its own standing.

We call it the standing-reserve (Bestand). [Heidegger 1954, 6]

72 | It is not in the scope of this thesis to describe the functionalities of these software and a general awareness of these is expected from the reader.

Page 45: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

45

1.6.1 SELECTION

... each possible combination you choose has different precise structures implicit in it, many of these possibilities remain unnoticed or unseen, for a variety of social or economic reasons. [Ted Nelson 1974]

The identity of the selector allows 'end users' to assume a sense of authorship over the media objects they produce, instead of just feeling like passive consumers and professionals to work routine tasks more efficiently.[Manov 2001, 121] Viewing a constantly changing screen based on her selection the user traverses a fixed branching tree structures constituted of pre-defined objects73 within a “cut and paste” logic. This logic evolved from softwares like Photoshop in the 1980s which Manovich points out was the “decade when contemporary culture became post-modern.” [ibid 126]

He summarises that though the approach of combining existing media to create a new one existed even in pre-digital technologies like film editing and music recording, this aspect has got standardized and legitimized74 in the work flow of contemporary software packages.

This technology enabled transfer from the material object to a signal is regarded as a paradigmatic shift towards digital media since mediated permanence is replaced by devices with constantly modifiable signals.[ibid 122]

Citing the first electronic instrument - Theremin75 which worked on the principle of altering a pre-existing signal's frequency and amplitude, he writes that electronic art was from its genesis different from the aesthetic blankness which modern art (painting and drawing) expected from itself. Noting the exception of the montage techniques of early twentieth century avant garde he notes that the change in the perception of the artist as an accessory to the machine by the 1960's when alongside Pop artists, video artists started synthesizing video signals. Apart from the pre-defined elements, he notes the convenience of using /selecting various pre-defined filters and “effects” (algorithmic modification of data)76 which can be best seen at work in the contemporary DJ aesthetics of the mix77.

73 | While more complex types of interactivity can be created by via a computer program which controls and modifies the media object at run time, the majority of interactive media uses fixed branching tree structures.” - [Manov 2001, 123 ] 74 | … by encoding the operations of selection and combination into the very interfaces of authoring and editing software, new media “legitimizes” them. Pulling elements from databases and libraries becomes the default; creating them from scratch becomes an exception. The Web acts as a perfect materialization of this logic. It is one gigantic library of graphics, photographs, video, audio, design layouts, software code and texts; [Manov 2001, 125 ] 75 | Designed in 1920 by the Russian scientist and musician Leon Theremin; a generator producing a sine wave; [Manov 2001, 122]76 | All these filters, be it manipulating image appearance, creating a transition between moving images, or applying a filter to a piece of music, involve the same principle: algorithmically modifying the existing media object or its parts. Since computer media consist from samples which are represented in a computer as numbers, a computer program can access every sample in turn and modify its value according to some algorithm. [ ibid 125]77 | The essence of DJ’s art is the ability to mix the selected elements together in rich and sophisticated ways. In contrast to “paste and cut” metaphor of modern GUI which suggests that selected elements can be simply, almost mechanically combined, the practice of live electronic music demonstrates that true art lies in the “mix.” [ibid 129]

Page 46: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

46

The central theme of his commentary is that contemporary computer operations encode existing social and economic practices and conventions in their design. He claims that “The logic of selection” 78 is a new form of control, “soft but powerful”. [ Manov 2001, 125]

Writing on the same lines as Manovich, Pickard notes that GUI based software follow upon the inherent nature of digital computers as “invocational media”79 and likens programming languages to ritual formulae that invoke certain algorithms. The ability to program (efficient representation of an idea or model in interface dependant invocations) thereby allows access to the wide range of a digital computer's processes.

The artist was no longer a romantic genius generating a new worldpurely out of his imagination; he became a technician turning a knob here,pressing a switch there. [Manovich 2001]

1.6.2 ESSENCE

In contrast to the 'logic of selection' I now present Heidegger's ideas about the essence of technology.Heidegger has noted that the 'essence of technology' is nothing technological and we can never realise this essence by pursuing the technological. He saw persons as forever chained to technology and considers this situation the 'worst' if the bounded person considers technology to be neutral since that definition he is immediately blind to the 'essence of technology'.

Commenting on his contemporary world's conception of technology which saw it as a means (instrumental) and a human (anthropological) activity, he writes that this instrumental conception of technology, as a means to an end, controls man's relation to it and prevents access to the essence of technology.

What matters to him is the manner in which technology is manipulated as a means. He claims that it is the human will to “get” technology, “intelligently in hand”, to master it. This will 'to mastery' he says becomes pertinent when technology “threatens to slip from human control”. [Hei 1954]

To access this essence, Heidegger states that technology should be understood as a 'revealing' and not just as a 'means' so that it opens us up to the truth. He extracts that since 'Technikon' in Greek means 'that which belongs to techne' and techne is the term used not just for crafts-work but also the “art of the mind and fine arts”.

78 | Although software does not directly prevent its users from creating from scratch, its design on every level makes it "natural" to follow a different logic: that of selection. [ibid 125]79 | The user interface not only includes the HCI (Human Computer Interface) but also deeper invocational levels in both hardware and software, which present themselves as interfaces to the developer-user. [Pick 2003]

Page 47: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

47

He points out that since the time of Plato the word 'techne' has been closely related the word 'episteme' and have been understood as “knowing in the widest sense”, to be an expert in something.

In context to this thesis, learning 'Processing', by artists and designers, can be seen as a process to “get” technology “intelligently in hand” i.e. learning a programming language, a 'meta – medium' . Over reliance on 'tool like' software products is the scenario similar to the idea of 'technology slipping away'. I will look at ideas of 'techne' more closely in section 2.2, but at present it can be equated to what was shown earlier as the 'ontologically clean systematic domain'. 'Knowing' and 'revealing' can be imagined as occurring when a learner develops a advanced level confidence with the system. This holistic “knowing” contributes to the 'revealing'. The same he says hold true for modern technology80, it too is a revealing and the newness81 can bee seen if we accept modern technology as a 'revealing and bringing forth'. Modern technology 'reveals' when it 'challenges' nature by unlocking its energies, transforming it, storing it, redistributing it and switching it about. This revealing (a process of semiosis) never ends nor does it confuse further on. According to Heidegger all of nature82, even man (when seen as a 'human resource') is almost like a standing reserve. Heidegger's conclusion was that since 'the essence of technology is nothing technological', thinking about and confronting with technology must occur in the domain of art. As a pedagogical medium for designers and artists 'Processing' provides the correct environment to attain the essence of digital technology. Since art is “akin to the essence of technology and, on the other hand, fundamentally different from it.” [Hei 1954, 12]

From the very moment its electric soul ignites at the touch of the power button, the computer drags you into an infinite loop of “yes-no-cancel” queries that hints at our future existence as a species requiring only one finger for clicking and a brain as optional accessory hardware. The common acceptance of the term “computer user” to describe such daily interaction hides a fundamental philosophical question.

Are we using the computers, or are the computers using us? In the complex infinite loops of today’s highly interactive software, how far could a computer go without us, constantly tending to its incessant needs to confirm its manner of being?

Lastly, were we to let it go along its infinite ways without any need for human input, would we know which keys to press to terminate the madness? [Maeda 2003]

80 | Modem Technology - that which puts exact science to use. [Hei 1954, 4]81 | It is said that modern technology is something incomparably different from all earlier technologies because it is based on modern physics as an exact science. Meanwhile, we have come to understand more clearly that the reverse holds true as well: modern physics, as experimental, is dependent upon technical apparatus and upon progress in the building of apparatus. The establishing of this mutual relationship between technology and physics is correct. [ ibid ]82 | Thus when man, investigating, observing, pursues nature as an area of his own conceiving, he has already been claimed by a way of revealing that challenges him to approach nature as an object of research, until even the object disappears into the objectlessness of standing-reserve.[ Hei 1954, 6]

Page 48: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

48

2 -TECHNOLOGY

2.1 CONTINGENCIES

MacKenzie has raised the question - “Does the design of machinery reflect the social relations within which it develops? Do capitalists (or men) merely abuse machinery for their own purposes, or do those purposes somehow shape the machine itself?” [Mac 1984, 28]

He finds it fortunate that historians have found partial and tentative answers about the effect of social relations on technical design. He considers the best evidence would come by documenting “the contingency of design”, identifying instances where “things could have been different,” where, for example, “the same artifact could have been made in different ways, or differently designed artifacts could have been constructed.” Once a contingency is noted then a historical enquiry can meditate upon why one design was chosen over another so the “question of the effect of social relations becomes a matter for empirical inquiry as well as for theory.”[ ibid 29]

He cautions that there is nothing new or substantially gainful achieved by just spotting contingencies without being able to explain the cause of choices that were made as well. He finds Marx's theory useful for this purpose since it points where to look for contingencies - “in the area of the technology of production”. 83 He points out after Marx that the labour process in a capitalist society apart from being a 'material process of production' is also a 'valorisation process'. All production technology is thereby designed to commit successful valorisation.

and valorisation will typically not simply be a matter of "profit maximizing" but will involve the creation and maintenance of desired social relations. [ibid 30]

His second caution, a little contradictory to the first' is that Marx's idea of 'valorisation' may not be helpful to identify contingency since in general practise claiming some process or artefact as “technically necessary” legitimises the choice of a technique or design.

A vested interest can disguise the extent of a contingency and further with the formations of habits and routine “our minds may be closed to the very possibility of doing things otherwise.” He cites work being done in “alternative technologies” as a positive way of revealing contingencies, especially the connecting technology to the virtues of small scale, decentralization, and ecological awareness.

He also notes that even within high-technology industry there are processes which want to fundamentally alter “what is produced and how it is produced.” [ ibid 31]

83 | In any society, the design of production technology will reflect the need for that technology to be part of a labour process that is a functioning whole. This implies obvious physical constraints: thus the instruments of production must be compatible with the raw material available. But it also implies social constraints. [Mac 84, 30]

Page 49: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

49

2.2 ART AND TECHNOLOGY

Mumford claims, aesthetic invention, which connected symbols to a 'meaningful' world was a major stimulus to 'technics'. All historic large architecture has been the site of aesthetic inventions using a combination of “volume, mass, colour, ornamental pattern, texture” to symbolise human and cosmic relationships. The most reminiscent architectural forms “pyramid, the obelisk, the tower, the arch, the dome, the steeple, the groined vault, the flying buttress, the stained-glass window” are all an achievement in “technical audacity” achieved only for its 'significance' and not solely to satisfy an engineering purpose. The mechanical inventions of the eighteenth century succeeded this chain of aesthetic inventions but by the twentieth century 'art' and 'technics' get separated. [Mum 1967, 252]

The 'separation of art and technics' ( not seeing a creative expression of a subjective form as an invention) is a modern idea, a result of the way modernity has been defined as 'the machine age, the industrial age etc.'. Mumford reminds the importance of 'technics' by citing the earliest observation in mathematical physics by Pythagoras was the relation between a vibrating string and a musical note. He also cites the series of historical textile inventions which created a technical innovation - the Damascus weavers, the tapestries of the Middle Ages and the ornamental patterns of the Jacquard loom which popularised punch-card instructions. [ibid 256] The printing press by the sixteenth century slowly diminished the class monopoly of knowledge and democratised image making with the techniques of printing, etching, engraving. The printing press is amongst the earliest 'tools' where the mechanization of the worker was transferred to the mechanization of the work itself. [ibid 273]

Mumford in his theorisation of the 'mega-machine' notes how in the present times 'technics' has become disconnected from the larger cultural whole in which a person belongs he states :

The classic Greek term 'tekhne' characteristically makes no distinction between industrial production and 'fine' or symbolic art; and for the greater part of human history these aspects were inseparable, one side respecting the objective conditions and functions, and other responding to subjective needs. [ibid 9 ]

The often used word in this thesis - 'technical' has developed out of the word “technic” (which Mumford uses significantly) and that dates back to early sixteenth century British usage which meant “skilled in a particular art or subject esp. to do with the mechanical arts”. root lies in the Latin 'technicus ' which itself inherits meaning from the Greek words - 'tekhnikos' which meant “of art or concerning the arts” and 'tekhne' which meant “art, skill, craft, method, system.” 84

In the graeco-roman usage of “technics” Mumford points out that it stood for the man's whole environment and meant broadly “life-centered, not work-centered or power-centered”. He considers that language - man's symbolic expression evolved out of the “mega-machine's order of ritual”. This point will be elaborated further in sections 2.7.1 and 3.

84 | etymonline.com/ term=technic | 01APR2011

Page 50: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

50

Technics (involving both language and tool use) thereby supported the capacities of human expression, it was the specific human achievement that was different from his closest anthropoid predecessors was the 'shaping' of a new self. The product of 'technics' - human 'culture' (ritual, speech, costume, social organisation and tool usage) is ultimately remodelling the human organism and the idea of the human personality.

There was a time when it was not technology alone that bore the name techne. Once the revealing that brings forth truth into the splendor of radiant appearance was also called techne. There was a time when the bringing-forth of the true into the beautiful was called techne. The poiesis of the fine arts was also called techne.

At the outset of the destining of the West, in Greece, the arts soared to the supreme height of the revealing granted them. They illuminated the presence [Gegenwart] of the gods and the dialogue of divine and human destinings. And art was called simply techne. It was a single, manifold revealing. It was pious, promos, i.e., yielding to- the holding sway and the safekeeping of truth.

The arts were not derived from the artistic. Artworks were not enjoyed aesthetically. Art was not a sector of cultural activity.

What was art—perhaps only for that brief but magnificent age? Why did art bear the modest name techne?

Because it was a revealing that brought forth and made present, and therefore belonged within poiesis. It was finally that revealing which holds complete sway in all the fine arts, in poetry, and in everything poetical that obtained poiesis as its proper name.

Heidegger on 'Techne' [Hei 1954, 12]

Page 51: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

51

2.3 THE ARTIST PROGRAMMER

Writing in 1991 Craig Hickman asks the awkward question “If photographers do not make their film, poets do not build their typewriters and painters do not weave their canvasses, why should artists working with computers write software?” [Hick 1991, 49]

I want to explore some of his ideas since he claims that “ the most exciting and important applications for computers in the arts will come through artists' programming. The reason lies fundamentally in the nature of the computer … . ”[ibid]

Even at the dawn of World Wide Web and 'networked society' the digital culture we live in today was foreseeable quite precisely. He writes that the definition of 'Computer Graphics' which has changed from earlier connotations with "crude but promising" artistic works to the present (1991) conditions where a 'computer graphics software' is commonly employed by designers to create 'Graphic Design' signifying a success of that artistic category.

The same is the case he cites with 'computer animation' (which has become a category of animation), 'computer writing' (the act of using a text formatting or reading software) and photography since we increasingly associate 'a photo' as a digital image seen on a computer monitor and not just as a print made from a chemical process.

He notes that 'photographers' remain distinguishable from 'computer artists working with digital image' because of the higher resolution and colour accuracy of the formers products but 'photographs' would start including products of even consumer snapshot digital cameras since already high resolution digital images sent back by space-crafts are called photographs. However he does state that the early computer artists who were the explorers of forms and ideas developed new connections by using the computers and digital media for completely artistic purposes.

Referring to the computer theorist and digital media visionary Alan Kay's article in the Scientific American where the idea of the computer as a meta-medium appears for the first time he explains that a computer can be programmed to respond like a typewriter, a calculator, a musical instrument or like a 'drawing paper and pencil'. Even though the simulation of traditional art materials and 'typewriter with correcting fluid' by paint and word processing programs may not necessarily be perfect, yet the digital computer adds own unique capabilities to the application.

He summarises that artists should learn to program.

Since the 1960s, media have been a primary focus of artists (in conjunction with thinkers in other areas of culture including sociology, communication theory and literary theory); this attention has expanded their roles from simple vehicles of ideas to objects commentary. In this context, what could be more exciting than the computer, the quintessential metamedium, and who could be better disposed than artists to discover its new applications?

Page 52: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

52

If artists are going to work with media as subject matter, create new media and realign relationships between traditional media, they probably will do so, in large part, through programming because it offers artists the flexibility they need. Most software today was modeled on some existing medium or application, what artists must do is expand upon what already exists. To do so they have to be able to create their own software. [Hick 1991, 50]

In the early days of creative usage of computers (before light pens and data tablets were easily available), lacking any interactive means of controlling the computer, artists wrote their own programs which once executed by the machine left no space for creative invention. Importantly from the context of this thesis 'the design process' took place exclusively in the conceptualization in the artists mind, prior to running a program.[Die 1986, 164]Artists writing software for producing graphics with a computer historically fell into two categories. In the first category were artworks produced from graphics subroutines coded in an available programming language and in the second category were machine language ( with a unique syntax, commands and vocabulary ) coded works.[ibid 163]

Graphics extensions did away with programming in machine code and could be programmed and operated with Fortran or ALGOL 60 like Georg Nees' graphics extensions G1, G2 and G385 and Leslie Mezei's SPARTA 86. Examples of the second category would be Frieder Nake's language COM-PART ER 5687 and Kenneth Knowlton's language BEFLIX88 and EXPLOR89. This “first generation of computer artists had to focus on logic and mathematics - in short, rather abstract methods.”

All these early graphics languages stayed obscure due to their machine dependency and minimal scope but they were highly useful for the goals of their artist-programmers. BEFLIX since it was available in several art departments was used more frequently in programmer artists collaboration.

These early computers were 'un-interactive', inaccessible, isolated in sterile air-conditioned spaces and were operated with coded punch cards.

Visual output was generated with pen plotters, microfilm plotters (animation) and line printers. Colour variations were done by using different inks on the plotting pens or during reproduction. These prints and reproductions (serigraphs) were exhibited as art and visual research material with signatures.It was in an international artistic context like this that Ivan Sutherland's developed the "Sketchpad" in 1963, an interactive graphics system which can be traced back as the starting point for the contemporary situation of artists drawing images directly into the computer's memory.

85 | Programmed in ALGOL 60 and contained commands for pen control, random number generators. [Die 1986, 163 ]86 | A system of Fortran calls incorporating graphics primitives (line, arc, rectangle, polygon, etc.), different pen attributes (dotted, connected, etc.), and transformations (move, size, rotate). - [ibid]87 | Written for the the Standard Elektric ER 56 and contained three sub-packages, a space organizer, a set of different random number generators, and selectors for the repertoire of graphic elements. - [ibid]88 | Designed to produce animated movies on a Stromberg-Carlson 4020 microfilm recorder. Points within a 252- by-184 coordinate system could be controlled, each having one of eight different shades of gray. Images resided in the computer's main memory. Provided instructions for motion effects and camera control.- [ibid ]89 | images from EXplicit Patterns, Local Operations and Randomness, written for artist Lillian Schwartz .[ibid]

Page 53: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

53

Artists even when they code themselves focus on the visual output - 'the digital image' and a few preliminary ideas about it are mentioned here before proceeding further. Kittler defines the computer image at simplest as a two-dimensional additive mixture of three base colours shown in a frame. He highlights that 'a digital image' derives a precise addressability to all elements (all coordinates within an x y axis) from early-warning radar systems that countered the threat of ballistic missiles, although it has replaced the polar coordinates of the radar screen with the Cartesian coordinates. [kit4

2001, 31]

At any rate, the generation of 2000 likely subscribes to the fallacy-backed by billions of dollars - that computers and computer graphics are one and the same. Only ageing hackers harbour the trace of a memory that it wasn't always so. There was a time when the computer screen's display consisted of white dots on an amber or green background, as if to remind us that the techno-historical roots of computers lie not in television, but in radar, a medium of war. [ibid]

Unlike the television where just the horizontal line mattered, in computers both the horizontal and vertical lines are addressed as basic units forming the two dimension matrix of the 'pixel'. Configuring the pixel into the red, green and blue colours produces each point of a digital image where the discreteness of the geometric coordinates and chromatic values can be fixed and modulated. Pixels in mass deceive the eye by appearing uniform and create the illusion of an image yet stay individually addressable. This feature lets a computer monitor to effortlessly switch between text and graphics modes.

Kittler states that the fundamental problem of the digital image is the Shanon-Nyquist sampling rate. Since nature cannot be resolved into discrete (digital) units so digitisation pre-supposes distortion.90 Pixels, being the algebraic two-dimensional matrices and geometric orthogonal grids, always have more than one neighbour and all algorithms which work on image content work with these neighbour relationships to cause an image. He emphasises that all technical media from the camera obscura till the television camera have converted the ancient law of reflection and the modern law of refraction into hardware.

He concludes that the digital computer graphics are at their root, 'indoctrinated' with the 'western mode' of perception which relies on reflection, linear perspective, refraction and aerial perspective. What is seen through the LCD screens and display monitors are a simulation of these laws themselves and not just a representation of the effects of these phenomena like in analogue optical media. [kitt4 2001,

34, 35]

90 | noise, "quantization hiss" looming in digitally recorded music occurs in computer images as a stepped effect or interference, as an illusory discontinuity or continuity. The sampling effect of Nyquist and Shannon does not just chop flowing curves or forms into building blocks, known among computer graphics specialists as Manhattan-block geometry. [kitt4 2001, 33]

Page 54: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

54

2.4 ARTEFACTS

Winner starts his insightful text from 198691 claiming existence of the controversy about the notion that technical things have political qualities.

At issue is the claim that the machines, structures, and systems of modern material culture can be accurately judged not only for their contributions to efficiency and productivity and their positive and negative environmental side effects, but also for the ways in which they can embody specific forms of power and authority. [Win 1986, 19]

Cautioning against the urgency to judge technical artefacts in political terms as shown by contemporary critics since all the great contributions of science and technology92 have been hailed at some time as the “best guarantees of democracy, freedom, and social justice” he notes:

To discover either virtues or evils in aggregates of steel, plastic, transistors, integrated circuits, chemicals, and the like seems just plain wrong, a way of mystifying human artifice and of avoiding the true sources, the human sources of freedom and oppression, justice and injustice. Blaming the hardware appears even more foolish than blaming the victims when it comes to judging conditions of public life. What matters is not technology itself, but the social or economic system in which it is embedded.[ibid]

He recalls this central premise of the 'social determination of technology' theory to counter an uncritical assumption about technical devices brought out by a typical 'technological determinism' point of view93. The important thing he says is to enquire into the social conditions which allow a technology to be developed, deployed and used.

A theory of 'technological politics'94 would look at the meanings of some of the characteristics of technical objects, thereby complementing the 'social determinism' side of the argument and would claim some technologies as a political phenomenon itself.

He states that commonly held notions about 'technologies' are about the 'things' used to construct order in the world. 'Technologies' are part of the existing 'social structures' which allow different people different levels of accessibility.

91 | [Win 1986, “Do Artifacts have Politics” ]92 | The factory system, automobile, telephone, radio, television, space program, and of course nuclear power have all at one time or another been described as democratizing, liberating forces. [ibid]93 | … the idea that technology develops as the sole result of an internal dynamic and then, unmediated by any other influence, moulds society to fit its patterns. [ibid]94 | By the term “politics” I mean arrangements of power and authority in human associations as well as the activities that take place within those arrangements. For my purposes here, the term “technology” is understood to mean all of modern practical artifice, but to avoid confusion I prefer to speak of “technologies” plural, smaller or larger pieces or systems of hardware of a specific kind. [ibid]

Page 55: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

55

He cites technological innovation as comparable to a legislative act which establishes a framework of some 'public order' to last generations. This analogy provides the reason to show all technologies 95 the same attention as given to “rules, roles, and relationships of politics”.

The issues that divide or unite people in society are settled not only in the institutions and practices of politics proper, but also, and less obviously, in tangible arrangements of steel and concrete, wires and semiconductors, nuts and bolts. [Win 1986, 23]

He notes that the wide scale deployment of any technical system also defines a structure of human relationships that will be sustained around it. These human relationships will be political by being either “centralized or de-centralized, egalitarian or in-egalitarian, repressive or liberating”. So, technologies by their very nature are not political in any specific way. The notion that technologies are in some sense inherently political assumes incorrectly that a technology exists on the pre-condition of a social environment to be structured in a particular way. Basically, a given technologies' compatibility with some particular social and political relationship is not circumstantial. [ibid]

In his thesis MacKenzie highlights that the cause stated by Marx for the causes of complex technical changes, ( like the coming of large scale industry ), was 'social relations' moulding technology and not the other way around. [Mac 1984, 481]

He explains Marx's argument further by stating - claiming technology and technical changes as the “prime mover” or the “independent variable” behind social change is to assume that technical change is itself unmotivated by social factors. If 'labour power' is included as a force of production automatically human agency as a constructor of history. The definite relations of production, men enter into, independent of their will, defines man's social existence. The aggregate of these relations of production create the economic structure of society which acts as a base for legal and political superstructure. This whole structure provides a certain form of social consciousness.

2.5 CLOSE CONTEXT - 1

Since 'Pearl Harbour, 1941' the United States has claimed an enhanced sensitivity to the threat of unprovoked attacks when it might be unaware. Compounded by the post second world war scenario consisting of inter-continental ballistic missiles pileup and cold war confrontation with the USSR, the United States stayed on a permanent war footing. With a massive standing army, heavy defence investment and sophisticated weapons research, this ‘military-industrial’ complex also generated many civilian jobs and further investment.[Ger 2008, 64] Computers were the best available tools for military planners to make simulations of probable violent scenarios. Cybernetics research was used and further funded by the American military for purposes of automating aspects of warfare. [ibid 65] The key focus of all this research at that stage was to define an efficient integration of humans with machines to improve military operations.

95 | … the building of highways, the creation of television networks, and the tailoring of seemingly insignificant features on new machines.[Win 1986]

Page 56: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

56

It is not the consciousness of men that determines their existence, but their social existence that determines their consciousness. [Mac 1984, 477]

By 1961 a complex, expensive, powerful and reliable computer technology SAGE - Early Warning System for air defence became operational after a decade of military funded research projects. By then newer technologies like transistors and integrated circuits had already made it obsolete but the technological paradigm which was achieved helped develop firms like IBM to develop System 360 - a complete suite of compatible computer systems, the airline ticket booking system SABRE etc. The resultant conception of the computer as a 'symbol' processing machine for manipulation of information has determined the technical structure of all digital technology. Various core technologies96 came out of it that ensured ‘real-time’ computing where messages could be acted upon immediately unlike the then-contemporary technology of the batch-processing model (itself based on the older analogue punched card technology). These historical developments in the 1960's forced a re-understanding of how the computers were to be used in the future.[Ger 2008, 67]

By 1958 to counter the soviet Sputnik the Eisenhower administration constituted the ARPA (Advanced Research Projects Agency) and provided it the flexibility to conduct long term high risk research. ARPA received a computer built as backup for SAGE and researched military applications of information processing. Around this time John McCarthy, AI pioneer, developed the concept of time-sharing allowing the computer to do the work of many users together by allotting a tiny amount of time to each task in a row.

The concept of a private personal engagement with the computer on a one-to-one basis starts from here. This technology also established a requirement for progress in graphic displays towards a graphical user interface which used symbols.

In 1960 J.C.R. Licklider based on this new paradigm of a 'one-to-one' interaction with a computer wrote a paper “Man-Computer Symbiosis” which looked at a post-automation model of a computer machine integrated with human operators. This paper espoused ideas of interactivity and networking. In 1962 when he formed the Information Processing Techniques Office (IPTO) he funded the PhD work of Ivan Sutherland at Lincoln Laboratory.97

In his research Sutherland employed - cathode ray displays and light pens - technologies developed at Lincoln labs and built the ‘Sketchpad’.98 - an interactive application which allowed the user to draw onto the computer screen and manipulate the drawing by storing the images as data in the computer’s memory. Since this drawing was basically mathematical data 'the sketchpad' demonstrated the future viability of the computer as a visual and virtual medium.

96 | magnetic memory, video displays, computer languages, graphic display techniques, simulation techniques, analogue-to-digital and digital-to-analogue conversion techniques, multiprocessing and networking” [Ger 2008, 66 ]97 | Secret facility connected to MIT and responsible for much of the work on SAGE. [ Ger 2008, 68 ]98 | Sutherland had been a PhD student of Claude Shannon’s while at Lincoln Laboratory and succeeded Licklider as director of IPTO.[ ibid]

Page 57: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

57

Unforeseen roots of the conception of the computer as a visual medium were also present by 1945 when Vannevar Bush published “As We May Think” in the Atlantic Monthly.99 As a solution to the increases in demands of information retrieval he formulated a device – 'Memex' which lets the operator input text, drawings and notes with a dry photocopier or through head-mounted stereo camera spectacles stored in a microfiche filing system. Conceived as photo-mechanical technology the 'Memex' formalised in thought future possibilities of graphical computing and hyper media.100 These ideas were taken up by Douglas Engelbart101 at the Augmentation Research Center (ARC) to develop techniques now taken for granted in computing. Techniques such as - word processing, cutting and pasting, separate windows, hypertext, multimedia, outline processing, computer conferencing and the mouse. [Ger 2008, 70]

In his aforementioned paper from 1960 J.C.R. Licklider defines “Man-Computer Symbiosis” [Lic 1960] as a conceptual close-coupling - the next step in 'cooperative interaction between man and computer'. He foresaw computers facilitating formulative thinking instead of just providing a solution to pre-formulated problems. He defines 'cooperative interaction' as decision-making in complex situations without depending on predetermined programs where “men will set the goals, formulate the hypotheses, determine the criteria, and perform the evaluations” and “computing machines will do the routinizable work that must be done to prepare the way for insights and decisions in technical and scientific thinking.” In support to the ideas of this thesis he remarks on the generally held notion that back then that programming for a computing machine disciplines the thought process and forces one to think clearly. But his ideas go against this situation since he writes that if the user can already formalise his problem in advance, symbiotic relation to a computer is not necessary. He narrows down to attain a symbiosis making the computer formulate parts of technical problems - “convert hypotheses into testable models and then test the models against data”.

This conception looked at making the existing hardware – SAGE - answer questions, simulate mechanisms + models, carry out a technical procedures and display the results to the human operator. According to this conception machines will transform data, plot graphs, interpolate, extrapolate, transform, allow humans to study the behaviour of dynamic models, provide statistical-inference. He does highlight that it will be inferior to humans in diagnosing decision-theory or game-theory102, matching patterns and recognizing relevance.

Remarkably he also envisions machines providing permutations and combinations of options when the human operator cannot imagine any since this procedure could be made routinisable into a clerical operation during the intervals between decisions taken by the human operator.

99 | Bush - pioneer of the scientific calculating machine; was the special scientific advisor to President Roosevelt at the time. [ ibid 69 ]100 | Data compression, Information exchange with other users, Voice recognition, 'Associative indexing' - user leaves recordable trails through the mass of information which can be followed and annotated by other users. [ ibid 70 ]101 | Ex- radar technician; Founded the(ARC) funded by IPTO to research how computers might be used to augment human intelligence.[ibid]102| Machine to make elementary evaluations of suggested courses of action whenever there is enough basis to support a formal statistical analysis. [Lic 1960, 14]

Page 58: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

58

He lists his contemporary technological handicaps to achieve this vision as,- speed mismatch between men and computers - memory hardware requirements - memory organization requirements - language problems - input and output equipment As a solution to the first problem he envisions that after a decade and a half by 1975 there would be a system of networked computing centres103 which would balance the speed of computers reducing the cost of the memory and dividing sophisticated programs among a number of users. He also positively notes that the existing development in interpretive programs like FORTRAN will improve further and adapt computers to human language forms. He writes that the maximum advancement would occur in the department of data processing since it deals with the input and output equipment - displays and controls – in which already the engineering of fast information retrieval and visual feedback providing techniques have developed greatly through the research at the Lincoln Laboratory. His vision from 1960 is extremely pertinent to understand the classic idea of “man-computer interaction”.

Certainly, for effective man-computer interaction, it will be necessary for the man and the computer to draw graphs and pictures and to write notes and equations to each other on the same display surface. The man should be able to present a function to the computer, in a rough but rapid fashion, by drawing a graph. The computer should read the man’s writing, perhaps on the condition that it be in clear block capitals, and it should immediately post, at the location of each hand-drawn symbol, the corresponding character as interpreted and put into precise type-face.

With such an input-output device, the operator would quickly learn to write or print in a manner legible to the machine. He could compose instructions and subroutines, set them into proper format, and check them over before introducing them finally into the computer’s main memory. He could sketch out the format of a table roughly and let the computer shape it up with precision.

He could correct the computer’s data, instruct the machine via flow diagrams, and in general interact with it very much as he would with another engineer, except that the “other engineer” would be a precise draughtsman, a lightning calculator, a mnemonic wizard, and many other valuable partners all in one.[Lic 1960, 19]

Apart from such visions of screen based interaction he also foresaw future existence of large wall displays as a symbiotic cooperation between a computer and a team of men and the development of speech recognition104.

103| Inter-connected to one another by wide-band communication lines and to individual users by leased-wire services. [ibid 15]104| One can hardly take a military commander or a corporation president away from his work to teach him to type. If computing machines are ever to be used directly by top-level decision makers, it may be worthwhile to provide communication via the most natural means.[Lic 1960, 20]

Page 59: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

59

2.5.1 SKETCHPAD

Manovich remarks that Ivan Sutherland’s 'Sketchpad' can be historically reckoned as a working prototype of the Licklider's vision applied to the 'systematic-domain' of image making and design. It was presented in Sutherland's thesis as a 'unique communication dimension' between the human and hardware. This new communication dimension was later referred to by Alan Kay and Adele Goldberg as a 'two-way conversation' between an individual human and an 'active' 'metamedium'. [Manov 2008, 63]

Created by Sutherland as a part of his PhD thesis at MIT, Sketchpad deeply influenced all subsequent work in computational media not only because it was the first interactive media authoring program but also because it made it clear that computer simulations of physical media can add many exiting new properties to media being simulated.

Sketchpad was the first software that allowed its users to interactively create and modify line drawings. Sketchpad redefined graphical elements of a design as objects105.[Manov 2008]

Some direct extracts from this historic PhD document106 supervised by Claude E. Shannon are featured next.

Towards the end of the summer of 1962 the third and final version of Sketchpad was beginning to show remarkable power. I had the good fortune at this time to obtain the services of Leonard M. Hantman, a Lincoln Laboratory Staff Programmer, who added innumerable service functions, such as magnetic tape manipulation routines, to the system. He also cleaned up some of the messy programming left over from my rushed efforts at getting things working. Also, towards the end of the summer the plotting system began to be able to give usable output. Hantman added plotting programs to Sketchpad. Computer time began to be spent less and less on program debugging and more and more on applications of the system. It was possible to provide preliminary services to other people, and so a user group was formed and in-formal instruction was given in the use of Sketchpad.[Suth 1963, 35]

The Sketchpad system was made to allow a man and a computer to 'converse' rapidly. The medium of this conversation was line drawings as against the only existing communication method of typed out statements. This new 'conversation' was deemed important for improving the efficiency of tasks like describing the shape of a mechanical part or the connections of an electrical circuit. The 'sketchpad' used typed statements only for making legends.

105 | Computational objects which could be manipulated, constrained, instantiated, represented, copied, recursively operated upon and recursively merged. [Manov 2008, 62]106 | Sutherland, Ivan E. "Sketchpad, A Man-Machine Graphical Communication System." Ph.D Thesis . MIT . (1963) . PDF file.

Page 60: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

60

... it was implicit in the research nature of the work that simple new facilities should be discovered which, when implemented, should be useful in a wide range of applications, preferably including some unforeseen ones. [ibid 17]

Through the research Sutherland discovered that properties of a computer drawing can be completely different from drawing on paper. Apart from the accuracy, 'ease of drawing' and the speed of erasing, the computer allowed for moving the drawing parts around on a computer screen without erasing them. The research work consisted of discovering and implementing widely applicable facilities like -

a subpicture capability for including arbitrary symbols on a drawing, a constraint capability for relating the parts of a drawing in any computable way, and a definition copying capability for building complex relationships from combinations of simple atomic constraints. [ibid]

These facilities when interfaced with a light pen produced a powerful system useful in a wide range of applications, even unforeseen ones.

In the thesis “Sketchpad: A man-machine graphical communication system” Sutherland demonstrates the 'Sketchpad' system on the TX-2 computer107 at the Lincoln Laboratory using the example of a hexagonal pattern. He gives commands using push buttons, switches functions on and off, indicates the coordinates of parts of a drawing with a light pen, rotates and magnifies this drawing using knobs and views all these activities on a display system. These drawings could be inked on paper using a plotter connected to the computer. Pointing the light pen at the display system and pressing a button called 'draw' presents an elastic straight line segment from the initial till the latest location of the pen. Six more presses of the 'draw' button produce six lines that are connected as an irregular hexagon. The shape is shut by pointing the light pen to the end of the first line where it 'locks on'. Slightly jerking the pen on the screen completes segments and drawing.

Then he draws a circle to make the hexagon regular by placing the light pen at a point and pressing the button 'circle centre' to display a point on the screen. Next selecting a point to fix as the radius limit he presses the we presses the 'draw button' to display an arc with dynamic length controlled by the light pen's position.

Pointing the light pen to a corner of the hexagon he presses the 'move button' and shifts all six points elastically into the circle shape. He does a final pen-jerk to indicate the completion of the drawing which results in all six vertices of the hexagon locating themselves on the circle. He goes on to make the inner shape into a regular hexagon by pointing to one side of the shape with the light pen, pressing the 'copy button', pointing to the other side of the shape and giving the termination jerk. The button copies a definition of equal length lines and applies it to the lines on the other side. 108 The circle is deleted by pointing to it and pressing the 'delete' button.

107 | TX-2 - A new version of a larger computer MIT had constructed for SAGE. [Manov 2008, 71]108 | We have said, in effect, make this line equal in length to that line. We indicate that all six lines are equal in length by five such statements. The computer satisfies all existing conditions (if it is possible) whenever we turn on a toggle switch. [Suth 1963, 24]

Page 61: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

61

He demonstrates the 'sub-picture'109 concept by attaching many hexagons together. Designating their individual corners as attachment points by pointing to each and pressing a button he files away the shape as a sub-picture. On a fresh screen110 , which he calls -“sheet of paper” he assembles seven sub-picture hexagons using a button. He fastens the corners of these sub-picture hexagons (point on a corner, press switch) since they were already indicated to be attachment points and connects two corners of each outer hexagon to the related corners of the inner hexagon. These composite shapes consisting of sub-picture drawings could be used as a composite symbol on a fresh screen as a unique sub-picture, along with other sub-pictures as well.

The concept of atomic constraints are what provide the sense of collaborating with the machine. These constraints automatically give a temporary identity to a shape or a particular shape feature which were notable in 'automatic' features like elasticity and re-alignment.111 He claimed that it was very convenient to program new specialised constraint types and that the default set of atomic constraints was increased from five to the seventeen within two days.

The concept of definition copying was demonstrated above when the hexagon was made regular by making their lengths uniform. This definition – make lengths common : a composite operation, could have been anything else as well equally effortlessly.112 Both these concepts ( constrains and definition copying ) complement each other.

Sutherland states the implication of 'drawing' with the 'Sketchpad' is that, unlike the carbon trail left on a paper 'sketch', 'the information' about the structure of the drawing and 'the information' the constrains which give the drawing its particular appearance is stored in the computer. He describes the 'structure of the drawing' as “drawing is tied together”. This structure keeps other parts of a composite shape intact when some other (individual / composite) part of a shape is moved.

This modularity feature – ability to store information about parts of a drawing and how these parts are related separately – when other features are manipulated is what makes 'Sketchpad' special. Since it saved the structure of a drawing, it could explicitly reference similar symbols 113 and has no need to relate to the semantics of a symbol which is just necessary for the human user. It stored only the topology data of a drawing and the behaviour of the composite vertices. Sutherland foresaw successful deployment in simulating circuit diagrams.

109 | Sub-pictures – any unique or composite shape - may be positioned, each in its entirety, with the light pen, rotated or scaled with the knobs and fixed in position by the pen flick termination signal; but their internal shape is fixed.[ ibid ]110 | The new sheet” / “sheet of paper” by changing a switch setting. [ ibid ]111 | Atomic Constraints - “ make lines vertical, horizontal, parallel or perpendicular; to make points lie on lines or circles; to make symbols appear upright, vertically above one another or be of equal size; and to relate symbols to other drawing parts such as points and lines have been included in the system.” [Suth 1963, 24]112 | The number of operations which can be defined from the basic constraints applied to various picture parts is almost unlimited. Useful new definitions are drawn regularly; they are as simple as horizontal lines and as complicated as dimension lines complete with arrowheads and a number which indicates the length of the line correctly. [ ibid 25 ]113 | In an electrical drawing, for example, all transistor symbols are created from a single master transistor drawing. If some change to the basic transistor symbol is made, this change appears at once in all transistor symbols without further effort. Most important of all, the computer “knows” that a “transistor” is intended at that place in the circuit. [ ibid 25 ]

Page 62: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

62

In the introduction section titled “Sketchpad and the design process” [Suth 1963, 28] calling the construction process of drawing with the 'Sketchpad' a model of the design process itself, Sutherland points out that fixing the locations ( of the points and lines of the drawing ) model the variables of a design while the geometric constraints applied ( to these points and lines of the drawing) model the design constraints. The design constrains in this set up restrict the values of design variables.

Further, Sketchpad's ability to satisfy the geometric constraints applied to parts of a drawing was seen by him as modelling “the ability of a good designer to satisfy all the design conditions imposed by the limitations of his materials, cost, etc.” [ibid]

Noting that because designers in many fields produce only 'a drawing of a part' anyway, he mentions that since design concerns itself with the 'drawing of a part' rather than to 'the part itself', “Sketchpad’s vocabulary of constraints” could be used to make “sound design” as well. [ibid]

Since no one had ever drawn engineering drawings on a computer display, no one knew what would be the result. After over a hundred hours of experience with a working system Sutherland also states four broad categories of application where the 'Sketchpad' could be used.

He knew that the 'Sketchpad' could produce a labour saving 'Library' of drawing. All drawings made on it physically with a light pen were a mathematical description of that drawing transferred on to a magnetic tape. He foresaw a 'library' of such drawings developing over time for being re-used in other drawings. Since time was saved this stored information was 'potent labour'. Since this stored information also contained information about the constrains acting upon these graphic entities a change in even a small section can lead to changes even in the composite structure. Sutherland expected this feature to prove very useful in simulating mechanical linkages for studying. [ibid 29]

Apart from a circuit simulator and for producing repetitive drawings he claimed that Sketchpad's most exciting use would be as “ an input program for other computation programs. The ability to place lines and circles graphically, when coupled with the ability to get accurately computed results pictorially displayed, should bring about a revolution in computer application.” [ibid 106] He also noted that new test users with no programming experience could make simple drawings with it “if a skilled user (myself) prepared the building blocks necessary.”

It happened that the relaxation analysis built into Sketchpad is exactly the kind of analysis used for many engineering problems. By using Sketchpad’s relaxation procedure we were able to demonstrate analysis of the force distribution in the members of a pin connected truss. We do not claim that the analysis represented ... is accurate to the last significant digit. What we do claim is that a graphical input coupled to some kind of computation which is in turn coupled to graphical output is a truly powerful tool for education and design.” [ibid]

Page 63: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

63

He stated that 'Sketchpad' could also be applied to artistic purposes of animation and cartooning apart from only to engineering drawings. Since it stored previously drawn information on a magnetic tape means “every cartoon component ever drawn is available for future use”. He writes of techniques like substituting elements and constrains from memory onto the display screen to make 'primitive' animations and explains how real pictures can be digitised114 with the 'Sketchpad'

Using a computer to partially automate an artistic process has brought me, a non-artist, some understanding of the effect of certain features on the appearance of a face. It is the understanding that can be gained from computer drawings that is more valuable than mere production of a drawing for shop use. [Suth 1963, 110]

He cited future work on a “Sketchpad Three” which made interactive solid objects on the screen. This was being done in a thesis by Timothy Johnson115 which addressed the problem of converting several two dimensional drawings into a three dimensional shape by storing 'drawing' in three dimensions from the start.

As the first manifestation of Licklider's ideas of coupling a machine with a brain the 'Sketchpad' – Ivan Sutherland's thesis marked a highpoint in visual technology. Developed on the TX2 machine to display a visual feedback of points, lines with infinite zoom the 'Sketchpad' apart from being a prototyping system, also literally prototyped a programming paradigm of the 'object' oriented software system based on initial conditions, rules, constrains, multiples using instances of rotatable and scalable objects. It also contains the genes of all modern graphics applications. [Manov 2008, 71] This achievement of making a computer into a problem solver by simulating and extending existing media was based on a historical trajectory of making computers control other machines in real time, run mathematical simulations and simulate aspects of human intelligence.

'Sketchpad' achieved a new understanding of computational media. Instead of just imitating existing media like a paper and pen, the act of simulation itself added fundamentally new features which were unique to this 'new computational media'. Though the contents of 'Sketchpad' were older media, it used the old media as a base for creating unforeseen “unimaginable representational and information structures, creative and thinking tools, and communication options.” [ibid]

More generally Manovich notes that the 'new' in 'new media' is due to the accessibility of adding new aspects to it. These 'new' aspects are of course the new software techniques that can be added to all 'new media' artefacts. In previous industrial media technologies, very differently from 'new media', the hardware was the software – the machine and the mechanism were inseparable. This separation of hardware and software in the digital computer 'legitimises' experimentation with media.

114 | The 'girl figure' ... was traced from a photograph into the Sketchpad system. The photograph was read into the computer by a facsimile machine … and shown in outline on the computer display. This outline was then traced with wax pencil on the display face. Later, with Sketchpad in the computer, the outline was made into a Sketchpad drawing by tracing the wax line with the light pen. Once having the tracing on magnetic tape many things can be done with it. In particular, the eyes and mouth were erased to leave the featureless face [ Suth 1963, 107 ] 115 | When Johnson is finished it should be possible to aim at a particular place in the three dimensional drawing through two dimensional, perspective views presented on the display. - [Suth 1963, 114]

Page 64: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

64

The digital computer has become so popular and relatively democratised because different programs for various purposes can be written to work on the same hardware. So a constantly evolving 'media software' is one example of a general principle of experimentation116 which is inherent in 'new media'. While earlier terms like 'experimental' and 'avant-garde' were in strict binary opposition to a normal stable centre, in the software culture this opposition ends with these terms defining the norm.

The software industry since it has evolved from industrial production ideas of the past retains monopolistic practices and hamper own faster evolution by restrictive file format policies and other legal methods.

The open source movement and its 'deep remixability'117 ethos is a direct repercussion of this inherent 'experimental' nature of software – whereby all software with accessible source code can be evolved by any one with programming skills which is what the industry does anyway within closed doors. This accessibility to software development was among Alan Kay’s core goals in designing a programming tool for ordinary users. Titled Smalltalk, this programming language, formalised a paradigm of object-oriented programming. It was created to give a uniform appearance to all applications and the interface of the XEROX PARC system and to allow its users program their own media tools. [Manov 2008,

35]

Charting the historical development from the 'Sketchpad' in 1962 to 'Processing' in 2003 takes us through the “commercial desktop applications that made software-based media authoring and design widely available to members of different creative professions and, eventually, media consumers as well – Word (1984), PageMaker (1985), Illustrator (1987), Photoshop (1989), After Effects (1993), and others.” [ibid 25]

It is not in the scope of this thesis document to pursue this history of the GUI in great depth but I now look at some texts and situations from which Alan Kay developed some of his ideas. Manovich calls Alan Kay - “ the key protagonist of 'cultural software movement' ” since Kay had the awareness to claim that computers are the first 'meta-medium' whose content was “a wide range of already-existing and not-yet-invented media.” [ibid 24]

116 | experimentation is a default feature of computational media. In its very structure it is “avant-garde” since it is constantly being extended and thus redefined. [Manov 2008, 68]117 | deep remixability - Software production environment allows designers to remix not only the content of different media, but also their fundamental techniques, working methods, and ways of re presentation and expression. [Manov 2008, 105]

Page 65: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

65

2.6 CLOSE CONTEXT 2

The post WW2 socio-technical developments had led to the idea of a 'Silicon Valley' which can be seen as 'the' fertile ground which made the conception of the computer as a personal creative medium pragmatic. The invention of the 'transistor'118 caused the area south of San Francisco around Stanford University to develop as a locus of a burgeoning micro-electronics industry. Firms like Shockley Semiconductors, Fairchild Semiconductors and Texas Instruments started producing these transistors in bulk and got involved in researching to improve this technology. What resulted in 1959 was the invention of IC's - integrated circuits or 'stand-alone' circuits composed from semiconductors. This technology was deeply useful in the guided weapons systems and the American space programme, so this research towards further miniaturisation and efficiency of these devices was in part funded by the American military and NASA. Further enhancement in the production processes of these IC's brought down the prices by 1965 to 1/1000th of the original. The wide possibilities of this technology in consumer electronics got many other production firms to flock into this area around Palo Alto, California which has since been referred to as the 'Silicon Valley'. By 1969 'Intel', a new corporation started by Fairchild alumni Robert Noyce and Gordon Moore, had invented the idea of a microprocessor, which half a decade later proved highly efficient in designing affordable 'personal computers'.119

The greatest accessibility to a technology generally exist when a technology is new and this flexibility of access disappears when an erstwhile 'choice' or an initial commitment gets fixed because of some particular material equipment or economic investment which relies on the technology. [Win 1986, 9]

As a complete reversal of the Eisenhower governments creation of the ARPA project with 'unlimited defence research funding' in 1958 a decade later during/after the Vietnam war the role of military funding in university research began to be seen as a threat to academic independence in an atmosphere of civil disobedience. By 1970 the 'Military Procurement Authorization Bill' was amended by a committee headed by Senator Mike Mansfield. This 'Mansfield amendment' prevented funds from reaching any open-ended ARPA funded research unless it had a direct relationship “to a specific military function or operation”.[Ger 2008, 134 ] This legal act caused a ripple effect of creating a surplus of computer scientists and researchers who would have otherwise been involved in long term research for open ended defence projects to converge in Silicon valley, California the booming centre of micro electronics and counter cultural thinking.

118 | A miniature, low electricity consuming, technical device ( that greatly improved the electronic switching of the valve) invented in 1954 and used ubiquitously in all electronics industries – television, radios, cars, computers.119 | In 1969 an Intel engineer Ted Hoff was asked to design a set of twelve ICs for a Japanese calculator. He reasoned that rather than design different sets of chips for different purposes, why not design one set of chips which could be programmed to do any task, much like a computer. [ Ger 2008, 120 ]

Page 66: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

66

Simultaneously the photo copying machine producer Xerox120 reacting to the business computer revolution and alarmed by ideas of a 'paperless office', set up the Xerox-PARC research facility in Palo Alto in 1970 to develop ' the office of the future'. It's first head Robert (Bob) Taylor121 who brought 'blue corduroy beanbags' for the research centre pioneering the idea of a 'youthful and out of the box thinking' image of a computer software corporation which has retained to the present times in the Apple, Facebook and Google offices. He also recruited many talented ex-ARPA-funded computer researchers who had previously designed pioneering interactive augmentation systems and the first packet-switching computer networks.

Robert Taylor had previously collaborated with J.C.R. Licklider for an article titled “The Computer as a Communication Device” where they redefined computers as a communication medium. Seeing telecommunication networks as more than information sending and receiving platforms they claimed that the communicator machines are active participants in the communication process and predicted that “in a few years, men will be able to communicate more effectively through a machine than face to face.”122 Their thesis was that the digital computer is a flexible, interactive medium which can be best used to support cooperative human communication. They also introduced the idea of using mental models to look at computer-based communication. [Lic 1968]

Alan Kay123, one among these young researchers, came in 1971 with an exposure to Seymour Papert’s experiments124 with children using computers at MIT. He and the others at Xerox-PARC merged such ideas of learning with the techniques demonstrated by Ivan Sutherland (real-time visual computing) and Douglas Engelbart (bit-mapped graphics) to create 'user-friendly' 'personal computers'.

By 1972 , in a 'Rolling Stones' article counter cultural icon Stewart Brand 125 had predicted the future popularity of the personal computer after seeing the research at Xerox-PARC. These personal computers were divided into business computers ( like the Xerox Alto126 ) and notebook sized computers (esp. for teaching children ). Both these machines lacking proper marketing were never a commercial hit though the paradigm they achieved have dictated the personal computer scenario since then.

120 | Xerox Corporation (since 1906 till present) Manufactures and sells predominantly printers, photo copiers and digital printing presses.121 | J.C.R. Licklider’s successor at IPTO, ARPA’s computing research arm. [ Ger 2008, 120 , 134 ]122 | Licklider, J.C.R. & Taylor, R. "The computer as a communication device" . Science and Technology . (April, 1968). PDF file123 | Kay had worked with Ivan Sutherland at the University of Utah formerly and had witnessed Douglas Engelbart’s famous Augmented Knowledge Workshop in 1968. [ Ger 2008, 135 ]124 | Experiments influenced by the structuralist cognitive psychologist Jean Piaget. [ibid]125 | editor . “Whole earth Catalogue” - an American counterculture magazine / 1968 to 1972 / 126 | A new kind of computer, which, through the use of windows, a mouse, and a graphic interface, could be, at least in theory, used intuitively and easily by anyone. [ibid]

Page 67: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

67

2.6.1 ALAN KAY

Alan Kay joined Xerox PARC to build the second type of personal computer – Dynabooks. He set-up the Learning Research Group at PARC and is credited for uttering the legendary slogan “the best way to predict the future is to invent it!” [Kay 1989, 131] . The Learning Research Group also worked on the graphical user interfaces and the Smalltalk programming language for the 'Alto' business computers simultaneously. Kay had studied biology and mathematics before being exposed to computers127 while working for the US Air Force. Later during his PhD years at the University of Utah computer science department in 1965 he learnt of Ivan Sutherland's 'Sketchpad' and about the newly developed object-oriented programming concepts of the Simula programming language128and theorised a biological analogy between the two.129 Such theorisation laid the conceptual blueprint for his later development of the Smalltalk language.

Before that by 1969 in collaboration with a hardware designer Edward Cheadle he designed a personal computer for a graduate project called FLEX130 focussing on clear graphics and windowing features. The FLEX machine had a user interface which took instructions from a RAND tablet with a pointing device and featured a high-resolution display for text and animated graphics with multiple windows. His PhD dissertation thesis titled the “Reactive Engine” already included visualisations of a sophisticated single-user machines. By1971 while teaching at the Stanford Artificial Intelligence Laboratory, he began formulating a book-sized computer which could be used by children. Initially titled 'Kiddiekomp' this idea was later realised in the development of a laptop computer – Dynabook 131 - the visionary prototype of a notebook computer that was later developed by Apple and other hardware companies. Naming it 'Dynabook' he implied that he foresaw a cultural impact of the device similar to the printing press.

In 1968, he had witnessed Seymour Papert's work at M.I.T.’s Artificial Intelligence Laboratory. Inspired by the work of psychologist Jean Piaget in the field of children's education Papert was using computers as an aid to help children explore the environment.

127 | He was given a two-week IBM programming course and he gained experience working with everything from a Burroughs 5000 to a Control Data 6600. [ Bar 2007, 1 ]128 | Developed in 1965 by Kristen Nygaard and Ole-Johan Dahl Simula was a language used to develop computer models for production line and manufacturing systems. Simula supported discrete-event simulation construction and it introduced object-oriented programming concepts. [ ibid ]129 | First, cells conform to basic “master” behaviours. Second, cells are autonomous and they communicate with each other by sending messages. Third, cells become different parts of the organism, depending upon the context. [ ibid 2 ]130 | FLEX was the first personal computer to directly support a graphics and simulation-oriented language. It was based on previous work done by others, including Wesley Clark’s LINC, a small computer that weighed several hundred pounds; the Rand Corporation’s JOSS, a system designed for economists; Douglas Engelbart’s interactive Augmentation System, and Seymour Papert’s Logo project. [ ibid ]131 | The Dynabook was described as a very powerful, portable electronic device that would about the size of a notebook. It would carry an encyclopedia of information inside its circuits and plug into a computer network. A key characteristic of the Dynabook, would be the size of the medium. Inspired by Moore’s law that states the number of transistors per integrated circuit chip would double each year, Kay physically envisioned this new medium as the size of a three-ring binder with a touch-sensitive screen. [ibid 3]

Page 68: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

68

Papert's software LOGO was appropriating the representational and responsive abilities of interactive computers to teach children to construct simulated 'micro-worlds'. This exposure caused Kay to believe in designing systems and programming languages which could be used (read and written) intuitively by anyone, especially children.

Kay’s theory for designing graphical interfaces was primarily based on the work of Jerome Bruner who had developed a theory about learning mentalities by 1966 after studying children. He reformulated it into a working model of “Doing with Images makes Symbols.” [Kay 1989, 128]

ALAN KAY's Graphical Interface Logic

DOING Mouse Enactive / Learning through action / Haptic

IMAGES Icons & Windows Iconic / System of representation using visual & other senses; upon the use of summarizing images to recognize, compare, configure.

SYMBOLS Smalltalk language Symbolic / Representation through language.132

2.6.2 SMALLTALK

The Learning Research Group by 1973 consisted of eight members, including Adele Goldberg, Dan Ingalls and Alan Kay. Initially thought of as a iconic / visual programming language to be developed in two years 'Smalltalk' ended up being a scripted language which was used to build an operating system and a graphical user interface.

The entire programming environment which developed around 'Smalltalk' consisted of the code editors, debuggers and compilers which were used to produce larger application systems for music and animation. The first versions of 'Smalltalk' were tested with children based on Kay's belief that personal computing meant creating interactive tools. This tool authoring literacy was enabled by teaching children to solve programmable problems.133

Smalltalk was designed to work on the desktop Alto computer with a 81/2-by-11-inch display monitor for visual feedback using clear text. Neat 'typeset clarity' was slated to, an important feature of the 'Dynabook' was developed by the Alto's designers. The Learning research group thought of the screen in terms of a metaphoric simulation of a real physical desktop. The concept of overlapping effortlessly shuffled windows was developed here as an analogy to pages piled on top of each other.

132 | A variation of Engelbart’s mouse would be used as a form of enactive representation to actively navigate and manipulate text and icons displayed on a computer screen. Icons and windows were incorporated into the design as a level of iconic representation. The Smalltalk programming language was the symbolic level of the design. [ Bar 2007, 4 ]133 | For example, the children were taught how to animate a simple box. Soon the children were creating paint, music, illustration, and animation tools. Over four years, Kay and his team invited over 250 children aged six to fifteen and 50 adults to try versions of Smalltalk with its interface. [ ibid 5 ]

Page 69: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

69

Accentuated by a haptic mouse which interacted with icons, folders and documents the 'Alto' was a pioneering concept134 which set the future standard for Macintosh and Windows machines. By 1976 'Smalltalk-76' had evolved into a complete programming language becoming two hundred times faster than the Smalltalk from 1972.

Kay also started work in 1976 on the 'Notetaker' project - a prototype of a notebook sized computer with a touch screen which inherited Alto's functionality within the bounds of a portable compact machine. The 'Notetaker' had conceptually done away with the mouse and featured in-built stereo audio speakers and a microphone, an ethernet port,128,000 bytes of main memory and a rechargeable battery.

By the mid-seventies Manovich points out that artists, film-makers and architects had already been using specialised computers for more than a decade and many large-scale exhibitions of computer art were exhibited already in major museums135 around the world. In the early1970s more technically radical developments were occurring at the other places as well, especially at the University of Utah in 3D computer graphics. In physical proximity to the university existed the firm 'Evans and Sutherland'136 which had already started using 3D graphics for flight simulator prototyping pioneering research in navigable 3D virtual space.

To understand the reason behind the contemporary (2011) identity of the digital computers as 'cultural media'137 Manovich cites Kay's essay - “Personal Dynamic Media” from 1977 for the best theoretical perspectives. Unlike the visionary stance of Vannevar Bush, J.C. Licklider and Douglas Englebart which was concerned with using computers in augmenting intellectual and scientific work, Alan Kay he notes saw computers as a new creative medium which enhanced the potential of drawing, painting, animation and music composition.

This 'creativity enhancing media' perspective resulted in the coupling of an easy to use language 'Smalltalk' with a unified user interface ensuring that various 'hybrid, multi and hyper - media' could be simulated from one machine. Written in collaboration with PARC computer scientist Adele Goldberg, Kay's essay described their ideas about a 'personal notebook sized computer' which could do all the tasks required by anyone since it contained various different softwares in one machine.

Kay’s paradigm was not to simply create a new type of computer-based media which would co-exist with other physical media. Rather, the goal was to establish a computer as an umbrella, a platform for all already existing expressive artistic media. [Manov 2008, 44]

134 | Based on the work of Englebart and his Research Center for Augmenting Human Intellect which throughout the 1960s developed hypertext (independently of Ted Nelson), the mouse, the window, the word processor, mixed text/graphics displays, and a number of other 'firsts'. [Manov 2008, 51]135 | 'Cybernetic Serendipity' at the Institute of Contemporary Art, London in 1968, 'Software' 1969 at The Jewish Museum, New York, and Los Angeles County Museum of Art. Tendencies 3 Zagreb 1969. 'Tendency 4', Computer and Visual Research, 1968. Center for Culture and Information, Zagreb. 136 | Headed by Ivan Sutherland who was also teaching at University of Utah. [ ibid 42]137 | facebook, twitter, social media, email, Web 2.0, blog, user generated news, folksonomy, wikipedia, istore etc.

Page 70: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

70

2.6.3 FROM THE OPERATOR TO THE USER

Stating their138 holistic concern with every aspect of “the communication and manipulation of knowledge” Kay and Goldberg proclaim that they are designing a solo device suitable for anyone irrespective of their age and profession. This solo device which they called – the 'Dynabook' was defined as “a personal dynamic medium the size of a notebook which could be owned by everyone and could have the power to handle virtually all of its owner’s information-related needs.” [Kay 1977,

31]

At the time of the essay they had implemented 'Smalltalk' on a working prototype titled “interim Dynabooks” and were exploring use as a

a programming and problem solving tool; as an interactive memory for the storage and manipulation of data; as a text editor and as a medium for expression through drawing, painting, animating pictures, and composing and generating music. [ibid]

They state that a 'message' (since it is always an evocation of an idea) whether it be representational or abstract, constructs the 'essence of a medium' depending on how the messages are stored, altered and sensed. So in principle 'digital media' could be a simulation of 'any other media' as long as this 'any other media' could be described and this description be stored and evoked. Since the 'digital computer medium' can only acknowledge 'digital media' they see the computer as the new 'meta-medium'. The device they visualised which could change the then existing concept of a computer from a tool into a 'meta-medium' was a small, mobile, information input-output system with high fidelity audio-visual feedback and no gap between cause and effect like a flute. [Kay 1977, 32]

To the user, it appears as a small box in which a disk memory can be inserted; each disk contains about 1500 page-equivalents of manipulable storage. The box is connected to a very crisp high-resolution black and white CRT or a lower-resolution high-quality colour display. Other input devices include a typewriter keyboard, a “chord” keyboard, a pointing device called a “mouse” which inputs position as it is moved about on the table, and a variety of organ-like keyboards for playing music. New input devices such as these may be easily attached, usually without building a hardware interface for them. Visual output is through the display, auditory output is obtained from a built-in digital-to-analog converter connected to a standard hi-fi amplifier and speakers.

In a very real sense, simulation is the central notion of the Dynabook. [Kay 1977, 33]

138 | The Learning Research Group at Xerox Palo Alto Research Center. 1977.

Page 71: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

71

The interim - dynabook simulated 'drawing, painting and photo-editing' with a pointing device and an 'iconic editor'. The 'picture' was presented to the user as a manipulable object and 'the book' was presented as a searchable non-sequential archive. It provided users easy access to various fonts, smoothed within a high resolution display. In fact it was made explicit that all simulated old media (text,image,animation,sound,fonts) were basically manipulable software objects (coded descriptions) in this system. A “menu” of generic commands allowed deletion, transposition and structuring of all these objects. It featured novel methods of accessing documents using multiple windows and metaphoric representations of some 'old media tasks' 139.

So though they see 'the book' simulation on the screen as a new sub-medium they preserve the work flow from the real-world in some cases140. The 'Smalltalk' language provided the common framework that linked all the simulations of 'old media' and their new re-animation. They present examples like SHAZAM141, OPUS142 and an instance of Electronic Circuit Design143 (done by a high school student) as successful implementations of new users using the 'Smalltalk' language in context to the Dynabook.

Concluding their essay they meditate about the consequences if everyone in the world had a Dynabook and remark that if the design allowed all owners to alter everything based on their needs then definitely a totally new medium can be announced. This conception which they term as the “meta-medium, whose content would be a wide range of already-existing and not-yet-invented media.” They also state caution about the dangers of self-collapse under the pressure from so many applications as inherent in such an all inclusive design144 and position their device more on the lines of malleable natural devices like clay and paper, unlike mass-consumer commodities like cars and television.

The meta-medium as defined by Alan Kay sees the computer primarily as a system featuring an assortment of different 'old media' simulations. This set up automatically forces the system to be ripe for designing new 'media tools' and new 'types of media'.

139 | A brush can be grabbed with the “mouse,” dipped into a paint pot, and then the halftone can be swabbed on as a function of the size, shape, and velocity of the brush. [Kay 1977, 35 ]140 | In the Dynabook, pens are members of a class that can selectively draw with black or white (or colored) ink and change the thickness of the trace. Each pen lives in its own window, careful not to traverse its window boundaries but to adjust as its window changes size and position. [ ibid ]141 | A simulation of an animation tool produced with 'Smalltalk' [ ibid 37 ]142 | A musical score capture system that produces a display of a conventional musical score from data obtained by playing a musical keyboard. [ ibid 40 ]143 | Using several kinds of iconic menus, this student system lets the user lay out a sophisticated electronic circuit, complete with labels. 144 | But if the projected audience is to be “everyone,” is it possible to make the Dynabook generally useful, or will it collapse under the weight of trying to be too many different tools for too many people? The total range of possible users is so great that any attempt to specifically anticipate their needs in the design of the Dynabook would end in a disastrous feature-laden hodgepodge which would not be really suitable for anyone.[ibid 41]

Page 72: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

72

This conception of a meta-medium, says Manovich is tied to Kay's notions of literacy [Manov 2008, 76],which claims that :

The ability to ‘read’ a medium means you can access materials and tools generated by others. The ability to write in a medium means you can generate materials and tools for others. You must have both to be literate. [Kay 1989, 193]

This idea about a medium which enhanced and made literacy accessible was demonstrated through the working of the 'Smalltalk' programming language which was used to create all the media editing softwares featured in the 'Alto' and the 'Dynabook' computers. Apart from providing an aesthetic consistency that ensured users to learn new programs easily, this mutual coupling between the code and the user interface was expected to involve more beginners to create their own custom tools by studying and analysing the existing ones.

This pedagogical aspect is noted by Manovich as missing in the Apple Macintoshes when they were sold starting 1984. The Macintosh, though predominantly modelled after the machines created at PARC did not include an accessible programming language to de-construct and learn from the existing software.

HyperCard145, a later Macintosh application permitted users to create some types of 'new' applications but totally lacked the holistic scope which 'Smalltalk' ensured. Titled to evoke the metaphor of the Rolodex card index the 'HyperCard' environment enabled users to create unique software constructs integrating different forms of media and control devices.

The causes for such an outlook about a tool, radical enough to be claimed as a medium was also a indirect result of the mid-twentieth century American 'hacker' culture which developed at the research facilities at MIT and Stanford. This culture consisted of young males dedicating great time and energy in to enquiring about the unknown or hidden possibilities of computing machinery and other technical challenges. It involved a “monastic devotion” to complex computer programs, often at the cost of dispensing with personal relations with the society at large and accepted social norms of work, rest and physical appearance146. This has been traced to an existing American tradition of electronic hobbyists and radio enthusiasts from the earlier part of the twentieth century who were catered to by retailers dealing with electronics peripherals. This hacker lineage led to the ‘Homebrew’ club 147 meetings which had started by 1975 in Silicon Valley. This club catered to a growing needs, which were not addressed by the big computer corporations, from individuals of owning a computer with even limited functions. These needs were sustained at these meetings centred around simple technology sharing and making computers do stuff to measure what was achievable.

145 | HyperCard -A hypermedia-programming environment with a scripting language - HyperTalk, Written for Macintosh in 1987 by Bill Atkinson - a PARC alumni. [Ger 2008, 141]146 | The early hackers at MIT and Stanford established one of the central ache-types of computing subculture, which continues to this day, that of the intellectually advanced but socially and sexually awkward male, who is prepared to devote most of his time to an engagement with the possibilities of digital technology, to the exclusion of almost anything else. This involved working for days and nights at a stretch, living on junk food and paying scant attention to personal hygiene and grooming. [ ibid 136 ]147 | An informal gathering of electronics enthusiasts to exchange news of new technology and smart tinkering techniques centred around the MITS 'Altair' kit ( a hard to assemble computer sold through the Popular Mechanics magazine) which was programmed with switches and had no output device. [Gere 2008, 137]

Page 73: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

73

Steve Wozniak and Steve Jobs unveiled their rudimentary computer 'Apple 1'148 at one these meetings in 1976. The Apple II computer was already a 'stand alone' consumer product which was sold in a elegant casing to hobbyists and tinkerers. It could be programmed with BASIC and provided visual output in colour graphics when plugged into a television. By the start of the 80's Apple had registered an exponential turnover under a more professional management149 and had alerted IBM to launch the IBM PC in 1981 to reap the potentials of a 'personal computer' market. To hasten the speed of development IBM 'open sourced' the development providing technical specifications for free and inviting others to write software and manufacture peripherals for machines150. Apple to compete with this IBM strategy licensed an agreement to use Xerox PARC research work comprising in 'Alto' and a GUI based OS in own machines starting with 'Lisa' in 1983 and the revolutionary151 'Macintosh' computer in 1984. Microsoft immediately followed this approach launching the 'Windows' OS which could function on any IBM architecture machine. This paradigm of WIMP (windows, icons, mouse, pointer) has since become the standard method, sold and distributed to public, for accessing computer hardware. In compliance to Alan Kay's visions from a decade before, this accessibility permitted the computer to become truly 'personal'.

By the late 80's a majority of the creative media (photography, music, print publishing) industries and all other commercial businesses were completely computerised. This period also saw the rise of the commercial CD-ROM, a compact disc shaped digital media storage platform, which has since then become the standard way to sell and distribute all software, games and any other 'content'

More interestingly, from the context of this thesis, this WIMP paradigm made 'multimedia', 'hypertext' and 'hypermedia' content creation open to small scale commercial and individual users. As already mentioned this was the time when Apple released the 'pseudo meta-medium' – 'HyperCard'. Other commercial authoring software like “Macromedia Director, Asymetrix Toolbook, and Silicon Beach’s SuperCard” with similar aspirations also followed this Apple product, some of which continued to stay in production much longer and were widely popular in the new multi-media departments of commercial media enterprises. [Gere 2008, 142]

148 | Apple 1-A board loaded with chips but owing to Wozniak’s programming and hardware skills it was recognized as an excellent piece of hardware, which, when plugged in to a keyboard and a TV monitor could allow the user to achieve what seemed then extra-ordinary things, such as display graphics.[Gere 2008, 138]149 | Steve Jobs employed a marketing director, Mike Markulla from Intel, and a manager, Mike Scott, from Fairchild Semiconductor. Older, more experienced and more respectable than either Wozniak or Jobs, Apple changed from a two men in a garage into a fully-fledged company, whose turnover had, in five years, exceeded hundreds of millions of dollars. [ibid 139]150 | The main beneficiary of this strategy was Microsoft. They supplied the operating system for the PC, for a flat fee, but were able to make lucrative licensing deals with producers of PC clones – machines that used the same off-the-shelf parts as the IBM but were considerably cheaper. [ibid]151 | The Apple Macintosh, with its bit-mapped graphics, graphical user interface, ease of interaction and stylish look defined the shape of the personal computer. [ibid 140]

Page 74: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

74

In a technical memo from 1997 titled “Digital Paint Systems: Historical Overview” , Alvy Ray Smith covers the lineage of graphics software development from the late 60's to the early 80's. He distinguishes between 'digital paint programs' and 'digital paint systems' to measure the depth of functionality provided.

A digital paint program does essentially no more than implement a digital simulation of classic painting with a brush on a canvas. A digital paint system will take the notion much farther, using the “simulation of painting” as a familiar metaphor to seduce the artist into the new digital, and perhaps forbidding, domain. [Smi 1997, 1]

He clarifies further that a 'digital paint system' comprises of a certain amount of tools for dealing with pixels and the ' digital paint program' happens to be just one among such tools. He writes that a 'digital painting ' could range from a simulation of classic painting to any pixel processing function when applied to pixels directly using any haptic extension like a mouse or stylus on a tablet. Listing modern consumer products - “Adobe Photoshop, Microsoft Image Composer, the Corel Draw suite etc.” as descendants of the early digital paint systems he classifies them as 'Digital paint programs' since they lacked advanced functionalities.

His time line of paint systems starts with Richard Shoup’s 'SuperPaint'152 developed at Xerox PARC in 1972 and lists various others 8-bit systems and frame buffer technologies from the 70's including his own work ' Paint '153. In 1977 he had implemented a 24 bit system – 'Paint3'154 and the following year Jim Blinn155 at the JPL had adopted an existing 8 bit program and added the metaphorical 'Air brush' tools. By the early 80's 'Paintbox'156 – a commercial paint system began selling and several others developed from research work at Lucafilm Ltd. and Pixar – LayerPaint157 by mark Leather and Photoshop158 by the Knoll brothers, which was sold to Adobe.

Manovich highlights that in this entire lineage the new creative medium which allowed this entire development to occur in the 70's was the 'digital frame buffer ' 159 - a technology developed by Evans & Sutherland – sustaining the entire 'metaphorical' paint analogy since an artist 'painting' on a computer screen is basically only altering values in the frame buffer.

152 | SuperPaint – 1972-73, The first complete 8-bit paint system, including hardware and software, at Xerox Palo Alto Research Centre. [Smi 1997]153 | Paint-1975, 8-bit paint system made at the New York Institute of Technology, for an Evans & Sutherland frame-buffer, later for the Genisco frame-buffer. Sold to Ampex in late 1976. [ibid ]154 | Paint 3 -1977, The first 24-bit (RGB) paint system. New York Institute of Technology, for three Evans & Sutherland or Genisco framebuffers in parallel. [Smi 1997]155 |1977 - Jim Blinn adapts Bloomenthal’s (University of Utah) paint program to his needs at the Jet Propulsion Laboratory in Pasadena. In particular, in 1977 he adds air- brushing that is later used to generate texture maps for his Voyager fly by simulation movies in 1978. [ibid]156 | PaintBox - Richard Taylor and others, 24-bit commercial video paint product for the English company Quantel, the first system to implement special-purpose hardware acceleration for digital painting. They later extend the resolution in the product Graphic PaintBox for use in print and film. [ibid]157 | 1985 - LayerPaint, Mark Leather , 32-bit paint program, on the Pixar Image Computer at the Computer Division of Lucasfilm Ltd. He is awarded a technical Academy Award for its use in “wire removal” in 1994. [ibid]158 | 1985 – Photoshop, 24-bit , The Knoll brothers (Thomas and John) working at Lucasfilm, adapt Computer Division imaging functions, including painting, to create a commercial imaging product (originally for the Mac), later purchased by Adobe . [ibid]159 | Computer memory designed to hold images represented as an array of pixels (today a more common name is graphics card).

Page 75: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

75

I mention this history of 'paint softwares' to conclude this narrative about the history of pioneering ideas about the creative usage of the computer. The scope of this thesis limits me from looking any further into the developments of these GUI based media-manipulation software and I next look at at the deeper history of the computer media.

Historical conditions for the development of a particular software ideally need not be made so textured as is done in this case. Often its immediate precursor and some other products from the same time are enough to realise what a software product is worth.

The reason I go into a deeper history of this technical artefact is to locate deeper links in time which were important in defining the contemporary 'digital - technological era' which shaped this particular technical artefact ('Processing').

2.7 A DEEP HISTORY OF DIGITAL MEDIA

In the chapter noting the significant technical developments which caused the digital culture which we live in currently Charlie Gere is quick at pointing out by the page 23 that capitalism offered the major context in which computers could develop.

Contrasting the conceptualisation of a 'universal machine' by Charles Babbage and Ada Lovelace he mentions Alan Turings choice of a typewriter as the input device. Though a common device by ??, which Turing chose only because of ubiquitous accessibility without any concern about future socio-cultural implications, the technical nature of the 'type-writer' (typography + write, adjective?) was already a product of a 'capitalist' conception160. This device which could be configured in either the upper or lower case, using the same keyboard, embodied the mechanics of efficient capitalism. Capitalism in order to function as 'the' universal 'meta-machine' requires “abstraction, standardization and mechanization ... treating disparate phenomena as equal and interchangeable” 161 which is the core of the digital paradigm.

Turing’s imaginary device not only invokes the typewriter, one of the paradigmatic information technologies of nineteenth-century capitalism, but also, in the tape and writing head assemblage, the very model of the assembly line. Moreover, the algorithmic method which his machine was intended to automate is itself a model of the division of labour, which, as both Adam Smith and, later, Marx realized, lies at the heart of efficient capitalist production. [Ger 2008, 25]

160 | The technology he chose, and that was so readily available to him, was in fact deeply embedded in a network of social and cultural meaning, one derived from contemporary capitalism. [Ger 2008, 23]161 | This is found in its emphasis on the exchange value of commodities, rather than their use value, the introduction of credit, paper money, and ‘fiduciary’ money, the division of labour into discrete and repeatable parts, and the standardization of components. This abstraction enables the flow of goods, money, and people crucial to capitalism’s continuous quest for expansion and profit. [ibid 24]

Page 76: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

76

Gere recalls Karl Marx's thesis that 'the commodity' offers the ideological nucleus upon which capitalism exists. Since the commodity, in order to be circulated efficiently, is summoned only in terms of abstract and temporal 'exchange value', this conversion of the materiality of the 'commodity' into a 'sign', a process of “semiotization”, is the inherent assumption upon which capitalism exists.

As a discrete sign, a commodity can be compared, measured and exchanged in value with other discrete signs (commodities - including paper money, human labour, legal documents). Furthermore, division of labour, whereby each worker is a relative specialist of a extremely small section of the production process, causes each section (an activity) to be a repeatable and an interchangeable sign, thereby making the process controllable and hence mechanizable.

Mumford states that Capitalism162 is not a modern practise. Capitalism appears initially in the post monarchical primitive mercantile form and grows into a corporate form when capital investment rises. The profit 'generating' assets though initially meant land and rents, but later included ship construction, maritime trading, mining, smelting and anything else requiring large investments which a bureaucratic state could not finance. According to the Mesopotamian and Egyptian records state capitalism, where mercantile activity was an activity of the state, may have preceded private capitalism. From the earliest times alphabets, minted money and numbers come from people who were long distance traders and colonial exploiters.[Mum 1967, 275]

This new accountancy looked at only those factors of an event which could be judged objectively and quantitatively. The capitalist spirit was established when a general familiarity with abstractions used in activities like timing, weighing, and measuring precisely was established in a community. The requirement of correct information and precise forecasts, to trade in commodities that reach or exist in the future, improved the practise of quantifying in various fields especially the quantification of distances using astronomy for navigation. The rise of a bureaucratic order from the counting houses laid the foundations for the discipline and impersonal regularity of Western life. This bureaucratic management level gained further importance since the log-keeping they did allowed all operations to be standardized uniformly throughout the mega-machine. This order has now been abstracted from human clerks into utilities like the ATM machines and computer hardware. [Mum 278]

The three canons which capitalism has established for economic structure are –

calculating quantity to develop a universal accountancy of profit and loss, regimenting time to ensure productive efficiency in men as well as machines and concentrating on abstract pecuniary rewards as a healthy motivation that justified all means. [Mum 279, 280]

162 | One means here by capitalism the translation of all goods, services, and energies, into abstract pecuniary terms, with an intensified application of human energy to money and trade, for the sake of gains that accrue primarily to the owners of property, who ideally were prepared to risk their savings on a new enterprises as well as to live off the income of established industrial and commercial organizations. [Mum 1967, 274]

Page 77: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

77

From the crude incision featured in the Jacquard's loom163 of 1804 which coded an elementary human task on a 'readable' wooden surface to the sophisticated engineering ideas imagined by Ada Lovelace164 of composing algebraic patterns of music, what has remained as the constant technique to convert a labour process into fixed capital have been - abstraction, standardisation and mechanisation.

Gere remarks that the link between efficient calculating machines for dealing with large quantities of information and rational industrial management to increase profits have stayed closely connected since the time of Charles Babbage. Even the later ideas of scientifically managed assembly line manufacture developed by Frederick Taylor and Henry Ford were developed within their existing paradigm of machine logic, which clearly separated production processes by dividing labour work.

This division of labour was the advantage Marx had pointed out when he wrote about valorisation. Labor costs less as Marx noted about Babbage's principle in 1832:

. . . the master manufacturer, by dividing the work to be executed into different processes, each requiring different degrees of skill or of force, can purchase exactly that precise quantity of both which is necessary for each process; whereas, if the whole work were executed by one workman, that person must possess sufficient skill to perform the most difficult and sufficient strength to execute the most laborious, of the operations into which the art is divided. [Mac 1984, 13]

The process of abstracting a commodity into a sign can of course be traced back to the earliest written languages used in trade in the pre-Christian civilizations, but from the context of the digital computer, traces of abstraction can be seen in the symbolic logic and formalisation of thought that was meditated upon in the 17th century by Gottfried Leibniz. Imagined for representing concepts using numbers to construct calculating apparatuses, these ideas were later formalised by George Boole's reduction of algebraic logic using just 1 and 0. This Boolean logic is one of the core primitive operations that can occur on hardware and was demonstrated by Claude Shanon using simple telegraph switching relays in his MA thesis from 1937 and still provides the logic making Integrated Circuits work. [Ger 2008, 33][kit3 1992]

Gere also notes how this process of abstraction found perfect housing in the development of communication technologies. From the semiotic use of fire to signal in pre-industrial times, through the semaphore (optical) telegraph which revolutionised Napoleonic wars in the late 18 th century, the development electric telegraphs, undersea telephone cables and then with radio waves, abstracting a message in to signs has had a long history[kit11996] .

163 | Joseph-Marie Jacquard’s pattern-weaving loom controlled the lifting of each of the warp threads for each pass of the shuttle through a system of wooden cards punched with holes. The actions of the human weaver were codified and converted into marks on the wooden card, which were then ‘read’ by the machine in order to repeat them. [Ger 2008, 26]164 | She (Lovelace) also suggested that the engine might compose elaborate and scientific pieces of music of ‘any degree of complexity or extent’. Remarking on the use of punched cards as used in the Jacquard Loom she wrote ‘We may say most aptly that the Analytical Engine weaves algebraical patterns just as the Jacquard-loom weaves flowers and leaves’. [ibid 28]

Page 78: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

78

These innovations which were primarily caused by war and capitalism’s need to communicate information between markets at national and then global scales, these communication technologies standardised mutually agreed notions of abstraction between sender and receiver.

The father of all transmission-tehnological innovations, however, has been war itself. In a strategic chain of escalation, the telegraph appeared in order to surpass the speed of messenger postal services; radio was developed to solve the problem of vulnerable undersea cables; and the computer emerged to make possible the codification of secret - and interceptable - radio communications.

[kit2 1999, 4]

Monopoly capitalism with global reach by the late nineteenth century also generated an excessive amount of information which required sophisticated calculating machines165 to be improved constantly. These same manufacturers were the best fit to take on the possibilities shown by post WW2 military research such as the C-T-R (Computing-Tabulating-Recording Company) later known as IBM (International Business Machines). With the rapid urbanisation by the late nineteenth century a system of punched cards began to be used in America by 1880 for the census where by each person was represented by a card with their personal details represented by a series of holes. The mechanical apparatus could count, sort and tabulate these cards having converted the mass of individuals into data.

The individual's reduction into a sign facilitated the ideological conversion of a 'unique' human into a machine describable, analysable information object which could be measured, classified and compared with other such entities. Apart from the census bureau the insurance firms and railways were the first to adopt this machine which prototyped the link between digital technology and surveillance.[Ger 2008, 39,40,41] By the time of the start of the second world war when Alan Turing published his Entscheidungs problem paper prototypical proofs of concept digital calculating machines like the 'Z1'166, 'ABC' 167 and 'Harvard Mark 1' 168 had been designed.

With the development in mechanised cryptography and especially crypto-analysis achieved during the World War 2 the need to store data within computing machines was recognised. The solution employed was to store data using cathode ray tubes and this was demonstrated in the 'Manchester Mark 1' by 1948 which was soon re-designed for a commercial audience and sold by the firm Ferranti.[Ger 2008, 47]

The 1940's in American defence research were primarily concerned with calculating ballistic missile trajectories and the complex calculations needed in the 'Manhattan project' where the Atomic Bomb was produced. After the war, mathematician John Von Neumann who was involved with the latter project, set the hardware architectural paradigm in the EDVAC169 machine.

165 | Calculators, such as the Comptometer developed by Felt and Tarrant , the Adder-Lister from Burroughs Adding Machine Company, and cash registers, such as those developed by National Cash Register, helped businesses and shops manage their affairs in ever faster and more intricate markets. [Ger 2008, 39]166 | Zuse 1.1938, built by Konrad Zuse,Germany. 167 | ABC / Atanasoff-Berry Computer, 1939, built by John V. Atanasoff & Claude Berry, United States. 168 | Harvard-IBM Automatic Sequence Controlled Calculator, 1939, Howard Aiken , IBM.169 | Electronic Discrete Variable Computer.

Page 79: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

79

Also among the first machines which could store data this machine featured the hardware configuration of a memory unit ( to store data and instructions), an arithmetic unit, input and output units and a control unit – also known as the Von Neumann architecture. The possibility of storing data ensured that this configuration could be applied universally for any task.

Gere cites these machines as “the embodiment of capitalist modernity, with its emphasis on abstraction, ex-changeability and self-regulation.” [Ger 2008, 50]

By 1949 when Claude E. Shannon formalised a linear conceptualisation170 of the communication process, he provided another layer of abstraction which has since proved extremely useful in all forms of technical communication. He separated the semantics of a message from the technical problems of delivering a message allowing further innovation in the transmission channels without concern for the message which it contained. He dealt with the problems of encoding 'information' (any message with a semantic value) and reducing 'noise' (all non messages with no semantic importance) within a communication system whose efficiency was measured in terms of 'entropy' (statistical properties of the message source).

This 'Information Theory' has since been used in areas beyond engineering even in non-technical fields like cognition, linguistics, psychology and economics and has provided 'the' popular term like 'Information Technology / IT'. The post-world war industrial-academic scenario was dominated by a Cybernetics171 influenced systems approach to the all domains of co-ordinated human activity.

Apart from the manufacturing industries, the term system was applied to business strategy, machines, cities, society etc. all of which were thereby apt to be analysed through a scientific 'systems analysis'. A decade later by the late 1960's parallel to the growth of 'Silicon Valley' in California such Cybernetics based 'systems' thinking was introduced to the general public by 'The Whole Earth Catalogue'. This magazine created the context whereby erstwhile military and control technology could be used for peaceful self-gratification purposes by the civilians, especially the youth. Ideas espoused in this about the personal computer, interactivity, feedback, enlightened technology and tool use were taken up positively by the American counter-culture. The editor of this magazine Stewart Brand, who was also connected with Alan Kay, Douglas Engelbart and others at the Xerox-PARC, was the leading spokesperson about the positive revolutionary effects of personalised computing machines and networked society if applied in an holistic thoughtful way. [Ger 2008, 131]

In particular the counter-culture was instrumental in creating the context in which the real-time interactive technologies developed by the military, or through military funding in the context of the Cold War, could be stripped of their militaristic, technocratic aura, repainted with a gloss of cybernetic idealism, taken in part from the post-war avant-garde, and re-purposed as gentler, kinder tools for a new generation.

[Ger 2008, 129]

170 | “A Mathematical Theory of Communication” - The source of a message; the device that encodes the message for transmission; the channel along which the message is sent; the device that decodes the message; and the message’s destination. [Ger 2008, 52]171 |Cybernetics - A field invented by Norbert Wiener through his 1948 book 'Cybernetics' which studied feedback principles reanging from missile technology, natural phenomena, man and electronics.

Page 80: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

80

Even before this American counter-cultural espousal of military-business technology, artists and theoreticians in Europe had started merging cybernetics, systems theory and information theory with aesthetics as can be seen in texts like Abraham Moles’ “Theorie de l’information et perception esthétique172” and Max Bense’s “Programmierung des Schönen173”.

Nicholas Schaeffer, a French-Hungarian sculptor had by 1954 pioneered the application of cybernetics to the aesthetic traditions of Kinetic Art174, With the application of cybernetics, sculptures which fused light and movement within a sophisticated autonomous structure were exhibited by him. 175 By 1961 artworks related to such ideas and also inheriting the concrete and constructivist art traditions had begun to be displayed at the ‘Nove Tendenije’ series of shows in Zagreb.

Thematically all the artistic works of this era merged the new theories of about 'information' with a post war desire of creating objective, rational and scientific art using cybernetic principles. [Ger 2008, 101]

In America meanwhile Marshall McLuhan had started meditating on cultural lineages of communication technologies to contextualise the changes occurring in western societies with nuclear families living under the constant exposure to radio, television and print technologies. With novel concepts of a “global village” and catchphrases like “the medium is the message” published in his books “The Gutenberg Galaxy: The Making of Typographic Man” from 1962 and “Understanding Media” from 1964 his impact on the ideology of the 'silicon valley' counter culture was substantial.

The term 'computer graphics' was used in 1960 for the first time by William Fetter of Boeing to describe his human factors cockpit drawings. This term was soon used ubiquitously across the aeronautics and the automobile manufacture industries which started using computers as a design tool176. By 1965 the first computer art exhibition was held at the Technische Hochschule in Stuttgart featuring A. Michael Noll, Frieder Nake and George Nees. Computer Graphics research and computer art at this stage remained indistinguishable as can be noted from the worked which came out from Bell Labs177, New Jersey. The conclusive manifestation of all this computer generated art was the 1968 exhibition “Cybernetic Serendipity: The Computer and the Arts” curated by Jasia Reichardt at the Institute of Contemporary Arts in London178.

172 | 1958 -“Théorie de l’information et perception esthétique” [Information Theory and Aesthetic Consciousness] in Paris.173 | 1960 - Aesthetica IV – Programmierung des Schönen. Allgemeine Texttheorie und Textästhetik ( Programming of Beauty).174 | Which was heralded by the motion sculptural works of Marcel Duchamp, Naum Gabo and Laszlo Moholy Nagy.175 | Sound equipped art structure built in 1954 for the Phillips Corporation, and followed by his two dynamic responsive works CYSP I (illus. 26) and CYSP II (both 1956). [Ger 2008, 100]176 | 1963 DAC-1 the first commercial CAD (computer aided design) system, built for General Motors by IBM, was shown publicly. In the same year Lockheed Georgia started using computer graphics. [ibid 104]177 | Bell labs- Micahel A. Noll, Kenneth Knowlton, Manfred Shroeder and others.178 | Cybernetic Serendipity might be considered the apogee of computer-aided art, considered as a mainstream art form. It consisted of all forms of computer-influenced and aided art, including work in music, interactivity, cybernetic art, computer-generated film and computer graphics, and involved an eclectic mixture of contributors, including Cage and many others mentioned above, and scientists from university laboratories, and was highly successful. [ ibid 110]

Page 81: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

81

“The Machine as Seen at the End of the Mechanical Age”, curated by Pontus Hulten at the MOMA, New York in 1968 and the 1970 show “Software, Information Technology: Its New Meaning for Art”, curated by Jack Burnham, at the Jewish Museum, New York were two other media art-historically significant shows which marked the end of the early phase of cybernetics in the arts era and heralded the arrival of conceptual art179.

Gere mentions that the new technology based avant-garde art practices were largely marginalized or ignored by the mainstream art world and no large art institution indulged in putting up such shows after 1970. He also points out that in the 'Software' exhibition it was already difficult to distinguish between conceptual art and cybernetic art since it featured work by artists like Hans Haacke and Joseph Kosuth ( later regarded as leading conceptual artists ) alongside works by the pioneers of multimedia computing like Ted Nelson and Nicholas Negroponte.[Ger 2008, 111]

Two years before he participated in this 'Software' exhibition, Nicholas Negroponte had founded 'Architecture Machine Group'180 at MIT, his Alma mater, where he was teaching at the time. He has ever since been one of the strongest proponent of the positive aspects of new technologies like the personal computer and the internet, through his publications and articles.

By 1980 he had collaboratively made a proposal defining a new institution at MIT - The Media Lab. Conceived as an interdisciplinary space to connect computer programmers with psychologists, anthropologists and other creative professions to study the possibilities of information and communication technologies. In context to this thesis, it is also the institution which supported John Maeda to develop 'Design by Numbers' and later Casey Rees, Ben fry to develop 'Processing'.

The last decade of the twentieth century with the concrete realisation of the networked society in the developed nations also saw a concurrent subversion of this technology by various artists who fall under the broad label of internet art or net.art181. Artworks whose existence was based upon the innovative possibilities offered by the internet replayed many strategies of the post-war avant garde182.

179 | On the whole, cybernetic and computer art was, rightly or wrongly, regarded as marginal in relation to both the traditional art establishment or to avant-garde art practice. In light of both the use of Cybernetics and information technology for military strategy and the increasing computerization of society for the purposes of capital, the utopian belief in their potential was hard for many to sustain. [ibid 107, 108]180 | The Architecture Machine Group was intended as a combination laboratory and think tank for studying issues of human-computer interface. In 1978, the Architecture Machine Group created the Aspen Movie Map - a system that simulated driving through the city of Aspen in Colorado, via a computer screen. Built using videodiscs it enabled the user to navigate through the city, and even choose the season in which they are travelling. It was, and indeed remains, one of the most sophisticated systems ever built. [Ger 2008, 132]181 | Among them are artists such as Vuk Cosic, Olia Lialina, Jodi, Alexei Shulgin, Heath Bunting, Rachel Baker and the Irational [sic] Organization and many others. [ibid 115]182 | Practically every trope or strategy of the post-war avant-garde has found new expression through net.art, including Lettriste-style hypergraphology, involving the representation of codes and signs, Oulipian combinatorial and algorithmic games, Situationist pranks, Fluxian or Johnsonian postal strategies, stages technological breakdowns such as previously rehearsed in video art, virtual cybernetic and robotic systems, parodies and political interventions. [ibid]

Page 82: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

82

2.7.1 ROOTS OF THE 'VIRTUAL' MACHINE

Turing’s conceptual machine, capable of being reconfigured in an infinite number of different states, is the perfect, idealized model of capitalism as a universal machine, in which different phenomena, labour and commodities are homogenized in order to be exchanged, manipulated and distributed. [Ger 2008, 50]

In the chapter titled “Enter the Machine”183 Mackenzie describes Marx's notion of a 'machine' as a rejection of the definitions given by mathematicians and mechanics which see it as a continuity from the 'tool' .

Marx states that describing a machine in terms of parts or differentiating it from a tool because of the source of power184 completely misses the most important aspect of the machine, historical element.

The machine, Marx noted, undermined the basis on which workers had resisted capital. As an instrument of large-scale industry, the machine embodied a totally objective layout of production and confronted labour as a pre-existing material condition of production taking over their skill. [Mac1984, 16]

When Engels analysed railways and ships at sea he remarked that both mandate the complete subordination of a worker to authority. Instead of seeing authority as a condition of capitalist social organization, he felt relationships of authority and subordination exist independent of any social organization. He considers the root cause of authority the material conditions under which production and circulation occurs. Thereby he considers that the unavoidable roots of authoritarianism are deeply embedded in human involvement with technology and science. [Win 1986, 6]

Mumford's basic thesis to explain the rise of machines is that : just listing a chronology of technical artefacts185 could never explain the cultural transformations which occurred in the river valleys of Egypt, Mesopotamia and India, and later at other parts of the planet.

He claims that the Machine Age had its origin, long before the so-called Industrial Revolution of the eighteenth century. He writes of an archetypal machine composed of human parts which was the first form of organization[Mum 1967, 11] . In his view the cause of modern technics were the changes that first occurred in the human mind which were translated later into institutions, mechanisms and technical artefacts.[ibid 278]

The automatic machine, working freely without human supervision was rooted in the abstract model of the 'archetypical mega-machine'. The tasks done earlier by imperfect human substitutes on a large scale preceded the mechanical tasks which can be applied on a small scale within a machine. [Mum 1967,

227]

183 | Marx's chapter on "Machinery and Large- Scale Industry" opens with a discussion of the definition of "machine." [Mac 1984, 14]184 | Nor does it suffice to differentiate the tool from the machine on the basis of the power source (human in the case of the former, nonhuman in the case of the latter) [ ibid ]185 | Wheeled wagon, the plow, the potter's wheel, the military chariot .[Mum 1967, 11]

Page 83: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

83

The greatest contribution of the mega-machine was the myth of the machine itself.186

To explain the passage of how the invisible machine anticipated 'the machine' three instances of 'Archetypal Machines', which still exist as the identifying marks of a mega-machine, are listed.

2.7.2 LABOUR MACHINE

With the success of the river valley civilisation model and the resultant population rise, a system of public organisation was involved to manage the land irrigation and canalization. The use of script by the temple and the palace came about to manage this order by taxation of the quantities produced from the land. Script based operations caused a change of pattern and scale. To manage these activities the earliest sensibilities of “mechanical order, mathematical exactitude, specialized skill and knowledge, and, above all, centralized intelligence” developed. These sensibilities developed out of a systematic observation of the stars, plotting the paths of planets and sensing the seasons as can be proven from the earliest calendars187.

The primary practitioners of this script, the early priesthood had the power of astronomical and meteorological prediction which made their authority supernatural as the 'interpreters' of the cosmic effects on humans. The temple as an institution emerged because of utility in managing large-scale farming by synchronizing agricultural operations. The order implied by a routine of repetitive work that started from the neolithic grinding and polishing, and as can be seen in the later 'civilisation' marks of geometric patterns and decoration could be scaled upon entire landscapes. The basic shapes and forms still used for computer graphics - rectangles, triangles, pyramids, straight lines, bounded fields, were the first mega-architectural forms which showed an awareness of an astronomic order and human control. With a kingship in place, the standardization of a new royal economy started. [Mum

1967, 167] Before the Second Millennium B.C. the use of the colossal labour machine had achieved the high point of mechanical efficiency which produced the Great Pyramids.[Mum 1967, 226]

Clandestine knowledge of the script was a necessity of the priestly class for self-preservation, since, if everyone had access to the sources of knowledge and could interpret it, it would have been impossible to preserve infallibility and conceal errors.

Secret knowledge is the key to any system of total control. Until printing was invented, the written word remained largely a class monopoly. Today the language of higher mathematics plus computerism has restored both the secrecy and the monopoly, with a consequent resumption of totalitarian control. [Mum 1967, 199] 188

186 | The notion that this machine was, by its very nature, absolutely irresistible – and yet, provided one did not oppose it, ultimately beneficent. That magical spell still enthrals both the controllers and the mass victims of the megamachine today. [ Mum 1967, 224]187 | The formulation of the Egyptian calender at the beginning of the Third Millennium B.C., indicates the culmination of a long and widespread process of exact observation and some kind of mathematical notation. Concern with the heavenly bodies and the discovery of a dynamic pattern of order in their seemingly random distribution may have been one of civilized man's earliest triumphs.[ ibid 166] 188 | This essential coalition between royal military power and often dubious supernatural authority anticipated a similar alliance between scientists and mathematical games-theorists with the higher agents of government today; and was subject to similar corruptions, miscalculations and hallucinations. [Mum 1967, 176]

Page 84: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

84

2.7.3 MILITARY MACHINE

'Mechanical' control of an army meant, to have a single mind with a singular aim at the top of the organization and a chain of command for passing messages through hierarchical network down to the smallest unit. The message stays the same from the source till the final destination. Written script, a 'method' of translating speech into graphic record, was the most important invention which aided the army to function efficiently by allowing the smooth transmission of impulses and messages throughout the system. The ability to fix 'accountability' when a scripted command was not implemented was very effective to carry out both constructive and coercive tasks. This concept of 'accountability' - to be accounted for, has always been tied to script from the earliest historical evidences189. Using written words to control working with large numbers the network of scribes and swift messengers allowed an action to be taken from a distance and thereby formed into a professional group which controlled the encoding and decoding of regal messages. In such a scheme, speed eventually becomes the signifier of power, a tendency which can be noticed even in contemporary notions of technology. Regal commands and military commands always have an urgency implicit in them. Current fascinations with the speeds of trains, internet, CPU, graphics, stock prices etc. has sped this scheme.[ibid 205]

The military machine in service of warfare has produced the greatest amount of mechanical inventions until the thirteenth century A.D. Especially with regard to metallurgical and chemical applications the military machine was always more innovative than the civilian culture190. Ethics of discipline and self-sacrifice from the military culture proved mandatory for any society which looked beyond the village. The order and accountancy instituted by the temple and the palace has been core for all larger scale cooperation and trade.

2.7.4 COMMUNICATIONS MACHINE

The hierarchic structure of the military machine formed the blueprint for the exponential scale of the human machine with no theoretical limits to the number of hands it can control. Making the human dimension and organic limits invisible constitutes the chief boast of an authoritarian machine. The productivity is achieved by physical coercion to overcome human laziness and bodily fatigue. Occupational specialization was a logical step in the constituency of the human machine since only by intense skilled specialisation of processes was the accuracy and perfection of the final monument achieved. All division of labour in modern industrial society and post modern information society starts at this root point. Like modern technology, the human machine (social order) dictated the purposes to be served by constituent human parts. The human machines were impersonal and dehumanized since they operated on a big scale. The control achieved by the collective human machines was confined to mass enterprises and large-scale architectural and military operations. [Mum

1967, 200]

189 | The earliest uses of writing were used not to convey ideas, religious or otherwise, but to keep temple records of grain, cattle, pottery, fabricated goods, stored and disbursed. This happened early, for a pre-dynastic Narmer mace in the Ashmolean Museum at Oxford records the taking of 120,000 prisoners, 400,000 oxen, and 1,422,000 goats. [ibid 192]190 | The scythe was attached to chariots for effective warfare before it was attached to machine for agricultural purpose; while Archimedes' knowledge of mechanics and optics was applied to destroying the Roman fleet attacking Syracuse before it was put to any more constructive industrial use. From Greek fire to atom bombs, from ballistas to rockets, warfare was the chief source of those mechanical inventions that demanded a metallurgical and chemical background. [ibid 226]

Page 85: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

85

The bureaucratic link between the source of power - the king and the actual human machines was the most important for the mega-construction and destruction work which the human machine performed. It was the bureaucracy – the invisible machine, which did the taxation to support the social hierarchy and assembled the 'manpower' to form the new mechanical fabric. The bureaucracy was the communications machine working in tandem with the military and labour machines to form a core part of the totalitarian structure. Classic bureaucracy is supposed to originate nothing, only to pass on a message from above without any changes. Only during times of corruption or rebellion does this organization change. This administrative process represses all aspects of the personality and orients the performance of daily task like a ritual. [ibid 201]

In slight connection to the section above where the word 'virtual' has been applied metaphorically, the Java language system, which 'Processing' inherits, has aspects of both a compiled and an interpreted language191. In the 'Processing' IDE once the play button is pressed the code is compiled into byte code that can run on the Java Virtual Machine (JVM)192. The byte code allows the 'Processing' code to be read on various platforms without causing a loss in speed like an interpreted language. [ReF 2007, 680]

2.8 CONCLUSION

Capitalism, post thirteenth century appropriated the discipline and anti-sexualism of monastic organization because of earlier habits of regimentation existing from pre-historic times. Due to its regularity and efficiency of functioning, the monastery laid the foundation of capitalism and mechanization. More importantly it laid the ground by fixing a moral value to the process of work itself, besides and apart from the final reward. Monasticism had achieved this efficiency by abstracting the human problem by omitting the woman from direct scope. The Benedictine system had demonstrated the efficiency of doing daily work by co-operating on a planned order. By moralising work itself instead of seeing labour as slavery, the monastery had increased productivity and the term, “le travail Benedictin” began to mean efficiency and formal perfection. [Mum 1967, 266,267]

Ordered routinised life and technical mastery made the Benedictine monasteries prosper and trade their surplus across Europe. This order of the monasteries was retained by the medieval guilds since it connected the aesthetic and moral values, attached with labour and the Christian religion. [ibid 271, 272]

The medieval guilds were the prototype of autonomous corporate bodies, that had a set agreement about the performance of work and the values of wage and measure. The sixteenth century printing press revolution reduced this mercantile-religious knowledge monopoly by democratising accessibility. [ibid 272, 273]

191 | A compiled language programme must be converted (compiled) from a human-readable form into a machine-readable instructions before it is run. A program called a compiler makes this transition. An interpreted language programme is interpreted (each statement is analysed) by another program while it runs. The former is faster than the latter while the latter can be modified during runtime. [ReF 2007, 680]

192 | The JVM is a software processor that acts as a buffer between the byte code and the machine’s physical microprocessor. Because there is a standardized JVM for many different types of computers, the same Java code can theoretically run on all of these different machines without requiring platform-specific changes. [ibid]

Page 86: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

86

The strongest icon of 'western civilized society', the town clock dictated Benedict's original insistence upon the obligation to perform manual labour. Though the rules imposed by monasticism was claimed to be for spiritual purposes and not for adding to the power of the ruling classes, it was here that slave labour from dawn to dusk shifted to five hours of labour, leaving space for leisure. Aesthetically improved living as initiated by constructing spacious buildings, manicured gardens, thrifty fields. Balancing this regimen of work was the intellectual labour of reading, writing, discussion and project planning community life. The order of monastic life with the seven 'canonical hours' – was visualised and quantified by the hour-glass, the sun-dial, and later the clock. It was from the monastery that the time-keeping got transferred to the marketplace. In the classical times this would have emerged from the mercantile classes.

By the fourteenth century complete towns could be regulated by the tower clock's bell. Contained in the social hierarchy of the monastery was the hierarchical structure of a collective labour machine, but at the same time the monastery had rationalized and humanized this discipline by the abstraction of spirituality.[ibid 265] These were the traits which were inherited by the scientific ideology of the seventeenth century. It contrast under capitalism the monastic sins of Christianity - pride, envy, greed, avarice, and lust became positive social virtues,mandatory incentives for all economic enterprises. [ibid

277]

This section 2.7 about the 'deep-history' of digital media though seems superficially quite vague and completely off tangent in context to 'Processing' .Yet it points out routes of further more elaborate enquiries which can be done. Pragmatic causes of time and convention are the only causes for bringing this section to an abrupt halt. In the next section no. 3 certain aspects of this history will be taken up to get an overview of the term 'Language' which is central to ideas of pedagogy and 'procedural literacy'.

The internet and hardware architecture as it exists now is a complete pre-condition for a product like 'Processing' to exist. From the original installer files, libraries, reference manuals, help examples, problem solving, peer learning etc. everything is supported solely by the internet infrastructure. I have tried to put across a very broad section of causes to elucidate the reason why an artefact like 'Processing' has come into being.

Unlike the original visions of Alan Kay the developments in software technology as it progressed since the 1980's have not proved useful in putting up pedagogical aspects on the forefront. What has come about is a total divide between 'programmers' – who are solely concerned with writing code to fulfil certain tasks and 'media designers' who pre-dominantly use products sold by global corporations to create aesthetic visual interfaces. The academic syllabuses and the labour markets (design and software industries) have sustained this dual abstraction of both the counterparts dealing solely with their respective field.

Page 87: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

87

'Processing', as an idea, is a resistance to this status quo of designers mentioning GUI based tools in their CV and programmers mentioning some programming languages in theirs. It also resists the monopoly capitalist model employed by the software firms of adding more and more sophisticated packages in each product release.

The approach which is supported by 'Processing' is of providing a set of elementary tools and the only way to access this 'toolkit' is through language – the simplified Java syntax. Claims of 'Processing' as a meta-medium mentioned earlier are justified because of this convergence of a techno-linguistic empowerment with a pedagogical environment which is not present in commercial design softwares.

I close this section about the deep history of Digital Media with an observation by Langdon Winner.

The available evidence tends to show that many large, sophisticated technological systems are in fact highly compatible with centralized, hierarchical managerial control. The interesting question, however, has to do with whether or not this pattern is in any sense a requirement of such systems, a question that is not solely empirical. The matter ultimately rests on our judgements about what steps, if any, are practically necessary in the workings of particular kinds of technology and what, if anything, such measures require of the structure of human associations.

[Win 1986, 9]

Page 88: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

88

3 - LANGUAGE

I now look at some ways in which the term 'Language' has been defined by some authors. This is done keeping in mind its meaning when preceded by the term 'Programming' ( section 0.4) i.e. a 'Programming Language' - 'Processing' - 'a programming language for artists and designers'. Ferdinand de Saussure differentiates between 'the language' and 'the faculty of language' to show that the former is necessarily a 'social product'.[Sau 1910]

The faculty of articulated language is available permanently in all persons in the form of biological apparatuses of vocal organs. This faculty cannot be utilised without the existence of a language, which necessarily comes to the individual from the society.

By its own virtue a language is an abstract thing which requires the human being for its 'realisation'. It is simultaneously independent to the person, since no single person can create a language, the language for its being pre-supposes collectivity.

Saussure considers that the only 'essential feature' of language is 'the combination of sound and acoustic image193 with an idea'. He classes language ( like writing ) as a semiological institution 194 dealing with a vast system of signs, along with other semiological signs like the church bell, ships' signals, army bugle calls etc. The psychology of languages is part of social psychology and a psychology of sign systems.

Mumford sees the evolution of language, a method of expressing and transmitting meaning, as having been phenomenally more important for human development than stone chipping or tool making. Apart from the rudimentary muscular coordination required for tool-using, the stressful co-ordination of multiple organs to create speech was a more important advance. To create a useful language would have taken far more effort, since even in early civilizations like Egypt and Mesopotamia their spoken language, was much more enhanced than their kit of tools.

In pre-historic times cultural 'work' gained value over manual work. Though tool-making provided the discipline of hand, muscle and eye, culture formation necessitated a control over man's natural functions.195

193 | (The acoustic image is the impression that remains with us the latent impression in the brain (D.)). There is no need to conceive it (the language) as necessarily spoken all the time. [Sau 1910 ]194 | Nearly all institutions, it might be said, are based on signs, but these signs do not directly evoke things. In all societies we find this phenomenon: that for various purposes systems of signs are established that directly evoke the ideas one wishes; it is obvious that a language is one such system, and that it is the most important of them all; but it is not the only one, and consequently we cannot leave the others out of account. [ibid]195 | Including his organs of excretion, his upsurging emotions, his promiscuous sexual activities, his tormenting and tempting dreams. [Mum 1967, 7]

Page 89: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

89

Constantly exploring the organic capabilities of nose, eyes, ears, tongue, lips and sexual organs, pre-historical man gave them new roles to play196.

Tool-technics is a fragment of bio-technics: man's total equipment for life. [Mum 1967, 7]

Citing the work of Dutch historian, J. Huizinga, whose concept of 'Homo Ludens' suggested that 'play' was the formative element in human culture, more than 'work' Mumford reiterates that man's most 'serious activity' was in the realm of make-believe.197 The thesis of 'Homo Ludens' was shocking at the time compared to the then prevalent general notion of 'Homo Faber', man the 'tool' maker, as the most important characteristic that defined Homo Sapiens.

Mumford theorises that man's sophisticated symbolic culture with its emphasis on 'minding' and 'symboling' were more immediate and much older than the desire for a control over the environment. 'Tools' were subsidiary instruments in this process of his self-discovery and self-transformation, never 'the main operative agent' in man's development.

In his theorisation of the 'mega-machine' he develops the view that man is primarily “a mind-making, self-mastering, and self-designing animal; and the primary locus of all his activities lies first in his own organism, and in the social organization through which it finds fuller expression.” [Mum 1967, 9]

Mumford considers that, if the 'technically' overlooked long phase of the invention of languages by the palaeolithic man did not exist, no future technology would have had its genesis. He locates the cause of underestimating language use in comparison to tools and machines due to the sophisticated abstractions involved in the earliest inventions, “in the ritual, social organization, morals, and language”.198 These left no physical remains unlike stone tools.

He even cites the dawn of the idea of 'creativity' occurred in this early palaeolithic phase when man orchestrated a symbolic universe of meaning which thereby started the process of 'culture'. [Mum 1967, 34]

He foresees that with the miniaturization of electronics the numerical superiority of the human brain would diminish in comparison to the computer but also states that machines can never be human.

196 | Even the hand was no mere horny specialized work-tool: it stroked a lover's body, held a baby close to the breast, made significant gestures, or expressed in shared ritual and ordered dance some otherwise inexpressible sentiment about life or death, a remembered past, or an anxious future. [Mum 1967, 8]197 | On this showing, ritual and mimesis, sports and games and dramas, released man from his insistent animal attachments; Long before he had achieved the power to transform the natural environment, man had created a miniature environment, the symbolic field of play, in which every function of life might be re-fashioned in a strictly human style, as in a game.[ibid 9 ]198 | But if tools were actually central to mental growth beyond purely animal needs, how is it that those primitive peoples, like the Australian Bushmen, who have the most rudimentary, nevertheless exhibit elaborate religious ceremonials, an extremely complicated kinship organization, and a complex and differentiated language? Why, further, were highly developed cultures, like those of the Maya, the Aztecs, the Peruvians, still using only the simplest handicraft equipment, though they were capable of constructing superbly planned works of engineering and architecture, like the road to Machu Picchu and Machu Picchu itself? And how is it that the Maya, who had neither machines nor draught animals, were not only great artists but masters of abstruse mathematical calculations? [ibid 23]

Page 90: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

90

But purely quantitative comparison does not begin to reveal qualitative uniqueness of the brain's responses- the richness of odor, taste, color, tone, emotion, erotic feeling which underlies and suffuses both the reactions and the projections that take place in and through the human mind. Were these eliminated, the brain's creative capabilities would be reduced to the level of a computer, able to deal accurately and swiftly with pure abstractions, but paralysed when faced with those organic concretions that are fatally lost by isolation or abstraction. While most of the 'emotional' responses to colour, sound, odour, form, tactile values predate man's rich cortical development, they underlie and enrich his higher modes of thought. [Mum 1967, 39]

3.1 CONTAINER

The long and difficult process of 'cultural fabrication' using language served the function of being a 'container' for man's creativity and reduced its negative manifestations. The time period which it took to form a culture was exhaustive and the causes are more diverse than just tool use. This process occurred faster after man discovered 'fantasy', the multifaceted aspects of his own mind. [Mum 1967, 41]

Advances in the art of symbol-making in speech and images allowed the transmission of acquired habits - 'culture' and 'knowledge' more effectively than ever before. [ibid 112]

MacKenzie notes the problematic of seeing Marx's “1859 preface” as an endorsement of “technological determinism” since the verb "to determine" (or the German bestimmen, which is what the English translations of Marx are generally rendering when they write "determine") [Mac 1984 , 7] is linguistically complex and would be more aligned to “determination by an authority ... set bounds or limits.”

So, if as Mumford claims that 'language' was the prime technology which brought about social changes then it would fit with Marx's original conception of 'determinism'. Dance, song, cave walls and language were the first 'creative mediums199 that got detached from 'ritual' by being sacred community efforts for aesthetic purposes by the palaeolithic man in the Ice Age. These can be viewed as predecessors of 'media storage' devices like contemporary 'film' and the 'hard drive' which are used to store abstract 'images'. The 'art' which developed in these 'creative mediums' evolved out of a lived 'knowledge' of a confident, emotional, aggressive, angry and excited hunter's mind. [ibid 119]

The neolithic cultivator changed the 'natural' environment with the axe, dams, reservoirs, terraced hills, permanent fields, designed clay or wooden dwellings. A increasingly humanized habitat started developing where large scale works could occur. The most significant neolithic trait which contemporary times have inherited is of 'industriousness' “the capacity for assiduous application to a single task, sometimes carried over years and generations.” [Mum 1967, 128]

199 | The kind of graphic line achieved in the paintings of the bisons of Altamira of the deer of Lascaux implies fine sensory-muscular coordination, along with the sharpest kind of eye for subtle detail. Hunting requires a high degree of visual and aural alertness to the least quiver of movement in leaves or grass, along with hair-trigger readiness to react promptly. That the Magdalenian hunter had attained this state of sensory vividness and esthetic tension is shown, not merely by the evocative realism of his highly abstract representations, but by the fact that many of his animals are depicted in motion. This was a higher achievement than static symbolism. [Mum 1967, 119]

Page 91: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

91

This idea applied to the format of 'daily work' emerged from the neolithic times. The use of terms, from the palaeolithic 'grinding' to the neolithic 'shaping' of tools are important to understand the change between these epochs. 'Shaping' implies the slow application of a single task using monotonous motions over a long time. 'Grinding' also a laborious process, required a willingness to endure drudgery, “ritual repetition pushed almost beyond endurance.”[Mum 1967, 137] He also notes that rituals because of their efficiency fall back into “the automatism of unconscious existence, and so arrest human development.” [ibid 123]

It should be noted that Mumford's use of the term 'civilization' includes all the institutions200 and behaviours common in notions of 'capitalism'. Hence he does not approach the neolithic 'civilisation' as 'backward' compared to the present 'standards' .

3.2 ESSENCE OF COMPUTATIONAL MEDIA

The essence of modern technology, its 'enframing', according to Heidegger is its employment of exact physical science. Though he also cautions against understanding modern technology deceptively as applied physical science itself.[Hei 1954, 7] The essence of modern technology is when man sees natures and everything 'actual' as standing reserve.[ibid 8]

He understood “essence” as “what something is ; in Latin, quid. , whatness” . The example he gave was the essence of tree is “what pertains to all kinds of trees - oaks, beeches, birches, firs” its "treeness." Heidegger saw the essence of something as the way in which it “essentially unfolds, administer itself and remains”, its idea (eidos). The real is transitory and the idea that is constant. With "enframing" he did not mean a tool or an apparatus, neither materially nor conceptually, he meant 'the essence' but definitely not 'the essential' (a genus).201

Heidegger's final thesis was that since 'the essence of technology is nothing technological', thinking about and confronting with technology must occur in the domain of art. Since art is “akin to the essence of technology and, on the other, fundamentally different from it.” [ibid 12]

200 | Its chief features, constant in varying proportions throughout history, are the centralization of political power, the separation of classes, the lifetime division of labour, the mechanization of production, the magnification of military power, the economic exploitation of the weak, and the universal introduction of slavery and forced labour for both industrial and military purposes. The invention and keeping of the written record, the growth of visual and musical arts, the effort to widen the circle of communication and economic intercourse far beyond the range of any local community: ultimately the purpose to make available to all men the discoveries and inventions and creations, the works of art and thought, the values and purposes that any single group has discovered. [Mum 1967, 186]201 | The machines and apparatus are no more cases and kinds of enframing than are the man at the switchboard and the engineer in the drafting room. Each of these in its own way indeed belongs as stockpart, available resource, or executor, within enframing; but enframing is never the essence of technology in the sense of a genus. [Hei 1954, 10]

Page 92: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

92

Mentioning the general notion sustained by the 'humanities' 202 which sees computer programming as a narrow technical skill, unrelated to the “theoretical and aesthetic concerns of new media” Mateas points that the technical details of code are never included while considering the 'technical artefact' .[Mat 2005]

His calls this the state - 'procedural illiteracy', a remnant from the twentieth century separation of art and 'techne' (section 2.2) . His argument is that without a sensitivity about how code operates, media artefacts are approached as a 'black box' by the humanities and thereby they completely miss the nexus of “authorship, code, and audience reception.”

Mateas argues that 'procedural literacy', allows new media practitioners and scholars to work with the essence of computational media. He is careful to state that 'computer programming' is important but only a part of 'procedural literacy'.

By procedural literacy I mean the ability to read and write processes, to engage procedural representation and aesthetics, to understand the interplay between the culturally- embedded practices of human meaning-making and technically-mediated processes. [Mat 2005, 1]

The 'ability to read and write processes' means - programming a computer to “embody any conceivable process”. He uses the words 'craft skill of programming' to define the most important aspect of procedural literacy, but cautions that this 'craft skill' should not be confused with the details of a particular programming language. What matters are the more 'general concepts and structures' which are uniform across all languages.

His challenge to new media scholars is that they must read code, beyond the trivial level ( primitive operations and control flow), at the sophisticated level of “the procedural rhetoric, aesthetics and poetics encoded in a work.” Citing the new understanding in game studies which sees the semantics of a digital game, (“including the gameplay, the rhetoric and ideology of the game”) encoded in its procedural rules.

As an endorsement of this idea, he positively notes the novel field of 'software studies' which is explicitly looking at how code functions within culture and the developments in the field of interactive art and design where the 'procedurality' is 'the interactive artefact' itself.

His thesis is that :

New media artists, game designers and theorists, media and software studies theorists, and generally anyone involved in cultural production on, in or around a computer, should know how to read, write and think about programs.

[Mat 2005, 2]

202 | New media practitioners, scholars, designers, artists including game designers and game studies scholars. [Mat 2005, 1]

Page 93: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

93

In his own work as a teacher he has employed these ideas of teaching students to create “procedurally expressive artefacts” to help them deal with the 'expressive power of computation'. The skills gained he claims are useful across a range of professions including – game designers, media artists or theorists. At the time of writing the article he was developing an undergraduate and graduate courses on “programming for artists” at the Georgia Institute of Technology.

He notes similar endeavours in computer educational projects such as the Rapunzel project203 (Flanagan & Perlin), Media design courses for woman204 (Guzdial 2003), Design By Numbers (Maeda 1999) and Processing (Fry & Reas 2003)205.

Computing is currently taught in media arts programs as a black box tool (using specific commercial software) and not as algorithmic processes to work with a meta-medium with its own unique conceptual possibilities. Mateas mentions that general scripting environments (Flash, Director, Visual Basic, [old] Hypercard) offer a frugal amount of “procedural authorship” and obscure the full potential of a computational medium.

But none of these place the 'describing a complex process' paradigm upfront, what is provided is a metaphoric simulated environment to create some 'media'. By hiding all the complexity and keeping the software user at a 'friendly' distance, the excessive mental work needed to describe a computational process are not accommodated. Mateas states that programming will always involve labour, so the more a 'media-design' software makes a process simple with a GUI the lesser useful that product becomes for describing processes in their entirety. He considers a tool like Flash, that follows a general condition about being very efficient for some types of animation but completely powerless to achieve some other type of effect or media.

In Mateas' conception computation ought to be seen as “a universal representational medium for describing structure and process”. All tools cannot be meta tools but procedural literacy enables better co-ordination with media software by understanding their inherent constraints since all these softwares are commercially sold description of complex processes themselves. What is crucial in all this is that 'media design' should be seen as a describing a computational process and not just a 'creating a file in some media format using some software'.

203 | An agent-based programming environment intended to appeal to middle school girls, precisely at an age when many girls, for a variety of social and cultural reasons, start loosing interest in math and science. Flanagan, M. and Perlin, K. Rapunzel, maryflanagan.com/rapunsel/about.htm. [Mat 2005, Bib]204 | A university freshmen media computation course that introduces computer science from the perspective of manipulating media objects such as still images, sound and video. His course is designed to address the high withdraw-or-failure rates in introductory computer science courses, particularly among women. Guzdial, M. 2003. A media computation course for non-majors, In ITiCSE Proceedings p. 104-108. ACM: New York. [ibid]205 | Both projects of the Aesthetics and Computation group at the MIT Media Lab

Page 94: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

94

He cautions against putting new media students in a starters CS courses since they feature a context outside of the visual aesthetics the digital media design206. Since such courses assume a prior knowledge of programming and cater “the adolescent male geek, emphasizing narrow technical mastery disconnected from broader social and cultural issues” they isolate the 'new media' student. By not emphasising computation as a creative medium, worthy of enquiry for its own sake, these courses preserve the two culture divide between the engineering/science and the arts/humanities.

He also clarifies that dumbing down a CS course by including visual examples to teach computation should be avoided. Instead he suggests an alternative curriculum which goes beyond the usual CS focus on mathematical abstractions and formalised models and also teaches programming like a expressive “concrete craft practise” involving writing and tinkering.

Learning 'computational design' is moving away from generating exotic playback of traditional media files (.jpg, .mov, .wav etc. ) and towards generating these same effects using coded procedures. Mateas brings in the term 'process intensity' 207 to make a difference between an interactive action causing a pre-existing media file to be evoked from the computational generation of effects which do not rely on fetching media files.

In fact, the expressive power of computation lies precisely in the fact that, for any crazy contraption you can describe in detail, you can turn the computer into that contraption.

What makes programming hard is the extreme attention to detail required to realize the contraption. A “loose idea” is not enough - it must be fully described in great detail before it will run on a computer.

A New Media introduction to CS should be a difficult course, with the challenge lying not in programming conceived of as applied mathematics, but in connecting new media theory and history with the concrete craft practice of learning to read and write complex mechanical processes. [Mat 2005,7]

206 | CS courses feature examples from engineering, mathematical and business applications (e.g. teaching recursion using the Fibonacci sequence, teaching functional abstraction using examples from physics, teaching object-oriented design using simple database-like models of people with attributes such as name and age). [ Mat 2005, 7]207 | Process intensity is the “crunch per bit”, the ratio of computation to the size of the media assets being manipulated by the system. [Mat 2005, 8]

Page 95: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

95

3.3 PEDAGOGY

This section cites some instances where the tutor or the institutional apparatus has had success in their attempts at teaching computer programming to designers and artists.

From his personal experience of teaching a subject titled “Computation as an Expressive Medium” Mateas remarks about including theoretical readings from the “The New Media Reader” (Wardrip-Fruin & Montfort 2003) along with providing training in 'craft' like skills for the projects. His audience for this course were MA, MSc and PhD students from Information design, technology, HCI etc. The skill set and prior education of this audience was vastly diverse.

From this experience he notes that learning 'Game Design' ( a native computational form ), worked well as a steady domain to create a New Media introduction to CS syllabus. Designing games necessitates focussing attention on 'procedurally' responding to a players interaction and it also maintains a layer of simulating visual aesthetics intact. He found most importantly that the authorial intentionality is best foregrounded in this domain. [Mat 2005,7]

The earliest argument Mateas cites which makes a claim for a “universal procedural literacy”208 is a talk given by A.J.Perlis209 two years before Ivan Sutherland invented the 'Sketchpad'.

In 1961 at a symposium that was organised to celebrate the 100th anniversary of M.I.T. titled “The Computer in the University”, Perlis argued that “procedural literacy” as a holistic idea was the core aim of a university education as against a just another credit course in a programming language. He expressed his belief that computers should be exposed at the earliest, during a students fresh man year itself and described a course that was developing in Carnegie Tech back then.

A two semester computer science course in the freshman year (something infrequent even now) as per his plan comprised of a first term where students programmed in machine code ( to learn mechanical algorithms of code analysis) and the second term where they would write code in GATE (Carnegie Algebraic language system) which automatically does what they learnt earlier. What is 'revealed' to the students is de-constructing step by step a intuitive human task and to design a machine readable algorithm out of it.

The skill they gain is :

decoding complex logical relations to produce branching codes and manual decoding of complex formula evaluations by mechanical processes. [Per 1962, 189] [Mat 2005, 5]

Perlis' course in programming was not conceived to teach one particular way of programming a computer. It was not about teaching one particular language or working on a particular machine.

208 | Earlier cited in Guzdial, M. and Soloway, E. 2003. Computer science is more important than calculus: The challenge of living up to our potential, In Inroads-The ACM SIGCSE Bulletin, Vol. 35(2), pp. 5-8. [Mat 2005, Bib]209 | Published in the collection Management and the Computer of the Future . Greenberger, M. (Ed.) 1962. Management and the Computer of the Future, The MIT Press. [ibid]

Page 96: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

96

The importance was given to provide a general grounding in developing sensitivity about describing a ( computational ) process, since programming was the best 'medium' to describe processes. Mateas notes this as the contemporary relevant skill which new media persons should possess – the ability and knowledge of basic concepts necessary to construct complex processes. [Per 1962, 206] [Mat 2005, 5]

Such ideas of 'democratising' techniques can also be noted in Alan Kay's prototype of the meta-medium (the FLEX machine) from 1967–69 that was made focusing children as the “user community” and he declared that the “children really can write programs that do serious things” after seeing the results. [Kay 1977, 32] Kay and his colleagues were encouraged by the creators of a CRT version of the MIT Logo work (programmable robot turtle that draws on paper).

The programs for the children used symbols that represented objects but also contained more importantly loops, recursions, visualization of alternative strategies, interactive discovery and “bug” sorting.

the kids love it! The interactive nature of the dialogue, the fact that they are in control, the feeling that they are doing real things rather than playing with toys or working out “assigned” problems, the pictorial and auditory nature of their results, all contribute to a tremendous sense of accomplishment to their experience. Their attention spans are measured in hours rather than minutes.[Kay 1977, 32]

Kay at that time was very excited about the 'metamedium' being 'active' like a human teacher. He defined being 'active' as responding to processes ( queries and experiments ) involving the learner in a two-way conversation. His conclusion about the vast and compelling implications of this first working model of a 'meta-medium' can be seen manifested in all the 'interactive' software present currently.

On the other side of the Atlantic, in Britain, by the late 1960's, Mason has noted that with technological progress and the formation of polytechnics a limited number of artists had taken up computer programming. [Mason 2008] She mentions various British art institutions in her essay that were encouraging and setting up departments which encouraged multi-disciplinary use of computers especially as a design aid. In 1960 when computers were still popularly labelled as 'electric brains' and rarely seen as a creative medium which could assist in the arts she notes that following certain education reforms the British conceptual artist Roy Ascott had by1961 designed a new course based on cybernetic principles of behaviour and process at the Ealing Art School.

Regarding the 1968 exhibition 'Cybernetic Serendipity' as the culmination of interest in cybernetics and 'systems' theories by British artists she notes that the main characteristic of British computer artists in the 1970s was that they had known how to programme or had a working relationship with scientists, engineers or technicians. She has attributed this emergence of the 'artist-programmer' to the creation of a few well funded polytechnics ( multi-disciplinary adult educational centres) since 1967 where some even had art schools merged into them. The result of these administrative shifts was that some artists gained pioneering access to “expensive and specialist” computers for the first time.

Page 97: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

97

This was a unique feature of British education at this time - as an art student, one could learn to programme. Thus, at the Polytechnic, it was theoretically possible to study art and craft (technology) together again, ... [Mas 2008]

She mentions a scenario of those times from Middlesex Polytechnic where John Vince, a programmer got attached to the arts community during his tenures as a data processing lecturer. Realising the interest in the computer from artists and designers he developed a software package specially for artists. Titled 'PICASO' - (Picture Computer Algorithms Subroutine Orientated) and written in Fortran John Vince would demonstrate the coding structure on a blackboard with chalk to artists. The artist/students could copy this code onto a coding sheet and add their alterations, which Vince could execute on the computer. [ibid]

Writing in the August issue of the “Computer Graphics and Art” magazine from 1976 Dr. Kai Chu states that at the time the aim was more to overcome the fear of the machine among students from a non Math background. He wrote that the 'traditional' way of teaching programming through solving complex mathematical problems tended to put off many students and they lost interest very soon. He considered 'computer graphics' a viable means to train 'applied art' and design students in “computer experience” as opposed to learning a specific language like FORTRAN .

In his article he concludes his success in raising interest towards programming among students by allowing them to explore the creative domain of designing patterns to be used on fabrics. Having a context the students felt passionate about made learning the techniques of programming much more easier to grasp. Since his situation made the use of X Y plotters too expensive so he used line printer which could print rolls of paper with the colour aspects being simulated by painting on top of the print out sheet. [Chu 1976, 20]

After choosing FORTRAN because of its ability for sophisticated character string manipulation he wrote a few sub programs and sub routines to create a system fit for the domain of computerized fabric design. Using an IBM 1130 machine, a console typewriter and a line printer for the display of the design the new system he created allowed designers to alter and add new elements to the existing design.

He cites the success of his methodology of using computer graphics to the pride students felt at achieving a level of satisfying creativity. Concluding this method to be a painless way to introduce programming and then to proceed to complex concepts he states that the joy of learning as being more important than getting facts and information correct.

In the article “Research and Teaching in Art and science” from the same magazine a year later Bonacic writes the blueprint for a novel way of teaching art and science. He considers that by programming computers with a new, complex and heuristic programming language a common language for art and science can be created.

Page 98: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

98

In his plan for a common curriculum he writes that firstly seminars would include “awareness of ethics, ethics of art and science, ethics and politics etc.” . His second point was the inclusion of teaching 'generative principles' by utilising “generators and transformers” that generate a “predictable non-limited numbers of structures in an n-dimensional coordinate system, with the assumption that the parameters of the generator, transformer, and resulting pattern are controllable.” [Bon 1977, 5]

In another article “Computergraphics for Interior design students at Purdue University” from this same issue in 1977 Kingsley & Victoris describe how interior design students were being trained to use a hidden-line removal program that allows the student to receive electrostatic and/or ink plots of architectural forms and spaces as three-point perspective views. [Kin 1977, 16 ]

3.4 CONCLUSION

It can be inferred from this small list of past work (in bridging the two cultural divide) that ideas similar to what the creators of 'Processing' had were explored by various others since the early 1960's. Yet, the initiation of 'Processing' project in the first decade of of the 2000's signifies the fact that all these older efforts never got institutionalised. Rather, what occurred (section 2.6.3) was the rise in competence of designers and artists in using a GUI based point and click software mediums. In the current scenario ( art / design institutions + commercial world ), being technologically able has come to mean a very narrow, domain specific expertise in using some particular software product. Technologically able currently does not mean a wide sensitivity about explicitly structuring processes and interactions.

Looking at language itself as a technology as Mumford has done works very well in context to the broader ideas around a product like 'Processing'. Working and prototyping using a high level programming language opens up creative ways of accessing the entire digital communications infrastructure. This fact can be ascertained from the list of contributed libraries in section 1.5.2 which show different types of hardware and computational accessibilities that can be downloaded and used. Viewing - more literally - programming language itself as the human interface of digital technology is valid as well else digital technology is seen more narrowly as the only the mass produced consumer hardware.

Page 99: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

99

4 - FINAL THOUGHTS

After describing the research methodology and the idea of 'sketching', Processing 'the artefact' was introduced in Section 1 as a manifestation of an ontologically designed domain of breakdowns. The meaning of the term 'design' in context to the current use of this term was elaborated to make sense of the designing which occurs in the act of using the Processing IDE. Since 'Processing' foregrounds the coded processes, acting like a meta tool, it was shown as to how it makes interaction with various media objects transparent. The transparency achieved was deemed essential for the design of unique new 'tools' and new 'media'. This transparency of interaction was mentioned in contrast to the standard GUI based software which mimed older media and tools by invoking their functionalities.

Viewing 'Processing' as a meta-medium, whose content comprised of pre-existing and novel media, was validated by providing a close scrutiny of it's creator's ambitions, learning structure and syntaxes. Processing's adherence to ideas about creating specialised aids serving those who work within a 'domain of breakdowns' was demonstrated by highlighting its vocabulary of functions. This vocabulary, where each function invokes specific algorithms, was shown to be a result of an anticipation of various future possibilities of breakdowns. It's base of library packages were listed to showcase the wide range of accessibility to various computational processes. Further validation was achieved by describing the technical concepts of 'abstraction' , 'ecosystem' and 'interfaces' in context to Processing. The base for the later section which would list historical references to demonstrate how computer operations and programming code encode existing socio-economic practices and conventions in their design was laid here. Notions of a collective and mutual 'networked learning' environment were explored by stressing upon the concept of 'libraries' and open source. In this section Marx's ideas about 'valorisation' were brought in to understand the paradigmatic shifts in the creative labour process which a product like 'Processing' denotes.

Scrutinising the artefact was concluded by contrasting the 'logic of selection', prevalent in popular consumer product softwares, with the 'essence of technology' as expounded by Heidegger. His idea that thinking about and confronting with technology should occur in the domain of art was shown to provide credibility to this pedagogical medium for designers and artists. Conventional GUI based software for designers and artists follow what Heidegger noted as the instrumental conception of technology, as a means to an end, and thereby severe man's access to the essence of technology. To attain this access he wrote that technology should be understood as a 'revealing' which opens in to the truth. 'Techne', the classical term was used to denote not just for crafts-work but also the art of the mind especially the fine arts. He stated that 'knowing' and 'revealing' occurred when a learner developed an advanced level of confidence with the system since a holistic 'knowing' contributed to the 'revealing'. He held this true even for modern technology which revealed by unlocking the energy within nature. This process of revelation (an endless semiosis) allows all (in this case digital) processes to be seen as a standing reserve. The section 1 ends by stating that 'Processing' provides the correct environment to attain the essence of digital technology.

Page 100: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

100

The section 2 opened with the question of locating human purpose in the design of technical systems and set out to seek contingencies. The central premise of the 'social determination of technology' theory was used to counter the 'technological determinism' point of view. Social conditions which allowed digital technology to be developed, deployed and used were listed by a chronology of historical activities occurring predominately within the American capitalist-military-industrial structure. Digital technology was shown to have evolved from and to be an inherent part of the capitalist social structure which allowed different people different levels of accessibility. Notions of 'technical artefacts' being political or neutral were stated to be an incorrect way of dealing with the topic since it denies human agency as an independent variable. The meanings of the characteristics of technical objects were looked at and digital technology was shown to be a political phenomenon in itself.

Relations of production, in aggregate, were concluded to be the core factor which creates a societies economic, legal and political superstructures. These superstructures which form the social consciousness were claimed to be the cause of aggregate and particular technical artefacts. Marx's ideas about the inaccuracy of describing a machine in terms of parts and differentiating it from a tool (because of the source of power) were cited since it laid stress on looking at the historical element of the machine. This historical element of the machine were elaborated by superficially glancing at the deeper history of digital media using Lewis Mumford's ideas of the 'mega-machine'. It was also pointed out that the unavoidable roots of capitalist authoritarianism are deeply embedded in human involvement with technology and science

Two contingencies were examined for their historical significance - Ivan Sutherland's 'Sketchpad' and Alan Kay's work in the 1970's. Both these contingencies were shown to be key in shaping the digital computer as a creative meta-medium. The roots of the personal computer and the object oriented programming paradigm were shown using these two contingencies. It was shown how the evolution from the time-sharing systems ( that allowed the computer to do the work of many users) to the concept of a private personal engagement with the computer (on a one-to-one basis) necessitated the rapid progress in graphic displays towards a graphical user interface that used symbols.

Aspects of Ivan Sutherland's research, the 'Sketchpad', which employed cathode ray displays and light pens as an interactive drawing application were elaborated. Since the 'Sketchpad' stored the drawing as mathematical data it demonstrated the future viability of the computer as a visual and virtual medium. This research work apart from being the first manifestation of existing ideas of coupling a machine with a brain was also noted for it's literal prototyping of the 'object' oriented programming paradigm. The shift in the perception of the computer from a sophisticated calculator to a machine that simulated aspects of human intelligence was referenced with many historical causes. Other aspects of this contingency were listed which qualify it as a proto new-media tool since it added unique non-existent features to simulations of old media. As a break from the earlier industrial media where the hardware and software were the same, the separation of these two aspects was remarked at length.

Page 101: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

101

The second contingency was made out to be the work of Alan Kay which gave birth to the notion of the computer as a 'creativity enhancing medium' and not just a medium for engineering and scientific work. The direct roots of 'Processing' were shown to have come from Alan Kay's vision of the computer as an ubiquitous platform that could sustain various other artistic media. The vision for a technical device which was on the lines of malleable devices like clay and paper and unlike mass-produced consumer commodities like cars and television was noted as the validity to be classed as a meta-medium. Being an assortment of different 'old media' simulations Alan Kay's work was stated as the seed for all future new 'media tools' and new 'types of media'. Simultaneously ideas about the two culture divide (art and technology) were examined and the earliest cases of artists working with computers to create aesthetic objects were mentioned. 'Processing' was hence shown to be the latest instance that inherits from this lineage and one that poses a radical reversal back from a 'simulated metaphorical' domain into the 'process-ual' domain. The final part of section 2 attempted to locate deeper links in history which were important in defining the contemporary ' digital - technological era ' that shaped this particular technical artefact ('Processing').

This section predominantly provided a history of the 'personal' computer and listed some of the socio-historical roots of technical concepts like 'abstraction' (Section 1.4) and the 'virtual machine'. This second section basically attempted to show the social causes which created the technical pre-condition (the internet, hardware architecture and the software paradigm) vital for a product like 'Processing' to emerge.

The final section used Ferdinand de Saussure's differentiation between the 'faculty of language' ( the hardware ) and 'the language' (programming language) as the base to situate ideas about viewing 'the language' (a semiological institution) itself as technology. The evolution of language and script was traced from the the river valley civilizations, population increase and improvements if ordering public organisation. It was mentioned how the use of script was pertinent in managing a social system based on a centralized intelligence and specialised knowledge.

The secrecy aspect of the script was also mentioned to connect the historical priestly class' hegemony with the current clandestine nature of 'digital technology' which enables totalitarian control of networks and public consent. Script had always allowed a 'control from distance', an aspect completely internalised in contemporary digital technology. Viewing 'the language' metaphorically as a container that shapes culture and effects cultural fabrication helps in understanding the concept 'procedural literacy'.

The sub-section 'essence of computational media' after re-stating Heidegger's ideas about the essence of technology goes on to explore ideas connected with teaching 'procedurality' to art and design students. Viewing programming languages, in an expanded way, as a universal medium for describing structure and process leads to a redefinition of 'media design' as a process of describing a computational process (and not just a creating a file in some media format using some software). Alan Kay's statements about the ability to write in a medium versus the ability to read a medium are elaborated to explain how digital literacy means the power to generate materials and tools for others to use and not just access the material created by others.

Page 102: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

102

The sub-section on Pedagogy lists some references of past work since the early 1960's which bridge the two cultural divide and are conceptually similar to the ideas of 'Processing'. All these references contradict the current convention of teaching computing to designers and artists by using specific commercial software and not as algorithmic processes to work with a medium with its own unique conceptual possibilities. This section cites instances from recent history which look at creative computing as an act of describing a complex process and not just as expert usage of a metaphoric simulated environment to create some 'media'. References are provided which discriminate the conventional popular paradigm of keeping the software user at a 'friendly' distance by not accommodating the excessive mental work required to describe a computational process.

This thesis concludes in complete agreement with the statement that “computer code is the most versatile, general process language ever created.” [Mat 2005,1]

Page 103: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

103

BIBLIOGRAPHY [Ban 2009] Banzi, Massimo. "Getting Started with Arduino" . (2009) . Print

[Bar 2007] Barnes, Susan B.. "Alan Kay: Transforming the Computer Into a Communication Medium” . Annals of the History of Computing, IEEE Volume: 29 , Issue: 2 (2007): 18 - 30 . PDF file

[Bon 1977] Bonacic, Vladimir. “Research and Teaching in Art and science” in “Computer Graphics and Art” - edited by Grace Hertlein. Vol. 2, No. 3. August, (1977) . Scanned copy . PDF file

[Chu 1976] Chu, Kai. “Computer Graphics and High School education” in “Computer Graphics and Art” - edited by Grace Hertlein . Vol. 1, No. 3. August, (1976 ) . Scanned copy . PDF file

[Die 1986] Dietrich, Frank. “Visual Intelligence: The First Decade of Computer Art (1965-1975)” . Leonardo, Vol. 19, No. 2 (1986) : 159-169 . JSTOR PDF file

[Fry 2007] Fry, Ben . "Visualizing Data" . (2007) . O’Reilly Media. Print

[Ger 2008] Gere, Charlie. "Digital Culture" . (2002, 2008) : Reaktion Books. Print

[Gos 1995] Gosling, James. "Java: an Overview " / The Java White paper / (February 1995) . PDF file

[Hei 1954] Heidegger, Martin. "The Question Concerning Technlogy / translated form "Die Frage nach der Technik" in Vorträge und Aufsätze./ (1954) . PDF file

[Hil 2008] Hilpinen, Risto, "artefact", The Stanford Encyclopedia of Philosophy (Fall 2008 Edition), Edward N. Zalta (ed.)

[Hick 1991] Hickman, Craig. "Why Artists Should Program" . Leonardo, Vol. 24, No. 1 (1991) : 49-51. The MIT Press . JSTOR PDF file

[Kay 1977] Kay, Alan and Goldberg, Adele. "Personal Dynamic Media" . Computer 10(3):31–41. (March, 1977). PDF file

[Kay 1989] Kay, Alan. “User Interface: A Personal View” -- in The Art of Human-Computer Interface Design. (1989) . Addison-Wesley . PDF file

[Kin 1977] Kingsley K. wu & Victoris J. Willis . “Computergraphics for Interior design students at Purdue University” in “Computer Graphics and Art” - edited by Grace Hertlein.Vol. 2, No. 3. August, (1977 ) . Scanned copy . PDF file

Page 104: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

104

[kit1 1996] Kittler, Friedrich . "The History of Communication Media" . Editors - Arthur and Marilouise Kroker, in: Ctheory. (June 30, 1996). PDF file

[kit2 1999] Kittler, Friedrich Adolf. "On the Implementation of Knowledge- Toward a Theory of Hardware." in: Josephine Bosma (Editor). Readme! (nettime): ASCII culture and the revenge of knowledge, Automedia. (1999): 60 - 68. PDF file

[kit3 1992] Kittler, Friedrich . "There is No Software" . in Stanford Literature Review. 9,1, Spring (1992) : 81-90 . PDF file

[kit4 2001] Kittler, Friedrich and Ogger, Sara. "Computer Graphics: A Semi-Technical Introduction" . Grey Room, No. 2 ( 2001) : 30-45 . MIT PRESS . JSTOR PDF file

[Lat 2008] Latour, Bruno. "A Cautious Prometheus? A Few Steps Toward a Philosophy of Design (with Special Attention to Peter Sloterdijk) ". Keynote lecture for the Networks of Design meeting of the Design History Society Falmouth, Cornwall, (3rd September 2008) . PDF file

[Lic 1968] Licklider, J.C.R. & Taylor, R. "The computer as a communication device" . Science and Technology . (April, 1968). PDF file

[Lic 1960] Licklider, J.C.R. “Man-Computer Symbiosis”. IRE Transactions on Human Factors in Electronics, volume HFE-1 (March, 1960).: 4–11 PDF file

[Lin 2010] Corbet, J. , Kroah-Hartman, G. and McPherson, A. . "Linux Kernel Development. How Fast it is Going, Who is Doing It, What They are Doing, and Who is Sponsoring It" . ( December 2010) PDF file

[Mac 1984] MacKenzie, Donald. "Marx and the Machine" . Technology and Culture, Vol. 25, No. 3. (Jul., 1984): 473-502. JSTOR PDF file

[Mae 2003] Maeda, John. "The Infinite Loop" - in The Ars Electronica catalogue 2003 - "Code - The Language of Our Time". Ed Gerfried Stocker, Christine Schöpf.. (2003) . Hatje Cantz . Print

[Mas 2004] Mason, Catherine . “A Computer in the Art Room” . Futures past: Twenty years of arts computing . CHArt conference (2004) . PDF file

[Mat 2005] Mateas, M. "Procedural Literacy: Educating the New Media Practitioner." On The Horizon. Special Issue. Future of Games, Simulations and Interactive Media in Learning Contexts, v13, n1 (2005). PDF file

[Manov 2001] Manovich, Lev. "The Language of New Media." (2001) : MIT Press . Print

[Manov 2008] Manovich, Lev. "Software takes command". (Version 11/20/2008) . Creative Commons Microsoft Word file.

Page 105: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

105

[Mum 1967] Mumford, Lewis. "Technics and Human Development." The Myth of the Machine (Volume 1) . Harcourt Brace Jovanovich . (1967) . Print

[Nak 2001] Nake, Frieder. "Data, information, and knowledge - a semiotic view of phenomena of organization" . Organizational Semiotics(2001) : 41-50 . PDF file

[NaG 2001] Nake, Frieder + Grabowski, Susanne. "Human-computer interaction viewed as pseudo-communication" . Knowledge-Based Systems 14 (2001): 441 - 447. PDF file

[NaG 2006] Nake, Frieder and Grabowski, Susanne. "The Interface as Sign and as Aesthetic Event." -- in Paul Fishwick (ed.), “Aesthetic Computing” . ( 2006): 53-70 . MIT Press . PDF file

[Nel 1974] Nelson, Ted. "Computer Lib" Self-published. (1974). / 2nd ed., Redmond, Washington: Tempus Books/Microsoft Press, 1987. 302-338 / PDF file

[Pic 2003] Pickard, Gabriel. "Beyond the Computer" - in Sarai Reader 2003: Shaping Technologies. (2003) : 264-272 . PDF file

[ReF 2003] Reas, Casey and Fry, Ben. "Processing" - in The Ars Electronica catalogue 2003 - "Code - The Language of Our Time". Ed Gerfried Stocker, Christine Schöpf.. (2003) . Hatje Cantz . Print

[ReF 2007] Reas, Casey and Fry, Ben. "Processing: a programming handbook for visual designers and artists" (2007) . MIT Press . Print

[Res 2003] Reas, Casey. “ Programming Media” - in The Ars Electronica catalogue 2003 - “Code - The Language of Our Time”. Ed Gerfried Stocker, Christine Schöpf.. (2003) . Hatje Cantz . Print

[Rev 2005] Reeves, Jack W. “What Is Software Design: 13 Years Later,” / based on a orig. 1992 article. / -- in Martin, Roger’s book "Agile Software Development: Principles, Patterns, and Practices." . (2005) . PDF file

[ReW 2010] Reas, Casey , McWilliams, Chandler & Lust. "Form+Code in Design, Art, and Architecture" . (2010) Princeton Architectural Press . Print

[Sau 1910] de Saussure, Ferdinand. "Third Course of Lectures on General Linguistics" / orig. publ. Pergamon Press, 1993. / (1910) . PDF file

[Smi 1997] R.Smith, Alvy. "Digital Paint Systems. Historical Overview" (May 30, 1997) : Microsoft Tech Memo 141. PDF file

[Suth 1963] Sutherland, Ivan E. “Sketchpad, A Man-Machine Graphical Communication System.” Ph.D Thesis. MIT. (1963). PDF file

Page 106: FROM THE 'SKETCHPAD' TO 'SKETCHING' - CONTEXTUALISING 'PROCESSING' : THE PROGRAMMING LANGUAGE FOR ARTISTS AND DESIGNERS .

106

[Steph 1999] Stephenson, Neal."In the Beginning was the Command Line". http://www.cryptonomicon.com/beginning.html (1999) . TXT file

[Ter 2009 ] Terzidis, Kostas. "Algorithms for Visual Design Using the Processing Language" . (2009) . Wiley Publishing, Inc. Print

[WFlo 1986] Winograd, T. & Flores, F. "Understanding computers and cognition. A new foundation for design." (1986) Norwood, NJ: Ablex . Print

[Win 1986 ] Winner, Langdon. "Do Artifacts have politics? " . from "The whale and the reactor: a search for limits in an age of high technology." (1986) : 19-39. University of Chicago Press . PDF file