Declarative Programming Techniques

152
S. Haridi and P. Van Roy 1 Declarative Programming Techniques Seif Haridi KTH Peter Van Roy UCL

description

Declarative Programming Techniques. Seif Haridi KTH Peter Van Roy UCL. Overview. What is declarativeness? Classification, advantages for large and small programs Iterative and recursive programs Programming with lists and trees - PowerPoint PPT Presentation

Transcript of Declarative Programming Techniques

Page 1: Declarative Programming Techniques

S. Haridi and P. Van Roy 1

Declarative Programming Techniques

Seif Haridi

KTH

Peter Van Roy

UCL

Page 2: Declarative Programming Techniques

S. Haridi and P. Van Roy 2

Overview• What is declarativeness?

– Classification, advantages for large and small programs

• Iterative and recursive programs

• Programming with lists and trees– Lists, accumulators, difference lists, trees, parsing, drawing trees

• Reasoning about efficiency– Time and space complexity, big-oh notation, recurrence equations

• Higher-order programming– Basic operations, loops, data-driven techniques, laziness, currying

• User-defined data types– Dictionary, word frequencies

– Making types secure: abstract data types

• The real world– File and window I/O, large-scale program structure, more on efficiency

• Limitations and extensions of declarative programming

Page 3: Declarative Programming Techniques

S. Haridi and P. Van Roy 3

Declarative operations (1)

• An operation is declarative if whenever it is called with the same arguments, it returns the same results independent of any other computation state

• A declarative operation is:– Independent (depends only on its arguments, nothing else)

– Stateless (no internal state is remembered between calls)

– Deterministic (call with same operations always give same results)

• Declarative operations can be composed together to yield other declarative components – All basic operations of the declarative model are declarative and

combining them always gives declarative components

Page 4: Declarative Programming Techniques

S. Haridi and P. Van Roy 4

Declarativeoperation

Arguments

Results

Declarative operations (2)

rest of computation

Page 5: Declarative Programming Techniques

S. Haridi and P. Van Roy 5

Why declarative components (1)

• There are two reasons why they are important:

• (Programming in the large) A declarative component can be written, tested, and proved correct independent of other components and of its own past history.– The complexity (reasoning complexity) of a program composed of

declarative components is the sum of the complexity of the components

– In general the reasoning complexity of programs that are composed of nondeclarative components explodes because of the intimate interaction between components

• (Programming in the small) Programs written in the declarative model are much easier to reason about than programs written in more expressive models (e.g., an object-oriented model).– Simple algebraic and logical reasoning techniques can be used

Page 6: Declarative Programming Techniques

S. Haridi and P. Van Roy 6

Why declarative components (2)• Since declarative components are

mathematical functions, algebraic reasoning is possible i.e. substituting equals for equals

• The declarative model of chapter 4 guarantees that all programs written are declarative

• Declarative components can be written in models that allow stateful data types, but there is no guarantee

42

2

)(7

)(

)(

abafb

af

aaf

becomes

equation otherany in replace can We

Given

Page 7: Declarative Programming Techniques

S. Haridi and P. Van Roy 7

Classification ofdeclarative programming

Declarativeprogramming

Descriptive

Programmable

Observational

Definitional Declarative model

Functional programming

Nondeterministiclogic programming

Deterministiclogic programming

• The word declarative means many things to many people. Let’s try to eliminate the confusion.

• The basic intuition is to program by defining the what without explaining the how

Page 8: Declarative Programming Techniques

S. Haridi and P. Van Roy 8

Descriptive language

s ::= skip empty statement | x = y variable-variable binding

| x = record variable-value binding

| s1 s2 sequential composition| local x in s1 end declaration

Other descriptive languages include HTML and XML

Page 9: Declarative Programming Techniques

S. Haridi and P. Van Roy 9

Descriptive language

<person id = ”530101-xxx”><name> Seif </name><age> 48 </age>

</person>

Other descriptive languages include HTML and XMLXML

Page 10: Declarative Programming Techniques

S. Haridi and P. Van Roy 10

Kernel language

s ::= skip empty statement | x = y variable-variable binding

| x = v variable-value binding

| s1 s2 sequential composition| local x in s1 end declaration| proc {x y1 … yn } s1 end procedure introduction| if x then s1 else s2 end conditional| { x y1 … yn } procedure application| case x of pattern then s1 else s2 end pattern matching

The following defines the syntax of a statement, s denotes a statement

Page 11: Declarative Programming Techniques

S. Haridi and P. Van Roy 11

Why the KL is declarative

• All basic operations are declarative

• Given the components (substatements) are declarative,– sequential composition

– local statement

– procedure definition

– procedure call

– if statement

– try statement

are all declarative

Page 12: Declarative Programming Techniques

S. Haridi and P. Van Roy 12

Structure of this chapter

• Iterative computation• Recursive computation• Thinking inductively• Lists and trees

• Control abstraction• Higher-order programming

• User-defined data types• Secure abstract data types

• Modularity• Functors and modules

• Time and space complexity• Nondeclarative needs• Limits of declarative programming

procedural abstraction data abstraction

Page 13: Declarative Programming Techniques

S. Haridi and P. Van Roy 13

Iterative computation

• An iterative computation is a one whose execution stack is bounded by a constant, independent of the length of the computation

• Iterative computation starts with an initial state S0, and transforms the state in a number of steps until a final state Sfinal is reached:

s s s final0 1 ...

Page 14: Declarative Programming Techniques

S. Haridi and P. Van Roy 14

The general scheme

fun {Iterate Si}

if {IsDone Si} then Si

else Si+1 in

Si+1 = {Transform Si}

{Iterate Si+1}

end

end • IsDone and Transform are problem dependent

Page 15: Declarative Programming Techniques

S. Haridi and P. Van Roy 15

The computation model

• STACK : [ R={Iterate S0}]

• STACK : [ S1 = {Transform S0},R={Iterate S1} ]

• STACK : [ R={Iterate S1}]

• STACK : [ Si+1 = {Transform Si},R={Iterate Si+1} ]

• STACK : [ R={Iterate Si+1}]

Page 16: Declarative Programming Techniques

S. Haridi and P. Van Roy 16

Newton’s method for thesquare root of a positive real number

• Given a real number x, start with a guess g, and improve this guess iteratively until it is accurate enough

• The improved guess g’ is the average of g and x/g:

g g x g

g x

g x

g

g x g x g x g

g g

i e g g x g g x

( / ) /

( / ) / /

/ , /

. . ,

2

2 2 2

2 2 2 1

2 2 0

For to be a better guess than g: <

i.e.

< ,

Page 17: Declarative Programming Techniques

S. Haridi and P. Van Roy 17

Newton’s method for thesquare root of a positive real number

• Given a real number x, start with a guess g, and improve this guess iteratively until it is accurate enough

• The improved guess g’ is the average of g and x/g:• Accurate enough is defined as:

| x – g2 | / x < 0.00001

Page 18: Declarative Programming Techniques

S. Haridi and P. Van Roy 18

SqrtIter

fun {SqrtIter Guess X} if {GoodEnough Guess X} then Guess else

Guess1 = {Improve Guess X} in {SqrtIter Guess1 X} endend• Compare to the general scheme:

– The state is the pair Guess and X

– IsDone is implemented by the procedure GoodEnough– Transform is implemented by the procedure Improve

Page 19: Declarative Programming Techniques

S. Haridi and P. Van Roy 19

The program version 1

fun {Sqrt X} Guess = 1.0in {SqrtIter Guess X}endfun {SqrtIter Guess X} if {GoodEnough Guess X} then

Guess else {SqrtIter {Improve Guess X} X} endend

fun {Improve Guess X} (Guess + X/Guess)/2.0endfun {GoodEnough Guess X} {Abs X - Guess*Guess}/X < 0.00001end

Page 20: Declarative Programming Techniques

S. Haridi and P. Van Roy 20

Using local procedures

• The main procedure Sqrt uses the helper procedures SqrtIter, GoodEnough, Improve, and Abs

• SqrtIter is only needed inside Sqrt• GoodEnough and Improve are only needed inside SqrtIter• Abs (absolute value) is a general utility

• The general idea is that helper procedures should not be visible globally, but only locally

Page 21: Declarative Programming Techniques

S. Haridi and P. Van Roy 21

Sqrt version 2local fun {SqrtIter Guess X} if {GoodEnough Guess X} then Guess else {SqrtIter {Improve Guess X} X} end end fun {Improve Guess X} (Guess + X/Guess)/2.0 end fun {GoodEnough Guess X} {Abs X - Guess*Guess}/X < 0.000001 endin fun {Sqrt X} Guess = 1.0 in {SqrtIter Guess X} endend

Page 22: Declarative Programming Techniques

S. Haridi and P. Van Roy 22

Sqrt version 3• Define GoodEnough and Improve inside SqrtIterlocal fun {SqrtIter Guess X} fun {Improve}

(Guess + X/Guess)/2.0 end fun {GoodEnough}

{Abs X - Guess*Guess}/X < 0.000001 end in if {GoodEnough} then Guess else {SqrtIter {Improve} X} end endin fun {Sqrt X} Guess = 1.0 in {SqrtIter Guess X} endend

Page 23: Declarative Programming Techniques

S. Haridi and P. Van Roy 23

Sqrt version 3• Define GoodEnough and Improve inside SqrtIterlocal fun {SqrtIter Guess X} fun {Improve}

(Guess + X/Guess)/2.0 end fun {GoodEnough}

{Abs X - Guess*Guess}/X < 0.000001 end in if {GoodEnough} then Guess else {SqrtIter {Improve} X} end endin fun {Sqrt X} Guess = 1.0 in {SqrtIter Guess X} endend

The program has a single drawback: on each iteration two procedure values are created, one for Improve and one for GoodEnough

Page 24: Declarative Programming Techniques

S. Haridi and P. Van Roy 24

Sqrt final versionfun {Sqrt X} fun {Improve Guess} (Guess + X/Guess)/2.0 end fun {GoodEnough Guess} {Abs X - Guess*Guess}/X < 0.000001 end fun {SqrtIter Guess} if {GoodEnough Guess} then Guess else {SqrtIter {Improve Guess} } end end Guess = 1.0in {SqrtIter Guess}end

The final version isa compromise betweenabstraction and efficiency

Page 25: Declarative Programming Techniques

S. Haridi and P. Van Roy 25

From a general schemeto a control abstraction (1)

fun {Iterate Si}

if {IsDone Si} then Si

else Si+1 in

Si+1 = {Transform Si}

{Iterate Si+1}

end

end • IsDone and Transform are problem dependent

Page 26: Declarative Programming Techniques

S. Haridi and P. Van Roy 26

From a general schemeto a control abstraction (2)

fun {Iterate S IsDone Transform}

if {IsDone S} then Selse S1 in

S1 = {Transform S}{Iterate S1}

endend

fun {Iterate Si}

if {IsDone Si} then Si

else Si+1 in

Si+1 = {Transform Si}

{Iterate Si+1}

end

end

Page 27: Declarative Programming Techniques

S. Haridi and P. Van Roy 27

Sqrt using the Iterate abstraction

fun {Sqrt X} fun {Improve Guess} (Guess + X/Guess)/2.0 end fun {GoodEnough Guess} {Abs X - Guess*Guess}/X < 0.000001 end Guess = 1.0in {Iterate Guess GoodEnough Improve}end

Page 28: Declarative Programming Techniques

S. Haridi and P. Van Roy 28

Sqrt using the Iterate abstraction

fun {Sqrt X}{Iterate 1.0 fun {$ G} {Abs X - G*G}/X < 0.000001 end

fun {$ G} (G + X/G)/2.0 end }

end

This could become a linguistic abstraction

Page 29: Declarative Programming Techniques

S. Haridi and P. Van Roy 29

Recursive computations

• Recursive computation is one whose stack size grows linear to the size of some input data

• Consider a secure version of the factorial function:fun {Fact N} if N==0 then 1 elseif N>0 then N*{Fact N-1} else raise domainError end endend• This is similar to the definition in chapter 3, but guarded

against domain errors (and looping) by raising an exception

Page 30: Declarative Programming Techniques

S. Haridi and P. Van Roy 30

Recursive computation

proc {Fact N R} if N==0 then R=1 elseif N>0 then R1 in

{Fact N-1 R} R = N*R1

else raise domainError end endend

Page 31: Declarative Programming Techniques

S. Haridi and P. Van Roy 31

Execution stack

• [{Fact 5 r0}]

• [{Fact 4 r1} , r0=5* r1]

• [{Fact 3 r2}, r1=4* r2 , r0=5* r1]

• [{Fact 2 r3}, r2=3* r3 , r1=4* r2 , r0=5* r1]

• [{Fact 1 r4}, r3=2* r4 , r2=3* r3 , r1=4* r2 , r0=5* r1]

• [{Fact 0 r5}, r4=1* r5, r3=2* r4 , r2=3* r3 , r1=4* r2 , r0=5* r1]

• [r5=1, r4=1* r5, r3=2* r4 , r2=3* r3 , r1=4* r2 , r0=5* r1]

• [r4=1* 1, r3=2* r4 , r2=3* r3 , r1=4* r2 , r0=5* r1]

• [r3=2* 1 , r2=3* r3 , r1=4* r2 , r0=5* r1]

• [r2=3* 2 , r1=4* r2 , r0=5* r1]

Page 32: Declarative Programming Techniques

S. Haridi and P. Van Roy 32

Substitution-basedabstract machine

• The abstract machine we saw in Chapter 4 is based on environments– It is nice for a computer, but awkward for hand calculation

• We make a slight change to the abstract machine so that it is easier for a human to calculate with– Use substitutions instead of environments: substitute identifiers by

their store entities

– Identifiers go away when execution starts; we manipulate store variables directly

Page 33: Declarative Programming Techniques

S. Haridi and P. Van Roy 33

Iterative Factorial

• State: {Fact N}

• (0,{Fact 0}) (1,{Fact 1}) … (N,{Fact N})

• In general: (I,{Fact I})

• Termination condition: is I equal to Nfun {IsDone I FI } I == N end

• Transformation : (I,{Fact I}) (I+1, (I+1)*{Fact I}) proc {Transform I FI I1 FI1}

I1 = I+1 FI1 = I1*FIend

Page 34: Declarative Programming Techniques

S. Haridi and P. Van Roy 34

Iterative Factorial

• State: {Fact N}

• (0,{Fact 0}) (1,{Fact 1}) … (N,{Fact N})

• Transformation : (I,{Fact I}) (I+1, (I+1)*{Fact I}) • fun {Fact N}

fun {FactIter I FI}if I==N then FIelse {FactIter I+1 (I+1)*FI} end

end{FactIter 0 1}

end

Page 35: Declarative Programming Techniques

S. Haridi and P. Van Roy 35

Iterative Factorial

• State: {Fact N}

• (0,{Fact 0}) (1,{Fact 1}) … (N,{Fact N})

• Transformation : (I,{Fact I}) (I+1, (I+1)*{Fact I}) • fun {Fact N}

{Iterate t(0 1) fun {$ t(I IF)} I == N end fun {$ t(I IF)} J = I+1 in t(J J*IF) end}

end

Page 36: Declarative Programming Techniques

S. Haridi and P. Van Roy 36

Iterative Factorial

• State: {Fact N}• (1,5) (1*5,4) … ({Fact N},0)• In general: (I,J) • Invariant I*{Fact J} == (I*J)*{Fact J-1} == {Fact N}• Termination condition: is J equal to 0

fun {IsDone I J} I == 0 end• Transformation : (I,J) (I*J, J-1)

proc {Transform I J I1 J1}I1 = I*J J1 = J1-1

end

Page 37: Declarative Programming Techniques

S. Haridi and P. Van Roy 37

Programmingwith lists and trees

• Defining types

• Simple list functions

• Converting recursive to iterative functions

• Deriving list functions from type specifications

• State and accumulators

• Difference lists

• Trees

Page 38: Declarative Programming Techniques

S. Haridi and P. Van Roy 38

User defined data types

• A list is defined as a special subset of the record datatype

• A list Xs is either– X|Xr where Xr is a list, or

– nil

• Other subsets of the record datatype are also useful, for example one may define a binary tree (btree) to be:– node(key:K value:V left:LT right:RT) where LT and BT are

binary trees, or

– leaf

• This begs for a notation to define concisely subtypes of records

Page 39: Declarative Programming Techniques

S. Haridi and P. Van Roy 39

Defining types

list ::= value | list [] nil• defines a list type where the elements can be of any typelist T ::= T | list [] nil• defines a type function that given the type of the parameter T

returns a type, e.g. list int btree T ::= node(key: literal value:T left: btree T right: btree T)

[] leaf(key: literal value:T)• Procedure types are denoted by proc{T1 … Tn}• Function types are denoted by fun{T1 … Tn}:T and is equivalent to

proc{T1 … Tn T}• Examples: fun{list list }: list

Page 40: Declarative Programming Techniques

S. Haridi and P. Van Roy 40

Lists

• General Lists have the following definitionlist T ::= T | list [] nil

• The most useful elementary procedures on lists can be found in the Base module List of the Mozart system

• Induction method on lists, assume we want to prove a property P(Xs) for all lists Xs1. The Basis: prove P(Xs) for Xs equals to nil, [X], and [X Y]

2. The Induction step: Assume P(Xs) hold, and prove P(X|Xs) for arbitrary X of type T

Page 41: Declarative Programming Techniques

S. Haridi and P. Van Roy 41

Constructive method forprograms on lists

• General Lists have the following definitionlist T ::= T | list T [] nil

• The task is to write a program {Task Xs1 … Xsn}

1. Select one or more of the arguments Xsi

2. Construct the task for Xsi equals to nil, [X], and [X Y]

3. The recursive step: assume {Task … Xsi …} is constructed, and design the program for {Task … X|Xsi …} for arbitrary X of type T

Page 42: Declarative Programming Techniques

S. Haridi and P. Van Roy 42

Simple functions on lists

• Some of these functions exist in the library module List:1. {Nth Xs N}, returns the Nth element of Xs2. {Append Xs Ys}, returns a list which is the concatenation of Xs

followed by Ys

3. {Reverse Xs} returns the elements of Xs in a reverse order, e.g. {Reverse [1 2 3]} is [3 2 1]

4. Sorting lists, MergeSort5. Generic operations of lists, e.g. performing an operation on all

the elements of a list, filtering a list with respect to a predicate P

Page 43: Declarative Programming Techniques

S. Haridi and P. Van Roy 43

The Nth function

• Define a function that gets the Nth element of a list

• Nth is of type fun{$ list T int}:T ,• Reasoning: select N, two cases N=1, and N>1:

• N=1: {Nth Xs 1} Xs.1

• N>1: assume we have the solution for {Nth Xr N-1} for a smaller list Xr, then {Nth X|Xr N} {Nth Xr N-1}

fun {Nth Xs N}X|Xr = Xs in

if N==1 then X elseif N>1 then {Nth Xr N-1} endend

Page 44: Declarative Programming Techniques

S. Haridi and P. Van Roy 44

The Nth function

fun {Nth Xs N}X|Xr = Xs in

if N==1 then X elseif N>1 then {Nth Xr N-1} endend

fun {Nth Xs N}if N==1 then Xs.1

elseif N>1 then {Nth Xs.2 N-1} endend

Page 45: Declarative Programming Techniques

S. Haridi and P. Van Roy 45

The Nth function

• Define a function that gets the Nth element of a list

• Nth is of type fun{$ list T int}:T ,fun {Nth Xs N} if N==1 then Xs.1 elseif N>1 then {Nth Xs.2 N-1} endend• There are two situations where the program fails:

– N > length of Xs, (we get a situation where Xs is nil) or

– N is not positive, (we get a missing else condition)

• Getting the nth element takes time proportional to n

Page 46: Declarative Programming Techniques

S. Haridi and P. Van Roy 46

The Member function

• Member is of type fun{$ value list value}:bool ,fun {Member E Xs}

case Xsof nil then false[] X|Xr then

if X==E then true else {Member E Xr} endend

end• X==E orelse {Member E Xr} is equivalent to• if X==E then true else {Member E Xr} end• In the worst case, the whole list Xs is traversed, i.e., worst case

behavior is the length of Xs, and on average half of the list

Page 47: Declarative Programming Techniques

S. Haridi and P. Van Roy 47

The Append function

fun {Append Xs Ys}

case Xs

of nil then Ys

[] X|Xr then X|{Append Xr Ys}

end

end

• The inductive reasoning is on the first argument Xs

• Appending Xs and Ys is proportional to the length of the first list

• declare Xs0 = [1 2] Ys = [a b] Zs0 = {Append Xs0 Ys}

• Observe that Xs0, Ys0 and Zs0 exist after Append

• A new copy of Xs0, call it Xs0’, is constructed with an unbound variable attached to the end: 1|2|X’, thereafter X’ is bound to Ys

Page 48: Declarative Programming Techniques

S. Haridi and P. Van Roy 48

The Append function

proc {Append Xs Ys Zs}

case Xs

of nil then Zs = Ys

[] X|Xr then Zr inZs = X|Zr

{Append Xr Ys Zr}

end

end

• declare Xs0 = [1 2] Ys = [a b] Zs0 = {Append Xs0 Ys}

• Observe that Xs0, Ys and Zs0 exist after Append

• A new copy of Xs0, call it Xs0’, is constructed with an unbound variable attached to the end: 1|2|X’, thereafter X’ is bound to Ys

Page 49: Declarative Programming Techniques

S. Haridi and P. Van Roy 49

Append execution (overview)

Stack: [{Append 1|2|nil [a b] zs0}] Store: {zs0, ...}

Stack: [ {Append 2|nil [a b] zs1} ] Store: {zs0 = 1|zs1, zs1, ... }

Stack: [ {Append nil [a b] zs2} ] Store: {zs0 = 1|zs1, zs1=2|zs2, zs2, ...}

Stack: [ zs2 = [a b] ] Store: {zs0 = 1|zs1, zs1=2|zs2, zs2, ...}

Stack: [ ] Store: {zs0 = 1|zs1, zs1=2|zs2, zs2= a|b|nil, ...}

Page 50: Declarative Programming Techniques

S. Haridi and P. Van Roy 50

Reverse

fun {Reverse Xs}

case Xs

of nil then nil

[] X|Xr then {Append {Reverse Xr} [X]}

end

end

Xs0 = 1 | [2 3 4]

Xs1

reverse of Xs1 [4 3 2]

append [4 3 2] and [1]

Page 51: Declarative Programming Techniques

S. Haridi and P. Van Roy 51

Length

fun {Length Xs}

case Xs

of nil then 0

[] X|Xr then 1+{Length Xr}

end

end

• Inductive reasoning on Xs

Page 52: Declarative Programming Techniques

S. Haridi and P. Van Roy 52

Merging two sorted lists

• Merging two sorted lists

• {Merge [3 5 10] [2 5 6]} [2 3 5 5 6 10]• fun{Merge list T list T }: list T , where T is either int, float, or atom fun {Merge Xs Ys}

case Xs # Ysof nil # Ys then Ys[] Xs # nil then Xs[] (X|Xr) # (Y|Yr) then

if X =< Y then X|{Merge Xr Ys}else Y|{Merge Xs Yr} end

endend

Page 53: Declarative Programming Techniques

S. Haridi and P. Van Roy 53

Sorting with Mergesort

• {MergeSort list T }: list T ; T is either int, float, atom• MergeSort uses a divide-and-conquer strategy

1. Split the list into two smaller lists of roughly equal size

2. Use MergeSort (recursively) to sort the two smaller lists

3. Merge the two sorted lists to get the final result

Page 54: Declarative Programming Techniques

S. Haridi and P. Van Roy 54

Sorting with Mergesort

L

L1

L2

L11

L12

L21

L22

S11

S12

S21

S22

S

S1

S2

split

split

split merge

merge

merge

Page 55: Declarative Programming Techniques

S. Haridi and P. Van Roy 55

Sorting with Mergesort• {MergeSort list T }: list T ; T is either int, float, atom• MergeSort uses a divide-and-conquer strategy1. Split the list into two smaller lists of roughly equal size2. Use MergeSort (recursively) to sort the two smaller lists3. Merge the two sorted lists to get the final result

fun {MergeSort Xs}case Xs of nil then nil[] [X] then Xselse Ys Zs in

{Split Xs Ys Zs}{Merge {MergeSort Ys} {MergeSort Zs}}

endend

Page 56: Declarative Programming Techniques

S. Haridi and P. Van Roy 56

Split

proc {Split Xs Ys Zs}

case Xs

of nil then Ys = nil Zs = nil[] [X] then Ys = Xs Zs = nil[] X1|X2|Xr then Yr Zr in

Ys = X1|YrZs = X2|Zr{Split Xr Yr Zr}

end

end

Page 57: Declarative Programming Techniques

S. Haridi and P. Van Roy 57

Exercise

• The merge sort, described above is an example of divide and conquer problem-solving strategy. Another simpler (but less efficient) is insertion sort. This sorting algorithm is defined as follows:

1. To sort a list Xs of zero or one element, return Xs

2. To sort a list Xs of the form X1|X2|Xr,

1. sort X2|Xr to get Ys

2. insert X1 in the right position in Ys, to get the result

Page 58: Declarative Programming Techniques

S. Haridi and P. Van Roy 58

Converting recursiveinto iterative computation

• Consider the view of state transformation instead: S0 S1 S2...Si ... Sf

• At S0 no elements are counted

• At S1 one element is counted

• At Si i elements are counted

• At Sn , where n is the final state, all elements are counted

• The state is in general a pair (I, Ys)

• I is the length of the initial prefix of the list Xs that is counted, and Ys is the suffix of Xs that remains to count

Consider% Length :: fun{$ <list T >}:<int>fun {Length Xs} case Xs of nil then 0 [] X|Xr then 1+{Length Xr} endend

Page 59: Declarative Programming Techniques

S. Haridi and P. Van Roy 59

Converting recursiveinto iterative computation

• The state is in general a pair (I, Ys)

• I is the length of the initial prefix of the list Xs that is counted, and Ys is the suffix of Xs that remains to count

fun {IterLength I Xs} case Xs of nil then I [] X|Xr then {IterLength I+1 Xr} endendfun {Length Xs}

{IterLength 0 Xs}end

x1 x2 ... xi xi+1 ... xn

Ys

Xs

• Assume Ys is Y|Yr

• (I, Y| Yr) (I+1, Yr)

• {Length Xs} = I + {Length Y| Yr} = (I+1) + {Length Yr} = ... = N+{Length nil}

Page 60: Declarative Programming Techniques

S. Haridi and P. Van Roy 60

State invariants

• Given a state SI that is (I, Ys): the property PP(SI) defined as {Length Xs} = I + {Length Yr} is called state invariant

fun {IterLength I Xs} case Xs of nil then I [] X|Xr then {IterLength I+1 Xr} endendfun {Length Xs}

{IterLength 0 Xs}end

x1 x2 ... xi xi+1 ... xn

Ys

Xs

• Induction using state invariants:

1. prove PP(S0) (0, Xs)

2. assume PP(SI) , prove PP(SI+1)

3. conclusion: for all I, PP(SI) holds (in particular PP(Sfinal) holds

Page 61: Declarative Programming Techniques

S. Haridi and P. Van Roy 61

Reverse

fun {Reverse Xs}

case Xs

of nil then nil

[] X|Xr then {Append {Reverse Xr} [X]}

end

end

• This program does a recursive computation

• Let us define another program that is iterative (Initial list is Xs)

• (nil , x1| x2 ... | xi | xi+1 | ... | xn |nil)

• (x1|nil , x2 ... xi xi+1 ... xn)

• (x2 |x1|nil , ... xi xi+1 ... xn)

Page 62: Declarative Programming Techniques

S. Haridi and P. Van Roy 62

Reverse

fun {IterReverse Rs Xs} case Xs of nil then Rs [] X|Xr then {IterReverse X|Rs Xr} endendfun {Reverse Xs} {IterReverse nil Xs} end

Page 63: Declarative Programming Techniques

S. Haridi and P. Van Roy 63

Nested lists

nestedList T ::= nil [] T | nestedList T [] nestedList T | nestedList T

• We add the condition that T nil, and T X|Xr for some X and Xr• Problem: given an input Ls of type nestedList T, count the number

of elements in Ls

• We write the program by following the structure of the type nestedList T

Page 64: Declarative Programming Techniques

S. Haridi and P. Van Roy 64

Nested listsfun {IsCons Xs} case Xs of X|Xr then true else false end endfun {IsLeaf X} {Not {IsCons X}} andthen X \= nil endfun {LengthL Xs} case Xs of nil then 0 [] X|Xr andthen {IsLeaf X} then 1 + {LengthL Xr} [] X|Xr andthen {Not {IsLeaf X}} then {LengthL X} + {LengthL Xr} endend

Page 65: Declarative Programming Techniques

S. Haridi and P. Van Roy 65

Nested lists

fun {Flatten Xs}case Xs of nil then nil [] X|Xr andthen {IsLeaf X} then X|{Flatten Xr} [] X|Xr andthen {Not {IsLeaf X}} then {Append {Flatten X} {Flatten Xr}} endend

Page 66: Declarative Programming Techniques

S. Haridi and P. Van Roy 66

Accumulators

• Accumulator programming is a way to handle state in declarative programs. It is a programming technique that uses arguments to carry state, transform the state, and pass it to the next procedure.

• Assume that the state S is transformed during the program execution

• Each procedure P is called as {P Sin Sout}, where the state is represented as a pair: the first argument Sin is the input state of P and the second argument Sout is the output state after P has terminated

• The pair (Sin,Sout) of arguments is called an accumulator. It is used to thread state throughout the program

Page 67: Declarative Programming Techniques

S. Haridi and P. Van Roy 67

Accumulators

• Assume that the state S consists of a number of components to be transformed individually:

S = (X,Y,Z)• Assume P1 to Pn are procedures:

proc {P X0 X Y0 Y Z0 Z}:

{P1 X0 X1 Y0 Y1 Z0 Z1}{P2 X1 X2 Y1 Y2 Z1 Z2}

:{Pn Xn-1 X Yn-1 Y Zn-1 Z}

end• The procedural syntax is easier to use than the functional syntax if

there is more than one accumulator

accumulator

Page 68: Declarative Programming Techniques

S. Haridi and P. Van Roy 68

Example• Consider a variant of MergeSort with accumulator

• proc {MergeSort1 N S0 S Xs}– N is an integer,

– S0 is an input list to be sorted

– S is the remainder of S0 after the first N elements are sorted

– Xs is the sorted first N elements of S0

• The pair (S0, S) is an accumulator

• The definition is in a procedural syntax because it has two outputs S and Xs

Page 69: Declarative Programming Techniques

S. Haridi and P. Van Roy 69

Example (2)fun {MergeSort Xs} {MergeSort1 {Length X} Xs _ Ys} Ysend

proc {MergeSort1 N S0 S Xs} if N==0 then S = S0 Xs = nil elseif N ==1 then X|S = S0 [X] else %% N > 1

S1 Xs1 Xs2 NL = N div 2 NR = N - NL {MergeSort1 NL S0 S1 Xs1} {MergeSort1 NR S1 S Xs2} Xs = {Merge Xs1 Xs2} endend

Page 70: Declarative Programming Techniques

S. Haridi and P. Van Roy 70

Multiple accumulators

• Consider a stack machine for evaluating arithmetic expressions

• Example: (1+4)-3

• The machine executes the following instructions

push(1)push(4)pluspush(3)minus

1 4

5 5 3

2

Page 71: Declarative Programming Techniques

S. Haridi and P. Van Roy 71

Multiple accumulators (2)

• Example: (1+4)-3

• The arithmetic expressions are represented as trees:

• minus(plus(1 4) 3)

• Write a procedure that takes arithmetic expressions represented as trees and output a list of stack machine instructions and counts the number of instructions

• proc {ExprCode Expr Cin Cout Nin Nout}

• Cin: initial list of instructions

• Cout: final list of instructions

• Nin: initial count

• Nout: final count

Page 72: Declarative Programming Techniques

S. Haridi and P. Van Roy 72

Multiple accumulators (3)proc {ExprCode Expr C0 C N0 N} case Expr of plus(Expr1 Expr2) then C1 N1 in C1 = plus|C0 N1 = N0 + 1 {SeqCode [Expr2 Expr1] C1 C N1 N} [] minus(Expr1 Expr2) then C1 N1 in C1 = minus|C0 N1 = N0 + 1 {SeqCode [Expr2 Expr1] C1 C N1 N} [] I andthen {IsInt I} then C = push(I)|C0 N = N0 + 1 endend

Page 73: Declarative Programming Techniques

S. Haridi and P. Van Roy 73

Multiple accumulators (4)proc {ExprCode Expr C0 C N0 N} case Expr of plus(Expr1 Expr2) then C1 N1 in C1 = plus|C0 N1 = N0 + 1 {SeqCode [Expr2 Expr1] C1 C N1 N} [] minus(Expr1 Expr2) then C1 N1 in C1 = minus|C0 N1 = N0 + 1 {SeqCode [Expr2 Expr1] C1 C N1 N} [] I andthen {IsInt I} then C = push(I)|C0 N = N0 + 1 endend

proc {SeqCode Es C0 C N0 N} case Es of nil then C = C0 N = N0 [] E|Er then N1 C1 in {ExprCode E C0 C1 N0 N1} {SeqCode Er C1 C N1 N} endend

Page 74: Declarative Programming Techniques

S. Haridi and P. Van Roy 74

Shorter version (4)

proc {ExprCode Expr C0 C N0 N} case Expr of plus(Expr1 Expr2) then {SeqCode [Expr2 Expr1] plus|C0 C N0 + 1 N} [] minus(Expr1 Expr2) then {SeqCode [Expr2 Expr1] minus|C0 C N0 + 1 N} [] I andthen {IsInt I} then C = push(I)|C0 N = N0 + 1 endend

proc {SeqCode Es C0 C N0 N} case Es of nil then C = C0 N = N0 [] E|Er then N1 C1 in {ExprCode E C0 C1 N0 N1} {SeqCode Er C1 C N1 N} endend

Page 75: Declarative Programming Techniques

S. Haridi and P. Van Roy 75

Functional style (4)

fun {ExprCode Expr t(C0 N0) } case Expr of plus(Expr1 Expr2) then {SeqCode [Expr2 Expr1] t(plus|C0 N0 + 1)} [] minus(Expr1 Expr2) then {SeqCode [Expr2 Expr1] t(minus|C0 N0 + 1)} [] I andthen {IsInt I} then t(push(I)|C0 N0 + 1) endend

fun {SeqCode Es T} case Es of nil then T [] E|Er then

T1 = {ExprCode E T} in {SeqCode Er T1} endend

Page 76: Declarative Programming Techniques

S. Haridi and P. Van Roy 76

Difference lists (1)

• A difference list is a pair of lists, each might have an unbound tail, with the invariant that the one can get the second list by removing zero or more elements from the first list

• X # X % Represent the empty list

• nil # nil % idem

• [a] # [a] % idem

• (a|b|c|X) # X % Represents [a b c]

• [a b c d] # [d] % idem

Page 77: Declarative Programming Techniques

S. Haridi and P. Van Roy 77

Difference lists (2)

• When the second list is unbound, an append operation with another difference list takes constant time

• fun {AppendD D1 D2}S1 # E1 = D1S2 # E2 = D1

in E1 = S2S1 # E2

end• local X Y in {Browse {AppendD (1|2|3|X)#X (4|5|Y)#Y}} end• Displays (1|2|3|4|5|Y)#Y

Page 78: Declarative Programming Techniques

S. Haridi and P. Van Roy 78

A FIFO queuewith difference lists (1)

• A FIFO queue is a sequence of elements with an insert and a delete operation.– Insert adds an element to one end and delete removes it from the other end

• Queues can be implemented with lists. If L represents the queue content, then inserting X gives X|L and deleting X gives {ButLast L X} (all elements but the last).– Delete is inefficient: it takes time proportional to the number of queue

elements

• With difference lists we can implement a queue with constant-time insert and delete operations– The queue content is represented as q(N S E), where N is the number of

elements and S#E is a difference list representing the elements

Page 79: Declarative Programming Techniques

S. Haridi and P. Van Roy 79

A FIFO queuewith difference lists (2)

• Inserting ‘b’:– In: q(1 a|T T)– Out: q(2 a|b|U U)

• Deleting X:– In: q(2 a|b|U U)– Out: q(1 b|U U)

and X=a

• Difference list allows operations at both ends

• N is needed to keep track of the number of queue elements

fun {NewQueue} X in q(0 X X) end

fun {Insert Q X}case Q of q(N S E) then E1 in E=X|E1 q(N+1 S E1) end

end

fun {Delete Q X}case Q of q(N S E) then S1 in S|X1 q(N-1 S1 E) end

end

fun {EmptyQueue} case Q of q(N S E) then N==0 end end

Page 80: Declarative Programming Techniques

S. Haridi and P. Van Roy 80

Flatten (revisited)

fun {Flatten Xs}case Xs of nil then nil [] X|Xr andthen {IsLeaf X} then X|{Flatten Xr} [] X|Xr andthen {Not {IsLeaf X}} then {Append {Flatten X} {Flatten Xr}} endend

Remember the Flatten function we wrote before? Let us replace lists by difference lists and see what happens.

Page 81: Declarative Programming Techniques

S. Haridi and P. Van Roy 81

Flatten with difference lists (1)

• Flatten of nil is X#X

• Flatten of X|Xr is Y1#Y where

– flatten of X is Y1#Y2

– flatten of Xr is Y3#Y

– equate Y2 and Y3

• Flatten of a leaf X is (X|Y)#Y

Here is what it looks like as text

Page 82: Declarative Programming Techniques

S. Haridi and P. Van Roy 82

Flatten with difference lists (2)

proc {FlattenD Xs Ds}case Xs of nil then Y in Ds = Y#Y [] X|Xr then Y0 Y1 Y2 in

Ds = Y0#Y2 {Flatten X Y0#Y1}

{Flatten Xr Y1#Y2} [] X andthen {IsLeaf X} then Y in (X|Y)#Y endendfun {Flatten Xs} Y in {FlattenD Xs Y#nil} Y end

Here is the new program. It is much more efficient than the first version.

Page 83: Declarative Programming Techniques

S. Haridi and P. Van Roy 83

Reverse (revisited)

• Here is our recursive reverse:

• Rewrite this with difference lists:– Reverse of nil is X#X

– Reverse of X|Xs is Y1#Y, where• reverse of Xs is Y1#Y2, and

• equate Y2 and X|Y

fun {Reverse Xs}case Xsof nil then nil[] X|Xr then {Append {Reverse Xr} [X]}end

end

Page 84: Declarative Programming Techniques

S. Haridi and P. Van Roy 84

Reverse with difference lists (1)

• The naive version takes time proportional to the square of the input length

• Using difference lists in the naive version makes it linear time

• We use two arguments Y1 and Y instead of Y1#Y

• With a minor change we can make it iterative as well

fun {ReverseD Xs}proc {ReverseD Xs Y1 Y}

case Xsof nil then Y1=Y[] X|Xs then Y2 in

{ReverseD Xr Y1 Y2}

Y2 = X|Yend

endR in{ReverseD Xs R nil}R

end

Page 85: Declarative Programming Techniques

S. Haridi and P. Van Roy 85

Reverse with difference lists (2)

fun {ReverseD Xs}proc {ReverseD Xs Y1 Y}

case Xsof nil then Y1=Y[] X|Xs then {ReverseD Xr Y1 X|Y}end

endR in{ReverseD Xs R nil}R

end

Page 86: Declarative Programming Techniques

S. Haridi and P. Van Roy 86

Difference lists: summary

• Difference lists are a way to represent lists in the declarative model such that one append operation can be done in constant time– A function that builds a big list by concatenating together lots of

little lists can usually be written efficiently with difference lists

– The function can be written naively, using difference lists and append, and will be efficient when the append is expanded out

• Difference lists are declarative, yet have some of the power of destructive assignment– Because of the single-assignment property of the dataflow variable

• Difference lists originate from Prolog

Page 87: Declarative Programming Techniques

S. Haridi and P. Van Roy 87

Trees

• Next to lists, trees are the most important inductive data structure

• A tree is either a leaf node or a node containing one or more subtrees

• While a list has a linear structure, a tree can have a branching structure

• There are an enormous number of different kinds of trees. Here we will focus on one kind, ordered binary trees. Later on we will see other kinds of trees.

Page 88: Declarative Programming Techniques

S. Haridi and P. Van Roy 88

Ordered binary trees btree T1 ::= tree(key:T1 value: value btree T1 btree T1 )

[] leaf• Binary: each non-leaf node has two subtrees• Ordered: keys of left subtree < key of node < keys of right subtree

key:3 value:x

key:1 value:y key:5 value:z

leaf leaf leaf leaf

Page 89: Declarative Programming Techniques

S. Haridi and P. Van Roy 89

Search trees

• Search tree: A tree used for looking up, inserting, and deleting information

• Let us define three operations:– {Lookup X T}: returns the value corresponding to key X

– {Insert X V T}: returns a new tree containing the pair (X,V)

– {Delete X T}: returns a new tree that does not contain key X

Page 90: Declarative Programming Techniques

S. Haridi and P. Van Roy 90

Looking up information

• There are four cases:– X is not found

– X is found

– X might be in the left subtree

– X might be in the right subtree

fun {Lookup X T} case T of leaf then notfound [] tree(key:Y value:V T1 T2) andthen X==Y then

found(V) [] tree(key:Y value:V T1 T2) andthen X<Y then

{Lookup X T1} [] tree(key:Y value:V T1 T2) andthen X>Y then

{Lookup X T2} endend

Page 91: Declarative Programming Techniques

S. Haridi and P. Van Roy 91

Inserting information

• There are four cases:– (X,V) is inserted

immediately

– (X,V) replaces an existing node with same key

– (X,V) is inserted in the left subtree

– (X,V) is inserted in the right subtree

fun {Insert X V T} case T of leaf then tree(key:X value:V leaf leaf) [] tree(key:Y value:W T1 T2) andthen X==Y then

tree(key:X value:V T1 T2) [] tree(key:Y value:W T1 T2) andthen X<Y then

tree(key:Y value:W {Insert X V T1} T2) [] tree(key:Y value:W T1 T2) andthen X>Y then

tree(key:Y value:W T1 {Insert X V T2}) endend

Page 92: Declarative Programming Techniques

S. Haridi and P. Van Roy 92

Deleting information (1)

• There are four cases:– (X,V) is not in the

tree

– (X,V) is deleted immediately

– (X,V) is deleted from the left subtree

– (X,V) is deleted from the right subtree

• Right? Wrong!

fun {Delete X T} case T of leaf then leaf [] tree(key:Y value:W T1 T2) andthen X==Y then

leaf [] tree(key:Y value:W T1 T2) andthen X<Y then

tree(key:Y value:W {Delete X T1} T2) [] tree(key:Y value:W T1 T2) andthen X>Y then

tree(key:Y value:W T1 {Delete X T2}) endend

Page 93: Declarative Programming Techniques

S. Haridi and P. Van Roy 93

Deleting information (2)

• The problem with the previous program is that it does not correctly delete a non-leaf node

• To do it right, the tree has to be reorganized:– A new element has to

take the place of the deleted one

– It can be the smallest of the right subtree or the largest of the left subtree

fun {Delete X T} case T of leaf then leaf [] tree(key:Y value:W T1 T2) andthen X==Y then

case {RemoveSmallest T2}of none then T1[] Yp#Wp#Tp then tree(key:Yp value:Wp T1 Tp)end

[] tree(key:Y value:W T1 T2) andthen X<Y thentree(key:Y value:W {Delete X T1} T2)

[] tree(key:Y value:W T1 T2) andthen X>Y thentree(key:Y value:W T1 {Delete X T2})

endend

Page 94: Declarative Programming Techniques

S. Haridi and P. Van Roy 94

Deleting information (3)

• To remove the root node Y, there are two possibilities:

• One subtree is a leaf. Result is the other subtree.

• Neither subtree is a leaf. Result is obtained by removing an element from one of the subtrees.

T1

YT1

T1

Y

T2 T1

Yp

Tp

leaf

Page 95: Declarative Programming Techniques

S. Haridi and P. Van Roy 95

Deleting information (4)

• The function {RemoveSmallest T} removes the smallest element from T and returns the triple Xp#Vp#Tp, where (Xp,Vp) is the smallest element and Tp is the remaining tree

fun {RemoveSmallest T} case T of leaf then none [] tree(key:X value:V T1 T2) then case {RemoveSmallest T2} of none then X#V#T2 [] Xp#Vp#Tp then Xp#Vp#tree(key:X value:V T1 Tp) end endend

Page 96: Declarative Programming Techniques

S. Haridi and P. Van Roy 96

Deleting information (5)

• Why is deletion complicated?

• It is because the tree satisfies a global condition, namely that it is ordered

• The deletion operation has to work to maintain the truth of this condition

• Many tree algorithms rely on global conditions and have to work hard to maintain them

Page 97: Declarative Programming Techniques

S. Haridi and P. Van Roy 97

Depth-first traversal

• An important operation for trees is visiting all the nodes

• There are many orders in which the nodes can be visited

• The simplest is depth-first traversal, in which one subtree is completely visited before the other is started

• Variants are infix, prefix, and postfix traversals

fun {DFS T} case T of leaf then skip [] tree(key:X value:V T1 T2) then

{DFS T1}{Browse X#V}{DFS T2}

endend

Page 98: Declarative Programming Techniques

S. Haridi and P. Van Roy 98

Breadth-first traversal

• A second basic traversal order is breadth-first

• In this order, all nodes of depth n are visited before nodes of depth n+1

• The depth of a node is the distance to the root, in number of hops

• This is harder to implement than depth-first; it uses a queue of subtrees

fun {BFS T} fun {TreeInsert Q T} if T\=leaf then {Insert Q T} else Q end end proc {BFSQueue Q1} if {EmptyQueue Q1} then skip else X Q2={Delete Q1 X} tree(key:K value:V L R)=X in {Browse K#V} {BFSQueue

{TreeInsert {TreeInsert Q2 R} L}} end endin {BFSQueue {TreeInsert (NewQueue} T}}end

Page 99: Declarative Programming Techniques

S. Haridi and P. Van Roy 99

Reasoning about efficiency

• Computational efficiency is two things:– Execution time needed (e.g., in seconds)

– Memory space used (e.g., in memory words)

• The kernel language + its semantics allow us to calculate the execution time up to a constant factor– For example, execution time is proportional to n2, if input size is n

• To find the constant factor, it is necessary to measure what happens in the implementation (e.g., « wall-clock » time)– Measuring is the only way that really works; there are so many

optimizations in the hardware (caching, super-scalar execution, virtual memory, ...) that calculating exact time is almost impossible

• Let us first see how to calculate the execution time up to a constant factor

Page 100: Declarative Programming Techniques

S. Haridi and P. Van Roy 100

Big-oh notation

• We will give the computational efficiency of a program in terms of the « big-oh » notation O(f(n))

• Let T(n) be a function that is the execution time of some program, measured in the size of the input n

• Let f(n) be some function defined on nonnegative integers

• T(n) is of O(f(n)) if– T(n) c.f(n) for some positive constant c,

except for some small values of n n0

• Sometimes this is written T(n)=O(f(n)). Be careful!– If g(n)=O(f(n)) and h(n)=O(f(n)) then it is not true that g(n)=h(n)!

Page 101: Declarative Programming Techniques

S. Haridi and P. Van Roy 101

Calculating execution time

• We will use the kernel language as a guide to calculate the time

• Each kernel operation has an execution time (that may be a function of the « size » of its arguments)

• To calculate the execution time of a program:– Start with the function definitions written in the kernel language

– Calculate the time of each function in terms of its definition

• Assume each function’s time is a function of the « size » of its input arguments

– This gives a series of recurrence equations

• There may be more than one equation for a given function, e.g., to handle the base case of a recursion

– Solving these recurrence equations gives the execution time complexity of each function

Page 102: Declarative Programming Techniques

S. Haridi and P. Van Roy 102

Kernel language execution time

s ::= skip k| x = y k| x = v k | s1 s2 T(s1) + T(s2)| local x in s1 end k+T(s1) | proc {x y1 … yn } s1 end k| if x then s1 else s2 end k+ max(T(s1), T(s2))

| { x y1 … yn } Tx(size(I({y1,…,yn}))

| case x of pattern then s1 else s2 end k+ max(T(s1), T(s2))

Execution time T(s) of each kernel language operation s:

• Each instance of k is a different positive real constant• size(I({y1,…,yn})) is the size of the input arguments, defined as we like• In some special cases, x = y and x = v can take more time

Page 103: Declarative Programming Techniques

S. Haridi and P. Van Roy 103

Example: Append

proc {Append Xs Ys Zs}case Xsof nil then Zs = Ys[] X|Xr then Z Zr in Zs = Z|Zr {Append Xr Ys Zr}end

end

• Recurrence equation for recursive call:– TAppend(size(I({Xs,Ys,Zs}))) = k1+max(k2, k3+TAppend(size(I({Xr,Ys,Zr})))

– TAppend(size(Xs)) = k1+ max(k2, k3+TAppend(size(Xr))

– TAppend(n) = k1+max(k2, k3+TAppend(n-1)

– TAppend(n) = k4+TAppend(n-1)

• Recurrence equation for base case:– TAppend(0) = k5

• Solution:– TAppend(n) = k4.n + k5 = O(n)

Page 104: Declarative Programming Techniques

S. Haridi and P. Van Roy 104

Recurrence equations

• A recurrence equation is of two forms:– Define T(n) in terms of T(m1), …, T(mk), where m1, …, mk < n.

– Give T(n) directly for certain values of n (e.g., T(0) or T(1)).

• Recurrence equations of many different kinds pop up when calculating a program’s efficiency, for example:– T(n) = k + T(n-1) (T(n) is of O(n))

– T(n) = k1 + k2.n + T(n-1) (T(n) is of O(n2))

– T(n) = k + T(n/2) (T(n) is of O(log n))

– T(n) = k + 2.T(n/2) (T(n) is of O(n))

– T(n) = k1 + k2.n + 2.T(n/2) (T(n) is of O(n.log n))

• Let us investigate the most commonly-encountered ones– For the others, we refer you to one of the many books on the topic

Page 105: Declarative Programming Techniques

S. Haridi and P. Van Roy 105

Example: FastPascal• Recursive case

– TFP(n) = k1+ max(k2, k3+TFP(n-1)+k4.n)– TFP(n) = k5+k4.n+TFP(n-1)

• Base case– TFP(1) = k6

• Solution method:– Assume it has form a.n2+b.n+c– This gives three equations in three unknowns:

• k4-2.a=0• k5+a-b=0• k6=a+b+c

– It suffices to see that this is solvable and a0

• Solution: TFP(n) is of O(n2)

fun {FastPascal N} if N==1 then [1] else local L in

L={FastPascal N-1} {AddList {ShiftLeft L} {ShiftRight L}}end

endend

Page 106: Declarative Programming Techniques

S. Haridi and P. Van Roy 106

Example: Mergesort

• Recursive case:– TMS(size(Xs)) = k1 + k2.n + T(size(Ys)) + T(size(Zs))

– TMS(n) = k1 + k2.n + T(n/2 ) + T(n/2) – TMS(n) = k1 + k2.n + 2.T(n/2)

(assuming n is a power of 2)

• Base cases:– TMS(0) = k3– TMS(1) = k4

• Solution:– TMS(n) is of O(n.log n)

(can show it also holds if n is not a power of 2)

• Do you believe this?– Run the program and measure it!

fun {MergeSort Xs}case Xs of nil then nil[] [X] then Xselse Ys Zs in

{Split Xs Ys Zs}{Merge {MergeSort

Ys} {MergeSort Zs}}

endend

Page 107: Declarative Programming Techniques

S. Haridi and P. Van Roy 107

Memory space

• Watch out! There are two very different concepts

• Instantaneous active memory size (memory words)– How much memory the program needs to continue executing

successfully

– Calculated by reachability (memory reachable from semantic stack)

• Instantaneous memory consumption (memory words/sec)– How much memory the program allocates during its execution

– Measure for how much work the memory management (e.g., garbage collection) has to do

– Calculate with recurrence equations (similar to execution time complexity)

• These two concepts are independent

Page 108: Declarative Programming Techniques

S. Haridi and P. Van Roy 108

Calculatingmemory consumption

s ::= skip 0| x = y 0| x = v memsize(v) | s1 s2 M(s1) + M(s2)| local x in s1 end 1+M(s1) | if x then s1 else s2 end max(M(s1), M(s2))

| { x y1 … yn } Mx(size(I({y1,…,yn}))

| case x of pattern then s1 else s2 end max(M(s1), M(s2))

Memory space M(s) of each kernel language operation s:

• size(I({y1,…,yn})) is the size of the input arguments, defined as we like• In some cases, if, case, and x = v take less space

Page 109: Declarative Programming Techniques

S. Haridi and P. Van Roy 109

Memory consumption of bind

• memsize(v):– integer: 0 for small integers, else proportional to integer size

– float: 0

– list: 2

– tuple: 1+n (where n=length(arity(v)))

– other records: (where n = length(arity(v)))

• Once for each different arity in the system: k.n, where k depends on the run-time system

• For each record instance: 1+n

– procedure: k+memsize(s) where <s> is the procedure body and k depends on the compiler

• memsize(s):– Roughly proportional to number of statements and identifiers. Exact

value depends on the compiler.

Page 110: Declarative Programming Techniques

S. Haridi and P. Van Roy 110

Higher-order programming• Higher-order programming = the set of programming techniques that

are possible with procedure values (lexically-scoped closures)

• Basic operations– Procedural abstraction: creating procedure values with lexical scoping

– Genericity: procedure values as arguments

– Instantiation: procedure values as return values

– Embedding: procedure values in data structures

• Control abstractions– Integer and list loops, accumulator loops, folding a list (left and right)

• Data-driven techniques– List filtering, tree folding

• Explicit lazy evaluation, currying

• Later chapters: higher-order programming is the foundation of component-based programming and object-oriented programming

Page 111: Declarative Programming Techniques

S. Haridi and P. Van Roy 111

Procedural abstraction

• Procedural abstraction is the ability to convert any statement into a procedure value– A procedure value is usually called a closure, or more precisely, a

lexically-scoped closure

– A procedure value is a pair: it combines the procedure code with the environment where the procedure was created (the contextual environment)

• Basic scheme:– Consider any statement <s>

– Convert it into a procedure value: P = proc {$} <s> end– Executing {P} has exactly the same effect as executing <s>

Page 112: Declarative Programming Techniques

S. Haridi and P. Van Roy 112

A common limitation

• Most popular imperative languages (C, C++, Java) do not have procedure values

• They have only half of the pair: variables can reference procedure code, but there is no contextual environment

• This means that control abstractions cannot be programmed in these languages– They provide a predefined set of control abstractions (for, while loops, if statement)

• Generic operations are still possible– They can often get by with just the procedure code. The contextual environment is

often empty.

• The limitation is due to the way memory is managed in these languages– Part of the store is put on the stack and deallocated when the stack is deallocated

– This is supposed to make memory management simpler for the programmer on systems that have no garbage collection

– It means that contextual environments cannot be created, since they would be full of dangling pointers

Page 113: Declarative Programming Techniques

S. Haridi and P. Van Roy 113

Genericity

• Replace specific entities (zero 0 and addition +) by function arguments

• The same routine can do the sum, the product, the logical or, etc.

fun {SumList L}case L of nil then 0[] X|L2 then X+{SumList L2}end

end

fun {FoldR L F U}case L of nil then U[] X|L2 then {F X {FoldR L2 F U}}end

end

Page 114: Declarative Programming Techniques

S. Haridi and P. Van Roy 114

Instantiation

• Instantiation is when a procedure returns a procedure value as its result

• Calling {FoldFactory fun {$ A B} A+B end 0} returns a function that behaves identically to SumList, which is an « instance » of a folding function

fun {FoldFactory F U}fun {FoldR L F U}

case L of nil then U[] X|L2 then {F X {FoldR L2 F U}}end

endin

proc {$ L} {FoldR L F U} endend

Page 115: Declarative Programming Techniques

S. Haridi and P. Van Roy 115

Embedding

• Embedding is when procedure values are put in data structures

• Embedding has many uses:– Modules: a module is a record that groups together a set of related

operations

– Software components: a software component is a generic function that takes a set of modules as its arguments and returns a new module. It can be seen as specifying a module in terms of the modules it needs.

– Delayed evaluation (also called explicit lazy evaluation): build just a small part of a data structure, with functions at the extremities that can be called to build more. The consumer can control explicitly how much of the data structure is built.

Page 116: Declarative Programming Techniques

S. Haridi and P. Van Roy 116

Control Abstractions

declare

proc {For I J P}

if I >= J then skip

else {P I} {For I+1 J P}

end

end

{For 1 10 Browse}

for I in 1..10 do {Browse I} end

Page 117: Declarative Programming Techniques

S. Haridi and P. Van Roy 117

Control Abstractionsproc {ForAll Xs P} case Xs of nil then skip [] X|Xr then {P X} {ForAll Xr P} endend

{ForAll [a b c d] proc{$ I} {System.showInfo "the item is: " # I} end}

for I in [a b c d] do {System.showInfo "the item is: " # I}end

Page 118: Declarative Programming Techniques

S. Haridi and P. Van Roy 118

Tree folding

Page 119: Declarative Programming Techniques

S. Haridi and P. Van Roy 119

Explicit lazy evaluation

Page 120: Declarative Programming Techniques

S. Haridi and P. Van Roy 120

Currying

Page 121: Declarative Programming Techniques

S. Haridi and P. Van Roy 121

Abstract data types

• A datatype is a set of values and an associated set of operations

• A datatype is abstract if only it is completely described by its set of operations regradless of its implementation

• This means that it is possible to change the implementation of the datatype without changing its use

• The datatype is thus described by a set of procedures

• These operations are the only thing that a user of the abstraction can assume

Page 122: Declarative Programming Techniques

S. Haridi and P. Van Roy 122

Example: stack

• Assume we want to define a new datartype stack T whose elements are of any type T

fun {NewStack}: stack Tfun {Push stack T T }: stack Tproc {Pop stack T T stack T}fun {IsEmpty stack T} : bool

• These operations normally satisfy certain conditions

• {IsEmpty {NewStack}} = true

• for any E and T, {Pop {Push S E} E S} holds

• {Pop {EmptyStack}} raises error

Page 123: Declarative Programming Techniques

S. Haridi and P. Van Roy 123

Stack (implementation)

fun {NewStack} nil end

fun {Push S E} E|S end

proc {Pop E|S E1 S1} E1 = E S1 = S end

fun {IsEmpty S} S==nil end

proc {Display S} S end

Page 124: Declarative Programming Techniques

S. Haridi and P. Van Roy 124

Stack (another implementation)

fun {NewStack} nil end

fun {Push S E} E|S end

proc {Pop E|S E1 S1} E1 = E S1 = S end

fun {IsEmpty S} S==nil end

fun {NewStack} emptyStack end

fun {Push S E} stack(E S) end

proc {Pop stack(E S) E1 S1} E1 = E S1 = S end

fun {IsEmpty S} S==emptyStack end

fun {Display S} ... end

Page 125: Declarative Programming Techniques

S. Haridi and P. Van Roy 125

Dictionaries

• The datatype dictionary is a finite mapping from a set T to value, where T is either atom or integer

• fun {NewDictionary}– returns an empty mapping

• fun {Put D Key Value}– returns a dictionary identical to D except Key is mapped to

Value• fun {CondGet D Key Default}

– returns the value corresponding to Key in D, otherwise returns Default

• fun {Domain D}– returns a list of the keys in D

Page 126: Declarative Programming Techniques

S. Haridi and P. Van Roy 126

Implementation

fun {NewDictionary} nil end fun {Put Ds Key Value} case Ds of nil then [Key#Value] [] (K#V)|Dr andthen Key==K then (Key#Value) | Dr [] (K#V)|Dr andthen K>Key then (Key#Value)|(K#V)|Dr [] (K#V)|Dr andthen K<Key then (K#V)|{Put Dr Key Value} end end

Page 127: Declarative Programming Techniques

S. Haridi and P. Van Roy 127

Implementationfun {CondGet Ds Key Default} case Ds of nil then Default [] (K#V)|Dr andthen Key==K then V [] (K#V)|Dr andthen K>Key then Default [] (K#V)|Dr andthen K<Key then {CondGet Dr Key Default} end end fun {Domain Ds} {Map Ds fun {$ K#_} K end} end

Page 128: Declarative Programming Techniques

S. Haridi and P. Van Roy 128

Tree implementationfun {Put Ds Key Value} % ... similar to Insert endfun {CondGet Ds Key Default} % ... similar to Lookup endfun {Domain Ds} proc {DomainD Ds S1 S0} case Ds of leaf then

S1=S0 [] tree(key:K left:L right:R ...) then S2 S3 in

S1=K|S2 {DomainD L S2 S3} {DomainD R S3 S0}

end end Rin {DomainD Ds R nil} R end

Page 129: Declarative Programming Techniques

S. Haridi and P. Van Roy 129

Software components

• What is a good way to organize a large program?– It is confusing to put all in one big file

• Partition the program into logical units, each of which implements a set of related operations– Each logical unit has two parts, an interface and an implementation

– Only the interface is visible from outside the logical unit, i.e., from other units that use it

– A program is a directed graph of logical units, where an edge means that one logical unit needs the second for its implementation

• Such logical units are vaguely called « modules » or « software components »

• Let us define these concepts more precisely

Page 130: Declarative Programming Techniques

S. Haridi and P. Van Roy 130

Basic concepts

• Module = part of a program that groups together related operations into an entity with an interface and an implementation

• Module interface = a record that groups together language entities belonging to the module

• Module implementation = a set of language entities accessible by the interface but hidden from the outside (hiding can be done using lexical scope)

• Module specification = a function whose arguments are module interfaces, that creates a new module when called, and that returns this new module’s interface– Module specifications are sometimes called software components, but the

latter term is widely used with many different meanings– To avoid confusion, we will call our module specifications functors and

we give them a special syntax (they are a linguistic abstraction)

Page 131: Declarative Programming Techniques

S. Haridi and P. Van Roy 131

Standalone applications

• A standalone application consists of one functor, called the main functor, which is called when the application starts– The main functor is used for its effect of starting the application

and not for the module it creates

– The modules it needs are linked dynamically (upon first use) or statically (upon application start)

– Linking a module: First see whether the module already exists, if so return it. Otherwise, find a functor specifying the module according to some search strategy, call the functor, and link its arguments recursively.

• Functors are compilation units, i.e., the system has support for handling functors in files, in both source and compiled form

Page 132: Declarative Programming Techniques

S. Haridi and P. Van Roy 132

Constructing a module

• Let us first see an example of a module, and then we will define a functor that specifies that module

• A module is accessed through its interface, which is a record

• We construct the module MyList

• MergeSort is inaccessible outside the module

• Append is accessible as MyList.append

declare MyList inlocal proc {Append …} … end proc {MergeSort …} … end proc {Sort …} … {MergeSort …} … end proc {Member …} … endin MyList=‘export‘( append:Append

sort: Sortmember: Member…)

end

Page 133: Declarative Programming Techniques

S. Haridi and P. Van Roy 133

Specifying a module

• Now let us define a functor that specifies the module MyList

• The functor MyListFunctor will create a new module whenever it is called

• The functor has no arguments because the module needs no other modules

fun {MyListFunctor} proc {Append …} … end proc {MergeSort …} … end proc {Sort …} … {MergeSort …} … end proc {Member …} … endin ‘export‘(append:Append

sort: Sort member: Member …)

end

Page 134: Declarative Programming Techniques

S. Haridi and P. Van Roy 134

Syntactic support for functors

• We give syntactic support to the functor

• The keyword export is followed by the record fields

• The keyword define is followed by the module initialization code

• There is another keyword not used here, import, to specify which modules that the functor needs

functorexport append: Append sort: Sort member: Member …define proc {Append …} … end proc {MergeSort …} … end proc {Sort …} … {MergeSort …} … end proc {Member …} … endend

Page 135: Declarative Programming Techniques

S. Haridi and P. Van Roy 135

Example standalone application

• Compile Dict.oz giving Dict.ozf

• Compile WordApp.oz giving WordApp*

functorimport Open Dict QTk define … (1. Read stdin into a list) … (2. Calculate word frequencies) … (3. Display result in a window)end

functorexport new:NewDict put:Put condGet:CondGet entries:Entriesdefine fun {NewDict} ... end fun {Put Ds Key Value} … end fun {CondGet Ds Key Default} … end fun {Entries Ds} … endend

File Dict.oz File WordApp.oz

Page 136: Declarative Programming Techniques

S. Haridi and P. Van Roy 136

Library modules

• A programming system usually comes with many modules– Built-in modules: available without needing to specify them in a

functor

– Library modules: have to be specified in a functor

Page 137: Declarative Programming Techniques

S. Haridi and P. Van Roy 137

Balanced trees• We look at ordered binary trees that are approximately

balanced

• A tree T with n nodes is approximately balanced if the height of T is at most c.log2(n+1), for some constant c

• Insertion and deletion operations are in the order of O(log n)

• Red-black trees preserve this property: the height of a RB-tree is at most 2.log2(n+1) where n is the number of nodes in the tree

Page 138: Declarative Programming Techniques

S. Haridi and P. Van Roy 138

Red Black trees

• Red Black trees have the following properties:1. Every node is either red or black

2. If a node is red, then both its children are black

3. Each path from a node to all descendant leaves contains the same number of black nodes

• It follows that the shortest path from the root to a leaf has all black nodes and is of length log(n+1) for a tree of n nodes

• The longest path has alternating black-red nodes and of length 2.log(n+1)

• Insertion and deletion algorithms preserve the ’balanced’ tree property

Page 139: Declarative Programming Techniques

S. Haridi and P. Van Roy 139

Example

26

4117

2114 4730

1910

7

2816 23 38

12

11

Page 140: Declarative Programming Techniques

S. Haridi and P. Van Roy 140

Insertion at leaf

105

10

12

10 ?

or

10 ?

5

10

12

10or

Has to be fixed

Page 141: Declarative Programming Techniques

S. Haridi and P. Van Roy 141

Insertion at internal node (left)

10

5

6

10

??

12 12

Insertion may lead to violation of red-black property

Page 142: Declarative Programming Techniques

S. Haridi and P. Van Roy 142

Insertion at internal node (left)

Y

X

T1

Z

T3

T4

Y

X Z

T3 T4

Right Rotate

T2

Correction is done by rotation (to the right in this case)

T1 T2

Page 143: Declarative Programming Techniques

S. Haridi and P. Van Roy 143

Insertion at internal node (left)

Y

X

T1

Z

T3

T4

Y

X Z

T3 T4

Right Rotate

T2

Correction is done by rotation (to the right in this case)

T1 T2

Page 144: Declarative Programming Techniques

S. Haridi and P. Van Roy 144

Insertion at internal node (right)

Y

Z

T1

X

T3

T2

Left Rotate

T4

Y

X Z

T3 T4T1 T2

Page 145: Declarative Programming Techniques

S. Haridi and P. Van Roy 145

Insertion at internal node (right)

Z

YT1

X

T3T2

Left Rotate

T4

Y

X Z

T3 T4T1 T2

Page 146: Declarative Programming Techniques

S. Haridi and P. Van Roy 146

Program

• <color> ::= red [] black

• <tree> ::= nil [] node(<color> <int> <tree> <tree>)

declare

fun {Insert X T}

node(C Y T1 T2) = {InsertRot X T}

in node(black Y T1 T2)

end

Page 147: Declarative Programming Techniques

S. Haridi and P. Van Roy 147

Programfun {InsertRot X T}

case T

of nil then node(red X nil nil)

[] node(C Y T1 T2) andthen X == Y then T

[] node(C Y T1 T2) andthen X < Y then

T3 = {InsertRot X T1} in

{RotateRight node(C Y T3 T2)}

[] node(C Y T1 T2) andthen X > Y then

T3 = {InsertRot X T2} in

{RotateLeft node(C Y T1 T3)}

end

end

Page 148: Declarative Programming Techniques

S. Haridi and P. Van Roy 148

Insertion at internal node (left)

Y

X

Z

T3

T4

Y

X Z

T3 T4

Right Rotate

T1 T2

T1 T2

fun {RotateRight T} node(C Z LT T4) = T in case LT of node(red Y node(red X T1 T2) T3) then node(red Y

node(black X T1 T2) node(black Z T3 T4))

[] node(red X T1 node(red Y T2 T3)) then node(red Y

node(black X T1 T2) node(black Z T3 T4))

else T endend

Page 149: Declarative Programming Techniques

S. Haridi and P. Van Roy 149

Insertion at internal node (right)

Z

Y

X

T3

T4

Y

X Z

T3 T4

Left Rotate

T1 T2

T1

T2

fun {RotateLeft T} node(C X T1 RT) = T in case RT of node(red Z node(red Y T2 T3) T4) then node(red Y

node(black X T1 T2) node(black Z T3 T4))

[] node(red Y T2 node(red Z T3 T4)) then node(red Y

node(black X T1 T2) node(black Z T3 T4))

else T endend

Page 150: Declarative Programming Techniques

S. Haridi and P. Van Roy 150

The real world

• File I/O

• Window I/O

• Programming with exceptions

• More on efficiency– Garbage collection is not magic

– External references

Page 151: Declarative Programming Techniques

S. Haridi and P. Van Roy 151

Large-scale program structure

• Modules: an example of higher-order programming

• Module specifications

Page 152: Declarative Programming Techniques

S. Haridi and P. Van Roy 152

Limitations and extensionsof declarative programming

• Is declarative programming expressive enough to do everything?– Given a straightforward compiler, the answer is no

• Examples of limitations

• Extensions– Concurrency

– State

– Concurrency and state

• When to use each model– Use the least expressive model that gives a simple program (without

technical scaffolding)

– Use declarative whenever possible, since it is by far the easiest to reason about, both in the small and in the large