Week 13 - Wednesday. What did we talk about last time? NP-completeness.

46
CS221 Week 13 - Wednesday

Transcript of Week 13 - Wednesday. What did we talk about last time? NP-completeness.

Page 1: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

CS221Week 13 - Wednesday

Page 2: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Last time

What did we talk about last time? NP-completeness

Page 3: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Questions?

Page 4: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Project 4

Page 5: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Assignment 6

Page 6: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

SortingStudent Lecture

Page 7: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

What do we want from sorting?

Page 8: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Characteristics of a sort Running time

Best case Worst case Average case

Stable Will elements with the same value get reordered?

Adaptive Will a mostly sorted list take less time to sort?

In-place Can we perform the sort without additional memory?

Simplicity of implementation Relates to the constant hidden by Big Oh

Online Can sort as numbers arrive

Page 9: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Insertion Sort

Page 10: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Insertion sort

Pros: Best case running time of O(n) Stable Adaptive In-place Simple implementation (one of the fastest

sorts for 10 elements or fewer!) Online

Cons: Worst case running time of O(n2)

Page 11: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Insertion sort algorithm

We do n rounds For round i, assume that the elements 0

through i – 1 are sorted Take element i and move it up the list of

already sorted elements until you find the spot where it fits

Page 12: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Insertion sort example

7

45

0

54

37108

51

7

45

0

54

37108

51

0

7

45

54

37108

51

0

7

45

54

37108

51

0

7

37

45

54108

51

0

7

37

45

54108

51

0

7

37

45

51

54108

Page 13: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Insertion Sort Implementation

Page 14: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Merge Sort

Page 15: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Merge sort Pros:

Best, worst, and average case running time ofO(n log n)

Stable Ideal for linked lists

Cons: Not adaptive Not in-place▪ O(n) additional space needed for an array▪ O(log n) additional space needed for linked lists

Page 16: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Merge sort algorithm

Take a list of numbers, and divide it in half, then, recursively: Merge sort each half After each half has been sorted, merge

them together in order

Page 17: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Merge sort example7

45

0

45

0

0

45

0

7

45

54

37108

37

108

37108

37

54108

7

45

0

54

37108

0

7

37

45

54108

7

45

0

54

37108

Page 18: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Merge Sort Implementation

Page 19: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Quicksort

Page 20: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Quicksort

Pros: Best and average case running time of

O(n log n) Very simple implementation In-place Ideal for arrays

Cons: Worst case running time of O(n2) Not stable

Page 21: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Quicksort algorithm

1. Pick a pivot2. Partition the array into a left half

smaller than the pivot and a right half bigger than the pivot

3. Recursively, quicksort the left half4. Recursively quicksort the right half

Page 22: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Partition algorithm Input: array, index, left, right Set pivot to be array[index] Swap array[index] with array[right] Set index to left For i from left up to right – 1

If array[i] ≤ pivot▪ Swap array[i] with array[index]▪ index++

Swap array[index] with array[right] Return index //so that we know where pivot

is

Page 23: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Quicksort Example

7

45

0

54

37108

0

7

45

54

37108

0

7

45

54

37108

0

7

37

45

54108

0

7

37

45

54108

0

7

37

45

54108

0

7

37

45

54108

Page 24: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Quicksort issues Everything comes down to picking the right pivot

If you could get the median every time, it would be great A common choice is the first element in the range as

the pivot Gives O(n2) performance if the list is sorted (or reverse

sorted) Why?

Another implementation is to pick a random location Another well-studied approach is to pick three random

locations and take the median of those three An algorithm exists that can find the median in linear

time, but its constant is HUGE

Page 25: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Heaps

Page 26: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Heaps

A maximum heap is a complete binary tree where The left and right children of the root

have key values less than the root The left and right subtrees are also

maximum heaps

We can define minimum heaps similarly

Page 27: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Heap example

10

9 3

0 1

Page 28: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

How do you know where to add?

Easy! We look at the location of the new

node in binary Ignore the first 1, then each 0 is for

going left, each 1 is for going right

Page 29: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

New node

Location: 6 In binary: 110 Right then left

10

9 3

0 1

Page 30: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Add 15

Oh no! 10

9 3

0 1 15

Page 31: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

After an add, bubble up

10

9 3

0 1 15

10

9 15

0 1 3

15

9 10

0 1 3

Page 32: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Only the root can be deleted

10

9 3

0 1

Page 33: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Replace it with the “last” node

9 3

0 1

9 3

0

1

Page 34: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Then, bubble down

9 3

0

1

1 3

0

9

Page 35: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Operations

Heaps only have: Add Remove Largest Get Largest

Which cost: Add: O(log n) Remove Largest: O(log n) Get Largest: O(1)

Heaps are a perfect data structure for a priority queue

Page 36: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Priority queue implementation

Page 37: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Priority queue

A priority queue is an ADT that allows us to insert key values and efficiently retrieve the highest priority one

It has these operations: Insert(key) Put the key into the priority queue Max() Get the highest value key Remove Max() Remove the highest

value key

Page 38: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Implementation

It turns out that a heap is a great way to implement a priority queue

Although it's useful to think of a heap as a complete binary tree, almost no one implements them that way

Instead, we can view the heap as an array

Because it's a complete binary tree, there will be no empty spots in the array

Page 39: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Array implementation of priority queue

public class PriorityQueue {

private int[] keys = new int[10];

private int size = 0;…

}

Page 40: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Array view

Illustrated:

The left child of element i is at 2i + 1 The right child of element i is at 2i + 2

10

9 3

0 1

10 9 3 0 1

0 1 2 3 4

Page 41: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Insert

public void insert(int key)

Always put the key at the end of the array (resizing if needed)

The value will often need to be bubbled up, using the following helper method

private void bubbleUp(int index)

Page 42: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Max

public int max()

Find the maximum value in the priority queue

Hint: this method is really easy

Page 43: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Remove Max

public int removeMax()

Store the value at the top of the heap (array index 0)

Replace it with the last legal value in the array This value will generally need to be bubbled

down, using the following helper method Bubbling down is harder than bubbling up, because

you might have two legal children!

private void bubbleDown(int index)

Page 44: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Upcoming

Page 45: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Next time…

Heapsort TimSort Counting sort Radix sort

Page 46: Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.

Reminders

Work on Project 4Work on Assignment 6

Due on Friday Read sections 2.4 and 5.1