SlideShare une entreprise Scribd logo
1  sur  169
Petar Petrov
10.12.2015
 Algorithm Analysis
 Sorting
 Searching
 Data Structures
 An algorithm is a set of instructions to be
followed to solve a problem.
 Correctness
 Finiteness
 Definiteness
 Input
 Output
 Effectiveness
There are two aspects of algorithmic
performance:
 Time
 Space
 First, we start to count the number of basic
operations in a particular solution to assess its
efficiency.
 Then, we will express the efficiency of algorithms
using growth functions.
 We measure an algorithm’s time requirement
as a function of the problem size.
 The most important thing to learn is how
quickly the algorithm’s time requirement
grows as a function of the problem size.
 An algorithm’s proportional time requirement
is known as growth rate.
 We can compare the efficiency of two
algorithms by comparing their growth rates.
 Each operation in an algorithm (or a program) has a
cost.
 Each operation takes a certain of time.
count = count + 1;  take a certain amount of time, but it is
constant
A sequence of operations:
count = count + 1; Cost: c1
sum = sum + count; Cost: c2
 Total Cost = c1 + c2
Example: Simple If-Statement
Cost Times
if (n < 0) c1 1
absval = -n c2 1
else
absval = n; c3 1
Total Cost <= c1 + max(c2,c3)
Example: Simple Loop
Cost Times
i = 1; c1 1
sum = 0; c2 1
while (i <= n) { c3 n+1
i = i + 1; c4 n
sum = sum + i; c5 n
}
Total Cost = c1 + c2 + (n+1)*c3 + n*c4 + n*c5
 The time required for this algorithm is proportional
to n
Example: Nested Loop
Cost Times
i=1; c1 1
sum = 0; c2 1
while (i <= n) { c3 n+1
j=1; c4 n
while (j <= n) { c5 n*(n+1)
sum = sum + i; c6 n*n
j = j + 1; c7 n*n
}
i = i +1; c8 n
}
Total Cost = c1 + c2 + (n+1)*c3 + n*c4 +
n*(n+1)*c5+n*n*c6+n*n*c7+n*c8
 The time required for this algorithm is proportional to n2
 Consecutive Statements
 If/Else
 Loops
 Nested Loops
 Informal definitions:
◦ Given a complexity function f(n),
◦ O(f(n)) is the set of complexity functions that are
upper bounds on f(n)
◦ (f(n)) is the set of complexity functions that are
lower bounds on f(n)
◦ (f(n)) is the set of complexity functions that,
given the correct constants, correctly describes f(n)
 Example: If f(n) = 17n3 + 4n – 12, then
◦ O(f(n)) contains n3, n4, n5, 2n, etc.
◦ (f(n)) contains 1, n, n2, n3, log n, n log n, etc.
◦ (f(n)) contains n3
Example: Simple If-Statement
Cost Times
if (n < 0) c1 1
absval = -n c2 1
else
absval = n; c3 1
Total Cost <= c1 + max(c2,c3)
O(1)
Example: Simple Loop
Cost Times
i = 1; c1 1
sum = 0; c2 1
while (i <= n) { c3 n+1
i = i + 1; c4 n
sum = sum + i; c5 n
}
Total Cost = c1 + c2 + (n+1)*c3 + n*c4 + n*c5
 The time required for this algorithm is proportional
to n
O(n)
Example: Nested Loop
Cost Times
i=1; c1 1
sum = 0; c2 1
while (i <= n) { c3 n+1
j=1; c4 n
while (j <= n) { c5 n*(n+1)
sum = sum + i; c6 n*n
j = j + 1; c7 n*n
}
i = i +1; c8 n
}
Total Cost = c1 + c2 + (n+1)*c3 + n*c4 +
n*(n+1)*c5+n*n*c6+n*n*c7+n*c8
 The time required for this algorithm is proportional to n2
O(n2)
Function Growth Rate Name
c Constant
log N Logarithmic
log2N Log-squared
N Linear
N log N Linearithmic
N2 Quadratic
N3 Cubic
2N Exponential
 Input:
◦ A sequence of n numbers a1, a2, . . . , an
 Output:
◦ A permutation (reordering) a1’, a2’, . . . , an’ of the
input sequence such that a1’ ≤ a2’ ≤ · · · ≤ an’
 In-Place Sort
◦ The amount of extra space required to sort the data
is constant with the input size.
Sorted on first key:
Sort file on second key:
Records with key value
3 are not in order on
first key!!
 Stable sort
◦ preserves relative order of records with equal keys
 Idea: like sorting a hand of playing cards
◦ Start with an empty left hand and the cards facing
down on the table.
◦ Remove one card at a time from the table, and
insert it into the correct position in the left hand
◦ The cards held in the left hand are sorted
To insert 12, we need to
make room for it by moving
first 36 and then 24.
insertionsort (a) {
for (i = 1; i < a.length; ++i) {
key = a[i]
pos = i
while (pos > 0 && a[pos-1] > key) {
a[pos]=a[pos-1]
pos--
}
a[pos] = key
}
}
 O(n2), stable, in-place
 O(1) space
 Great with small number of elements
 Algorithm:
◦ Find the minimum value
◦ Swap with 1st position value
◦ Repeat with 2nd position down
 O(n2), stable, in-place
 Algorithm
◦ Traverse the collection
◦ “Bubble” the largest value to the end using pairwise
comparisons and swapping
 O(n2), stable, in-place
 Totally useless?
1. Divide: split the array in two
halves
2. Conquer: Sort recursively both
subarrays
3. Combine: merge the two sorted
subarrays into a sorted array
mergesort (a, left, right) {
if (left < right) {
mid = (left + right)/2
mergesort (a, left, mid)
mergesort (a, mid+1, right)
merge(a, left, mid+1, right)
}
}
 The key to Merge Sort is merging two sorted
lists into one, such that if you have two lists
X (x1x2…xm) and Y(y1y2…yn) the
resulting list is Z(z1z2…zm+n)
 Example:
L1 = { 3 8 9 } L2 = { 1 5 7 }
merge(L1, L2) = { 1 3 5 7 8 9 }
3 10 23 54 1 5 25 75X: Y:
Result:
3 10 23 54 5 25 75
1
X: Y:
Result:
10 23 54 5 25 75
1 3
X: Y:
Result:
10 23 54 25 75
1 3 5
X: Y:
Result:
23 54 25 75
1 3 5 10
X: Y:
Result:
54 25 75
1 3 5 10 23
X: Y:
Result:
54 75
1 3 5 10 23 25
X: Y:
Result:
75
1 3 5 10 23 25 54
X: Y:
Result:
1 3 5 10 23 25 54 75
X: Y:
Result:
99 6 86 15 58 35 86 4 0
99 6 86 15 58 35 86 4 0
99 6 86 15 58 35 86 4 0
99 6 86 15 58 35 86 4 0
99 6 86 15 58 35 86 4 0
86 1599 6 58 35 86 4 0
99 6 86 15 58 35 86 4 0
99 6 86 15 58 35 86 4 0
86 1599 6 58 35 86 4 0
99 6 86 15 58 35 86 4 0
99 6 86 15 58 35 86 4 0
99 6 86 15 58 35 86 4 0
86 1599 6 58 35 86 4 0
99 6 86 15 58 35 86 4 0
4 0
99 6 86 15 58 35 86 0 4
4 0
15 866 99 35 58 0 4 86
99 6 86 15 58 35 86 0 4
6 15 86 99 0 4 35 58 86
15 866 99 58 35 0 4 86
0 4 6 15 35 58 86 86 99
6 15 86 99 0 4 35 58 86
0 4 6 15 35 58 86 86 99
Merge Sort runs O (N log N) for all cases, because of
its Divide and Conquer approach.
T(N) = 2T(N/2) + N = O(N logN)
1. Select: pick an element x
2. Divide: rearrange elements so
that x goes to its final position
• L elements less than x
• G elements greater than or equal
to x
3. Conquer: sort recursively L and G
x
x
x
L G
L G
quicksort (a, left, right) {
if (left < right) {
pivot = partition (a, left, right)
quicksort (a, left, pivot-1)
quicksort (a, pivot+1, right)
}
}
 How to pick a pivot?
 How to partition?
 Use the first element as pivot
◦ if the input is random, ok
◦ if the input is presorted? - shuffle in advance
 Choose the pivot randomly
◦ generally safe
◦ random numbers generation can be expensive
 Use the median of the array
◦ Partitioning always cuts the array into half
◦ An optimal quicksort (O(n log n))
◦ hard to find the exact median (chicken-egg?)
◦ Approximation to the exact median..
 Median of three
◦ Compare just three elements: the leftmost, the
rightmost and the center
◦ Use the middle of the three as pivot
 Given a pivot, partition the elements of the
array such that the resulting array consists of:
◦ One subarray that contains elements < pivot
◦ One subarray that contains elements >= pivot
 The subarrays are stored in the original array
40 20 10 80 60 50 7 30 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
40 20 10 80 60 50 7 30 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
40 20 10 80 60 50 7 30 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
40 20 10 80 60 50 7 30 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
40 20 10 80 60 50 7 30 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
40 20 10 80 60 50 7 30 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
40 20 10 80 60 50 7 30 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
3. if too_big_index < too_small_index
swap a[too_big_index]a[too_small_index]
40 20 10 30 60 50 7 80 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
3. if too_big_index < too_small_index
swap a[too_big_index]a[too_small_index]
40 20 10 30 60 50 7 80 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
3. if too_big_index < too_small_index
swap a[too_big_index]a[too_small_index]
4. while too_small_index > too_big_index, go to 1.
40 20 10 30 60 50 7 80 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
3. if too_big_index < too_small_index
swap a[too_big_index]a[too_small_index]
4. while too_small_index > too_big_index, go to 1.
40 20 10 30 60 50 7 80 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
3. if too_big_index < too_small_index
swap a[too_big_index]a[too_small_index]
4. while too_small_index > too_big_index, go to 1.
40 20 10 30 60 50 7 80 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
3. if too_big_index < too_small_index
swap a[too_big_index]a[too_small_index]
4. while too_small_index > too_big_index, go to 1.
40 20 10 30 60 50 7 80 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
3. if too_big_index < too_small_index
swap a[too_big_index]a[too_small_index]
4. while too_small_index > too_big_index, go to 1.
40 20 10 30 60 50 7 80 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
3. if too_big_index < too_small_index
swap a[too_big_index]a[too_small_index]
4. while too_small_index > too_big_index, go to 1.
40 20 10 30 7 50 60 80 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
3. if too_big_index < too_small_index
swap a[too_big_index]a[too_small_index]
4. while too_small_index > too_big_index, go to 1.
40 20 10 30 7 50 60 80 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
3. if too_big_index < too_small_index
swap a[too_big_index]a[too_small_index]
4. while too_small_index > too_big_index, go to 1.
40 20 10 30 7 50 60 80 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
3. if too_big_index < too_small_index
swap a[too_big_index]a[too_small_index]
4. while too_small_index > too_big_index, go to 1.
40 20 10 30 7 50 60 80 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
3. if too_big_index < too_small_index
swap a[too_big_index]a[too_small_index]
4. while too_small_index > too_big_index, go to 1.
40 20 10 30 7 50 60 80 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
3. if too_big_index < too_small_index
swap a[too_big_index]a[too_small_index]
4. while too_small_index > too_big_index, go to 1.
40 20 10 30 7 50 60 80 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
3. if too_big_index < too_small_index
swap a[too_big_index]a[too_small_index]
4. while too_small_index > too_big_index, go to 1.
40 20 10 30 7 50 60 80 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
3. if too_big_index < too_small_index
swap a[too_big_index]a[too_small_index]
4. while too_small_index > too_big_index, go to 1.
40 20 10 30 7 50 60 80 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
3. if too_big_index < too_small_index
swap a[too_big_index]a[too_small_index]
4. while too_small_index > too_big_index, go to 1.
40 20 10 30 7 50 60 80 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
3. if too_big_index < too_small_index
swap a[too_big_index]a[too_small_index]
4. while too_small_index > too_big_index, go to 1.
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
3. if too_big_index < too_small_index
swap a[too_big_index]a[too_small_index]
4. while too_small_index > too_big_index, go to 1.
5. swap a[too_small_index]a[pivot_index]
40 20 10 30 7 50 60 80 100pivot_index = 0
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
7 20 10 30 40 50 60 80 100pivot_index = 4
[0] [1] [2] [3] [4] [5] [6] [7] [8]
too_big_index too_small_index
1. while a[too_big_index] <= a[pivot_index]
++too_big_index
2. while a[too_small_index] > a[pivot_index]
--too_small_index
3. if too_big_index < too_small_index
swap a[too_big_index]a[too_small_index]
4. while too_small_index > too_big_index, go to 1.
5. swap a[too_small_index]a[pivot_index]
 Running time
◦ pivot selection: constant time, i.e. O(1)
◦ partitioning: linear time, i.e. O(N)
◦ running time of the two recursive calls
 T(N)=T(i)+T(N-i-1)+cN where c is a
constant
◦ i: number of elements in L
 What will be the worst case?
◦ The pivot is the smallest element, all the time
◦ Partition is always unbalanced
 What will be the best case?
◦ Partition is perfectly balanced.
◦ Pivot is always in the middle (median of the array)
 Java API provides a class Arrays with several
overloaded sort methods for different array
types
 Class Collections provides similar sorting
methods
Arrays methods:
public static void sort (int[] a)
public static void sort (Object[] a)
// requires Comparable
public static <T> void sort (T[] a,
Comparator<? super T> comp)
// uses given Comparator
Collections methods:
public static <T extends Comparable<T>>
void sort (List<T> list)
public static <T> void sort (List<T> l,
Comparator<? super T> comp)
 Given the collection and an element to
find…
 Determine whether the “target”
element was found in the collection
◦ Print a message
◦ Return a value
(an index or pointer, etc.)
 Don’t modify the collection in the
search!
 A search traverses the collection until
◦ the desired element is found
◦ or the collection is exhausted
linearsearch (a, key) {
for (i = 0; i < a.length; i++) {
if (a[i] == key) return i
}
return –1
}
40 20 10 30 7
Search for 20
40 20 10 30 7
Search for 20
40 != 20
40 20 10 30 7
Search for 20
20 = 20
40 20 10 30 7
Search for 20
20 = 20
return 1
40 20 10 30 7
Search for 5
40 20 10 30 7
Search for 5
40 != 5
40 20 10 30 7
Search for 5
20 != 5
40 20 10 30 7
Search for 5
10 != 5
40 20 10 30 7
Search for 5
30 != 5
40 20 10 30 7
Search for 5
7 != 5
return -1
 O(n)
 Examines every item
 Locates a target value in a sorted array/list
by successively eliminating half of the array
on each step
binarysearch (a, low, high, key) {
while (low <= high) {
mid = (low+high) >>> 1
midVal = a[mid]
if (midVal < key) low=mid+1
else if (midVal > key) high=mid+1
else return mid
}
return –(low + 1)
}
3 4 6 7
Search for 4
8 10 13 141
3 4 6 7
Search for 4
8 10 13 141
left right
3 4 6 7
Search for 4
8 10 13 141
4 < 7
left right
3 4 6 7
Search for 4
8 10 13 141
left right
3 4 6 7
Search for 3
8 10 13 141
4 > 3
left right
3 4 6 7
Search for 4
8 10 13 141
left right
3 4 6 7
Search for 4
8 10 13 141
4 = 4
left right
3 4 6 7
Search for 4
8 10 13 141
return 4
left right
3 4 6 7
Search for 9
8 10 13 141
3 4 6 7
Search for 9
8 10 13 141
left right
3 4 6 7
Search for 9
8 10 13 141
9 > 7
left right
3 4 6 7
Search for 9
8 10 13 141
left right
3 4 6 7
Search for 9
8 10 13 141
9 < 10
left right
3 4 6 7
Search for 9
8 10 13 141
left right
3 4 6 7
Search for 9
8 10 13 141
9 > 8
left right
3 4 6 7
Search for 9
8 10 13 141
leftright
right < left
return -7
 Requires a sorted array/list
 O(log n)
 Divide and conquer
Collection
List
Set
SortedSet
Map
SortedMap
LinkedList ArrayList
HashSet
TreeSet
HashMap
TreeMap
Extends
Implements
Interface
Class
 Set
◦ The familiar set abstraction.
◦ No duplicates; May or may not be ordered.
 List
◦ Ordered collection, also known as a sequence.
◦ Duplicates permitted; Allows positional access
 Map
◦ A mapping from keys to values.
◦ Each key can map to at most one value (function).
Set List Map
HashSet ArrayList HashMap
LinkedHashSet LinkedList LinkedHashMap
TreeSet Vector Hashtable
TreeMap
 Ordered
◦ Elements are stored and accessed in a specific
order
 Sorted
◦ Elements are stored and accessed in a sorted
order
 Indexed
◦ Elements can be accessed using an index
 Unique
◦ Collection does not allow duplicates
 A linked list is a series of connected nodes
 Each node contains at least
◦ A piece of data (any type)
◦ Pointer to the next node in the list
 Head: pointer to the first node
 The last node points to NULL
A 
Head
B C
A
data pointer
node
A 
Head
B C
D
x
A 
Head
B CD
A 
Head
B C
x

Head
B C
A 
Head
B C
A 
Head
B C
A
data next
node

previous
Tail
Operation Complexity
insert at beginning O(1)
Insert at end O(1)
Insert at index O(n)
delete at beginning O(1)
delete at end O(1)
delete at index O(n)
find element O(n)
access element by index O(n)
 Resizable-array implementation of the List
interface
 capacity vs. size
A B C
A B C
A B C D
A B C D E
D
capacity > size
capacity = size
A B C
Operation Complexity
insert at beginning O(n)
Insert at end O(1) amortized
Insert at index O(n)
delete at beginning O(n)
delete at end O(1)
delete at index O(n)
find element O(n)
access element by index O(1)
 Some collections are constrained so clients
can only use optimized operations
◦ stack: retrieves elements in reverse order as added
◦ queue: retrieves elements in same order as added
stack
queue
top 3
2
bottom 1
pop, peekpush
front back
1 2 3
addremove, peek
 stack: A collection based on the principle of
adding elements and retrieving them in the
opposite order.
 basic stack operations:
◦ push: Add an element to the top.
◦ pop: Remove the top element.
◦ peek: Examine the top element.
stack
top 3
2
bottom 1
pop, peekpush
 Programming languages and compilers:
◦ method call stack
 Matching up related pairs of things:
◦ check correctness of brackets (){}[]
 Sophisticated algorithms:
◦ undo stack
 queue: Retrieves elements in the order they
were added.
 basic queue operations:
◦ add (enqueue): Add an element to the back.
◦ remove (dequeue): Remove the front element.
◦ peek: Examine the front element.
queue
front back
1 2 3
addremove, peek
 Operating systems:
◦ queue of print jobs to send to the printer
 Programming:
◦ modeling a line of customers or clients
 Real world examples:
◦ people on an escalator or waiting in a line
◦ cars at a gas station
 A data structure optimized for a very
specific kind of search / access
 In a map we access by asking "give me the
value associated with this key."
 capacity, load factor
A -> 65
“Ivan Ivanov"
555389085
ivan@gmail.net
5122466556
12
hash
function
“Ivan"
5/5/1967
 Implements Map
 Fast put, get operations
 hashCode(), equals()
0
1
2
3
4
5
key=“BG” 2117
hashCode()
%6
5
(“BG”, “359”)
 What to do when inserting an element and
already something present?
 Could search forward or backwards for an
open space
 Linear probing
◦ move forward 1 spot. Open?, 2 spots, 3 spots
 Quadratic probing
◦ 1 spot, 2 spots, 4 spots, 8 spots, 16 spots
 Resize when load factor reaches some limit
 Each element of hash table be another data
structure
◦ LinkedList
◦ Balanced Binary Tree
 Resize at given load factor or when any chain
reaches some limit
 Implements Map
 Sorted
 Easy access to the biggest
 logarithmic put, get
 Comparable or Comparator
 0, 1, or 2 children per node
 Binary Search Tree
◦ node.left < node.value
◦ node.right >= node.value
 A priority queue stores a collection of entries
 Main methods of the Priority Queue ADT
◦ insert(k, x)
inserts an entry with key k and value x
◦ removeMin()
removes and returns the entry with smallest key
Priority Queues
15
4
 A heap can be seen as a complete binary tree:
16
14 10
8 7 9 3
2 4 1
 A heap can be seen as a complete binary tree:
16
14 10
8 7 9 3
2 4 1 1 1 111
 In practice, heaps are usually implemented as
arrays:
16
14 10
8 7 9 3
2 4 1
16 14 10 8 7 9 3 2 4 1 =0
 To represent a complete binary tree as an
array:
◦ The root node is A[1]
◦ Node i is A[i]
◦ The parent of node i is A[i/2] (note: integer divide)
◦ The left child of node i is A[2i]
◦ The right child of node i is A[2i + 1]
16
14 10
8 7 9 3
2 4 1
16 14 10 8 7 9 3 2 4 1 =0
16
4 10
14 7 9 3
2 8 1
16 10 14 7 9 3 2 8 140
16
4 10
14 7 9 3
2 8 1
16 10 7 9 3 2 8 14 140
16
14 10
4 7 9 3
2 8 1
16 14 10 4 7 9 3 2 8 10
16
14 10
4 7 9 3
2 8 1
16 14 10 4 7 9 3 2 8 10
16
14 10
4 7 9 3
2 8 1
16 14 10 7 9 3 2 14 80
16
14 10
8 7 9 3
2 4 1
16 14 10 8 7 9 3 2 4 10
16
14 10
8 7 9 3
2 4 1
16 14 10 8 7 9 3 2 4 10
16
14 10
8 7 9 3
2 4 1
16 14 10 8 7 9 3 2 4 10
16
14 10
8 7 9 3
2 4 1
16 14 10 8 7 9 3 2 4 10
 java.util.Collections
 java.util.Arrays exports similar basic operations for an
array.
binarySearch(list, key)
sort(list)
min(list)
max(list)
reverse(list)
shuffle(list)
swap(list, p1, p2)
replaceAll(list, x1, x2)
Finds key in a sorted list using binary search.
Sorts a list into ascending order.
Returns the smallest value in a list.
Returns the largest value in a list.
Reverses the order of elements in a list.
Randomly rearranges the elements in a list.
Exchanges the elements at index positions p1 and p2.
Replaces all elements matching x1 with x2.
Algorithm Analysis Guide

Contenu connexe

Tendances

Advanced data structure
Advanced data structureAdvanced data structure
Advanced data structureShakil Ahmed
 
finding Min and max element from given array using divide & conquer
finding Min and max element from given array using  divide & conquer finding Min and max element from given array using  divide & conquer
finding Min and max element from given array using divide & conquer Swati Kulkarni Jaipurkar
 
Data Structure: Algorithm and analysis
Data Structure: Algorithm and analysisData Structure: Algorithm and analysis
Data Structure: Algorithm and analysisDr. Rajdeep Chatterjee
 
19. algorithms and-complexity
19. algorithms and-complexity19. algorithms and-complexity
19. algorithms and-complexityashishtinku
 
Advanced data structures slide 1 2
Advanced data structures slide 1 2Advanced data structures slide 1 2
Advanced data structures slide 1 2jomerson remorosa
 
Lowest common ancestor
Lowest common ancestorLowest common ancestor
Lowest common ancestorShakil Ahmed
 
LeetCode April Coding Challenge
LeetCode April Coding ChallengeLeetCode April Coding Challenge
LeetCode April Coding ChallengeSunil Yadav
 
Monadic Comprehensions and Functional Composition with Query Expressions
Monadic Comprehensions and Functional Composition with Query ExpressionsMonadic Comprehensions and Functional Composition with Query Expressions
Monadic Comprehensions and Functional Composition with Query ExpressionsChris Eargle
 
Leet Code May Coding Challenge - DataStructure and Algorithm Problems
Leet Code May Coding Challenge - DataStructure and Algorithm ProblemsLeet Code May Coding Challenge - DataStructure and Algorithm Problems
Leet Code May Coding Challenge - DataStructure and Algorithm ProblemsSunil Yadav
 
M|18 User Defined Functions
M|18 User Defined FunctionsM|18 User Defined Functions
M|18 User Defined FunctionsMariaDB plc
 

Tendances (20)

Advanced data structure
Advanced data structureAdvanced data structure
Advanced data structure
 
Sorting ppt
Sorting pptSorting ppt
Sorting ppt
 
finding Min and max element from given array using divide & conquer
finding Min and max element from given array using  divide & conquer finding Min and max element from given array using  divide & conquer
finding Min and max element from given array using divide & conquer
 
Lec23
Lec23Lec23
Lec23
 
Data Structure: Algorithm and analysis
Data Structure: Algorithm and analysisData Structure: Algorithm and analysis
Data Structure: Algorithm and analysis
 
19. algorithms and-complexity
19. algorithms and-complexity19. algorithms and-complexity
19. algorithms and-complexity
 
Advanced data structures slide 1 2
Advanced data structures slide 1 2Advanced data structures slide 1 2
Advanced data structures slide 1 2
 
Algorithm
AlgorithmAlgorithm
Algorithm
 
Priority queues
Priority queuesPriority queues
Priority queues
 
Lowest common ancestor
Lowest common ancestorLowest common ancestor
Lowest common ancestor
 
Lec2
Lec2Lec2
Lec2
 
LeetCode April Coding Challenge
LeetCode April Coding ChallengeLeetCode April Coding Challenge
LeetCode April Coding Challenge
 
Monadic Comprehensions and Functional Composition with Query Expressions
Monadic Comprehensions and Functional Composition with Query ExpressionsMonadic Comprehensions and Functional Composition with Query Expressions
Monadic Comprehensions and Functional Composition with Query Expressions
 
Lec16
Lec16Lec16
Lec16
 
Leet Code May Coding Challenge - DataStructure and Algorithm Problems
Leet Code May Coding Challenge - DataStructure and Algorithm ProblemsLeet Code May Coding Challenge - DataStructure and Algorithm Problems
Leet Code May Coding Challenge - DataStructure and Algorithm Problems
 
Arrays
ArraysArrays
Arrays
 
Unit 2 dsa LINEAR DATA STRUCTURE
Unit 2 dsa LINEAR DATA STRUCTUREUnit 2 dsa LINEAR DATA STRUCTURE
Unit 2 dsa LINEAR DATA STRUCTURE
 
Oracle sql ppt2
Oracle sql ppt2Oracle sql ppt2
Oracle sql ppt2
 
Recursive squaring
Recursive squaringRecursive squaring
Recursive squaring
 
M|18 User Defined Functions
M|18 User Defined FunctionsM|18 User Defined Functions
M|18 User Defined Functions
 

En vedette

History of Google Local from 2004-2011
History of Google Local from 2004-2011 History of Google Local from 2004-2011
History of Google Local from 2004-2011 Mike Blumenthal
 
Introduction to "Origins, Evolution & Creation"
Introduction to "Origins, Evolution & Creation"Introduction to "Origins, Evolution & Creation"
Introduction to "Origins, Evolution & Creation"John Lynch
 
Was There A Darwinian Revolution
Was There A Darwinian RevolutionWas There A Darwinian Revolution
Was There A Darwinian RevolutionJohn Lynch
 
Google at a glance 1998 - 2008
Google at a glance 1998 - 2008Google at a glance 1998 - 2008
Google at a glance 1998 - 2008Andreas Jaffke
 
A history of science (volume 1)
A history of science (volume 1) A history of science (volume 1)
A history of science (volume 1) Dipoceanov Esrever
 
National Air And Space Museum Washington DC
National Air And Space Museum Washington DCNational Air And Space Museum Washington DC
National Air And Space Museum Washington DCShivakumar Patil
 
History of Creationism, Parts II & III
History of Creationism, Parts II & IIIHistory of Creationism, Parts II & III
History of Creationism, Parts II & IIIJohn Lynch
 
Google Algorithm Change History - 2k14-2k16.
Google Algorithm Change History - 2k14-2k16.Google Algorithm Change History - 2k14-2k16.
Google Algorithm Change History - 2k14-2k16.Saba SEO
 
Introduction to Information Technology ch 01_b
Introduction to Information Technology ch 01_bIntroduction to Information Technology ch 01_b
Introduction to Information Technology ch 01_bShahi Raz Akhtar
 
Introduction to Information Technology ch 02_a
Introduction to Information Technology ch 02_aIntroduction to Information Technology ch 02_a
Introduction to Information Technology ch 02_aShahi Raz Akhtar
 
Ancient Ideas of Creation & Evolution
Ancient Ideas of Creation & EvolutionAncient Ideas of Creation & Evolution
Ancient Ideas of Creation & EvolutionJohn Lynch
 
Dc parent 14 2
Dc parent 14 2Dc parent 14 2
Dc parent 14 2mtaft
 

En vedette (20)

History of Google Local from 2004-2011
History of Google Local from 2004-2011 History of Google Local from 2004-2011
History of Google Local from 2004-2011
 
Chapter one
Chapter oneChapter one
Chapter one
 
Introduction to "Origins, Evolution & Creation"
Introduction to "Origins, Evolution & Creation"Introduction to "Origins, Evolution & Creation"
Introduction to "Origins, Evolution & Creation"
 
CSS 3, Style and Beyond
CSS 3, Style and BeyondCSS 3, Style and Beyond
CSS 3, Style and Beyond
 
Sorting algorithms
Sorting algorithmsSorting algorithms
Sorting algorithms
 
Was There A Darwinian Revolution
Was There A Darwinian RevolutionWas There A Darwinian Revolution
Was There A Darwinian Revolution
 
Google at a glance 1998 - 2008
Google at a glance 1998 - 2008Google at a glance 1998 - 2008
Google at a glance 1998 - 2008
 
A history of science (volume 1)
A history of science (volume 1) A history of science (volume 1)
A history of science (volume 1)
 
sPen Data Management
sPen Data ManagementsPen Data Management
sPen Data Management
 
Sorting pnk
Sorting pnkSorting pnk
Sorting pnk
 
National Air And Space Museum Washington DC
National Air And Space Museum Washington DCNational Air And Space Museum Washington DC
National Air And Space Museum Washington DC
 
History of Creationism, Parts II & III
History of Creationism, Parts II & IIIHistory of Creationism, Parts II & III
History of Creationism, Parts II & III
 
Algorithms - Aaron Bloomfield
Algorithms - Aaron BloomfieldAlgorithms - Aaron Bloomfield
Algorithms - Aaron Bloomfield
 
simple-sorting algorithms
simple-sorting algorithmssimple-sorting algorithms
simple-sorting algorithms
 
Google Algorithm Change History - 2k14-2k16.
Google Algorithm Change History - 2k14-2k16.Google Algorithm Change History - 2k14-2k16.
Google Algorithm Change History - 2k14-2k16.
 
Introduction to Information Technology ch 01_b
Introduction to Information Technology ch 01_bIntroduction to Information Technology ch 01_b
Introduction to Information Technology ch 01_b
 
Introduction to Information Technology ch 02_a
Introduction to Information Technology ch 02_aIntroduction to Information Technology ch 02_a
Introduction to Information Technology ch 02_a
 
Ancient Ideas of Creation & Evolution
Ancient Ideas of Creation & EvolutionAncient Ideas of Creation & Evolution
Ancient Ideas of Creation & Evolution
 
Ds 4
Ds 4Ds 4
Ds 4
 
Dc parent 14 2
Dc parent 14 2Dc parent 14 2
Dc parent 14 2
 

Similaire à Algorithm Analysis Guide

Algorithms - "Chapter 2 getting started"
Algorithms - "Chapter 2 getting started"Algorithms - "Chapter 2 getting started"
Algorithms - "Chapter 2 getting started"Ra'Fat Al-Msie'deen
 
Insersion & Bubble Sort in Algoritm
Insersion & Bubble Sort in AlgoritmInsersion & Bubble Sort in Algoritm
Insersion & Bubble Sort in AlgoritmEhsan Ehrari
 
Advance data structure & algorithm
Advance data structure & algorithmAdvance data structure & algorithm
Advance data structure & algorithmK Hari Shankar
 
Design and Analysis of Algorithms Lecture Notes
Design and Analysis of Algorithms Lecture NotesDesign and Analysis of Algorithms Lecture Notes
Design and Analysis of Algorithms Lecture NotesSreedhar Chowdam
 
lecture7.ppt
lecture7.pptlecture7.ppt
lecture7.pptEdFeranil
 
Unit-1 DAA_Notes.pdf
Unit-1 DAA_Notes.pdfUnit-1 DAA_Notes.pdf
Unit-1 DAA_Notes.pdfAmayJaiswal4
 
Analysis and design of algorithms part2
Analysis and design of algorithms part2Analysis and design of algorithms part2
Analysis and design of algorithms part2Deepak John
 
3.8 quick sort
3.8 quick sort3.8 quick sort
3.8 quick sortKrish_ver2
 
1_Asymptotic_Notation_pptx.pptx
1_Asymptotic_Notation_pptx.pptx1_Asymptotic_Notation_pptx.pptx
1_Asymptotic_Notation_pptx.pptxpallavidhade2
 
CSE680-07QuickSort.pptx
CSE680-07QuickSort.pptxCSE680-07QuickSort.pptx
CSE680-07QuickSort.pptxDeepakM509554
 
Algorithm And analysis Lecture 03& 04-time complexity.
 Algorithm And analysis Lecture 03& 04-time complexity. Algorithm And analysis Lecture 03& 04-time complexity.
Algorithm And analysis Lecture 03& 04-time complexity.Tariq Khan
 
An Experiment to Determine and Compare Practical Efficiency of Insertion Sort...
An Experiment to Determine and Compare Practical Efficiency of Insertion Sort...An Experiment to Determine and Compare Practical Efficiency of Insertion Sort...
An Experiment to Determine and Compare Practical Efficiency of Insertion Sort...Tosin Amuda
 
Algorithim lec1.pptx
Algorithim lec1.pptxAlgorithim lec1.pptx
Algorithim lec1.pptxrediet43
 
Algorithms with-java-advanced-1.0
Algorithms with-java-advanced-1.0Algorithms with-java-advanced-1.0
Algorithms with-java-advanced-1.0BG Java EE Course
 
Data Structure & Algorithms - Mathematical
Data Structure & Algorithms - MathematicalData Structure & Algorithms - Mathematical
Data Structure & Algorithms - Mathematicalbabuk110
 

Similaire à Algorithm Analysis Guide (20)

Algorithms - "Chapter 2 getting started"
Algorithms - "Chapter 2 getting started"Algorithms - "Chapter 2 getting started"
Algorithms - "Chapter 2 getting started"
 
Insersion & Bubble Sort in Algoritm
Insersion & Bubble Sort in AlgoritmInsersion & Bubble Sort in Algoritm
Insersion & Bubble Sort in Algoritm
 
Advance data structure & algorithm
Advance data structure & algorithmAdvance data structure & algorithm
Advance data structure & algorithm
 
Design and Analysis of Algorithms Lecture Notes
Design and Analysis of Algorithms Lecture NotesDesign and Analysis of Algorithms Lecture Notes
Design and Analysis of Algorithms Lecture Notes
 
lecture7.ppt
lecture7.pptlecture7.ppt
lecture7.ppt
 
Unit-1 DAA_Notes.pdf
Unit-1 DAA_Notes.pdfUnit-1 DAA_Notes.pdf
Unit-1 DAA_Notes.pdf
 
Analysis and design of algorithms part2
Analysis and design of algorithms part2Analysis and design of algorithms part2
Analysis and design of algorithms part2
 
Sorting
SortingSorting
Sorting
 
Unit 7 sorting
Unit 7   sortingUnit 7   sorting
Unit 7 sorting
 
3.8 quick sort
3.8 quick sort3.8 quick sort
3.8 quick sort
 
Cis435 week01
Cis435 week01Cis435 week01
Cis435 week01
 
1_Asymptotic_Notation_pptx.pptx
1_Asymptotic_Notation_pptx.pptx1_Asymptotic_Notation_pptx.pptx
1_Asymptotic_Notation_pptx.pptx
 
CSE680-07QuickSort.pptx
CSE680-07QuickSort.pptxCSE680-07QuickSort.pptx
CSE680-07QuickSort.pptx
 
Data Structures 6
Data Structures 6Data Structures 6
Data Structures 6
 
Algorithm And analysis Lecture 03& 04-time complexity.
 Algorithm And analysis Lecture 03& 04-time complexity. Algorithm And analysis Lecture 03& 04-time complexity.
Algorithm And analysis Lecture 03& 04-time complexity.
 
An Experiment to Determine and Compare Practical Efficiency of Insertion Sort...
An Experiment to Determine and Compare Practical Efficiency of Insertion Sort...An Experiment to Determine and Compare Practical Efficiency of Insertion Sort...
An Experiment to Determine and Compare Practical Efficiency of Insertion Sort...
 
Algorithim lec1.pptx
Algorithim lec1.pptxAlgorithim lec1.pptx
Algorithim lec1.pptx
 
Algorithms with-java-advanced-1.0
Algorithms with-java-advanced-1.0Algorithms with-java-advanced-1.0
Algorithms with-java-advanced-1.0
 
Data Structure & Algorithms - Mathematical
Data Structure & Algorithms - MathematicalData Structure & Algorithms - Mathematical
Data Structure & Algorithms - Mathematical
 
Lec35
Lec35Lec35
Lec35
 

Dernier

A Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformA Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformChameera Dedduwage
 
Science 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsScience 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsKarinaGenton
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfsanyamsingh5019
 
Concept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.CompdfConcept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.CompdfUmakantAnnand
 
The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13Steve Thomason
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityGeoBlogs
 
Separation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and ActinidesSeparation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and ActinidesFatimaKhan178732
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentInMediaRes1
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxOH TEIK BIN
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxGaneshChakor2
 
PSYCHIATRIC History collection FORMAT.pptx
PSYCHIATRIC   History collection FORMAT.pptxPSYCHIATRIC   History collection FORMAT.pptx
PSYCHIATRIC History collection FORMAT.pptxPoojaSen20
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introductionMaksud Ahmed
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxpboyjonauth
 
How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxmanuelaromero2013
 
Crayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon ACrayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon AUnboundStockton
 
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Krashi Coaching
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdfSoniaTolstoy
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeThiyagu K
 

Dernier (20)

A Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy ReformA Critique of the Proposed National Education Policy Reform
A Critique of the Proposed National Education Policy Reform
 
Science 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsScience 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its Characteristics
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdf
 
Concept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.CompdfConcept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.Compdf
 
The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activity
 
Separation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and ActinidesSeparation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and Actinides
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media Component
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptx
 
CARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptxCARE OF CHILD IN INCUBATOR..........pptx
CARE OF CHILD IN INCUBATOR..........pptx
 
PSYCHIATRIC History collection FORMAT.pptx
PSYCHIATRIC   History collection FORMAT.pptxPSYCHIATRIC   History collection FORMAT.pptx
PSYCHIATRIC History collection FORMAT.pptx
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introduction
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptx
 
How to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptxHow to Make a Pirate ship Primary Education.pptx
How to Make a Pirate ship Primary Education.pptx
 
Crayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon ACrayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon A
 
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
Kisan Call Centre - To harness potential of ICT in Agriculture by answer farm...
 
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdfBASLIQ CURRENT LOOKBOOK  LOOKBOOK(1) (1).pdf
BASLIQ CURRENT LOOKBOOK LOOKBOOK(1) (1).pdf
 
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Bikash Puri  Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Bikash Puri Delhi reach out to us at 🔝9953056974🔝
 
Measures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and ModeMeasures of Central Tendency: Mean, Median and Mode
Measures of Central Tendency: Mean, Median and Mode
 
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
Model Call Girl in Tilak Nagar Delhi reach out to us at 🔝9953056974🔝
 

Algorithm Analysis Guide

  • 2.  Algorithm Analysis  Sorting  Searching  Data Structures
  • 3.  An algorithm is a set of instructions to be followed to solve a problem.
  • 4.  Correctness  Finiteness  Definiteness  Input  Output  Effectiveness
  • 5. There are two aspects of algorithmic performance:  Time  Space
  • 6.  First, we start to count the number of basic operations in a particular solution to assess its efficiency.  Then, we will express the efficiency of algorithms using growth functions.
  • 7.  We measure an algorithm’s time requirement as a function of the problem size.  The most important thing to learn is how quickly the algorithm’s time requirement grows as a function of the problem size.  An algorithm’s proportional time requirement is known as growth rate.  We can compare the efficiency of two algorithms by comparing their growth rates.
  • 8.  Each operation in an algorithm (or a program) has a cost.  Each operation takes a certain of time. count = count + 1;  take a certain amount of time, but it is constant A sequence of operations: count = count + 1; Cost: c1 sum = sum + count; Cost: c2  Total Cost = c1 + c2
  • 9. Example: Simple If-Statement Cost Times if (n < 0) c1 1 absval = -n c2 1 else absval = n; c3 1 Total Cost <= c1 + max(c2,c3)
  • 10. Example: Simple Loop Cost Times i = 1; c1 1 sum = 0; c2 1 while (i <= n) { c3 n+1 i = i + 1; c4 n sum = sum + i; c5 n } Total Cost = c1 + c2 + (n+1)*c3 + n*c4 + n*c5  The time required for this algorithm is proportional to n
  • 11. Example: Nested Loop Cost Times i=1; c1 1 sum = 0; c2 1 while (i <= n) { c3 n+1 j=1; c4 n while (j <= n) { c5 n*(n+1) sum = sum + i; c6 n*n j = j + 1; c7 n*n } i = i +1; c8 n } Total Cost = c1 + c2 + (n+1)*c3 + n*c4 + n*(n+1)*c5+n*n*c6+n*n*c7+n*c8  The time required for this algorithm is proportional to n2
  • 12.  Consecutive Statements  If/Else  Loops  Nested Loops
  • 13.  Informal definitions: ◦ Given a complexity function f(n), ◦ O(f(n)) is the set of complexity functions that are upper bounds on f(n) ◦ (f(n)) is the set of complexity functions that are lower bounds on f(n) ◦ (f(n)) is the set of complexity functions that, given the correct constants, correctly describes f(n)  Example: If f(n) = 17n3 + 4n – 12, then ◦ O(f(n)) contains n3, n4, n5, 2n, etc. ◦ (f(n)) contains 1, n, n2, n3, log n, n log n, etc. ◦ (f(n)) contains n3
  • 14. Example: Simple If-Statement Cost Times if (n < 0) c1 1 absval = -n c2 1 else absval = n; c3 1 Total Cost <= c1 + max(c2,c3) O(1)
  • 15. Example: Simple Loop Cost Times i = 1; c1 1 sum = 0; c2 1 while (i <= n) { c3 n+1 i = i + 1; c4 n sum = sum + i; c5 n } Total Cost = c1 + c2 + (n+1)*c3 + n*c4 + n*c5  The time required for this algorithm is proportional to n O(n)
  • 16. Example: Nested Loop Cost Times i=1; c1 1 sum = 0; c2 1 while (i <= n) { c3 n+1 j=1; c4 n while (j <= n) { c5 n*(n+1) sum = sum + i; c6 n*n j = j + 1; c7 n*n } i = i +1; c8 n } Total Cost = c1 + c2 + (n+1)*c3 + n*c4 + n*(n+1)*c5+n*n*c6+n*n*c7+n*c8  The time required for this algorithm is proportional to n2 O(n2)
  • 17. Function Growth Rate Name c Constant log N Logarithmic log2N Log-squared N Linear N log N Linearithmic N2 Quadratic N3 Cubic 2N Exponential
  • 18.
  • 19.
  • 20.  Input: ◦ A sequence of n numbers a1, a2, . . . , an  Output: ◦ A permutation (reordering) a1’, a2’, . . . , an’ of the input sequence such that a1’ ≤ a2’ ≤ · · · ≤ an’
  • 21.  In-Place Sort ◦ The amount of extra space required to sort the data is constant with the input size.
  • 22. Sorted on first key: Sort file on second key: Records with key value 3 are not in order on first key!!  Stable sort ◦ preserves relative order of records with equal keys
  • 23.  Idea: like sorting a hand of playing cards ◦ Start with an empty left hand and the cards facing down on the table. ◦ Remove one card at a time from the table, and insert it into the correct position in the left hand ◦ The cards held in the left hand are sorted
  • 24. To insert 12, we need to make room for it by moving first 36 and then 24.
  • 25.
  • 26.
  • 27.
  • 28. insertionsort (a) { for (i = 1; i < a.length; ++i) { key = a[i] pos = i while (pos > 0 && a[pos-1] > key) { a[pos]=a[pos-1] pos-- } a[pos] = key } }
  • 29.  O(n2), stable, in-place  O(1) space  Great with small number of elements
  • 30.  Algorithm: ◦ Find the minimum value ◦ Swap with 1st position value ◦ Repeat with 2nd position down  O(n2), stable, in-place
  • 31.  Algorithm ◦ Traverse the collection ◦ “Bubble” the largest value to the end using pairwise comparisons and swapping  O(n2), stable, in-place  Totally useless?
  • 32. 1. Divide: split the array in two halves 2. Conquer: Sort recursively both subarrays 3. Combine: merge the two sorted subarrays into a sorted array
  • 33. mergesort (a, left, right) { if (left < right) { mid = (left + right)/2 mergesort (a, left, mid) mergesort (a, mid+1, right) merge(a, left, mid+1, right) } }
  • 34.  The key to Merge Sort is merging two sorted lists into one, such that if you have two lists X (x1x2…xm) and Y(y1y2…yn) the resulting list is Z(z1z2…zm+n)  Example: L1 = { 3 8 9 } L2 = { 1 5 7 } merge(L1, L2) = { 1 3 5 7 8 9 }
  • 35. 3 10 23 54 1 5 25 75X: Y: Result:
  • 36. 3 10 23 54 5 25 75 1 X: Y: Result:
  • 37. 10 23 54 5 25 75 1 3 X: Y: Result:
  • 38. 10 23 54 25 75 1 3 5 X: Y: Result:
  • 39. 23 54 25 75 1 3 5 10 X: Y: Result:
  • 40. 54 25 75 1 3 5 10 23 X: Y: Result:
  • 41. 54 75 1 3 5 10 23 25 X: Y: Result:
  • 42. 75 1 3 5 10 23 25 54 X: Y: Result:
  • 43. 1 3 5 10 23 25 54 75 X: Y: Result:
  • 44. 99 6 86 15 58 35 86 4 0
  • 45. 99 6 86 15 58 35 86 4 0 99 6 86 15 58 35 86 4 0
  • 46. 99 6 86 15 58 35 86 4 0 99 6 86 15 58 35 86 4 0 86 1599 6 58 35 86 4 0
  • 47. 99 6 86 15 58 35 86 4 0 99 6 86 15 58 35 86 4 0 86 1599 6 58 35 86 4 0 99 6 86 15 58 35 86 4 0
  • 48. 99 6 86 15 58 35 86 4 0 99 6 86 15 58 35 86 4 0 86 1599 6 58 35 86 4 0 99 6 86 15 58 35 86 4 0 4 0
  • 49. 99 6 86 15 58 35 86 0 4 4 0
  • 50. 15 866 99 35 58 0 4 86 99 6 86 15 58 35 86 0 4
  • 51. 6 15 86 99 0 4 35 58 86 15 866 99 58 35 0 4 86
  • 52. 0 4 6 15 35 58 86 86 99 6 15 86 99 0 4 35 58 86
  • 53. 0 4 6 15 35 58 86 86 99
  • 54. Merge Sort runs O (N log N) for all cases, because of its Divide and Conquer approach. T(N) = 2T(N/2) + N = O(N logN)
  • 55. 1. Select: pick an element x 2. Divide: rearrange elements so that x goes to its final position • L elements less than x • G elements greater than or equal to x 3. Conquer: sort recursively L and G x x x L G L G
  • 56. quicksort (a, left, right) { if (left < right) { pivot = partition (a, left, right) quicksort (a, left, pivot-1) quicksort (a, pivot+1, right) } }
  • 57.  How to pick a pivot?  How to partition?
  • 58.  Use the first element as pivot ◦ if the input is random, ok ◦ if the input is presorted? - shuffle in advance  Choose the pivot randomly ◦ generally safe ◦ random numbers generation can be expensive
  • 59.  Use the median of the array ◦ Partitioning always cuts the array into half ◦ An optimal quicksort (O(n log n)) ◦ hard to find the exact median (chicken-egg?) ◦ Approximation to the exact median..  Median of three ◦ Compare just three elements: the leftmost, the rightmost and the center ◦ Use the middle of the three as pivot
  • 60.  Given a pivot, partition the elements of the array such that the resulting array consists of: ◦ One subarray that contains elements < pivot ◦ One subarray that contains elements >= pivot  The subarrays are stored in the original array
  • 61. 40 20 10 80 60 50 7 30 100pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index
  • 62. 40 20 10 80 60 50 7 30 100pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index 1. while a[too_big_index] <= a[pivot_index] ++too_big_index
  • 63. 40 20 10 80 60 50 7 30 100pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index 1. while a[too_big_index] <= a[pivot_index] ++too_big_index
  • 64. 40 20 10 80 60 50 7 30 100pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index 1. while a[too_big_index] <= a[pivot_index] ++too_big_index
  • 65. 40 20 10 80 60 50 7 30 100pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index 1. while a[too_big_index] <= a[pivot_index] ++too_big_index 2. while a[too_small_index] > a[pivot_index] --too_small_index
  • 66. 40 20 10 80 60 50 7 30 100pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index 1. while a[too_big_index] <= a[pivot_index] ++too_big_index 2. while a[too_small_index] > a[pivot_index] --too_small_index
  • 67. 40 20 10 80 60 50 7 30 100pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index 1. while a[too_big_index] <= a[pivot_index] ++too_big_index 2. while a[too_small_index] > a[pivot_index] --too_small_index 3. if too_big_index < too_small_index swap a[too_big_index]a[too_small_index]
  • 68. 40 20 10 30 60 50 7 80 100pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index 1. while a[too_big_index] <= a[pivot_index] ++too_big_index 2. while a[too_small_index] > a[pivot_index] --too_small_index 3. if too_big_index < too_small_index swap a[too_big_index]a[too_small_index]
  • 69. 40 20 10 30 60 50 7 80 100pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index 1. while a[too_big_index] <= a[pivot_index] ++too_big_index 2. while a[too_small_index] > a[pivot_index] --too_small_index 3. if too_big_index < too_small_index swap a[too_big_index]a[too_small_index] 4. while too_small_index > too_big_index, go to 1.
  • 70. 40 20 10 30 60 50 7 80 100pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index 1. while a[too_big_index] <= a[pivot_index] ++too_big_index 2. while a[too_small_index] > a[pivot_index] --too_small_index 3. if too_big_index < too_small_index swap a[too_big_index]a[too_small_index] 4. while too_small_index > too_big_index, go to 1.
  • 71. 40 20 10 30 60 50 7 80 100pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index 1. while a[too_big_index] <= a[pivot_index] ++too_big_index 2. while a[too_small_index] > a[pivot_index] --too_small_index 3. if too_big_index < too_small_index swap a[too_big_index]a[too_small_index] 4. while too_small_index > too_big_index, go to 1.
  • 72. 40 20 10 30 60 50 7 80 100pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index 1. while a[too_big_index] <= a[pivot_index] ++too_big_index 2. while a[too_small_index] > a[pivot_index] --too_small_index 3. if too_big_index < too_small_index swap a[too_big_index]a[too_small_index] 4. while too_small_index > too_big_index, go to 1.
  • 73. 40 20 10 30 60 50 7 80 100pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index 1. while a[too_big_index] <= a[pivot_index] ++too_big_index 2. while a[too_small_index] > a[pivot_index] --too_small_index 3. if too_big_index < too_small_index swap a[too_big_index]a[too_small_index] 4. while too_small_index > too_big_index, go to 1.
  • 74. 40 20 10 30 60 50 7 80 100pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index 1. while a[too_big_index] <= a[pivot_index] ++too_big_index 2. while a[too_small_index] > a[pivot_index] --too_small_index 3. if too_big_index < too_small_index swap a[too_big_index]a[too_small_index] 4. while too_small_index > too_big_index, go to 1.
  • 75. 40 20 10 30 7 50 60 80 100pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index 1. while a[too_big_index] <= a[pivot_index] ++too_big_index 2. while a[too_small_index] > a[pivot_index] --too_small_index 3. if too_big_index < too_small_index swap a[too_big_index]a[too_small_index] 4. while too_small_index > too_big_index, go to 1.
  • 76. 40 20 10 30 7 50 60 80 100pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index 1. while a[too_big_index] <= a[pivot_index] ++too_big_index 2. while a[too_small_index] > a[pivot_index] --too_small_index 3. if too_big_index < too_small_index swap a[too_big_index]a[too_small_index] 4. while too_small_index > too_big_index, go to 1.
  • 77. 40 20 10 30 7 50 60 80 100pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index 1. while a[too_big_index] <= a[pivot_index] ++too_big_index 2. while a[too_small_index] > a[pivot_index] --too_small_index 3. if too_big_index < too_small_index swap a[too_big_index]a[too_small_index] 4. while too_small_index > too_big_index, go to 1.
  • 78. 40 20 10 30 7 50 60 80 100pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index 1. while a[too_big_index] <= a[pivot_index] ++too_big_index 2. while a[too_small_index] > a[pivot_index] --too_small_index 3. if too_big_index < too_small_index swap a[too_big_index]a[too_small_index] 4. while too_small_index > too_big_index, go to 1.
  • 79. 40 20 10 30 7 50 60 80 100pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index 1. while a[too_big_index] <= a[pivot_index] ++too_big_index 2. while a[too_small_index] > a[pivot_index] --too_small_index 3. if too_big_index < too_small_index swap a[too_big_index]a[too_small_index] 4. while too_small_index > too_big_index, go to 1.
  • 80. 40 20 10 30 7 50 60 80 100pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index 1. while a[too_big_index] <= a[pivot_index] ++too_big_index 2. while a[too_small_index] > a[pivot_index] --too_small_index 3. if too_big_index < too_small_index swap a[too_big_index]a[too_small_index] 4. while too_small_index > too_big_index, go to 1.
  • 81. 40 20 10 30 7 50 60 80 100pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index 1. while a[too_big_index] <= a[pivot_index] ++too_big_index 2. while a[too_small_index] > a[pivot_index] --too_small_index 3. if too_big_index < too_small_index swap a[too_big_index]a[too_small_index] 4. while too_small_index > too_big_index, go to 1.
  • 82. 40 20 10 30 7 50 60 80 100pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index 1. while a[too_big_index] <= a[pivot_index] ++too_big_index 2. while a[too_small_index] > a[pivot_index] --too_small_index 3. if too_big_index < too_small_index swap a[too_big_index]a[too_small_index] 4. while too_small_index > too_big_index, go to 1.
  • 83. 40 20 10 30 7 50 60 80 100pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index 1. while a[too_big_index] <= a[pivot_index] ++too_big_index 2. while a[too_small_index] > a[pivot_index] --too_small_index 3. if too_big_index < too_small_index swap a[too_big_index]a[too_small_index] 4. while too_small_index > too_big_index, go to 1.
  • 84. 1. while a[too_big_index] <= a[pivot_index] ++too_big_index 2. while a[too_small_index] > a[pivot_index] --too_small_index 3. if too_big_index < too_small_index swap a[too_big_index]a[too_small_index] 4. while too_small_index > too_big_index, go to 1. 5. swap a[too_small_index]a[pivot_index] 40 20 10 30 7 50 60 80 100pivot_index = 0 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index
  • 85. 7 20 10 30 40 50 60 80 100pivot_index = 4 [0] [1] [2] [3] [4] [5] [6] [7] [8] too_big_index too_small_index 1. while a[too_big_index] <= a[pivot_index] ++too_big_index 2. while a[too_small_index] > a[pivot_index] --too_small_index 3. if too_big_index < too_small_index swap a[too_big_index]a[too_small_index] 4. while too_small_index > too_big_index, go to 1. 5. swap a[too_small_index]a[pivot_index]
  • 86.  Running time ◦ pivot selection: constant time, i.e. O(1) ◦ partitioning: linear time, i.e. O(N) ◦ running time of the two recursive calls  T(N)=T(i)+T(N-i-1)+cN where c is a constant ◦ i: number of elements in L
  • 87.  What will be the worst case? ◦ The pivot is the smallest element, all the time ◦ Partition is always unbalanced
  • 88.  What will be the best case? ◦ Partition is perfectly balanced. ◦ Pivot is always in the middle (median of the array)
  • 89.  Java API provides a class Arrays with several overloaded sort methods for different array types  Class Collections provides similar sorting methods
  • 90. Arrays methods: public static void sort (int[] a) public static void sort (Object[] a) // requires Comparable public static <T> void sort (T[] a, Comparator<? super T> comp) // uses given Comparator
  • 91. Collections methods: public static <T extends Comparable<T>> void sort (List<T> list) public static <T> void sort (List<T> l, Comparator<? super T> comp)
  • 92.
  • 93.  Given the collection and an element to find…  Determine whether the “target” element was found in the collection ◦ Print a message ◦ Return a value (an index or pointer, etc.)  Don’t modify the collection in the search!
  • 94.  A search traverses the collection until ◦ the desired element is found ◦ or the collection is exhausted
  • 95. linearsearch (a, key) { for (i = 0; i < a.length; i++) { if (a[i] == key) return i } return –1 }
  • 96. 40 20 10 30 7 Search for 20
  • 97. 40 20 10 30 7 Search for 20 40 != 20
  • 98. 40 20 10 30 7 Search for 20 20 = 20
  • 99. 40 20 10 30 7 Search for 20 20 = 20 return 1
  • 100. 40 20 10 30 7 Search for 5
  • 101. 40 20 10 30 7 Search for 5 40 != 5
  • 102. 40 20 10 30 7 Search for 5 20 != 5
  • 103. 40 20 10 30 7 Search for 5 10 != 5
  • 104. 40 20 10 30 7 Search for 5 30 != 5
  • 105. 40 20 10 30 7 Search for 5 7 != 5 return -1
  • 106.  O(n)  Examines every item
  • 107.  Locates a target value in a sorted array/list by successively eliminating half of the array on each step
  • 108. binarysearch (a, low, high, key) { while (low <= high) { mid = (low+high) >>> 1 midVal = a[mid] if (midVal < key) low=mid+1 else if (midVal > key) high=mid+1 else return mid } return –(low + 1) }
  • 109. 3 4 6 7 Search for 4 8 10 13 141
  • 110. 3 4 6 7 Search for 4 8 10 13 141 left right
  • 111. 3 4 6 7 Search for 4 8 10 13 141 4 < 7 left right
  • 112. 3 4 6 7 Search for 4 8 10 13 141 left right
  • 113. 3 4 6 7 Search for 3 8 10 13 141 4 > 3 left right
  • 114. 3 4 6 7 Search for 4 8 10 13 141 left right
  • 115. 3 4 6 7 Search for 4 8 10 13 141 4 = 4 left right
  • 116. 3 4 6 7 Search for 4 8 10 13 141 return 4 left right
  • 117. 3 4 6 7 Search for 9 8 10 13 141
  • 118. 3 4 6 7 Search for 9 8 10 13 141 left right
  • 119. 3 4 6 7 Search for 9 8 10 13 141 9 > 7 left right
  • 120. 3 4 6 7 Search for 9 8 10 13 141 left right
  • 121. 3 4 6 7 Search for 9 8 10 13 141 9 < 10 left right
  • 122. 3 4 6 7 Search for 9 8 10 13 141 left right
  • 123. 3 4 6 7 Search for 9 8 10 13 141 9 > 8 left right
  • 124. 3 4 6 7 Search for 9 8 10 13 141 leftright right < left return -7
  • 125.  Requires a sorted array/list  O(log n)  Divide and conquer
  • 127.  Set ◦ The familiar set abstraction. ◦ No duplicates; May or may not be ordered.  List ◦ Ordered collection, also known as a sequence. ◦ Duplicates permitted; Allows positional access  Map ◦ A mapping from keys to values. ◦ Each key can map to at most one value (function).
  • 128. Set List Map HashSet ArrayList HashMap LinkedHashSet LinkedList LinkedHashMap TreeSet Vector Hashtable TreeMap
  • 129.  Ordered ◦ Elements are stored and accessed in a specific order  Sorted ◦ Elements are stored and accessed in a sorted order  Indexed ◦ Elements can be accessed using an index  Unique ◦ Collection does not allow duplicates
  • 130.  A linked list is a series of connected nodes  Each node contains at least ◦ A piece of data (any type) ◦ Pointer to the next node in the list  Head: pointer to the first node  The last node points to NULL A  Head B C A data pointer node
  • 131. A  Head B C D x A  Head B CD
  • 134. A  Head B C A data next node  previous Tail
  • 135. Operation Complexity insert at beginning O(1) Insert at end O(1) Insert at index O(n) delete at beginning O(1) delete at end O(1) delete at index O(n) find element O(n) access element by index O(n)
  • 136.  Resizable-array implementation of the List interface  capacity vs. size A B C
  • 137. A B C A B C D A B C D E D capacity > size capacity = size
  • 138. A B C
  • 139. Operation Complexity insert at beginning O(n) Insert at end O(1) amortized Insert at index O(n) delete at beginning O(n) delete at end O(1) delete at index O(n) find element O(n) access element by index O(1)
  • 140.  Some collections are constrained so clients can only use optimized operations ◦ stack: retrieves elements in reverse order as added ◦ queue: retrieves elements in same order as added stack queue top 3 2 bottom 1 pop, peekpush front back 1 2 3 addremove, peek
  • 141.  stack: A collection based on the principle of adding elements and retrieving them in the opposite order.  basic stack operations: ◦ push: Add an element to the top. ◦ pop: Remove the top element. ◦ peek: Examine the top element. stack top 3 2 bottom 1 pop, peekpush
  • 142.  Programming languages and compilers: ◦ method call stack  Matching up related pairs of things: ◦ check correctness of brackets (){}[]  Sophisticated algorithms: ◦ undo stack
  • 143.  queue: Retrieves elements in the order they were added.  basic queue operations: ◦ add (enqueue): Add an element to the back. ◦ remove (dequeue): Remove the front element. ◦ peek: Examine the front element. queue front back 1 2 3 addremove, peek
  • 144.  Operating systems: ◦ queue of print jobs to send to the printer  Programming: ◦ modeling a line of customers or clients  Real world examples: ◦ people on an escalator or waiting in a line ◦ cars at a gas station
  • 145.  A data structure optimized for a very specific kind of search / access  In a map we access by asking "give me the value associated with this key."  capacity, load factor A -> 65
  • 147.  Implements Map  Fast put, get operations  hashCode(), equals()
  • 149.  What to do when inserting an element and already something present?
  • 150.  Could search forward or backwards for an open space  Linear probing ◦ move forward 1 spot. Open?, 2 spots, 3 spots  Quadratic probing ◦ 1 spot, 2 spots, 4 spots, 8 spots, 16 spots  Resize when load factor reaches some limit
  • 151.  Each element of hash table be another data structure ◦ LinkedList ◦ Balanced Binary Tree  Resize at given load factor or when any chain reaches some limit
  • 152.  Implements Map  Sorted  Easy access to the biggest  logarithmic put, get  Comparable or Comparator
  • 153.  0, 1, or 2 children per node  Binary Search Tree ◦ node.left < node.value ◦ node.right >= node.value
  • 154.  A priority queue stores a collection of entries  Main methods of the Priority Queue ADT ◦ insert(k, x) inserts an entry with key k and value x ◦ removeMin() removes and returns the entry with smallest key Priority Queues 15 4
  • 155.  A heap can be seen as a complete binary tree: 16 14 10 8 7 9 3 2 4 1
  • 156.  A heap can be seen as a complete binary tree: 16 14 10 8 7 9 3 2 4 1 1 1 111
  • 157.  In practice, heaps are usually implemented as arrays: 16 14 10 8 7 9 3 2 4 1 16 14 10 8 7 9 3 2 4 1 =0
  • 158.  To represent a complete binary tree as an array: ◦ The root node is A[1] ◦ Node i is A[i] ◦ The parent of node i is A[i/2] (note: integer divide) ◦ The left child of node i is A[2i] ◦ The right child of node i is A[2i + 1] 16 14 10 8 7 9 3 2 4 1 16 14 10 8 7 9 3 2 4 1 =0
  • 159. 16 4 10 14 7 9 3 2 8 1 16 10 14 7 9 3 2 8 140
  • 160. 16 4 10 14 7 9 3 2 8 1 16 10 7 9 3 2 8 14 140
  • 161. 16 14 10 4 7 9 3 2 8 1 16 14 10 4 7 9 3 2 8 10
  • 162. 16 14 10 4 7 9 3 2 8 1 16 14 10 4 7 9 3 2 8 10
  • 163. 16 14 10 4 7 9 3 2 8 1 16 14 10 7 9 3 2 14 80
  • 164. 16 14 10 8 7 9 3 2 4 1 16 14 10 8 7 9 3 2 4 10
  • 165. 16 14 10 8 7 9 3 2 4 1 16 14 10 8 7 9 3 2 4 10
  • 166. 16 14 10 8 7 9 3 2 4 1 16 14 10 8 7 9 3 2 4 10
  • 167. 16 14 10 8 7 9 3 2 4 1 16 14 10 8 7 9 3 2 4 10
  • 168.  java.util.Collections  java.util.Arrays exports similar basic operations for an array. binarySearch(list, key) sort(list) min(list) max(list) reverse(list) shuffle(list) swap(list, p1, p2) replaceAll(list, x1, x2) Finds key in a sorted list using binary search. Sorts a list into ascending order. Returns the smallest value in a list. Returns the largest value in a list. Reverses the order of elements in a list. Randomly rearranges the elements in a list. Exchanges the elements at index positions p1 and p2. Replaces all elements matching x1 with x2.