Download as pdf or txt
Download as pdf or txt
You are on page 1of 52

Chapter 2.

6
Comparison of Algorithms

modified from Clifford A. Shaffer and George Bebis


Which Cost More to Feed?

2
Algorithm Efficiency

• There are often many approaches


(algorithms) to solve a problem.
• How do we choose between them?

• At the heart of computer program design


are two (sometimes conflicting) goals
• To design an algorithm that
• is easy to understand, code, debug.
• makes efficient use of the resources.
3
Algorithm Efficiency (cont)

• Goal (1) is the concern of Software


Engineering.

• Goal (2) is the concern of data structures


and algorithm analysis.

• When goal (2) is important,


• how do we measure an algorithm’s cost?

4
Estimation Techniques

• Known as “back of the envelope” or “back


of the napkin” calculation
1. Determine the major parameters that effect
the problem.
2. Derive an equation that relates the
parameters to the problem.
3. Select values for the parameters, and apply
the equation to yield and estimated solution.

5
Estimation Example

• How many library bookcases does it take


to store books totaling one million pages?

• Estimate:
• Pages/inch
• Feet/shelf
• Shelves/bookcase

6
Analysis of Algorithms

• An algorithm is a finite set of precise instructions


for performing a computation or for solving a
problem.
• What is the goal of analysis of algorithms?
• To compare algorithms mainly in terms of running
time but also in terms of other factors (e.g., memory
requirements, programmer's effort etc.)
• What do we mean by running time analysis?
• Determine how running time increases as the size
of the problem increases.

7
Input Size

• Input size (number of elements in the input)


• size of an array

• polynomial degree

• # of elements in a matrix

• # of bits in the binary representation of the input

• vertices and edges in a graph

8
Best, Worst, Average Cases

• Not all inputs of a given size take the


same time to run.
• Sequential search for K in an array of n
integers:
• Begin at first element in array and look at
each element in turn until K is found

9
Time Analysis

• Provides upper and lower bounds of running time.

Lower Bound  Running Time  Upper Bound

• Different types of analysis:


- Worst case
- Best case
- Average case

10
Worst Case

• Provides an upper bound on running time.


• An absolute guarantee that the algorithm would
not run longer, no matter what the inputs are.

Lower Bound  Running Time  Upper Bound

11
Best Case

• Provides a lower bound on running time.


• Input is the one for which the algorithm runs the
fastest.

Lower Bound  Running Time  Upper Bound

12
Average Case

• Provides an estimate of “average” running time.


• Assumes that the input is random.
• Useful when best/worst cases do not happen
very often
• i.e., few input cases lead to best/worst cases.

Lower Bound  Running Time  Upper Bound

13
Which Analysis to Use?

• While average time appears to be the


fairest measure,
• it may be difficult to determine.

• When is the worst case time important?

14
How to Measure Efficiency?

• Critical resources:
• Time, memory, programmer effort, user effort

• Factors affecting running time:


• For most algorithms, running time depends on
“size” of the input.
• Running time is expressed as T(n) for some
function T on input size n.

15
How do we analyze an algorithm?
• Need to define objective measures.

(1) Compare execution times?


Empirical comparison (run programs)
Not good: times are specific to a particular machine.

(2) Count the number of statements?


Not good: number of statements varies with
programming language and programming style.

16
How do we analyze an
algorithm? (cont.)
(3) Express running time t as a function
of problem size n (i.e., t=f(n) )
Asymptotic Algorithm Analysis
- Given two algorithms having running times
f(n) and g(n),
- find which functions grows faster

- Such an analysis is independent of


machine time, programming style, etc.
17
Comparing algorithms

• Given two algorithms having running times


f(n) and g(n), how do we decide which one
is faster?

• Compare “rates of growth” of f(n) and g(n)

18
Understanding Rate of Growth

• Consider the example of buying elephants


and goldfish:

Cost: (cost_of_elephants) + (cost_of_goldfish)

Approximation:
• Cost ~ cost_of_elephants

19
Understanding Rate of Growth
(cont’d)
• The low order terms of a function are
relatively insignificant for large n

n4 + 100n2 + 10n + 50

Approximation:

n4

• Highest order term determines rate of growth!

20
Rate of Growth ≡ Asymptotic Analysis

• Using rate of growth as a measure to


compare different functions implies
comparing them asymptotically
• i.e., as n  

• If f(x) is faster growing than g(x), then f(x)


always eventually becomes larger than g(x)
in the limit
• i.e., for large enough values of x
21
The Growth of Functions

• A problem that can be solved with polynomial


worst-case complexity is called tractable

• Problems of higher complexity are called


intractable

• Problems that no algorithm can solve are called


unsolvable

22
Asymptotic Notation

O notation: asymptotic “less than”:

f(n)=O(g(n)) implies: f(n) “≤” c g(n) in the limit*


c is a constant

(used in worst-case analysis)

23
Asymptotic notations
• O-notation

24
Asymptotic Notation

 notation: asymptotic “greater than”:

f(n)=  (g(n)) implies: f(n) “≥” c g(n) in the limit*


c is a constant

(used in best-case analysis)

25
Asymptotic notations (cont.)
•  - notation

(g(n)) is the set of


functions with larger or
same order of growth as
g(n)

26
Asymptotic Notation

 notation: asymptotic “equality”:

f(n)=  (g(n)) implies: f(n) “=” c g(n) in the limit*


c is a constant

(provides a tight bound of running time)


(best and worst cases are same)

27
Asymptotic notations (cont.)

• -notation

(g(n)) is the set of


functions with the same
order of growth as g(n)

28
Relations Between Different Sets
• Subset relations between order-of-growth
sets.

RR
O( f ) ( f )
•f
( f )

29
Visualizing Orders of Growth

• On a graph, as
you go to the
right, a faster

Value of function 
fA(n)=30n+8
growing
function
eventually fB(n)=n2+1
becomes
larger...
Increasing n 

30
Common
orders of
magnitude

31
Complexity
• Let us assume two algorithms A and B that solve
the same class of problems.
• The time complexity of A is 5,000n,
the one for B is 1.1n for an input with n elements
• For n = 10,
• A requires 50,000 steps,
• but B only 3,
• so B seems to be superior to A.

• For n = 1000, A requires 5106 steps,


• while B requires 2.51041 steps.

32
N log2N N*log2N N2 2N

1 0 0 1 2
2 1 2 4 4

4 2 8 16 16

8 3 24 64 256
16 4 64 256 65,536

32 5 160 1024 4.29*109


64 6 384 4096 1.84*1019
128 7 896 16,384 3.40*1038
33
A Common Misunderstanding
Confusing worst case with upper bound.

Upper bound refers to a growth rate.

Worst case refers to the worst input from


among the choices for possible inputs of
a given size.

34
Algorithm speed vs function growth
• An O(n2) algorithm will be slower than an O(n)
algorithm (for large n).
• But an O(n2) function will grow faster than an O(n)
function.
Value of function 

fA(n)=30n+8

fB(n)=n2+1

Increasing n 
35
Running Time Examples
Algorithm 1 Cost
arr[0] = 0; c1
arr[1] = 0; c1
arr[2] = 0; c1
...
arr[N-1] = 0; c1
-----------
c1+c1+...+c1 = c1 x N

Algorithm 2 Cost
for(i=0; i<N; i++) c2
arr[i] = 0; c1
-----------
(N+1) x c2 + N x c1 = (c2 + c1) x N + c2
36
Running Time Examples (cont.’d)

Cost
sum = 0; c1
for(i=0; i<N; i++) c2
for(j=0; j<N; j++) c2
sum += arr[i][j]; c3
------------
c1 + c2 x (N+1) + c2 x N x (N+1) + c3 x N x N

37
How do we find f(n)?
(1) Associate a "cost" with each statement

(2) Find total number of times each statement


is executed

(3) Add up the costs

38
Big-O Visualization
O(g(n)) is the set of
functions with
smaller or same
order of growth as
g(n)

39
More Running Time Examples
i = 0;
while (i<N) {
X=X+Y; // O(1)
result = mystery(X); // O(N)
i++; // O(1)
}
• The body of the while loop: O(N)
• Loop is executed: N times
N x O(N) = O(N2)

40
More Running Time Examples
(cont.’d)
if (i<j)
for ( i=0; i<N; i++ ) O(N)
X = X+i;
else
O(1)
X=0;

Max ( O(N), O(1) ) = O (N)

41
Complexity Examples
What does the following algorithm compute?
int who_knows(int a[], int n) {
int m = 0;
for (int i = 0; i<n; i++)
for (int j = i+1; j<n; j++)
if ( abs(a[i] – a[j]) > m )
m = abs(a[i] – a[j]);
return m;
}
returns the maximum difference between any two
numbers in the input array
Comparisons: n-1 + n-2 + n-3 + … + 1 = (n-1)n/2 = 0.5n2 - 0.5n
Time complexity is O(n2)
42
Complexity Examples
Another algorithm solving the same problem:
int max_diff(int a[n]) {
int min = a[0];
int max = a[0];
for (int i = 1; i<n; i++)
if ( a[i] < min )
min = a[i];
else if ( a[i] > max )
max = a[i];
return max-min;
}
Comparisons: 2n - 2
Time complexity is O(n). 43
Names of Orders of Magnitude
“Popular” functions g(n) are n log n, log n, 1, 2n, n2, n!, n, n3
Listed from slowest to fastest growth:
O(1) bounded (by a constant) time

O(log2N) logarithmic time

O(N) linear time

O(N*log2N) N*log2N time

O(N2) quadratic time

O(N3) cubic time

O(2N ) exponential time


44
Examples (cont.’d)

45
Examples of Growth Rate
/* @return Position of largest value in "A“ */
static int largest(int[] A) {
int currlarge = 0; // Position of largest
for (int i=1; i<A.length; i++)
if (A[currlarge] < A[i])
currlarge = i; // Remember position
return currlarge; // Return largest
position
}

46
Examples (cont)

sum = 0;
for (i=1; i<=n; i++)
for (j=1; j<=n; j++)
sum++;

47
Time Complexity Examples (1)
a = b;

This assignment takes constant time, so it is


O(1).

sum = 0;
for (i=1; i<=n; i++)
sum += n;

48
Time Complexity Examples (2)

sum = 0;
for (j=1; j<=n; j++)
for (i=1; i<=j; i++)
sum++;
for (k=0; k<n; k++)
A[k] = k;

49
Time Complexity Examples (3)
sum1 = 0;
for (i=1; i<=n; i++)
for (j=1; j<=n; j++)
sum1++;

sum2 = 0;
for (i=1; i<=n; i++)
for (j=1; j<=i; j++)
sum2++;
50
Examples (cont.’d)

51
Analyze the complexity of the
following code segments

52

You might also like