lecture 3 insertion sort and complexity analysis

39
Lecture 3 : Analysis of Algorithms & Insertion Sort Jayavignesh T Asst Professor SENSE

Upload: jayavignesh86

Post on 19-Jan-2017

221 views

Category:

Engineering


0 download

TRANSCRIPT

Page 1: Lecture 3   insertion sort and complexity analysis

Lecture 3 : Analysis of Algorithms & Insertion Sort

Jayavignesh T

Asst Professor

SENSE

Page 2: Lecture 3   insertion sort and complexity analysis

Time Complexity

• Amount of computer time required by an algorithm to run on completion

• Difficult to compute time complexity in terms of physically clocked time.

• Drawbacks of measuring running time in-terms of seconds, millisecond etc are

– Dependence of speed of a underlying hardware

– Number of other programs running (System load)

– Dependence of compiler used in generating machine code

Page 3: Lecture 3   insertion sort and complexity analysis

How to calculate running time then?

• Time complexity given in terms of FREQUENCY COUNT

• Count denoting number of times of execution of statement.

For (i=0; i <n;i++) { // St1 : 1, St 2 : n+1 , St 3 : n times

sum = sum + a[i]; // n times

}

3n + 2 ; O(n) neglecting constants and lower order terms

Page 4: Lecture 3   insertion sort and complexity analysis

How to calculate running time then?

for (i=0; i < n ; i ++) // 1 ; n+1 ; n times

{

for (j=0; j < n ; j ++) // n ; n(n+1) ; n(n)

{

c[i][j] = a[i][j] + b[i][j];

}

} 3n2+4n+ 2 = O(n2)

Page 5: Lecture 3   insertion sort and complexity analysis

Time Complexity

• Number of steps required by an algorithm varies with the size of the problem it is solving.

• Normally expressed as order of magnitude

– eg O(n2)

– Size of problem doubles then the algorithm will take 4 times as many steps to complete

Page 6: Lecture 3   insertion sort and complexity analysis

How to calculate running time then?

• All Algorithms run longer on larger inputs

• Algorithm’s efficiency - f(n)

• Identify the most important operations of the algorithm – BASIC Operation

• Basic operation – contributing to most of total running time

• Compute the number of times basic operation is executed (mostly in inner loop)

Ex : Sorting Algorithms – Comparison (< >)

Matrix Multiplication, Polynomial evaluation – Arithmetic Operations ( *, +)

= (assignment), ==(equality) etc..

Page 7: Lecture 3   insertion sort and complexity analysis
Page 8: Lecture 3   insertion sort and complexity analysis

Order of Growth of Algorithm

• Measuring the performance of an algorithm in relation with input size n

• Cannot says it equals n2 , but it grows like n2

Page 9: Lecture 3   insertion sort and complexity analysis

EFFICIENCY COMPARISONS

Page 10: Lecture 3   insertion sort and complexity analysis

Rate of Growth of Algorithm as fn of i/p size

Page 11: Lecture 3   insertion sort and complexity analysis

Determination of Complexities

• How do you determine the running time of piece of code?

Ans : Depends on the kinds of statements used

Page 12: Lecture 3   insertion sort and complexity analysis

1. Sequence of Statements

Statement 1;

Statement 2;

Statement k;

• Independent statement in a piece of code and not an unrolled loop

• Total Time : Adding the time for all statements.

• Total Time = Time (Statement 1) + Time (Statement 2) + … + Time (Statement k)

• Each statement – simple (basic operations) – Time constant – Total time is also constant O(1)

Page 13: Lecture 3   insertion sort and complexity analysis

1 (Constant Time)

• When instructions of program are executed once or at most only a few times , then the running time complexity of such algorithm is known as constant time.

• It is independent of the problem size.

• It is represented as O(1).

• For example, linear search best case complexity is O(1)

Page 14: Lecture 3   insertion sort and complexity analysis

Log n (Logarithmic)

• The running time of the algorithm in which large problem is solved by transforming into smaller sizes sub problems is said to be Logarithmic in nature.

• Becomes slightly slower as n grows.

• It does not process all the data element of input size n.

• The running time does not double until n increases to n2.

• It is represented as O(log n).

• For example binary search algorithm running time complexity is O(log n).

Page 15: Lecture 3   insertion sort and complexity analysis

2.For loops

for (i=0; i<N;i++)

{

Sequence of statements

}

• Loop executes N times, Sequence of statements also executes N times.

• Total time for the for loop = N*O(1) = O(N)

Page 16: Lecture 3   insertion sort and complexity analysis

3.If-then-else statements

If(cond) {

Sequence of statements 1

}

Else

{

Sequence of statements 2

}

• Either Sequence 1 or Sequence 2 will execute.

• Worst Case Time is slowest of two possibilities

– Max { time (sequence 1), time (sequence 2) }

– If Sequence 1 is O(N) and Sequence 2 is O(1), Worst case time for if-then-else would be O(N)

Page 17: Lecture 3   insertion sort and complexity analysis

n (Linear)

• The complete set of instruction is executed once for each input i.e input of size n is processed.

• It is represented as O(n).

• This is the best option to be used when the whole input has to be processed.

• In this situation time requirement increases directly with the size of the problem.

• For example linear search Worst case complexity is O(n).

Page 18: Lecture 3   insertion sort and complexity analysis

4.Nested Loops

For (i=0;i<N;i++){

for(j=0;j<M;j++){

sequence of statements;

}

}

Total Complexity = O(N*M)

= O(N2)

Page 19: Lecture 3   insertion sort and complexity analysis

5.Statement with function calls

• for (j=0; j<N; j++) g(N); has complexity O(N2)

– Loop executes N times

– g(N) has complexity O(N)

Page 20: Lecture 3   insertion sort and complexity analysis

n2 (Quadratic)

• Running time of an algorithm is quadratic in nature

when it process all pairs of data items.

• Such algorithm will have two nested loops.

• For input size n, running time will be O(n2).

• Practically this is useful for problem with small input

size or elementary sorting problems.

• In this situation time requirement increases fast with

the size of the problem.

• For example insertion sort running time complexity is

O(n2).

Page 21: Lecture 3   insertion sort and complexity analysis

Performance Classification

Page 22: Lecture 3   insertion sort and complexity analysis

Efficiency comparisons

Page 23: Lecture 3   insertion sort and complexity analysis

Function of Growth Rate

Page 24: Lecture 3   insertion sort and complexity analysis

Prob1. Calculate worst-case complexity!

• Nested Loop + Non-nested loop

for (i=0;i<N;i++){

for(j=0;j<N;j++){

sequence of statements;

}

}

for(k=0;k<N;j++){

sequence of statements;

}

• O(N2), O(N) = O(max(N2,N) = O(N2)

Page 25: Lecture 3   insertion sort and complexity analysis

Prob 2.Calculate worst-case complexity!

• Nested Loop

for (i=0;i<N;i++){

for(j=i;j<N;j++){

sequence of statements;

}

}

• N+ (N-1) + (N-2) + …. + 1 = N(N+1)/2 = O(N2)

Page 26: Lecture 3   insertion sort and complexity analysis

Approaches of Designing Algorithms

• Incremental Approach • Insertion sort

– In each iteration one more element joins the sorted array

• Divide and Conquer Approach – Recursively break down into 2 or more sub problems until it

becomes easy to solve. Solutions are combined to give solution to original problem

• Merge Sort

• Quick Sort

Page 27: Lecture 3   insertion sort and complexity analysis

Insertion Sort

3 4 6 8 9 7 2 5 1

1 n j

i

Strategy • Start empty handed • Insert a card in the right position of the already sorted hand • Continue until all the cards are Inserted/sorted

Page 28: Lecture 3   insertion sort and complexity analysis

Analysis – Insertion Sort

Page 29: Lecture 3   insertion sort and complexity analysis

Insertion Sort – Tracing Input

Page 30: Lecture 3   insertion sort and complexity analysis

Analysis – Insertion Sort

• Assume that the i th line takes time ci , which is a constant. (Since the third line is a comment, it takes no time.)

• For j = 2, 3, . . . , n, let tj be the number of times that the while loop test is executed for that value of j .

• Note that when a for or while loop exits in the usual way - due to the test in the loop header - the test is executed one time more than the loop body.

Page 31: Lecture 3   insertion sort and complexity analysis

Analysis – Insertion Sort – Running time

Page 32: Lecture 3   insertion sort and complexity analysis

Best case Analysis

Page 33: Lecture 3   insertion sort and complexity analysis
Page 34: Lecture 3   insertion sort and complexity analysis
Page 35: Lecture 3   insertion sort and complexity analysis

Worst case Analysis

Page 36: Lecture 3   insertion sort and complexity analysis

Average Case

Page 37: Lecture 3   insertion sort and complexity analysis
Page 38: Lecture 3   insertion sort and complexity analysis

Divide-and-Conquer

• The most-well known algorithm design strategy:

1. Divide instance of problem into two or more smaller instances

2. Solve smaller instances recursively

3. Obtain solution to original (larger) instance by combining these solutions

• Type of recurrence relation

Page 39: Lecture 3   insertion sort and complexity analysis

Divide-and-Conquer Technique (cont.)