Algorithms Complexity

Views:
 
Category: Education
     
 

Presentation Description

The complexity of algorithms is shown in this presentation.

Comments

Presentation Transcript

Algorithm Complexity:

Algorithm Complexity 1

Algorithm:

Algorithm Definition: An algorithm is a well defined step-by-step computational procedure that takes some set of values as input and produces some values as output. A computer program is another pervasive example of an algorithm. Every program is a series of instructions in a specific order to perform a specific task. An algorithm is implemented in programming languages like C, C++, Java etc. 2

Algorithm Complexity:

Algorithm Complexity The subject of the analysis of algorithms consists of the study of their efficiency . Two aspects of the algorithm efficiency (complexity) are: 1. The amount of time (number of steps) required to execute the algorithm 2. The memory space it consumes. 3

Dominant Operations:

Dominant Operations The following main operations are used in the algorithms Assignment ( ⃪ ) Comparison ( = , ≠ , < , > , ≤ , ≥ ) Arithmetic operation ( + , - , × , ÷ ) Logical operations ( and , or , not ) How long does each of these operations take to execute? In general, assignments are very fast ,while other operations are slower. Multiplication and division are often slower than addition and subtraction. 4

Time Complexity:

Time Complexity The running time of an algorithm is defined to be an estimate of the number of operations performed by it given a particular number of input values . Let T(n) be a measure of the time required to execute an algorithm of problem size n . We call T(n) the time complexity function of the algorithm. If n is sufficiently small then the algorithm will not have a long running time. 5

Time Complexity:

Time Complexity How fast T(n) increases for large n ? This is called the asymptotic behavior of the time complexity function, basically we will be focusing on the leading term of T(n) . In our time analysis we will restrict ourselves to the worst case behavior of an algorithm; that is, the longest running time for any input of size n . 6

Example:

Example If T(n) = 4n 3 + 2n 2 + n + 5 then T(n) = n 3 ( 4 + ) and for large n we have T(n) ≃ 4n 3 . We say that T(n) has a growth of order n 3 . We say that one algorithm is more efficient then another if its worse case running time has a lower order of growth . 7

Exercise :

Exercise What is the run-time complexity based on n for the following algorithm segment: 1. for i = 1 to n do 1.1 for j = 1 to n do 1.1.1 A(i,j) ⃪ x Solution The inner loop 1.1 is executed n times and the outer loop 1 is also executed n times. Hence, T(n) = n 2 so that the growth is of order n 2 . 8

Exercise:

Exercise Estimate the time complexity of the following algorithm: 1. i ⃪ 1 2. p ⃪ 1 3. for j = 1 to n do 3.1 p ⃪ p × i 3.2 i ⃪ i + 1 Solution It takes two assignment statements 1 and 2 to initialize the variables i and p . The loop 3 is executed n times, and each time it executes two assignment statements and two arithmetic operations 3.1 and 3.2. Thus, the time complexity of the algorithm is given by T(n) = 4n + 2 so the growth is of order n . 9

Example :

Example Two algorithms, A and B ,have time complexities f and g respectively, where f(n) = 3 n 2 - 3n + 1 and g(n)=n 2 .Determine which algorithm is faster . Solution First , find the value of n such that f(n)=g(n) .Thus n 2 = 3 n 2 - 3n + 1 2 n 2 - 3n + 1 = 0 (2 n - 1)(n - 1) = 0 n = 1/2 or n = 1 10

Example:

Example Take the larger value of n , that is for n > 1 , suppose n = 2 ,then f(2) = 3(2) 2 - 3(2) + 1 = 7 and g(2) = 2 2 = 4 Thus g(n) < f(n) for all n > 1 .Hence g(n) is faster, so algorithm B is faster 11

Time-Space Trade-off:

Time-Space Trade-off Any algorithm can be analyzed in terms of time or space complexity. Time complexity and space complexity are inversely proportional. If reduces one then other increases. This behavior is known as time space tradeoff. 12

authorStream Live Help