1. INTRODUCTION Two important reasons to determine operation and step counts are:
1. predict the growth in run time as the instance
characteristics increase.
2. compare the time complexity of two programs that perform
the same task.

2. ASYMPTOTIC NOTATION :

2. ASYMPTOTIC NOTATION 2.1.Big Oh Notation
Definition-1
Let p(n) and q(n) be two non-negative functions.
* P(n) is asymptotically bigger (p(n) asymptotically
dominates q(n)) than the function q(n), iff
q(n)
------- = 0 (1)
p(n)
* q(n) is asymptotically smaller than p(n), iff p(n) is
asymptotically bigger than q(n).
* p(n) and q(n) are asymptotically equal, iff neither is
asymptotcially bigger than the other. Lim
n->∞

Slide 5:

Example:
10n + 7 10/n+7/n2
------------------- = ----------------- = 0 / 3 = 0
3n2 + 2n + 6 3+2/n+6/n2
3n2 + 2n + 6 is asymptotically bigger than 10n + 7 and 10n+7 is asymptotically smaller than 3n2 + 2n + 6.
A similar derivation shows that 8n4+9n2 is asymptotically bigger than 100n3-3, and 2n2+3n is asymptotically bigger than 83n.
12n+6 is asymptotically equal to 6n+2.
The following table gives the terms that occur frequently in a step-count analysis. Lim
n->∞

Slide 6:

All terms in the above table have the co-efficient 1, but in an actual analysis the co-efficient of these terms may have a different value.
Using equation(1), the terms in the above table are ordered as follows:
1 < log n < n < n log n < n2 < n3 < 2 n < n!
( < is to be read as “ is asymptotically smaller than “ )
Asymptotic notation
it describes the behavior of the time or space complexity for large instance characteristics.
Asymptotic notation can be developed with reference to “steps counts” (or) “ space complexity and operation counts”.
The notation f(n) = O(g(n)) read as “ f(n) is big oh of g(n)”
- it means f(n) is asymptotically smaller than or equal to g(n).
- therefore, in an asymptotic sense g(n) is an upper bound for
f(n).

Slide 7:

Example:
10n + 7 = O (3n2 + 2n + 6 ); 100n3-3 = O(8n4+9n2 )
12n + 6 = O(6n + 2); 3n2 + 2n + 6 = O(10n + 7 )
Definition-2
Let t(m.n) and u(m,n) be two terms. t(m,n) is asymptotically bigger than u(m,n). Equivalently, u(m,n) is asymptotically smaller than t(m, n).
u(m,n) u(m,n)
--------- = 0 and ------------ = ∞
t(m,n) t(m,n)
(or)
u(m,n) and u(m,n)
--------- = ∞ ---------- = 0
t(m,n) t(m,n) Lim
n->∞ Lim
n->∞ Lim
n->∞ Lim
n->∞

2.2. Omega Notation (Ω ) and Theta Notation (Θ) :

2.2. Omega Notation (Ω ) and Theta Notation (Θ) The big oh notation is the most frequently used asymptotic notation.
The ‘omega’ and ‘theta’ notations are sometimes used to describe the asymptotic complexity of a program.
The notation f(n) = Ω(g(n))
- read as “f(n) is omega of g(n).
- it means that f(n) is asymptotically bigger than or equal to
g(n).
In an asymptotic sense g(n) is a lower bound fro f(n).
The notation f(n) = Θ(g(n))
- read as “ f(n) is theta of g(n).
- it means that f(n) is asymptotically equal to g(n).
Example
10n+7= Ω (n) because 10n+7 is asymptotically equal to n
100n3-3 = Ω (n3); 12n+6 = Ω (n); 8n4+9n2 = Ω (n3)
8n4 + 9n2 = Ω (n5)

Slide 9:

Interpretation of O(g(n)), Ω(g(n)) and (g(n)) as the following sets:
1. O(g(n)) = { f(n) | f(n) = O(g(n))}
2. Ω(g(n)) = {f(n) | f(n) = Ω(g(n))}
3. Θ (g(n)) = {f(n) | f(n) = Θ(g(n))}
Under this interpretation
O(g1(n)) = O(g2(n)) and
Θ(g1(n)) = Θ(g2(n)) are meaningful.

3. Asymptotic Mathematics (optional) :

3. Asymptotic Mathematics (optional) 3.1. Big Oh Notation (O)
Definition
f(n) = O(g(n)) iff positive constants c and n0 exist such that f(n) <= Cg(n) for all n, n>= n0
Big Oh notation describes an upper bound on the asymptotic growth rate of the function f.
The definition states that the functions ‘f’ is at most ‘c’ times the function ‘g’ except possibly when ‘n’ is smaller than n0.
Where : g – is an upper bound on the value of ‘f’ for all large n (i.e.n>=n0).
The following figure illustrates, what is mean by function g(n) to upper bound and another function f(n).
f(n) may be less than, equal to, or greater than cg(n) for several values of n.
There must exist a value ‘m’ of ‘n’ beyond which f(n) is never less than cg(n).

Slide 11:

The n0 in the definition of big Oh be any integer>=m.
Figure-1 g(n) is an upper bound on f(n) Cg(n) f(n) m n->

Slide 12:

Example: (Linear function)
consider f(n) = 3n+2, when n=2; 3n+2 <= 3n +n < = 4n.
3*2+2 <=3*2+2< = 4*2
6+2 <= 6+2 < = 8
8 = 8 = 8
Thus f(n) is bounded from above by a linear function.
3.2. Omega Notation (Ω )
Definition:
f(n)= Ω g(n)) iff positive constants ‘c’ and n0 exist such that f(n)>=cg(n) for all n, n<= n0.
* The omega notation, is the lower-bound of the big Oh notation, it permits us to bound the asymptotic growth rate of ‘f’.

Slide 13:

Theorem
If f(n) = amnm + ………….+a1n + a0 and am>0, then
f(n) = Ω(nm)
Example
i) 3n+2 = Ω(n)
ii) 10n2+4n+2 = Ω(n2)
iii) 100n4+3500n2+82+8 = Ω(n4)
The following figure illustrates, what is mean by function g(n) to lower bound and another function f(n).
f(n) may be less than, equal to, or greater than cg(n) for several values of n.
There must exist a value ‘m’ of ‘n’ beyond which f(n) is never greater than cg(n).
The n0 in the definition of omega be any integer>=m

Slide 14:

Figure-2 g(n) is an lower bound on f(n) Cg(n) f(n) m n->

Slide 15:

3.3. Theta Notation (Θ )
Definition:
* f(n) = Θ(g(n)) iff positive constants c1 and c2 and an n0 exist such that c1g(n) <= c2g(n) for all n.
* The theta notation is used when the function f can be bounded both from above and below by the same function g.
* f(n) = Θg(n)) is represented as
f – lies between c1 times the function g and
c2 times the function g except
when ‘n’ is smaller than n0.
c1 and c2 are positive constant
g – is both a lower and upper bound on the value f.

Slide 16:

Another representation of f(n) = Θ(g(n))
f(n) is both Ω g(n) and O(g(n)
The following figure illustrate what is mean for a function g(n) to both “upper” and “lower” bound to another function f(n).
There must exist a value ‘m’ of ‘n’ beyond which f(n) lies between c1g(n) and c2g(n).
The n0 – in the definition of theta could be of any inter >=m.

Slide 17:

m n-> C1g(n) f(n) C2g(n) Figure-3 g(n) is a lower & upper bound on f(n)

Slide 18:

Theorem
If f(n) = amnm + ………….+a1n + a0 and am>0, then
f(n) = Θ(nm)
Example
i) 3n+2 = Θ(n)
ii) 10n2+4n+2 = Θ(n2)
iii) 100n4+3500n2+82+8 = Θ(n4)

3.4. Little Oh Notation (o) :

3.4. Little Oh Notation (o) The little oh notation describes a strict upper bound on the asymptotic growth rate of function ‘ f ‘.
f(n) is little oh of g(n) iff f(n) is asymptotically smaller than g(n).
Definition:
f(n) = o(g(n)) (read as “ f of n is little oh of g of n “)
iff f(n) = O(g(n)) and f(n) = Ω (g(n))
Example:
i) 3n+2 = o(n2) as 3n+2=O(n2) and 3n+2 = Ω(n2)
ii) 10n2+4n+2 = o(n3) and not o(n2)
Little oh notation is used in step-count analysis.
(e.g) step count of 3n+o(n) is “ 3n “.

3.5. Properties :

3.5. Properties The following theorem is useful for computation, to involve asymptotic notation
Theorem:
1. An n0 exist such that (log n)x < (log n) x+є for every n, n>=n0.
2. An n0 exists such that (log n) x < n є for every n, n>= n0.
3. An n0 exists such that n x < n є for every n, n >= no.
4. For every real y, an n0 exists such that n x (log n ) y < n
for every n, n>= n0.
5. An n0 exists such that n x < 2 n every n, n > + n0.
* The following are some useful identities, that involving the big oh, omega and theta notations.

Slide 21:

can be any one of O, Ω, Θ ASYMPTOTIC
IDENTITES

4. complexity Analysis examples :

4. complexity Analysis examples Determining asymptotic complexity is quite easy, without determining the exact step count.
First determine the asymptotic complexity of each statement (or) group of statements in the program
And add up these complexities.
The following determine the asymptotic complexity of several methods without performing an exact step-count analysis.
This figure use the following equation when
f1(n) = Θ(g1(n)) and f2(n) = Θ(g2(n)), then
f1(n) + f2(n) = Θ(max{g1(n), g2{n)})

Slide 23:

t sum (n) = Θ(max{gi(n)}) = Θ(n) Table-1 Asymptotic complexity of ‘ sum’ program

5. practical complexities :

5. practical complexities Describes the behavior of the time or space complexity for large instance characteristics
Common asymptotic functions are
1 (constant), log n, n (linear)
n log n, n2 , n3
2n ( exponential), n!
where n usually refer to the number of instance of data (data

Common asymptotic functions :

Common asymptotic functions

Slide 26:

How the various functions grow with ‘n’, you should study the following figure 2.23 and 2.24:
- these figure shows that 2n grows very rapidly with ‘ n ‘.
- if a program needs 2n steps for execution, when n=40.
the no. of steps need is approximately 1.1*10 12.
- on a computer performing 1,000,000,000 steps/sec. and
require 18.3minutes.
- if n=50, the same program run for 13-days on this computer
- if n=60, 310.56 years will be required to execute the program
- if n=100. 4*1013 year will be needed.

Slide 27:

Example:
if a program needs n10 steps, then 1,000,000,000 steps/sec. computer need. 10 sec
when n = 10,
when n = 100, 3171 years needed
when n=1000, 3.17*1013 years needed
if a program needs n3 steps, then computer need 1-sec.
when n=1000
when n=10,000, 110.67 minutes needed
when n=1000, 11.57 days needed

Slide 29:

The following figure gives the time that a 1,000,000,000 instruction /sec. computer needs to execute a program of complexity f(n) instructions

You do not have the permission to view this presentation. In order to view it, please
contact the author of the presentation.

By: Premalathagovindaraj (30 month(s) ago)

let us allow to download the ppt.