Você está na página 1de 20

CHAPTER NO : 4 INTRODUCTION TO ALGORITHM

DEFINE RECURRENCE RELATION :

A recurrence relation for the sequence {an} is an equation that expresses an is terms of one or more of the previous terms of the sequence, namely, a0, a1, , an-1, for all integers n with n n0, where n0 is a nonnegative integer. A sequence is called a solution of a recurrence relation if it terms satisfy the recurrence relation.

In

other words, a recurrence relation is like a recursively defined sequence, but without specifying any initial values (initial conditions).
Therefore,

the same recurrence relation can have (and usually has) multiple solutions.
If

both the initial conditions and the recurrence relation are specified, then the sequence is uniquely determined.

Consider the recurrence relation an = 2an-1 an-2 for n = 2, 3, 4, Is the sequence {an} with an=3n a solution of this recurrence relation? For n 2 we see that 2an-1 an-2 = 2(3(n 1)) 3(n 2) = 3n = an. Therefore, {an} with an=3n is a solution of the recurrence relation. Is the sequence {an} with an=5 a solution of the same recurrence relation? For n 2 we see that 2an-1 an-2 = 25 - 5 = 5 = an.

Therefore, {an} with an=5 is also a solution of the recurrence relation.

There

are four methods of recurrence relation:


TREE METHOD. ITREATION METHOD. SUBSTITUTION METHOD. MASTER METHOD .

Define Tree method :

Another common pattern of computation is called tree recursion. As an example, consider computing the sequence of Fibonacci numbers, in which each number is the sum of the preceding two:

0,1,1,2,3,5,8,13,21,...
In general, the Fibonacci numbers can be defined by the rule We can immediately translate this definition into a recursive procedure for computing Fibonacci numbers: (define (fib n) (cond ((= n 0) 0) ((= n 1) 1) (else (+ (fib (- n 1)) (fib (- n 2))))))

T(n)

= 2T(n/2) + n2.

The recursion tree for this recurrence has the following form:

In

this case, it is straightforward to sum across each row of the tree to obtain the total work done at a given level:

Define

Iteration :

Iteration means the act of repeating a process usually with the aim of approaching a desired goal or target or result. Each repetition of the process is also called an "iteration," and the results of one iteration are used as the starting point for the next iteration.

T(n) = T((7/8)^1 * n) + 2 * (7/8)^0 * n = T((7/8)^2 * n) + 2 * (7/8)^1 * n + 2 * (7/8)^0 * n = T((7/8)^3 * n) + 2 * (7/8)^2 * n + 2 * (7/8)^1 * n + 2 * (7/8)^0 * n . . . = T((7/8)^k * n) + 2 * n * sum j = 0 to k-1 (7/8)^j Now, let k tend to infinity and see what happens. It would help if you're familiar with geometric series.

Using Substitution Method :

The substitution method is a condensed way of proving an asymptotic bound on a recurrence by induction. In the substitution method, instead of trying to find an exact closed-form solution, we only try to find a closedform bound on the recurrence. This is often much easier than finding a full closed-form solution, as there is much greater leeway in dealing with constants. The substitution method is a powerful approach that is able to prove upper bounds for almost all recurrences. However, its power is not always needed; for certain types of recurrences, the master method (see below) can be used to derive a tight bound with less work. In those cases, it is better to simply use the master method, and to save the substitution method for recurrences that actually need its full power.

Consider the following reccurence relation, which shows up fairly frequently for some types of algorithms:

T(1) = 1 T(n) = 2T(n1) + c1By expanding this out a bit (using the "iteration method"), we can guess that this will be O(2n). To use the substitution method to prove this bound, we now need to guess a closed-form upper bound based on this asymptotic bound. We will guess an upper bound of k2n b, where b is some constant. We include the b in anticipation of having to deal with the constant c1 that appears in the recurrence relation, and because it does no harm. In the process of proving this bound by induction, we will generate a set of constraints on k and b, and if b turns out to be unnecessary, we will be able to set it to whatever we want at the end. Our property, then, is T(n) k2n b, for some two constants k and b. Note that this property logically implies that T(n) is O(2n), which can be verified with reference to the definition of O.

Base

case: n = 1. T(1) = 1 k21 b = 2k b. This is true as long as k (b + 1)/2. Inductive case: We assume our property is true for n 1. We now want to show that it is true for n. T(n) = 2T(n1) + c1 2(k2n 1 b) + c1 (by IH) = k2n 2b + c1 k2n b This is true as long as b c1.

So we end up with two constraints that need to be satisfied for this proof to work, and we can satisfy them simply by letting b = c1 and k = (b + 1)/2, which is always possible, as the definition of O allows us to choose any constant. Therefore, we have proved that our property is true, and so T(n) is O(2n). The biggest thing worth noting about this proof is the importance of adding additional terms to the upper bound we assume. In almost all cases in which the recurrence has constants or lower-order terms, it will be necessary to have additional terms in the upper bound to "cancel out" the constants or lower-order terms. Without the right additional terms, the inductive case of the proof will get stuck in the middle, or generate an impossible constraint; this is a signal to go back to your upper bound and determine what else needs to be added to it that will allow the proof to proceed without causing the bound to change in asymptotic terms.

The master method is a cookbook method for solving recurrences. Although it cannot solve all recurrences, it is nevertheless very handy for dealing with many recurrences seen in practice. Suppose you have a recurrence of the form\

T(n) = aT(n/b) + f(n),

where a and b are arbitrary constants and f is some function of n. This recurrence would arise in the analysis of a recursive algorithm that for large inputs of size n breaks the input up into asubproblems each of size n/b, recursively solves the subproblems, then recombines the results. The work to split the problem into subproblems and recombine the results is f(n). We can visualize this as a recurrence tree, where the nodes in the tree have a branching factor of a. The top node has work f(n) associated with it, the next level has work f(n/b) associated with each of a nodes, the next level has work f(n/b2) associated with each of a2 nodes, and so on. At the leaves are the base case corresponding to some 1 n < b. The tree has logbn levels, so the total number of leaves is alogbn = nlogba.

The total time taken is just the sum of the time taken at each level. The time taken at the i-th level is aif(n/bi), and the total time is the sum of this quantity as i ranges from 0 to logbn1, plus the time taken at the leaves, which is constant for each leaf times the number of leaves, or O(nlogba). T(n) = 0i<logbn aif(n/bi) + O(nlogba).

What this sum looks like depends on how the asymptotic growth of f(n) compares to the asymptotic growth of the number of leaves. There are three cases:

Case 1: f(n) is O(nlogba ). Since the leaves grow faster than f, asymptotically all of the work is done at the leaves, so T(n) is (nlogb a). Case 2: f(n) is (nlogba). The leaves grow at the same rate as f, so the same order of work is done at every level of the tree. The tree has O(log n) levels, times the work done on one level, yielding T(n) is (nlogb a log n).

Case 3: f(n) is (nlogba + ). In this case f grows faster than the number of leaves, which means that asymptotically the total amount of work is dominated by the work done at the root node. For the upper bound, we also need an extra smoothness condition on f in this case, namely that af(n/b) cf(n) for some constant c < 1 and large n. In this case T(n) is (f(n)). As mentioned, the master method does not always apply. For example, the second example considered above, where the subproblem sizes are unequal, is not covered by the master method. Let's look at a few examples where the master method does apply.

Example 1:

Say you have derived the recurrence relation T(n) = 8T(n/2) + cn2, where c is some positive constant. We see that this has the appropriate form for applying the master method, and that a=8, b=2, and h(n) = cn2. cn2 is O(nlog28 ) = O(n3 ) for any 1, so this falls into case 1. Therefore, T(n) is (n3). Say you have derived the recurrence relation T(n) = T(n/2) + cn, where c is some positive constant. We see that this has the appropriate form for applying the master method, and that a=1, b=2, and h(n) = cn. Then h(n) is (nlog21 + ) = (n) for any 1, so this falls into case 3. And ah(n/b) = cn/2 = h(n), therefore T(n) is (n).

Example 2:

Example

3:

Say you have derived the recurrence relation T(n) = 8T(n/4) + cn3/2, where c is some positive constant. We see that this has the appropriate form for applying the master method, and that a=8, b=4, and h(n) = cn3/2. cn3/2 is (nlog48) = (n3/2), so this falls into case 2. Therefore, T(n) is (n3/2log n).

Você também pode gostar