recursion_material(2_classes).pdf

14
Subject: Data Structure with C Topic: Recursion In this chapter we are departing recursion concepts through the following steps as millstones. Presentation starts with recursion definition, how recursion works ?, designing of recursive algorithm with appropriate methodology, limitation of recursion and to end with summarizing the overall concepts. There are mainly two approaches for repetitive approach. Iteration Recursion Recursion is a repetitive process in which an algorithm calls itself. by and large recursion is organized in such a way that a function or subroutine calls itself. The following Figure 1 and Figure 2 portrayed the typical pictorial examples for recursive calling. Figure 1: Recursive call In similar way iterative process can be emphasized, whenever the definition involves only the algorithm parameter/s and not the algorithm itself. But recursion algorithm appears within the definition itself. In other words recursion is a programming technique in which a method can call itself to solve a problem It can be also define in the following way, one which uses the word or concept being defined in the definition itself. In some situations, a recursive definition can be an appropriate or elegant way to express a concept.

Upload: naveen-gowdru

Post on 02-Sep-2015

212 views

Category:

Documents


0 download

TRANSCRIPT

  • Subject: Data Structure with C

    Topic: Recursion In this chapter we are departing recursion concepts through the following steps as millstones.

    Presentation starts with recursion definition, how recursion works ?, designing of recursive algorithm with appropriate methodology, limitation of recursion and to end with summarizing the overall concepts.

    There are mainly two approaches for repetitive approach. Iteration Recursion Recursion is a repetitive process in which an algorithm calls itself. by and large recursion is organized in such a way that a function or subroutine calls itself. The following Figure 1 and Figure 2 portrayed the typical pictorial examples for recursive calling.

    Figure 1: Recursive call

    In similar way iterative process can be emphasized, whenever the definition involves only the algorithm parameter/s and not the algorithm itself. But recursion algorithm appears within the definition itself.

    In other words recursion is a programming technique in which a method can call itself to solve a problem

    It can be also define in the following way, one which uses the word or concept being defined in the definition itself. In some situations, a recursive definition can be an appropriate or elegant way to express a concept.

  • Figure 2: recursive calling

    Recursive Algorithm

    The description for a way to solve a problem, that refers to itself and show everything in top folder, in a folder and all it subfolders. This quite similar to concept look up a word in a dictionary using an alphabetical order.

    Value of Recursion This is well suited an alternative concept in the following situations

    1. Recursion can be used to replace loop 2. A recursive procedure is mathematically more elegant than one using loops 3. Sometime procedure that would tricky to write a loop are straightforward to

    using recursion. 4. Recursively defined data structure, like list, are very well suited to processing

    by recursive procedures and functions

    Recursion against iteration In most of the situations, every recursive solution has a corresponding iterative solution, For example : N ! (factorial of N ) can be calculated with a loop. Recursion has the overhead of multiple method invocations. However, for some problems recursive solutions are often more simple and elegant than iterative solutions. You must be able to determine when recursion is appropriate, problem may usually be solved either way, both methods have advantages, iterative algorithms may be more efficient, due to the following reasons. It does not require an additional function call, run quite faster and utilize less amount of memory. Recursion leads to higher overhead because of time to perform function call and memory for activation records (call stack). This may be simpler algorithm and easier to understand, debug and maintain. It is suits for recursive data structures concept like tree and graph and also some problems which have naturally backtracking searches space.

    Content of a Recursive Method

  • Base case(s): Values of the input variables for which we perform no recursive calls are called base cases (there should be at least one base case). Every possible chain of recursive calls must eventually reach a base case.

    Recursive calls. This is calls to the current method. Each recursive call should be defined so that it makes progress towards a base case. The recursive solution for the given problem involves in a two-way journey, firstly we decompose the problem from the top to the bottom and then we solve the problem from the bottom to the top.

    How Recursive Works ? In a recursive function execution process each call sets up a new instance of all the parameters and the local variables. As always, when the method completes, control returns to the method that invoked it (which might be another invocation of the same method) Example : pow(4, 3) = 4 * pow(4, 2) = 4 * 4 * pow(4, 1) = 4 * 4 * 4 * pow(4, 0) = 4 * 4 * 4 * 1 = 64

    How recursion works at run time? It is quite interesting and need to understand what happens when a function is called.. Whenever a function is called, a block of memory is allocated to it in a run-time, such kind of structure is called the stack. This block of memory will contain the following information. a) the functions local variables, b)local copies of the functions call-by-value parameters, c)pointers to its call-by-reference parameters, and d) a return address, in other words where in the program the function was called from. When the function finishes, the program will continue to execute from that point.

    Activation Record Complier will be automatically create the activation record for every function call. For example memory for such record in Java complier allocates to store information about each running method. Such as return address ("RA") argument values, local variable values. Java stacks up the records as methods are called. A method's activation record exists until it returns. This records helps us trace the behavior of a recursive method, in figure 3 depicts the activation record.

  • x = [ 4 ] y = [ 0 ] | pow(4, 0) | RA = [pow(4,1)] | | x = [ 4 ] y = [ 1 ] | pow(4, 1) | RA = [pow(4,2)] | | x = [ 4 ] y = [ 2 ] | pow(4, 2) | RA = [pow(4,3)] | | x = [ 4 ] y = [ 3 ] | pow(4, 3) | RA = [main] | | | main Figure 3: activation record

    Recursion A function that is defined in terms of itself is called self-referential, or recursive. Recursive functions are designed in 3 steps

    Step 1. Identify a base base case an instance of problem whose solution is trivial Ex: The factorial function has two base cases: if n = 0 : n! = 1 if n = 1 : n! = 1

    Step 2. Identify an induction step: a means of solving non trivial instance of problem using one or more smaller instances of problem Ex: In the factorial problem, we solve the big problem using a smaller version of the problem, n! = (n-1)! n Step 3: Form an algorithm from the base and induction step In figure 4 and 5 elaborates the recursive function call for factorial computation.

    Algorithm to compute factorial Factorial (N) 1. Receive N 2. if N > 1 return

    Factorial(N-1) * N else return 1

  • Figure 4: Recursive call for Factorial

    Figure 5: recursive call instances

  • Indirect Recursion A method invoking itself is considered to be direct recursion. A method could invoke another method, which invokes another, etc., until eventually the original method is invoked again Ex: method m1 could invoke m2, which invokes m3, which in turn invokes m1 again and it is illustrated in figure 6.

    Figure 6: method of invoking

    It requires the quite similar attention as we did in the direct recursion. It is often more difficult to trace and debug the programe.

    Designing of Recursive Algorithm Each call of a recursive algorithm either solves one part of the problem or it reduces the size of the problem. The general part of the solution is the recursive call. At each recursive call, the size of the problem is reduced. The statement that solves the problem is known as the base case. Every recursive algorithm must have a base case. The rest of the algorithm is known as the general case. The general case mainly contains the logic needed to reduce the size of the problem.

    Once the base case has been reached, the solution begins. We now know, one part of the answer and can return that part to the next, more general statement. This allows us to solve the next general case. As we solve each general case in turn, we are able to solve the next-higher general case until we finally solve the most general case, the original problem. The rules for designing a recursive algorithm:

    First, determine the base case. Then determine the general case. Combine the base case and the general cases into an algorithm

    Now we have learnt concepts, how to design recursive algorithm. In order to understand further let we discuss to solve some problems which have recursive in nature such as computation of Fibonacci series and Tower of Hanoi etc.

  • Example 1: Fibonacci numbers series.

    One of the most relevant example to employ the recursive concept is to compute the fibonacci series. The computation of fibonacci series, each next number is equal to the sum of the previous two numbers. A classical Fibonacci series is 0, 1, 1, 2, 3, 5, 8, 13, The series of n numbers can be generated using a recursive formula given in the figure 7 and problem is demonstrated in figure 8. Figure 9 and figure 10 illustrated the recursive calling and recursive tree respectively.

    Figure 7: Fibonacci series recursive formula

    Figure 8: Fibonacci series demonstration

    Fibonacci series algorithm Fibonacci(n) if (n=0) then Result0 else

    if (n=1) then Result1 else Result= Fibonacci(n-1) + Fibonacci(n-2)

    Return

    ( )( ) ( )

    0 if =01 if =1

    1 2 otherwise

    n

    Fibonacci n nFibonacci n Fibonacci n

    =

    +

  • Figure 9: Fibonacci function recursive calling

    Figure 10: Fibonacci function recursive tree

    Analysis The efficiency of the Fibonacci recursive algorithm is exponential behaviour which is explored using figure 11

  • Figure 11: analysis of Fibonacci call

    Example 2: Towers of Hanoi Problem Move stack of disks between pegs Can only move top disk in stack Only allowed to place disk on top of larger disk

    Figure 12: Tower of Hanoi

  • Algorithm

    Figure 13: Tower of Hanoi Recursive tree

  • Figure 14: Disk moving to destination in Tower of Hanoi

    In the subsequent section, we made an attempt to discuss the typical recursive algorithm followed by solution to such problems.

    Recursive Algorithm_1 Fun1(x) If (x

  • Newtons Method SquareRoot( num, ans, tol) If |ans2-num| < tol return ans else return(SquareRoot(num,(ans2+num)/(2*ans), tol)

    Exercises: a) Squareroot(9,3,0.01) ? = 3 b) SquareRoot(4,4,0.01)? =(SquareRoot(4,(16+4)/(2*4),0.01) = 4,2,0.01 = 2

    Greatest Common Divisor(x,y) gcd(x,y) if (y==0) return(x) else If (x

  • else If(N==0) return(Ack(M-1,1) else return(Ack(M-1,Ack(M,N-1)))

    Exercises: a) Ack(2,3)? = Ack(1,Ack(2,2)) = Ack(1,Ack(1,Ack(2,1))) = Ack(1,Ack(1,Ack(1,Ack(2,0)))) = Ack(1,Ack(1,Ack(1,Ack(1,1)))) = Ack(1,Ack(1,Ack(1,Ack(0,Ack(1,0))))) = Ack(1,Ack(1,Ack(1,Ack(0,Ack(0,1))))) = Ack(1,Ack(1,Ack(1,Ack(0,2)))) = Ack(1,Ack(1,Ack(1,3))) = Ack(1,Ack(1,Ack(0,Ack(1,2)))) = Ack(1,Ack(1,Ack(0,Ack(0,Ack(1,1))))) = Ack(1,Ack(1,Ack(0,Ack(0,Ack(0,Ack(1,0)))))) = Ack(1,Ack(1,Ack(0,Ack(0,Ack(0,Ack(0,1)))))) = Ack(1,Ack(1,Ack(0,Ack(0,Ack(0,2))))) = Ack(1,Ack(1,Ack(0,Ack(0,3)))) = Ack(1,Ack(1,Ack(0,4))) = Ack(1,Ack(1,5)) = Ack(1,Ack(0,Ack(1,4))) = Ack(1,Ack(0,Ack(0,Ack(1,3)))) = Ack(1,Ack(0,Ack(0,Ack(0,Ack(1,2))))) = Ack(1,Ack(0,Ack(0,Ack(0,Ack(0,Ack(1,1)))))) = Ack(1,Ack(0,Ack(0,Ack(0,Ack(0,Ack(0,Ack(1,0))))))) = Ack(1,Ack(0,Ack(0,Ack(0,Ack(0,Ack(0,Ack(0,1))))))) = Ack(1,Ack(0,Ack(0,Ack(0,Ack(0,Ack(0,2)))))) = Ack(1,Ack(0,Ack(0,Ack(0,Ack(0,3))))) = Ack(1,Ack(0,Ack(0,Ack(0,4)))) = Ack(1,Ack(0,Ack(0,5))) = Ack(1,Ack(0,6)) = Ack(1,7) =Ack(0,Ack(1,6)) =Ack(0,Ack(0,Ack(1,5))) =Ack(0,Ack(0,Ack(0,Ack(1,4)))) =Ack(0,Ack(0,Ack(0,Ack(0,Ack(1,3))))) =Ack(0,Ack(0,Ack(0,Ack(0,Ack(0,Ack(1,2)))))) =Ack(0,Ack(0,Ack(0,Ack(0,Ack(0,Ack(0,Ack(1,1))))))) =Ack(0,Ack(0,Ack(0,Ack(0,Ack(0,Ack(0,Ack(0,Ack(1,0)))))))) =Ack(0,Ack(0,Ack(0,Ack(0,Ack(0,Ack(0,Ack(0,Ack(0,1)))))))) =Ack(0,Ack(0,Ack(0,Ack(0,Ack(0,Ack(0,Ack(0,2))))))) =Ack(0,Ack(0,Ack(0,Ack(0,Ack(0,Ack(0,3)))))) =Ack(0,Ack(0,Ack(0,Ack(0,Ack(0,4))))) =Ack(0,Ack(0,Ack(0,Ack(0,5))))

  • =Ack(0,Ack(0,Ack(0,6))) =Ack(0,Ack(0,7)) =Ack(0,8) =9

    More Exercises Problems b) Ack(2,5) =13

    c) Ack(0,3) =4

    d) Ack(3,0) =5

    Limitations of Recursion Recursion should not be used if the answer to any of the following questions is no: Is the algorithm or data structure naturally suited to recursion (tree is the first choice) ? Is the recursive solution shorter and more understandable?. Does the recursive solution run within acceptable time and space limits ? As a general rule, recursive algorithms should be effectively used only when their efficiency is logarithmic.

    Summary The trick with recursion is to ensure that each recursive call gets closer to a base case. In most of the examples, We have looked at, the base case is the empty list, and the list gets shorter with each successive call. Recursion can always be used instead of a loop. (This is a mathematical fact.) In declarative programming languages, like Prolog, there are no loops. There is only recursion. Recursion is elegant and sometimes very handy, but it is marginally less efficient than a loop, because of the overhead associated with maintaining the stack.