Algorithms
  • Introduction
  • Algorithms
    • Insertion Sort
    • Shell Sort
    • Selection Sort
    • Heap Sort
    • Merge Sort
    • Quick Sort
    • Dual Pivot QuickSort
    • Radix Exchange Sort
  • Data Structures
    • Basic Datastructures
      • Table or Array
      • List
      • Linked List
      • Stack
      • Queue
      • Deque
      • Dictionary
      • HashTables
    • Trees
      • Tree
      • Binary Search Tree
      • Heap
      • Red-Black Trees
      • Splay Trees
      • Randomized Search Trees
    • Dynamic Programming
      • Fibonacci
      • Knapsack
      • Optimal Binary Search Trees
      • Longest Common Sub sequence
    • External Data Structures
      • B-Trees
      • Variants
    • Multi-dimensional Data Structures
      • Point-Region Quadtree
      • Point-Quadtrees
    • Priority Queues
      • Binomial Queues
  • Graphs
    • Basics
    • Depth First Search
    • Applications of Depth First Search
    • Breadth First Search
    • Shortest Paths
    • Transitive Closure
    • Flow Networks
    • Matching
  • Strings
    • Data Structures for Strings
      • Tries
        • Binary Tries
        • Multiway Trie
        • Patricia Tries
      • Variable Length-code
        • Elias Code
        • Gamma Code
        • Fibonacci Code
      • Huffmancode
      • Ternary Search Tree
    • Searching in Strings
      • Prefix Function
      • Knuth-Morris-Pratt
      • Boyer-Moore
      • Monte-Carlo Algorithms
      • Karp-Rabin
      • Automatons
      • Shift-AND
    • Indexing Text
      • Suffix Trees
      • Suffix Arrays
      • Search Engines & Inverted Files
  • P and NP
    • Introduction
    • SAT and 3-SAT
    • Graph Problems
    • Problems with Collections
    • Networks
    • Data storage
  • Metaheuristics
    • Trajectory Methods
      • Local Search
      • Simulated Annealing
      • Tabu Search
    • Population Based
      • Ant Colony
      • Evolutionary Computation
  • Sources
Powered by GitBook
On this page
  • General Analysis & Formula
  • Bad Code
  • With Dynamic Programming

Was this helpful?

  1. Data Structures
  2. Dynamic Programming

Fibonacci

General Analysis & Formula

To illustrate the steps above, we will apply them to one of the easiest problems out there. The Fibonacci problem. In the Fibonacci problem, our goal is to find the n'th Fibonacci number, where the solution is given by the sum of the 2 previous numbers.

Let's write this out as a formula

Fib(n)=Fib(n−2)+Fib(n−1)Fib(n) = Fib(n - 2) + Fib(n - 1)Fib(n)=Fib(n−2)+Fib(n−1)

Bad Code

If we would write this formula in a bad code practice, then we would get something like this (javascript code).

function fib(n) {
    if (n <= 1) return n;
    return fib(n - 2) + fib(n - 1);
}

console.log(fib(25));

Now why is this bad? Well let's start with an analysis of the problem. The run time for this is going to be in a recursive way so for Fib(n) our solution will be the sum of the time needed for Fib(n - 2) and Fib(b - 1). Here we also add the time needed to add them together which is O(1).

T(n)=T(n−1)+T(n−2)+O(1)T(n) = T(n - 1) + T(n - 2) + O(1)T(n)=T(n−1)+T(n−2)+O(1) T(n)≥2T(n−2)T(n) ≥ 2T(n - 2)T(n)≥2T(n−2) T(n)=Θ(2n/2)T(n) = Θ(2^{n/2})T(n)=Θ(2n/2)

Which is bad! Let's improve this by using our Dynamic Programming approach.

With Dynamic Programming

So every time we need to calculate a problem, check if we already computed the problem, if not then calculate it. This technique is the bottom-up approach which will use "memoization".

Memoization (Top-Down)

var memo = {};

function fib(n) {
    if (memo[n]) return memo[n];
    if (n <= 1) return n;
    var f = fib(n - 2) + fib(n - 1);
    memo[n] = f;
    return f;
}

console.log(fib(25));

We can also work bottom down, which is the iterative way or also called "tabulation". It will calculate all the values from the first one to the last one (so bottom up). The advantage of this method is that it does not require a stack!

Tabulation (Bottom-Up)

function fib(n) {
    var fib = [];
    fib[0] = 0; fib[1]= 1;
    for (var i = 2; i <= n; i++) {
        fib[i] = fib[i - 1] + fib[i - 2];
    }

    return fib[n];
}

console.log(fib(25));

Now, what is our gain here? Well this time we use almost no recursion. Since every time when we look for a value, we check if it is in our memo table. If it is, then we will use it, giving back a Θ(1) performance.

This means for n numbers, that we will have a Θ(1) performance, so resulting in a total run time of: Θ(n) which is way better than our previous implementation.

PreviousDynamic ProgrammingNextKnapsack

Last updated 6 years ago

Was this helpful?