Unit 1

Β 
Disc 1
LJ 1

Algorithm Techniques

  • Algorithm: a well-defined sequence of steps used to solve a well-defined problem in finite time
  • Running time: of an algorithm is the number of instructions it executes when run on a particular instance.
    • We use Big-Oh to find the upper bound (worst case)
  • Types of Algorithm Structures:
    • Recursive
    • Iterative
  • Types of Algorithm Solution:
    • A good solution
    • Best(s) solution(s)

Algorithms Strategies

Recursion: Divide & Conquer

  • Reapply algorithm to subproblem
function factorial(N){ if(N==1) return 1 else return N*factorial(N-1) }
Β 

Backtracking: Exhaustive /Brute-force Search approach

Try all possible solutions and pick up the desired one
  • Note: It is not for optimization (dynamic programming is for that)
  • It is performed in a depth-first search (DFS) manner
  • Steps:
    • Start by trying all possible states and build a State Space Tree to represent them following the DFS order
    • Apply the Bounding Function to omit the invalid states (based on some condition)
    • Return the group of valid states
Β 

Branch & Bound

  • It is performed in a breadth-first search (BFS) manner
  • Branch and bound does not decrease the complexity of a problem, it is simply a reasonable way to organize a search for an optimal solution.

Asymptotic Analysis

Big-Oh Notation (Upper Bounds)

Intuitive Definition

  • Let n be a size of program’s input
  • Let T(n) be function e.g. running time
  • Let f(n) be another function - preferably simple
Then, if & only if . So, this has to be true whenever n is big enough for some large constant c
πŸ“Œ
How big is BIG and large is LARGE? Big and Large enough to make fits under
Β 
Example
We have an algorithm whose running time is defined as . Let’s try and pick
notion image
We can see that there is always a vertical line (in this example n=1000) where is less than
Β 

Formal Definition

is the set of all functions that satisfy: There exist positive constants and such that, for all , then
Β 
Example 1
Proof: Choose and set the vertical line
⚠️
Big-Oh notation doesn’t care about (most) constant factors. πŸ‘† This is not applicable for exponent statements like and
Β 
Example 2
Proof: Choose and set
notion image
πŸ“Œ
Saying that time is doesn’t mean it’s slow! Big-Oh is upper bound, so based on it we can say that an algorithm is fast but NOT an algorithm is slow.
Β 
Example 3
Proof: Choose and set
πŸ“Œ
Big-Oh is usually used to indicate the dominating (fastest-growing) term
Β 

Table of Important Big-Oh Sets

Smallest to largest (the small one is a subset of the large one ()
Function
Common Name
Constant
Logarithmic
Log-squared
Root-n
Linear
n log n
Quadratic
Cubic
Quartic
Exponential
Exponential
Factorial
γ…€
  • or faster time (smaller functions) are considered efficient
  • or slower time is considered useless
    • Of course, in the daily computations , even is bad
    • Β 
notion image
  • Big-Oh is used to indicate the growth, not number of operations, therefore it is used to indicate:
    • Worst-case running time
    • Best-case running time
    • Memory use
Β 

Recursion

  • A recurrence relation is an equation which is defined in terms of itself
  • Why to use it:
    • Many natural functions are easily expressed as recurrences
    • It is often easy to find a recurrence as the solution of a counting problem. Solving the recurrence can be done for many special cases although it is somewhat of an art.
  • A recurrence relation consists of:
    • Base case (a.k.a. initial or boundary condition) that terminates the recursion
    • General condition that breaks the problem into smaller pieces
Β 
notion image
Β 

Β 

Divide and Conquer (Recursion)

  • The time complexity is modeled by recurrence relations
  • The general steps of this paradigm:
      1. Divide into smaller subproblems
      1. Conquer via recursion calls
      1. Combine solutions of subproblems into one for the original problem: Cleanup work and merge results (e.g. like Merge Sort)
πŸ“Œ
The divide-and-conquer technique can be applied to those problems that exhibit the independence principle:
Each problem instance can be divided into a series of smaller problem instances which are independent of each other.
  • Merge Sort algorithm is the simplest example that uses this strategy
Β 

Problem: Counting Inversions in an Array

  • Giving an array with elements, calculate the pairs that meet the following critieria: and
  • To calculate the largest number of inversions in an array of N elements:
  • Solution #1: Brute-force
    • By using two loops (the 2nd loop is nested)
    • It takes time
  • Solution #2: Divide & Conquer
    • (See Stanford lecture)
Β 
Β 
Β 
Β 
Β 

Resources

Complexity Analysis

  • Chapter 3: Algorithm Analysis in A Practical Introduction to Data Structures and Algorithm Analysis by Clifford A.
  • Chapter 14 Sections 14.1 through 14.2 Analysis Techniques in A Practical Introduction to Data Structures and Algorithm Analysis by Clifford A. Shaffer.
Β 

Sequences and Recurrence relations

Β 

Brute Force Algorithms and Variations