4. Algorithm Analysis Part 3

4.1. Time Complexity Examples (1)

  • Example: a = b;

    • This assignment takes constant time, so it is \(\Theta(1)\).

  • Example:

sum = 0;
for (i=1; i<=n; i++) {
   sum += n;
}

4.2. Time Complexity Examples (2)

  • Example:

sum = 0;
for (j=1; j<=n; j++) {     // First for loop
  for (i=1; i<=j; i++) {  //   is a double loop
    sum++;
  }
}
for (k=0; k<n; k++) {      // Second for loop
   A[k] = k;
}

4.3. Time Complexity Examples (3)

  • Example: Compare these two code fragments:

sum1 = 0;
for (i=1; i<=n; i++) {     // First double loop
  for (j=1; j<=n; j++) {  //   do n times
    sum1++;
  }
}

sum2 = 0;
for (i=1; i<=n; i++) {     // Second double loop
  for (j=1; j<=i; j++) {  //   do i times
    sum2++;
  }
}

4.4. Time Complexity Examples (4)

  • Not all double loops are \(\Theta(n^2)\).

sum1 = 0;
for (k=1; k<=n; k*=2) {    // Do log n times
  for (j=1; j<=n; j++) {  // Do n times
    sum1++;
  }
}

sum2 = 0;
for (k=1; k<=n; k*=2) {    // Do log n times
  for (j=1; j<=k; j++) {  // Do k times
    sum2++;
  }
}

4.5. Binary Search

  • How many elements are examined in worst case?

// Return the position of an element in sorted array A with value K.
// If K is not in A, return A.length.
public static int binarySearch(int[] A, int K) {
  int low = 0;
  int high = A.length - 1;
  while(low <= high) {                  // Stop when low and high meet
    int mid = (low + high) / 2;         // Check middle of subarray
    if( A[mid] < K) low = mid + 1;      // In right half
    else if(A[mid] > K) high = mid - 1; // In left half
    else return mid;                    // Found it
  }
  return A.length;                      // Search value not in A
}

4.6. Other Control Statements

  • while loop: Analyze like a for loop.

  • if statement: Take greater complexity of then/else clauses.

  • switch statement: Take complexity of most expensive case.

  • Subroutine call: Complexity of the subroutine.

4.7. Analyzing Problems

  • Upper bound: Upper bound of best known algorithm.

  • Lower bound: Lower bound for every possible algorithm.

4.8. Analyzing Problems: Example

  • May or may not be able to obtain matching upper and lower bounds.

  • Example of imperfect knowledge: Sorting

    1. Cost of I/O: \(\Omega(n)\).

    2. Bubble or insertion sort: \(O(n^2)\).

    3. A better sort (Quicksort, Mergesort, Heapsort, etc.): \(O(n \log n)\).

    4. We prove later that sorting is in \(\Omega(n \log n)\).

4.9. Space/Time Tradeoff Principle

  • One can often reduce time if one is willing to sacrifice space, or vice versa.

  • Encoding or packing information

    • Boolean flags: Need less space, but take time to unpack

  • Table lookup

    • Factorials: Store a table of values for lookup instead of computing

  • Disk-based Space/Time Tradeoff Principle: The smaller you make the disk storage requirements, the faster your program will run.

4.10. Multiple Parameters

  • Compute the rank ordering for all C pixel values in a picture of P pixels.

  for (i=0; i<C; i++) {   // Initialize count
     count[i] = 0;
  }
  for (i=0; i<P; i++) {   // Look at all of the pixels
     count[value(i)]++; // Increment a pixel value count
  }
  sort(count);          // Sort pixel value counts
  • If we use P as the measure, then time is \((P \log P)\). Unrealistically high.

  • If we use C as the measure, then time is \((C \log C)\). Unrealistically low.

  • More accurate is \(\Theta(P + C \log C)\).

4.11. Space Complexity

  • Space complexity can also be analyzed with asymptotic complexity analysis.

  • Time: Algorithm

  • Space: Data Structure