Time and Space Complexity

Time complexity

Time complexity is a way to measure how the execution time of an algorithm grows as the input size (n) increases. It helps us understand how efficient an algorithm is and predict its performance for different input sizes.

Big O notation is used to express time complexity. It describes the upper bound of the growth rate of the algorithm's running time. Here are examples of common time complexities with code explanations:

1. O(1) (Constant Time):

  • The algorithm takes the same amount of time regardless of the input size.
int firstElement = array[0]; // Accessing the first element of an array
boolean isEmpty = list.isEmpty(); // Checking if a list is empty

2. O(log n) (Logarithmic Time):

  • The algorithm's running time grows logarithmically with the input size.

  • Often seen in algorithms that divide the problem into smaller subproblems repeatedly.

int index = Arrays.binarySearch(array, target); // Binary search
int result = tree.floor(key); // Finding the largest element smaller than or equal to key in a binary search tree

3. O(n) (Linear Time):

  • The algorithm's running time grows directly proportional to the input size.

  • Common in simple loops that iterate through the entire input.

for (int i = 0; i < n; i++) {
    // some operation on each element
}

4. O(n log n) (Log-linear Time):

  • The algorithm's running time grows slightly faster than linear time, but slower than quadratic time.

  • Often found in efficient sorting algorithms like merge sort and quicksort.

Arrays.sort(array); // Merge sort or quicksort

5. O(n^2) (Quadratic Time):

  • The algorithm's running time grows as the square of the input size.

  • Nested loops or algorithms that involve pairwise comparisons often have quadratic time complexity.

for (int i = 0; i < n; i++) {
    for (int j = i + 1; j < n; j++) {
        // some operation involving elements at indices i and j
    }
}

6. O(n^3) (Cubic Time):

  • The algorithm's running time grows as the cube of the input size.
for (int i = 0; i < n; i++) {
    for (int j = 0; j < n; j++) {
        for (int k = 0; k < n; k++) {
            // some operation involving elements at indices i, j, and k
        }
    }
}

7. O(2^n) (Exponential Time):

  • The algorithm's running time grows exponentially with the input size.

  • Often found in brute-force algorithms that try all possible combinations.

void generateSubsets(int[] arr) {
    for (int i = 0; i < (1 << n); i++) { // 1 << n represents 2^n
        // generate a subset based on the bits of i
    }
}

8. O(n!) (Factorial Time):

  • The algorithm's running time grows as factorial of the input size.

  • Extremely slow for even moderately sized inputs.

void generatePermutations(int[] arr) {
    if (n == 1) {
        // base case
    } else {
        for (int i = 0; i < n; i++) {
            // generate permutations by swapping elements
        }
    }
}

Remember:

  • Lower time complexities (like O(1), O(log n), O(n)) generally indicate more efficient algorithms.

  • Higher time complexities (like O(n^2), O(2^n), O(n!)) can become impractical for large inputs.

  • Understanding time complexity is crucial for choosing the right algorithms and optimizing code for performance.

Space complexity

Space complexity quantifies the amount of memory an algorithm requires to run as a function of the input size. Unlike time complexity, it measures the additional memory space needed beyond the input itself.

Here are some common space complexities with examples:

1. O(1) (Constant Space):

  • The algorithm uses the same amount of additional memory no matter the input size.
int sum = 0;
for (int i = 0; i < n; i++) {
  sum += array[i]; // only two variables (sum and i) used
}

2. O(log n) (Logarithmic Space):

  • The additional memory usage grows logarithmically with the input size.

  • Often seen in recursive algorithms that divide the problem into smaller subproblems.

int binarySearch(int[] array, int key) {
  // recursive calls with reduced input size
  if (low > high) return -1;
  int mid = (low + high) / 2;
  // ... comparison and further calls
}

3. O(n) (Linear Space):

  • The additional memory usage grows directly proportional to the input size.

  • Common in algorithms that require temporary storage for each element of the input.

int[] copyArray(int[] arr) {
  int[] copy = new int[n]; // additional array of size n needed
  for (int i = 0; i < n; i++) {
    copy[i] = arr[i];
  }
  return copy;
}

4. O(n log n) (Log-linear Space):

  • The additional memory usage grows slightly faster than linear time, but slower than quadratic space.

  • Often found in efficient sorting algorithms like merge sort and quicksort.

void mergeSort(int[] array) {
  // uses additional array of size n for merging
  merge(array, low, mid, high);
}

5. O(n^2) (Quadratic Space):

  • The additional memory usage grows as the square of the input size.

  • Occurs in algorithms that require storing intermediate results for all pairwise comparisons between input elements.

int[][] allPairsShortestPaths(int[][] graph) {
  // creates and populates n x n distance matrix
  for (int i = 0; i < n; i++) {
    for (int j = 0; j < n; j++) {
      // ... calculate and store shortest path distances
    }
  }
  return distances;
}

Remember:

  • Lower space complexities are generally preferable, especially for memory-constrained environments.

  • Choosing algorithms with appropriate space complexity can be crucial for efficient program execution.

  • Analyzing both time and space complexity provides a comprehensive understanding of an algorithm's resource requirements.