Time Complexity

In computer science, analysis of algorithms is a very crucial part. It is important to find the most efficient algorithm for solving a problem. It is possible to have many algorithms to solve a problem, but the challenge here is to choose the most efficient one.

Now the point is, how can we recognize the most efficient algorithm if we have a set of different algorithms? Here, the concept of space and time complexity of algorithms comes into existence. Space and time complexity acts as a measurement scale for algorithms. We compare the algorithms on the basis of their space (amount of memory) and time complexity (number of operations/computer time to run).

The total amount of the computer’s memory used by an algorithm when it is executed is the space complexity of that algorithm. We’ll not discuss space complexity in this article (to make this article a bit smaller).

freeCodeCamp

What is Time Complexity#

Time complexity is the number of operations an algorithm performs to complete its task. Typically the most efficient algorithm has the smallest time complexity and completes the least amount of tasks while still reaching the solution.

Calculating Time Complexity#

Time complexity is commonly calculated and expressed in the Big O Notation . Time complexity is also commonly expressed with the worst-case scenario in mind, therefore you can be 100% sure that the algorithm runs within the calculated worst-case time.

A program that loops through an array of size N has O(N) time complexity. A program that loops through a nested for loop each of size N has O(N²) time complexity. To learn more in-depth about how to properly calculate more advanced time complexities, check out this article .

The larger the expression inside the parenthesis, the longer the time it takes to run the algorithm. For example, a O(N²) algorithm will be expected to take longer than a O(N) algorithm, which will in turn take longer than a O(log(N)) and so forth. However, this may not always be the case . If you graph and N, you will notice that while will eventually overtake N in time length, there is a certain interval where is faster than N. However, generally speaking, since these algorithms often deal with large data, the former assertion is sufficient by itself.

List of common time complexities, their names, and examples, sorted in ascending order of complexity.

  • O(1) Constant Time - Accessing an element in an array with index number
  • O(log n) Logarithmic Time - Binary search in sorted array (get element at the halfway point of the array and comparing against target value, essentially cutting the search area of the area in half per iteration)
  • O(n) Linear Time - Linear search in array (checking each element one by one until target is found), Adding all elements of an array together
  • O(n log n) Quasilinear Time - Various sorts like Quick Sort and Merge Sort
  • O(n²) Quadratic Time - Various sorts like Bubble Sort and Insertion Sort, Getting every possible unique pair of elements from an array

Image Credit: Woltmann, Sven. “Time Complexity Graph for Big O Notation.” Https://Www.happycoders.eu/, www.happycoders.eu/algorithms/big-o-notation-time-complexity/.

To understand Time Complexity in general more, check out this article .

Created By: WHS Comp Sci Club Officers

CC-BY-NC-SA 4.0 | WHS CSC 2021