Time complexity in data structure

Written by

Sonal Moga

Whar are Asymptotic Notations?
The Time complexity of Algorithms
Space Complexity of an Algorithm

The Time complexity of Algorithms

As we know that analysis of algorithms is required to find the most efficient algorithm for a given task, two factors help us determine the efficiency, time and space complexity of the algorithms. We will discuss here the first factor i.e. Time complexity.

Time complexity evaluates the amount of time taken by the algorithm to perform a given function of the length of the input. There are five different types of time complexities possible:

  • Constant time complexity O(1)
  • Linear time complexity O(n)
  • Logarithmic time complexity O(Log n)
  • Quadratic time complexity O(n²)
  • Exponential time complexity O(2ˆn)

To get more comfortable with the term “time complexity” let us understand it through a simple problem of search using two very basic searches Linear & Binary.

For both the searches let us take a mock problem to solve, here we will search for an element, say ‘9’ in a given array.

Array= {1, 2, 3, 4, 5, 6, 7, 8, 9}

So, the linear search starts and every element is matched with the target element which in this case is ‘9’. The algorithm is run 9 times till the target is found and it returns true. This case can be considered as the worst-case scenario for the algorithm, as it runs for the longest time to find the target.

Now let’s search using Binary Search.

Here, the target element is first matched with the middle element of a sorted array, it then searches to the left of the list if the target is less than the middle element or searches the right side of the element is greater than the middle element.

So, in this case, the number of operations required is 3 until the targeted element ‘9’ is found. However, this case here is also the worst case for the binary search as all the elements were scrutinized until the last element matches the target.

The search starts here (mid-point)

We can conclude from the given example that binary search used Logarithmic time complexity, number of operations= log (9) = approx. 3 (for the base 2)

For an array of size n, it takes Log (n) in a Binary Search.

In the end, we can understand that the numbers of operations are greatly reduced when we switched our search algorithm. This may not seem to be a big difference here but when there are a huge amount of data in real-time the time complexity of an algorithm plays a very important role.

Time complexity in data structure