Today, I'm going to be talking about Big O Notation which is a useful way to assess algorithms!!
This can then allow you to identify problems/ inefficiencies that may be present. It also allows you to see how your algorithm might respond to different volume datasets.
Before proceeding ,I think it may be useful for you to have an understanding of the terms linear , polynomial , exponential , logarithmic , permutations .
What is It?
Big O Notation describes the performance of an algorithm, in terms of its 'Time Complexity' and 'Space Complexity.
Time Complexity - How long it takes to run the algorithm.
Space Complexity - How much memory an algorithm takes up.
Algorithms tend to have 3 cases:
Best Case Scenario -
Worst Case Scenario -
Average Case Scenario -
Typically when we talk about Big ), we talk about the Worst Case Scenario because we prepare for the worst.
Higher terms 'dominate' others. E.g. if an algorithm has statements where one is O(1) and the other is O(n) then the overall complexity of the algorithm is O(n).
Big O Ignores constants:
E.g. O(5) simplifies to O(1)
O(4n) simplifies to O(n)
O(log 20) simplifies to O(log n) and so on ...
Different Measures and What They Mean:
O(1) - Constant time . No matter the input size, the algorithm will run at the same speed.
E.g. length = len(a)
O(n) - Linear time . Time increases, in direct proportion to the size of the dataset.
E.g. the Linear Search Algorithm has 'Time Complexity' Big O(n) when searching through a list of 100 elements vs 10000 elements. The worst case scenario will be O(n)
O(n^2) - Polynomial time . Time increases, in direct proportion to the size of the dataset.
E.g. programs with 2 nested loops, for e.g. looping through a 2D array.
twoDArray = [ [1,2,3,4], [5,6,7,8], [9,10,11,12], [13,14,15,16] ] #2D Array
for row in twoDArray: #Initial loop through the 2D row.
for number in row: #Access the individual numbers.
print(number) #Output the individual numbers.
O(2^n) - Exponential time. The time taken to execute will double with every additional item added to the dataset.
E.g. recursive calculation of Fibonacci numbers.
O(log n) - Logarithmic Growth . Grows very slowly as the size of the dataset.
E.g. Binary search works well with larger data sets.
Here's a link to the Big O Cheatsheet which is a very useful resource for all things Big O!!
If you'd like a video about Big O then I'd highly recommend this!
As always, thank you for making it this far, please leave comments, they help me so much.
Be sure to like and share also, lets try and get 100 views within a week, like we have done previously.
Follow CompAndCode on all platforms!
Thank You Nibble Byte!!!