Constant Time: 1
No matter how many elements, operation time will always be the same
Logarithmic Time: log(n)
If doubling the amount of elements doesn't double the work. Always assume that searching operations are log(n)
Linear Time: n
Iterating through all elements in a collection of data. If you see a loop from 0 to array.length, this is linear runtime
n*log(n)
Almost linear, increases by 1 and a little bit. Sorting operations.
Quadratic Time: n^2
Every element in a collection has to be compared to every other element. 'The handshake problem'
Exponential Time: 2^n
If you add a single element to a collection, the processing power required doubles.
Constant Time
O(1)
Logarithmic Time
O(log(n))
linear time
O(n)
Quasilinear Time
O(n*log(n))
Quadratic Time
O(n^2)
Exponential Time
O(2^n)
Iterating with a simple for loop through a single collection
Probably O(n), linear time
iterating through half a collection
Still O(n). There are no constants in runtime
Iterating through two different collections with separate for loops
O(n + m)
Two nested for loops iterating over the same collection
O(n^2), quadratic
Two nested for loops iterating over different collections
O(n*m), sort of quadratic
Sorting
O(n
log(n)), quasilinear. Cutting in half is log(n), n
log(n) is cutting in half, but then there's a linear bit on the end.
searching a sorted array?
O(log(n)), logarithmic