Time complexity:
Worst-case: \(O(n^2)\)
Best-case: \(O(nlog(n))\)
Average-case: \(O(nlog(n))\)
Pick a random element (pivot) and partition the array.
[all numbers that are less than the pivot] pivot [all numbers that are greater than the pivot]
Time complexity:
Worst-case: \(O(n^2)\)
Best-case: \(O(nlog(n))\)
Average-case: \(O(nlog(n))\)
Pick a random element (pivot) and partition the array.
[all numbers that are less than the pivot] pivot [all numbers that are greater than the pivot]
Time complexity:
Worst-case: \(O(nlog(n))\)
Best-case: \(O(nlog(n))\)
Average-case: \(O(nlog(n))\)
Merge sort divides the array in half, sorts each of those halves, and them merges them back together.
Time complexity:
Worst-case: \(O(n^2)\)
Best-case: \(O(n^2)\)
Average-case: \(O(n^2)\)
Find the smallest element and move it to the front. Then, find the second smallest and move it.
Time complexity:
Worst-case: \(O(n^2)\)
Best-case: \(O(n)\)
Average-case: \(O(n^2)\)
Sort array elements in ascending order. If the first element is greater than the second element, swap the first two elements.
Then, we go to the “next pair” and so on.
In computer science, amortized analysis is a method of analyzing algorithms that considers the entire sequence of operations of the program. It allows for the establishment of a worst-case bound for the performance of an algorithm irrespective of the inputs by looking at all of the operations. At the heart of the method is the idea that while certain operations may be extremely costly in resources, they cannot occur at a high-enough frequency to weigh down the entire program because the number of less costly operations will far outnumber the costly ones in the long run, "paying back" the program over a number of iterations. It is particularly useful because it guarantees worst-case performance rather than making assumptions about the state of the program. [wikipedia]