# Introduction

The whole point of a sorting algorithm is to order the elements of a list. Commonly this is done numerically or lexicographically (alphabetically).

# Types of Sorts

## Adaptive/Non-Adaptive

Adaptive Sorts have different operations depending on how pre-sorted the list is (i.e. pre-sorted is faster to sort than not)

Non-Adaptive Sorts have the same sequence of operations independent of the order of the data.

## Stable/Non-Stable

Stable Sorts preserve the order of duplicate elements (as in b a(1) a(2) -> a(1) a(2) b)

Non-Stable Sorts **sometimes** reverse the order of the duplicates (as in a(2) a(1) b)

## In-Place/Non-In-Place

In-Place Sorts use minimal extra data storage; they sort within the given data structure rather than creating a new one.

Non-In-Place Sorts use extra data storage for more than just swaps, and memory can get out of hand.

# Different Sorting Algorithms

## Bubble Sort

## Selection Sort

## Insertion Sort

## Shell Sort

## Merge Sort

## Quick Sort

## Heap Sort

## Key Index Sort

# Lower Bound on Complexity (for comparison based sorts)

Consider a tree with n! paths. Each path is a different ordering of the n elements in the array (hence the n!).

Let the height of this tree be h. Hence there are 2^{h} leaves. So knowing there are n! leaves, we know the height is ln(n!)

Consider a sort algorithm as a binary decision tree (at each node it either goes left or right).

Hence the sort must make at least O(ln(n!)) decisions.

$ln(n!)$ can be rerwitten as $lg(n!)$, and that can be rewritten as $n*lg(n)-n$ which goes to $nlg(n)$ (ignoring the linear factor). The maths behind that is found in Stirling's Approximation.

Hence the lower bound is said to be $O(nlg(n!))$

# Sorting Videos

Merge Sort vs Quick Sort Battle - Jaz and Chris