Sorting-Algorithms-Blender ========================== Sorting algorithms visualized using the Blender Python API. Table of contents ================= * [Introduction](#introduction) * [Description](#description) * [Getting Started](#getting-started) * [Possible Updates](#possible-updates) * [Sorting Algorithms](#sorting-algorithms) * [Bubble Sort](#bubble-sort) * [Insertion Sort](#insertion-sort) * [Selection Sort](#selection-sort) * [Shell Sort](#shell-sort) * [Merge Sort](#merge-sort) * [Quick Sort](#quick-sort) * [Big O](#big-o) * [What is Big O Notation?](#what-is-big-o-notation) * [Time Complexity Notations](#time-complexity-notations) * [Table of Sorting Algorithms](#table-of-sorting-algorithms) Introduction ============ ## Description Running one of the scripts in this project generates primitive meshes in Blender, wich are animated to visualize various sorting algorithms.
The three folders (sort_color, sort_combined, sort_scale) contain three different types of visualisation. ## Getting Started
  1. Download, install and start Blender.
  2. Open the .py file in the Text Editor.
  3. Click the play button to run the script.
## Possible Updates Below I compiled a list of features that could be implemented in the future. Contributions to this project with either ideas from the list or your own are welcome. Sorting Algorithms ================== ## Bubble Sort

Bubble sort is one of the most straightforward sorting algorithms, it makes multiple passes through a list.

In essence, each item “bubbles” up to the location where it belongs.

|
bubble_sort_scale.py|bubble_sort_color.py| | ------------- |:-------------:| | ![BubbleSort2](https://user-images.githubusercontent.com/78089013/174035707-e1475bd4-a3c6-4e74-ba9f-30b57335adfe.gif)|![BubbleColor2](https://user-images.githubusercontent.com/78089013/174149862-2ed3c492-0987-4194-834f-fc5276299bcc.gif)| ## Insertion Sort

Like bubble sort, the insertion sort algorithm is straightforward to implement and understand.

It splits the given array into sorted and unsorted parts, then the values from the unsorted parts are picked and placed at the correct position in the sorted part.

| insertion_sort_scale.py|insertion_sort_color.py| | ------------- |:-------------:| |![InsertionSort2](https://user-images.githubusercontent.com/78089013/174035509-714265d2-4d27-4d77-b809-997f4e233feb.gif)|![InsertionColor](https://user-images.githubusercontent.com/78089013/174154736-ada0e27f-88d0-4707-ba99-14ed967cce21.gif)| ## Selection Sort The selection sort algorithm sorts an array by repeatedly finding the minimum element (considering ascending order) from unsorted part and putting it at the beginning.
The algorithm maintains two subarrays in a given array.

In every iteration of selection sort, the minimum element (considering ascending order) from the unsorted subarray is picked and moved to the sorted subarray.

|selection_sort_scale.py|selection_sort_color.py| | ------------- |:-------------:| |![SelectionSort2](https://user-images.githubusercontent.com/78089013/174033035-b6b9527a-3d12-4844-b066-50b4cb9d11ef.gif)|![SelectionSort2](https://user-images.githubusercontent.com/78089013/174156159-605f5121-06c3-4314-a22c-5f7919bb9c44.gif)| ## Shell Sort

The shell sort algorithm extends the insertion sort algorithm and is very efficient in sorting widely unsorted arrays.
The array is divided into sub-arrays and then insertion sort is applied.
The algorithm is:

This sorting technique works by sorting elements in pairs, far away from each other and subsequently reduces their gap.
The gap is known as the interval. We can calculate this gap/interval with the help of Knuth’s formula.

|shell_sort_scale.py|shell_sort_color.py| | ------------- |:-------------:| |![ShellSort2](https://user-images.githubusercontent.com/78089013/174032744-6d968c18-8fdb-4268-937f-55e910f8c4d5.gif)|![ShellColor](https://user-images.githubusercontent.com/78089013/174157836-a4571ad7-0fd1-4237-9fb2-dc2d730a64c7.gif)| ## Merge Sort Merge sort uses the divide and conquer approach to sort the elements. It is one of the most popular and efficient sorting algorithm.
It divides the given list into two equal halves, calls itself for the two halves and then merges the two sorted halves.
We have to define the merge() function to perform the merging. The sub-lists are divided again and again into halves until the list cannot be divided further.
Then we combine the pair of one element lists into two-element lists, sorting them in the process.
The sorted two-element pairs is merged into the four-element lists, and so on until we get the sorted list. | merge_sort_scale.py|merge_sort_color.py| | ------------- |:-------------:| |![MergeSort2](https://user-images.githubusercontent.com/78089013/174032376-9b9768aa-5891-468e-a7a9-1d494374c5a4.gif)|![MergeColor2](https://user-images.githubusercontent.com/78089013/174161064-3fff2b70-90db-425c-acab-0d87040ec205.gif)| ## Quick Sort Like Merge Sort, QuickSort is a Divide and Conquer algorithm. It picks an element as pivot and partitions the given array around the picked pivot. There are many different versions of quickSort that pick pivot in different ways. The key process in quickSort is partition(). Target of partitions is, given an array and an element x of array as pivot, put x at its correct position in sorted array and put all smaller elements (smaller than x) before x, and put all greater elements (greater than x) after x.
All this should be done in linear time. | quick_sort_scale.py|quick_sort_color.py| | ------------- |:-------------:| |![QuickSort2](https://user-images.githubusercontent.com/78089013/174031317-2c261df1-6786-42e5-be08-1a7f5fccbca6.gif)|![QuickColor](https://user-images.githubusercontent.com/78089013/174161905-a3a2d1bd-0064-4e23-92b3-50da150c6f0c.gif)| Big O ===== ### What is Big O Notation? Big O notation is the language we use for talking about how long an algorithm takes to run.
It's how we compare the efficiency of different approaches to a problem.
With Big O notation we express the runtime in terms of **how quickly it grows relative to the input, as the input gets arbitrarily large.** Let's break that down: ### Time Complexity Notations These are the asymptotic notations that are used for calculating the time complexity of the sorting algorithms: ### Table of Sorting Algorithms
Algorithm Time Complexity Space Complexity
Best Case Average Case Worst Case Worst Case
Quick Sort Ω(n log(n))
Θ(n log(n))
O(n^2)
O(log(n))
Merge Sort Ω(n log(n))
Θ(n log(n))
O(n log(n))
O(n)
Tim Sort Ω(n)
Θ(n log(n))
O(n log(n))
O(n)
Heap Sort Ω(n log(n))
Θ(n log(n))
O(n log(n))
O(1)
Bubble Sort Ω(n)
Θ(n^2)
O(n^2)
O(1)
Insertion Sort Ω(n)
Θ(n^2)
O(n^2)
O(1)
Selection Sort Ω(n^2)
Θ(n^2)
O(n^2)
O(1)
Tree Sort Ω(n log(n))
Θ(n log(n))
O(n^2)
O(n)
Shell Sort Ω(n log(n))
Θ(n(log(n))^2)
O(n(log(n))^2)
O(1)
Bucket Sort Ω(n+k)
Θ(n+k)
O(n^2)
O(n)
Radix Sort Ω(nk)
Θ(nk)
O(nk)
O(n+k)
Counting Sort Ω(n+k)
Θ(n+k)
O(n+k)
O(k)
Cube Sort Ω(n)
Θ(n log(n))
O(n log(n))
O(n)