Seven common sorting algorithms of comparison class (detailed explanation of python)

Keywords: Python less

Seven common sorting algorithms of comparison class (detailed explanation of python)

Seven common sorting algorithms of comparison classes

Bubble sort, insert sort, select sort, fast sort, Hill sort, merge sort, heap sort explanation and python implementation
Algorithm stability: for the values with equal elements in the sequence, the algorithm is stable if the sequence remains the same after sorting

1. Bubble sorting

Time complexity O(n2)
stable

Main ideas:

In each sorting, compare the two numbers from the beginning. The smaller number "shoots" forward and the larger number "sinks" backward

python implementation:

def bubble_sort(arr):
	for i in range(0,len(arr)):
	#The maximum value "sinks" to the last after two comparisons for each sorting
		for j in range(1,len(arr)-i):
		#Compare the first element with the unordered one in pairs, with the largest number coming later
			if arr[j]<arr[j-1]:
				arr[j-1],arr[j]=arr[j],arr[j-1]
	return arr

2. Insert sort

Time complexity O(n2)
stable

Main idea: insert the elements into the proper position of the ordered sequence (compare the two elements from the back to the front in the ordered sequence)

python implementation:

def insert_sort(arr):
	for i in range(1,len(arr)):
	#i is the subscript of the current element to be inserted, which is compared with the arrayed arr[:i] from the back to the front
		pre_index=i-1
		current=arr[i]
		#If the value to be inserted is smaller, the value larger than it will move backward until the insertion point is found
		while pre_index>=0 and arr[pre_index]>current:
			arr[pre_index+1]=arr[pre_index]
			pre_index-=1
		arr[pre_index+1]=current
	return arr

3. Select Sorting

Time complexity O(n2)
instable

Main idea: select the smallest element in the first place (exchange)

python implementation:

def select_sort(arr):
	for i in range(0,len(arr)):
	#After each sorting, put the minimum value at the top: find the index of the minimum value from the subsequent unsorted sequences, and exchange with the current arr[i]
		min_index=i
		for j in range(i+1,len(arr)):
			if arr[j]<arr[min_index]:
				min_index=j
		if min_index!=i:
			arr[i],arr[min_index]=arr[min_index],arr[i]
	return arr
			

4. Quick sorting

Time complexity O(nlogn)
instable

Main idea: select a datum element (generally the first or last one), divide the original sequence into two subsequences according to less than datum element and greater than datum element, and then arrange the subsequences quickly respectively

python implementation:

def quick_sort(arr):
#Recursive implementation, find the termination condition first
	if len(arr)<2:
		return arr
	#Select arr[0] as the base element and compare arr[1:] with it
	left´╝îright=[],[]
	for i in range(1,len(arr)):
		if arr[i] <=arr[0]:
			left.append(arr[i])
		else:
			right.append(arr[i])
	#For each sorting, the position of the reference elements has been determined, and the subsequences smaller than and larger than the reference elements will be arranged quickly
	return quick_sort(left)+[arr[0]]+quick(right)

5. Merge and sort

Time complexity O(nlogn)
stable

Main idea: first two two merge into an orderly sequence, then four four merge

python implementation:

Recursive method: gradually decompose large problems into small ones, first solve small problems (call their own functions), then solve big problems

def recursion_merge_sort(arr):
   #It is divided into different subsequences, and each subsequence is subdivided separately
	mid=len(arr)//2
	arr1=arr[:mid]
	arr2=arr[mid:]
	if len(arr1)>1:
		arr1=recursion_merge_sort(arr1)
	if len(arr2)>1:
		arr2=recursion_merge_sort(arr2)
	result=[]
	#Merge two sequenced subsequences
	while arr1 and arr2:
		 if arr1[0]<arr2[0]:
		 	result.append(arr1.pop(0))
		 else:
		 	result.append(arr2.pop(0))
	if arr1:
		result=result+arr1
	if arr2:
		result=result+arr2
	return result

Non recursive method: solve the small problem first, replace the result with its own value (iteration), and merge the small problem step by step

def non_recursion_merge_sort(arr):
	#Split unsorted sequence, step i, gradually increasing step
	i=1
	while i<len(arr):
		low=0
		while low<len(arr):
			mid=low+i
			high=min(mid+i,len(arr))
			if mid<hight:
				left,right=arr[low:mid],arr[mid:high]
				#Each subsequence is compared 22 times and combined into an ordered sequence
				result=[]
				while left and right:
					if left[0]<right[0]:
						result.append(left.pop(0))
					else:
						result.append(right.pop(0))
				if left:
					result+=left
				if right:
					result+=right
				#Replace your value with the new result
				arr[low:high]=result
			low+=2*i
		i*=2
	return arr	

6. Hill sorting

Time complexity O(n1.3)
instable

Main idea: divide the sequence according to a certain increment (the span of grouping), use insertion sorting for each sub sequence, and then gradually reduce the increment

python implementation:

def shell_sort(arr):
	grap=len(arr)//2
	while grap>0:
	#Equivalent to insertion sort with span of grap
		for i in range(grap,len(arr)):
			current=arr[i]
			pre_index=i-grap
			#Compare the two, move the large backward, find the insertion point, and insert the current element
			while pre_index>=0 and arr[pre_index]>current:
				arr[pre_index+grap]=arr[pre_index]
				pre_index-=grap
			arr[pre_index+grap]=current
		#Decrease span gradually (insertion sort when span is 1)
		grap//=2
	return arr

7. Heap sorting

Time complexity O(nlogn)
instable

Main idea: if it is arranged in ascending order, according to the big top heap (the complete binary tree whose root node value is greater than or equal to the value of all the left and right subtrees), each sorting will exchange the heap top elements with the last value, adjust the big top heap, and exchange

python implementation:

def heap_sort(arr):
	#Adjust the big top heap with start as the root node (compare the value size of the root node and all the subtrees, and put the maximum value on the top of the heap), and end is the subscript corresponding to the last value of the heap
	def adjust_heap(start,end):
		root=start
		#If the root node subscript is i, the left child subscript is 2*i+1, and the right child subscript is 2*i+2
		while True:
			child=2*root+1
			#No left subtree
			if child>end:
				break
			#Find the maximum value in each subtree
			if child+1<end and arr[child+1]>arr[child]:
				child+=1
			if arr[root]<arr[child]:
				arr[root],arr[child]=arr[child],arr[root]
				root=child
			#The maximum value is already at the top of the reactor, end the adjustment
			else:
				break
	#Create a large top heap
	#The subscript of the last non leaf node in a complete binary tree
	last_no_leave=len(arr)//2-1
	while last_no_leave>=0:
		#Adjust every subtree with children
		adjust_heap(last_no_leave,len(arr)-1)
		last_no_leave-=1
	#Exchange the top element of the heap with the last element, adjust the heap, and then exchange
	end=len(arr)-1
	while end>0:
		arr[0],arr[end]=arr[end],arr[0]
		#Remove the last element and adjust the remaining elements to the large top heap
		adjust_heap(0,end-1)
		end-=1
	return arr

Summary: in comparison sort

There are three stable sorting algorithms: bubble sort, insert sort and merge sort; four unstable sorting algorithms: selection sort, fast sort, Hill sort and heap sort

Posted by hawkenterprises on Fri, 19 Jun 2020 22:03:29 -0700