- Python Data Structures and Algorithms
- Benjamin Baka
- 560字
- 2021-07-09 19:45:08
Composing complexity classes
Normally, we need to find the total running time of a number of basic operations. It turns out that we can combine the complexity classes of simple operations to find the complexity class of more complex, combined operations. The goal is to analyze the combined statements in a function or method to understand the total time complexity of executing several operations. The simplest way to combine two complexity classes is to add them. This occurs when we have two sequential operations. For example, consider the two operations of inserting an element into a list and then sorting that list. We can see that inserting an item occurs in O(n) time and sorting is O(nlogn) time. We can write the total time complexity as O(n + nlogn), that is, we bring the two functions inside the O(...). We are only interested in the highest order term, so this leaves us with just O(nlogn).
If we repeat an operation, for example, in a while loop, then we multiply the complexity class by the number of times the operation is carried out. If an operation with time complexity O(f(n)) is repeated O(n) times then we multiply the two complexities:
O(f(n) * O(n)) = O(nf(n)).
For example, suppose the function f(...) has a time complexity of O(n2) and it is executed n times in a while loop as follows:
for i n range(n):
f(...)
The time complexity of this loop then becomes O(n2) * O(n) = O(n * n2) = O(n3). Here we are simply multiplying the time complexity of the operation with the number of times this operation executes. The running time of a loop is at most the running time of the statements inside the loop multiplied by the number of iterations. A single nested loop, that is, one loop nested inside another loop, will run in n2 time assuming both loops run n times. For example:
for i in range(0,n):
for j in range(0,n)
#statements
Each statement is a constant, c, executed nn times, so we can express the running time as ; cn n = cn2 = O(n2).
For consecutive statements within nested loops we add the time complexities of each statement and multiply by the number of times the statement executed. For example:
n = 500 #c0
#executes n times
for i in range(0,n):
print(i) #c1
#executes n times
for i in range(0,n):
#executes n times
for j in range(0,n):
print(j) #c2
This can be written as c0 +c1n + cn2 = O(n2).
We can define (base 2) logarithmic complexity, reducing the size of the problem by ?, in constant time. For example, consider the following snippet:
i = 1
while i <= n:
i=i * 2
print(i)
Notice that i is doubling on each iteration, if we run this with n = 10 we see that it prints out four numbers; 2, 4, 8, and 16. If we double n we see it prints out five numbers. With each subsequent doubling of n the number of iterations is only increased by 1. If we assume k iterations, we can write this as follows:

From this we can conclude that the total time = O(log(n)).
Although Big O is the most used notation involved in asymptotic analysis, there are two other related notations that should be briefly mentioned. They are Omega notation and Theta notation.
- Java異步編程實戰
- C語言程序設計(第2版)
- 基于免疫進化的算法及應用研究
- 云原生Spring實戰
- 編寫高質量代碼:改善C程序代碼的125個建議
- ASP.NET Core 2 Fundamentals
- Scala Reactive Programming
- 學習OpenCV 4:基于Python的算法實戰
- Programming with CodeIgniterMVC
- Hands-On Full Stack Development with Spring Boot 2.0 and React
- Java Web從入門到精通(第2版)
- 網絡數據采集技術:Java網絡爬蟲實戰
- Practical Maya Programming with Python
- Python應用開發技術
- Node.js實戰:分布式系統中的后端服務開發