官术网_书友最值得收藏!

Composing complexity classes

Normally, we need to find the total running time of a number of basic operations. It turns out that we can combine the complexity classes of simple operations to find the complexity class of more complex, combined operations. The goal is to analyze the combined statements in a function or method to understand the total time complexity of executing several operations. The simplest way to combine two complexity classes is to add them. This occurs when we have two sequential operations. For example, consider the two operations of inserting an element into a list and then sorting that list. We can see that inserting an item occurs in O(n) time and sorting is O(nlogn) time. We can write the total time complexity as O(n + nlogn), that is, we bring the two functions inside the O(...). We are only interested in the highest order term, so this leaves us with just O(nlogn).

If we repeat an operation, for example, in a while loop, then we multiply the complexity class by the number of times the operation is carried out. If an operation with time complexity O(f(n)) is repeated O(n) times then we multiply the two complexities:

O(f(n) * O(n)) = O(nf(n)).

For example, suppose the function f(...) has a time complexity of O(n2) and it is executed n times in a while loop as follows:

    for i n range(n): 
f(...)

The time complexity of this loop then becomes O(n2) * O(n) = O(n * n2) = O(n3). Here we are simply multiplying the time complexity of the operation with the number of times this operation executes. The running time of a loop is at most the running time of the statements inside the loop multiplied by the number of iterations. A single nested loop, that is, one loop nested inside another loop, will run in n2 time assuming both loops run n times. For example:

    for i in range(0,n):  
for j in range(0,n)
#statements

Each statement is a constant, c, executed nn times, so we can express the running time as ; cn n = cn2 = O(n2).

For consecutive statements within nested loops we add the time complexities of each statement and multiply by the number of times the statement executed. For example:

    n = 500    #c0   
#executes n times
for i in range(0,n):
print(i) #c1
#executes n times
for i in range(0,n):
#executes n times
for j in range(0,n):
print(j) #c2

This can be written as c0 +c1n + cn2 = O(n2).

We can define (base 2) logarithmic complexity, reducing the size of the problem by ?, in constant time. For example, consider the following snippet:

    i = 1 
while i <= n:
i=i * 2
print(i)

Notice that i is doubling on each iteration, if we run this with n = 10 we see that it prints out four numbers; 2, 4, 8, and 16. If we double n we see it prints out five numbers. With each subsequent doubling of n the number of iterations is only increased by 1. If we assume k iterations, we can write this as follows:

From this we can conclude that the total time = O(log(n)).

Although Big O is the most used notation involved in asymptotic analysis, there are two other related notations that should be briefly mentioned. They are Omega notation and Theta notation.

主站蜘蛛池模板: 嘉祥县| 百色市| 吴桥县| 县级市| 林西县| 潍坊市| 临清市| 桐梓县| 乳源| 凌海市| 昌乐县| 临桂县| 青海省| 邳州市| 莱阳市| 邵武市| 肥西县| 沭阳县| 宣城市| 平谷区| 芷江| 同仁县| 台江县| 金湖县| 肥乡县| 理塘县| 临澧县| 农安县| 高台县| 河池市| 秀山| 连州市| 蛟河市| 黄平县| 卢湾区| 自治县| 德惠市| 安多县| 郁南县| 九江县| 韩城市|