官术网_书友最值得收藏!

Composing complexity classes

Normally, we need to find the total running time of a number of basic operations. It turns out that we can combine the complexity classes of simple operations to find the complexity class of more complex, combined operations. The goal is to analyze the combined statements in a function or method to understand the total time complexity of executing several operations. The simplest way to combine two complexity classes is to add them. This occurs when we have two sequential operations. For example, consider the two operations of inserting an element into a list and then sorting that list. We can see that inserting an item occurs in O(n) time and sorting is O(nlogn) time. We can write the total time complexity as O(n + nlogn), that is, we bring the two functions inside the O(...). We are only interested in the highest order term, so this leaves us with just O(nlogn).

If we repeat an operation, for example, in a while loop, then we multiply the complexity class by the number of times the operation is carried out. If an operation with time complexity O(f(n)) is repeated O(n) times then we multiply the two complexities:

O(f(n) * O(n)) = O(nf(n)).

For example, suppose the function f(...) has a time complexity of O(n2) and it is executed n times in a while loop as follows:

    for i n range(n): 
f(...)

The time complexity of this loop then becomes O(n2) * O(n) = O(n * n2) = O(n3). Here we are simply multiplying the time complexity of the operation with the number of times this operation executes. The running time of a loop is at most the running time of the statements inside the loop multiplied by the number of iterations. A single nested loop, that is, one loop nested inside another loop, will run in n2 time assuming both loops run n times. For example:

    for i in range(0,n):  
for j in range(0,n)
#statements

Each statement is a constant, c, executed nn times, so we can express the running time as ; cn n = cn2 = O(n2).

For consecutive statements within nested loops we add the time complexities of each statement and multiply by the number of times the statement executed. For example:

    n = 500    #c0   
#executes n times
for i in range(0,n):
print(i) #c1
#executes n times
for i in range(0,n):
#executes n times
for j in range(0,n):
print(j) #c2

This can be written as c0 +c1n + cn2 = O(n2).

We can define (base 2) logarithmic complexity, reducing the size of the problem by ?, in constant time. For example, consider the following snippet:

    i = 1 
while i <= n:
i=i * 2
print(i)

Notice that i is doubling on each iteration, if we run this with n = 10 we see that it prints out four numbers; 2, 4, 8, and 16. If we double n we see it prints out five numbers. With each subsequent doubling of n the number of iterations is only increased by 1. If we assume k iterations, we can write this as follows:

From this we can conclude that the total time = O(log(n)).

Although Big O is the most used notation involved in asymptotic analysis, there are two other related notations that should be briefly mentioned. They are Omega notation and Theta notation.

主站蜘蛛池模板: 胶南市| 天全县| 舟曲县| 高雄市| 宁城县| 内乡县| 长沙市| 嘉荫县| 鸡泽县| 广河县| 炉霍县| 崇阳县| 鞍山市| 英超| 略阳县| 翁源县| 达拉特旗| 东城区| 来凤县| 井陉县| 舞阳县| 涞源县| 文水县| 兴业县| 大城县| 凤阳县| 郧西县| 威信县| 图木舒克市| 陇西县| 法库县| 炎陵县| 象州县| 理塘县| 潮安县| 格尔木市| 靖宇县| 营山县| 公安县| 盐城市| 新乡市|