Вы находитесь на странице: 1из 15
1.8 Comparing Algorithms Lets suppose that we perform an analysis of the (worst case) time compleity of slgorithms A and ‘Bto obtain functions f(n) and fy(n)- How do we compare the to functions to arive atx decision (on which of A or B to implement? Figae 1.3 astrates two ofthe ways that fa(n) and fan) might ‘compare with each other. In the rt cae, the decision woold seem tobe clear ext i favour of ‘A because f(n) i lower (has smaller values) than fy(n) fr every value of n. However it is less ‘obvious inthe second case because it depends on which values of input size, we cae mest about. Figure 1.3: Two possible ways in which complenty functions might compare wit each other. {In general, we sll make a siemplifying assumption that teri, principe, no upper bound on the size of te inp, and that we care how sn algorithm perform on tage values ofn. Furthermore, we {do not want ou deliberations to be clouded by differences in algort implementation. There are _many factors that affect the actual resource usage when an gorithm i implemented and execred ‘on a computer. The processor sped slne, for example, can change the mount of ne a program lakes to ua ona particular input sie. However, we wadt to compa algorithms ia manucr that is independent of the technology used wo implement them. In other words, when we say today ‘that algorithm A has 2 “lower time complexity thn” algorithm B, we want that statement to stil ‘be ime 10 years fom now, despite the inevitable progress intetnology. Tht is one reason that itis preferable to count the numberof introctons that ae being executed, rather than attempt to ‘accumulate the acral Iatncies of each operation. But more importantly that means that we Want «way of desenstizing oar analysis method tothe absolute ferences in the vales of fonctions ‘representing resource usage. ‘Toexplain this desensitization futher, suppose the diference between the time complet functions {or two algorithms computed ata cera input size is 100 I that a large value? 1s it sficientiy Jarge for us to conclde that one algo i better than another, at Jest on an iapat ofthe size ‘considered? The answer depends onthe technology ofthe implementation. If that 100 represents a 31 numberof computer operations, in today's word a single operation takes less than I pano-second (billionth ofa second), s0 100 of them occur faster than you can blink your eyes. Most people ‘would consider so small a difference in time insufficient to coeclade that one of the abganitims was beter = rater it might be more wef to think of them as being equivalent, at lest forthe input size considered. The challenge then, is how does one define this “desensitization” of analysis 30 ‘that we canst compare algorithms, bu without relying on the absolute values being etpat from the complexity fonctions? [tums out that mathematics has already provided us with a way to carry out his ype of compar- ison of functions. Its called Asympusic Analysis because ie compares functions based on their behaviour on very Tage input vas. LBA Asymptotic Analysis Let f and g be fanctions that could arse from a time) complexity analysis of an alent. That ‘means that bo fad g will map natural pumbes(non-pegatve whale numbers), reEsenting npat sizes, to real numbers. (According tothe analysis we have doe so fr these functions would also ‘utr whole mbers, bat we allow for functions that srs rom an aml in measurable unit such as length of time.) Let denote fo into each ofthese fanctions (som represent the ie of te input othe algorithms whose analyses produced f and ). We shall concentrate oa the relative ‘ales of fang g for large values of. particular we exarie how f changes sw increses. and compare hat growth of f with the growth of. We shall consider thom o be sgnitcanly ere ‘aly if the ratios oftheir comesponding values on increasing valucs of m become cer infty ct zero, Tat eos 9610 a wichotomy forthe tendency of f)/¢(n) m increas withow Doon: ‘the rao wl ed 0. or some ite non-zero number. The st esl implies thet f grows signitcanly slower than g, the second implies hat f grows sipiicany faster thang. the third fies that hey have equivalent growth rates. Let « lym S/n thats fat) + v 98» c.Hv =O then (1) grows och slower than gin. Conversely. if» = oe then 1) grows mich faster chan gn). On te ter haa, if vis eter zero nr inf, ten f() aod grow a comparable aes. If close 1, then 7) ‘and gn) are aporosimately equivalent for large vals of, We sal take this noi of equivalence ‘ne sep further though, and trea fn and gf) a equivalen! even if is very lag, so ong ais footy. The elution of ¥ no ee, Ont oF init 9 importa tha peste neutons ‘ave been Invested vo represent te various stations. We Ml focus on the three mest commen. 4+ Definition 1-6 fin) = Otg6nd> ify not infinite, then we say that Jin) = eta), pronounced “7 Colm) | bi-Oof (fn), to mean that fom) grows no faster han g(n) = Definition 1-7 fin) = C4g¢n)) If is not zr, then we any that Fi) = O(n), pronounced “/ (of ns omeea cli ls} wre tt 70) pore ea lsun tole wo #00. 2 + Definition 18 f(x) = (im) Fv Gite, then we say that ln) = OG) pronounced “F (okm) theta of] (of "1 eam that f(r) grows at “approsimaely” within a constant factor of the same rata gn). Noe that his ease isthe intersectin ofthe preceding ewo cases In other word if tn) = @tgin), then bot fe) = Og) and fn) = 2450). ‘We can regard 0,0, © as relasons on fonctions that are analagous othe rations <,2,= on nam bers, Note that even if vis a non-zero faction less than 1, we sil say that f grows no faster than 1. Technically, what we means tht f grows no faster than some constant mokipied by g. Ths is consistent with our treating al nite values of vas iff and have equivalent growth ates. When we ‘write f = ©(g) we Sometimes say that fn) grows on he order of)” ce that “fn has order of _g7owh g(a". The Order of Growth of fonction isthe rate at which it increases op ta constant, ‘with increasing input size. Alternative Definitions Many texts ase the following definions of 0,0 and @, which are equvalemt tothe ones given above. In this course, you may use whichever you are mare comfortable wih Definition 1-9 fn) = O(g(s)) [if tiere are constants no onde wach vat for an ro, Fl) Sea + Definition 1-10 fin) = Sa) (Hirer contami gad wh that or alm & ha FO) 2 ca. S10) = Csi) HE ln) = OCg(ni) and Fee) = OCg(n). Thats, there are constants mo cx and cs] such that forall =o: fn) 5 cin) and fn) = cet. ‘Figure 14 tlusrates how the raps ofthe functions f and g might compare with eachother ineach ofthe cases when f = O(9) f= Mig)and f = OC). ‘A Word on Notation ‘There's # notational anomaly bat should be discussed. Technically speaking. Otg(n) actually refers oa set of functions. ach function in that set ns an order of grows ht is upper bounded 3 mom ta fe-onm Figure 14: Mustation of f= 0(@)f = 0(@) and f = @¢8) by that of gn). For example, O12) actually refers to the infinite set of functions that do not grow asymptotically faster than 1; examples of functions in his setinclude n,2n-+1, vR-m!5, 1, 10000, nga, Ign. Likewise, Xg(n) also refers to ase of functions bat now i is those Fancions whose cxde of growths lower bounded by s(n (Le. those that grow atleast s fast as 7) Solows that ‘ein the set that isthe intersection of OC) and Xg(n). When we write “find = Ot. we actually mean that f(n) € O(g(). However, this notational spomraly s now deeply entenched nthe literature and habits of computer scientists, so we have to lean o ive with it Another point to note regarding notion is that we sometimes use O(/) to refer to a particular ‘unetion whose order of growth is equivalent to that of f. This use is sometimes refered to a8 an ‘anonymous function. We use anonymous fonctions in comexts where the ony portant feature ofthe function is its onder of growth. Both O and £1 can be used in place of © ia this context, the difference in meaning isin the bunds on the order of growth ofthe anonymoes foetion. For ‘example, OG) a8 an anonymous funtion, refers to some function whose orde of growth doesnot exceed that of 1, so it could have onder of growth not ng or even 1, or any cther fonction i the set Ofn). You cam often detect the intended use ofthe notation OL) in the comet in which i ‘is wed: since one interpretation is a set and the other is as an individual member of dat se, here ‘sualy litle confusion about which meering is intended. Sil fer persons ew ithe concep, i can be somewhat confusing to follow a st 1.82 Properties of Asymptotic Bounds ‘A numberof properties of usymptode bounds wl be use in applying them te real problems. ‘We state these propestie fist, and then give a number of examples to show how they ae wed in practice. Inthe following. f and gare factions that map the natural numbers nto the positive ea 1, @YOC) = OF/g). The order of growth ofthe produc of two (anonymous) Sections i the product of thei individual ordars of growth, 2, O(/) + 01) = OL +) = Ofmax f=), where max.) refers othe facson that has the higher order of growth. Adding a function o another of lower order of growth does not change its onder of grow. This property is often generalised tothe sum of in arbitrary (but constant) numberof functions. The sulting order of growth i stl that ofthe individual 4 function with the highest order of growth. Be careful though; it doesnot apply ifthe mamber of functions being summed is somehow depeadent on te iopt sive. For example, if we add any constant numberof function, each of order m, we get a Sux hat is stil of orden. However if we added. sach fonctions, we wou get sum of order n?. 3. cO(P) = @C6f) = OCP. for any constant c. (Constant malipirs ofa fection do ot change its onder of growth, Each property can be proved by making deductions from the definitions of ©. but fr this course, ‘we shall take it cn faith that hese properties are ineed core. You shoul also note that these properties hold for O and Q, that is replacing ® everywhere with ether ove ofthese symbols stil Yields tre statements. ‘Some Example Uses ‘The best way to appreciate the value of these properties isto se them in use. Their most common ‘purpose ist simplify oder af growth expressions. For example, suppose afer performing a ime ‘complexity analysis, we get that Tm), the time complexity for some algoit. is piven by (a) = ©Qn +2! ++ 50). ‘The stape of this function isnot immediatly obvios soto the uninitiated it doesnot immediately convey the order of prowth cf Tm). We can apply ral 2 (the generalised version) to simplify the expression for T(n) toa simpler one, but frst we must convince ourselves thatthe term 2n? has (he highest order of growth. Its fact that in any polynomial function, the highest order term. omits the order of growth. You need only compute the Fimit ofthe rao of any the terms to (he highest o see tha al ter terms vanish in comparison to the largest as she input size inreases ‘wihowt bound. See Lemma 1. fora formal ireatment of this fact. $0 we cn reduce he expression {or T(a) to Tn) = ©), but we can go even further. Property 3 tells us that his ean be reduced ‘o simply Tn) = ©). In genera, the order of growth of any polynomial isthe sams ws ha fits highest cede erm. Lemma 4. Iff and gare beth potynomial factions the ome withthe Maher degree has the higher oder of sro Proofs Let the depres otf be pend that of she 4. Whit loss of poner the polynomial withthe higher degree, that is, q > p. We may write fim) = EP ain) and gin) = Tfobnl, where op ..dp are the coeficients of f and by...by are the ‘oefcients of Now Kasai mig wa lenge mls prove Gee proper, th he nie rn inca lingam en oa re ce ef fa fining hen aks woe avo ee om 35 ‘We pause hereto make some ebservations about the values in the numerator and de- ‘nominator: we are trying to campate the limiting value of f(n)/n) 8 n> 2. We se that for a very large value ll terms, except the ls of the rumeratr become ‘vanishingly small a large number raised t a negative exponent yields very small, number). Therefor, the numerator approaches the yale af a. In the denominator, ‘here is t least one term with a postive exponent in n since q> p, therefore for large values of nthe denominator's value is very large. Since the numerator appoacihes a constant, but the denominator ineeases without hound, it follows that their tio van Ishes as increases without bound. Tha is the limit ofthe ratio of f(n)/s(n) is O25. ‘n= co, It follows that fn) has lower order of growth than g(r). . Example 1-1 All ofthe following equivalences of polmomial orders of growth flow fram the preceding discussion: | (r+ 10000 + 100000) = O(n") ++ 0110000) = 41) + @1107n) = em) ++ tain D) = OH") = OHH) = ee) ‘+ Dif = Ont) (ee proximation of polynomial sums given in Mathematica! ‘Background section forthe jastiication) Example 1-2 Exponential functions grow much fase than any polynomial sn gen- era summing an exponential with a polynomial, yields an exponentially growing fune- tion, repels ofthe degree ofthe polynomial, The base ofan exponentil matters the large the base, he larger the nde of growth 2 024m = 02) 2 0049) 200, #27 = 03%, but + O12". (root of hiss let as an exer othe renter) ‘Example 13 Remember thal erpondotils ate to polynoiate what polynomial are to logarithms. So sasming a (non-constant) polynomial wih ay power ofa logarithm of, polynomial yields the polynomial. > ©(n+ Ign) = 0%n) + O(n + fopiom) = O10) > in te? m) = Om) ‘Example 1-4 Be wary of expocentials with logarithmic powers, o logarithms of ex- onential functions. Remember that the logaridum isthe inverse fection ofthe expo- ental, $9 when these functions are combined, they are eliminating each other sd 10 Yu must be careful ro pay ateation to the function thes yielded a a result. 6 + 028") = 00) 2 OQ + 2) = Ons?) = On) + Of?" +m) = 09? +n) = O60?) Propery I the product property) sometimes comes inhaacy when we wantto compare complicsed functions that have been composed as product of cher functions. For example, the function mgt is not polynomial, but how does ic compare with them? Its larger thas and smaller than 2, but does it grow in such a way as to havea diferecs order of growth fom either? We can Use the rodvet property a follows: nign=nxign [Now ign = O(n) but n # Oiige). Somx Ign = mx O(n) = Oln2), but x? # Otelgn). Also, Ign = (0), but | # OANgm. Som ign = nx (1) = Qin), but # Ainigm). The upshot is that ‘the function gi relly alsin beeen the orders of growth of the polynomials mand n? 1.9 The Master Method ‘This stetion discusses a method for solving recurere relations without drawing recursion ees. only works fora restricted subset of recurrence relations. but when itis applicable, reduces the task of solving recurence relations to application one of cree options. We dicuss this powerfal techni inthe following sections, 1.9.1 A Gonerie Recursion Tree In this section we show how tome a tecusion tre to soe a general rcurerce relation ofthe frm Tia) = aTin/b) + Jn). We beg, a8 wal, by comtracting a ection lee to keep tack ofthe ‘current input size andthe amount of oveshead at each node (See Figure 1.6). The objecsive is © ‘sum the overheads over the the ent tre. The expression we obtain (hich wil bein tems of) isthe (closed form) expression of T(n) ‘The approach we use isto calculate the total work done per level ofthe tre. Clearly the overhead atthe root node is f(n). Now if we caleulate the wark done atthe level jst below the 100%, we see tha there ae a nodes, each of which incurs an overhead cost of f(n/b). So the otal amet of work a that level is a(n/B). Repeating this reasoning forthe next lve, we see that the total mount of work forthe level so fn/2). In general. at depth i the tts work done isa! fn/B). [Note though, that the overhead forthe lea nodes is constant (1) s0 the total amount of work at the leaf level (deepest dept) ison the order ofthe nomber of leaves. I order to complete this ‘compuation, we need two bits of information: how dep is the tre, and how many leaves are at 37 = =~ Bioi.x ee eo — L HOM, Figure 1: Scaled Skeich of common fonctions Figure 1.6 The recursion tre fr solving the recuence T(n) = oP) + ft) that depth. Actually, dhe two acts ae linked, because each level asa more nodes than he previous level f we define the root level tobe depth , then we can sy hat at depth J we have o/ nodes. So if we can calculate d, the (mexinmam) depth ofthe tee, then the numberof leaves i w. So, really, ‘the only non-inultve fat what sd, the depth of the ee, Let us consider what would rake the tree stop branching — forth is how we know we ae st depts 4. Ifthe value of becomes so snl tat we have a direct answer for Tn) then we do net produce ‘any children for that node i the recursion tee, So that is the determinant of a leaf node. Now ‘consider how the value of the input to Ts changing a ech level. At he foot node, is. a the level below, its n/b. 0 goeral level 8 n/B!. Soa evel dit would be 9/4. ta ether words, atthe lef level the vale of the input to T is n/4. From the eriinaldefisition of the pecblem, we [know that we area leaf lve ifthe input oT is atmos 1 So we can write w/24 < 1s relation that allows us to relate d (our arknown) ob and als that we started with. Solving ford nlt# <3 sz ns = ign = dep = bea ie “Te ome ‘We concode that fora gven initial iopt sie, nthe maximum depth ofthe recursion tee when the Inpat size reduces by a factor of beach level is (the losest iteger to) logy. n. The analysis of the » numberof nodes at ave tell us that the mb of leaves is therefore a8". Now; we are ready to resume our computation of T(n). The total work at the laf levels: [Namber of eaves x 6(1) = @(Number of leaves) ” ‘The last level of imeral nodes (i, with overhead related to f(a) is at level ~ 1. To compute tbe {ota overhead incurred by all ofthese internal nodes, we compote the sum of the ttl work at each ‘evel. Recall tat the rotal work done at level iis fin). Woof Inert aades = Eile fa/¥) = eet esa ‘Now, (a) is givea by summing these 1wo expressions: get Tons) dftnithy +a a ‘Unfortunately, we are unable to resolve this expression any fer without more information on Fo). ietars out that fora relatively lage collection of fonctions that f(n) could be, we can actually rece (e) to something more tractable than Equation 1.1. The method by which we do ‘hiss called the Master Method (based on a theorem sometimes called the (Lith) Master Theorem). which we dscns next 1.9.2 The Mester Theorem ‘When feced with w recurrence relation of the type analysed in the previous section, the Master ‘Met i the pefered way to solve if fin) falls into one of eee categorie (the dee categories {donot cover al the posites). The recursion tree wil always yield coroct answers, but for mont problems actully encountered in anidysing algorithms, the Mister Method provides a quicker way to solve the recwences ha arse, (itt) Master Theorem: Gives the eeurence: Ten) = aTen/b) + for) ere a, are constants and fn) is an arbivary function net F = log, a (Eis calles ‘he critica exponen) then: Case 1: IFfor some €> 0, f(n) = O(n), then T(n) = Brn). ‘Case 2: IF f(n) = O(n) then Tin) = ELF) Ig (Case 3: If for some 6 > 0, fa) = Sn) bat for some « > 0, fin) = O(n), tren To) = eLfon). ‘The cases ofthe Master Theorem cam be rectly related to the preceding analysis of recursion wees. (Case | corresponds tothe station when the work done tthe leaves overshelms that done atthe ‘mount of work thatthe leaves do. One way to se this is to recognise that iff) ‘means thatthe amount of ware done a the root node (ie. fn) i of the same onder of growth as that done a the leaves (i.e. tn) from the discussion in Case I). Recll hat the amount of work one a dept is af(n/0), Since fn) = O(n), wecan substitute for fn) to e-expres the amount of wort doe ato! : fait) = deupty I ears owe) ow) ows") oom eur ‘That ihe amount of work one a cach vel he same, and farther, the sea hat done atthe ef eel ‘Case 3 covers the situation when the amount of work done atthe root node is more than that done atthe leaves. However, not al such foetions fall into case 3 because ifthe decrease intra work ‘ta lev! i 100 slow, then the sue of the work over al levels wil yield a fentio that has higher ‘order of growth than fn) the total work per level decreases rapidly enough, then the root node's ‘overhead dominates, and so Tn) = Oxf) 19.3 Examples Here ae afew worked examples of recurence relations that are solved wing the Master Method. In euch cate, assume that T(x) = I form 1 Example 1-5 Ton) = AT (NID +0 Now. the overhead function is f(n) = n, and comparing » with r. ici clear that {e) = n= O(n?) = Ofn). However this by ise isnot sufficient © conclude which ‘ase applis. We need to show that f(n) is stietly smaller than n#, so we look for an that allows f(n) = Ola). For defiiteness, let ¢ = 0, then Ee = 2-5 = 15 nd itis clear that fn) = n = O(n!*) = O(n), which pats us in case 1 ofthe Master ‘Theorem. So we conciade that T(n) = O(n) = O(e!) Its reassuring to see that Bie Ts yous pow isan 4 ‘we obtain the same order of growth as we did when we solved tis problem wsing & recursion te cate! Example 1-6 Ta) = 37.972) +1? ‘The rial exponents E = log, 3. We do’ eed actly calle the merc ‘alu ots enough o cbere tht |< E <2 ine log fan focreasing eacton ee als teowen ?= 2! and 4 = 2. Now we compare th oerbead fection, fl) =n? with nf =, Since 2 > Ig3, itis clear that fn)» OA) furermor. if we let 5 = (2-13), then it is clear that 5 > O and that Ig3 +5 < 2, 90 fn) = tare), ‘We seem be on ack to proving tht Case 3 ols at we need io also find an > 0 satisfying th tition specified. Thats, we need to be able oad an 10 Eo ge 8 vale lage than 2 (tbe exponet of fn) Late =I, thene > Oand E+e= 1+lg3 > 2, $0 fi) = w= O(a). We have said the exterior Cave 3, 0 we coved tat Ten) = Of) = 8. Example -7 Tin) = 27(9/2) +m ‘The critical exponents E = Jogs2 = 1. The overhead function is fn) = n. Comparing iuwith af = a! =n. we se that fa) = O(n) = O(n) 50 weave established that Case 2 olds. So we conclude that Tin} = OL/(@) Ign) = Olalg=) Master Theorem: Practice Problems and Solutions Master Theorem ‘The Master Theorem applies to rearzences of the following frm: Tn) = oF no) + fo) ‘where a> 1 and 8> 1 are constants and f(n) an asympttiealy postive function. ‘There are 3 cases: 1. If f(a) = Ofahts*-*) for some constant ¢ > 0, then T(x) = O(n"). 2. I f(a) = O{e** log n) witht > 0, then T(n) = O(n" lag! n), 3. f(n) = M(atsa>t*) with ¢ > 0, and f{n) satis the regularity condition, then T(x} = O(n). ‘Regularity condition: af(n/b)< ef) for some constant e-<1 and al uiiently large m. Practice Problems For each of te flowing recurrences, give an expression forthe runtime T(n) ithe recurrence can be ‘solved withthe Macter Theorem. Otherwise, indicate thatthe Master Theorem dows not ply 1. Dn) ST) 4? 2. Tn) = ATi fd) 408 ees cad 4.70) =27I0/2) +0" 5. Tin) =167(n/4) +0 6. Tin) = 27/2} + mtogn 1. Tn) = 27(n/2} + n/logn 8. Tin) = 27 n/a) + 0 9. T(n) = 087(n/2)+ 1/0 10. Tn) = 167(a/2)+ 01 11, To) = V3P(0/2)+ogn 12. Tie) =37(n/2)+n 18. Tin) =9T m8} + Va Ti) = aT} ren 18. Tin) = 3T(n/4)+ nlogn 16. Tn) = 3TI9/3) 4-9/2 17. Tn) = OT(n/2)-+ 0 og 16. Tn) = 47(n/2) + nf gn 18, T(n) = 647{n/8)— 9? logn 20. Tn) = T7n/a) +n? 2. Tn) = aT /2)+ og 22. Tn) = T(n/2) + n(2~ cosn) Solutions 1. Tin) = 37(0/2) +53 + T(0) = O(6%) (Cone 3) 2. Tn) = AT\a/2) +n? — T(n) = O(eogn) (Case 2) 3. Tin) = Tin) +2" + O12") (Case 3) 4. T(0) = 2°T{w/2) +0" —+ Does not apply (as nt constant) 5. T(n) = 16P(a/A) +m = Tn) = Ofe*) (Case 1) 6. Tn) = 27/2) + mlogn =» Tin) = nlog?n (Case 2) 7. T(n) = 27(e/2}+n/logn —> Does mot apply (non-polysmial difecence between fn) and nist) 8. T(n) = 20(n/)+ n° me» T(n) = (9°!) (Cese 3) 8. T(x) = 057(n/2)+ 1/n => Dow not apply (0 <1) 10. T(n) = 167(a/4)+ nl => T(n) = O(n) (Case 3) 11. T(n) = VEE (n/2)+ logn —+ 710) = O( A) (Case 1) 12, T(n) = ST (n/2) +m => Tim) = O(wI%) (Cave 1) 18, Tn) = ST (n/8) + vit => Tin) = O(n) (Case 1) 14, T(n) = AT(0/2) +e me T(n) = 040?) (Cane 1) 18. Tn) = BT (0/4) + nlogn = Tn) = 8fmlogn) (Cas 3) 16, Tn) = 37 (0/8) + n/2 =» Tin) = OXntogn) (Coe 2) 37. T(n) = OT\n/8) 4 nag = Tin) = Cogn) (Case 3) 18, T(n) = AT (m2) + n/ logn > Th) = O(n") (Case 1) 19. T(n) = 647(n/8)— n?logn => Does not apply (f(n) I not postive) 20. Tn) = 77/3) +1? => T(n) = OG) (Ce 3) 21. Tin) = AT (n/2) + log => T(n) = O¢e?) (Case 1) 22. T\n) = T(0/2) + n2—coan) =» Does nt apply. We ar in Care 3, but the megularity condition is ilated. (Consider n = 2rk, where soda and arbitewly large For any sch choles of m.yo0 can Siow thot 2 3/2 thereby violating the regularity conden.)

Вам также может понравиться