From Wikipedia: a plot of the running time of the Euclidean algorithm to find gcd(x,y). Stare at the plot for a while; it has some fascinating fine structure.
What can be said about the average number of iterations needed to run the Euclidean algorithm on (x, y), where x and y are, say, selected u. a. r.1 from {1, 2, ..., N}? It clearly can't grow any faster than logarithmically, since the Euclidean algorithm operated on (x, y) takes at most logφ max(x,y) + O(1) steps (when its inputs are consecutive Fibonacci numbers). It turns out that the mean number of iterations required on inputs which are at most N is in fact C log N for some constant C; this is apparently a classical (1969) result, and a special case of a more general result of Viviane Baladi and Brigitte Vallee in their paper Euclidean algorithms are Gaussian (arXiv:0307062v4). So the Fibonacci numbers, which are the worst possible input in this algorithm, are only a constant factor slower than the average. Baladi and Vallee also show that the number of iterations is in fact asymptotically Gaussian with variance also of order log N.
1 "u. a. r." stands for "uniformly at random". This seems to be a semi-standard abbreviation.
Subscribe to:
Post Comments (Atom)
1 comment:
it dosent actually say
Post a Comment