- #36
Ibix
Science Advisor
- 12,195
- 14,029
I have python 3 installed, but typing python invokes python2. I should probably change the behaviour...
The difference is not so much loops vs. comprehensions but how much code written in Python (as opposed to just built-in functions or operations, which are really C code running inside the interpreter) is executed on each iteration.Ibix said:I understood python loops to be slow and list comprehensions to be better.
It actually seems to be random noise. I set up a version that runs the two algorithms a thousand times on range(1000) and range(1000)[::-1] and the average is pretty close for monotonic cases. Looks like my two-or-three-runs-and-eyeball-it approach wasn't entirely reliable from a statistical point of view. Baluncore's approach is, as you note, way faster for non-monotonic cases since it exits early.PeterDonis said:However, @Baluncore's algorithm exits early if it spots a failure of monotonicity, whereas yours will traverse the entire list each time, even if a failure in monotonicity occurs early in the traversal. That's probably why yours is slower.
Bear in mind that you are doing FOUR list traversals (for...in, zip and two all's) instead of just one.Ibix said:It actually seems to be quite a lot slower than Baluncore's approach as well
This is a common belief, partly based on misunderstanding and partly based on early implementations of Python which were really slow.Ibix said:which surprises me since I understood python loops to be slow and list comprehensions to be better
Exactly this. So for instance finding the dot product of two vectors represented asPeterDonis said:The difference is not so much loops vs. comprehensions but how much code written in Python (as opposed to just built-in functions or operations, which are really C code running inside the interpreter) is executed on each iteration.
numpy.array
s is implemented entirely in C and runs much quicker than iterating over plain lists, but when as in this case you are executing the same code to compare elements of a list, it doesn't matter whether that list is traversed via a for...in comprehension, a for..next iteration or functools.reduce. And where it is possible that the iteration might terminate early, the for..next loop will always win because it is the only one you can break out of.That'll be it. I learned python originally from an O'Reilly book well over fifteen years ago, so it's probably 20+ year old info now. How time flies. I guess I need to update more than my install.pbuk said:This is a common belief [...] partly based on early implementations of Python which were really slow.
from itertools import accumulate
def is_monotonic(lst):
def all_eq(acc):
return all(x == y for x, y in zip(lst, accumulate(lst, acc)))
return all_eq(max) or all_eq(min)
accumulate
is like reduce
except it returns a generator of all intermediate values (and it can stop early).