简体   繁体   中英

how lazy is haskell when dealing with complicated program flow

I was wondering how smart/lazy Haskell is. Can I always be sure that Haskell will only do what is necessary to generate a certain output?

No.

Haskell specifies a denotational semantics for its core lambda-like calculus, and you can rely on that semantics. Additionally, there is a metatheory proof that a particular reduction order -- known colloquially as "lazy evaluation" -- realizes that semantics; and so many people use that as their mental model of how Haskell programs behave.

There are two broad categories of ways that a Haskell program may end up evaluating more than necessary:

  • The Haskell implementation may choose to evaluate more. GHC uses lazy evaluation in most places, but I believe it will use other evaluation orders for efficiency in some cases. You could also look at the Eager Haskell project, which is attempting to use another implementation strategy; or, in principle, an implementation would be within its rights to choose to speculatively fork some computations to another thread (and then throw away the results if they weren't needed).

  • The denotational semantics specified may demand more evaluation than "necessary". For example, one that occasionally trips up beginners:

     primes:: [Int] primes = 2: filter prime [3,5..] prime:: Int -> Bool prime x = and [x `mod` p /= 0 | p <- primes, p < x]

    When checking whether 3 should be in the list primes , it is in fact not necessary to check any of the elements of primes past 2 , because the sequence is strictly monotonically increasing. But Haskell is not (does not try to be) smart enough to notice that; it will go straight on trying to check the rest of the primes and end up in an infinite loop instead of giving the list of primes.

    An even smaller example: you could think that x && False is always False , but x will typically be evaluated anyway, because the semantics says this should be an infinite loop if x is. (Contrast False && x , which typically does not result in evaluating x .)

That said, when you say "complex structure", one thing that comes to mind is: does Haskell do the laziness thing even with custom data types that I define ? That is, do complex structures like hash maps and balanced trees and kd trees and so forth get treated lazily? The answer there is yes ; there is nothing fundamentally special about any of the types in the Prelude except IO . Lists, booleans, Maybe s, and so forth are lazy not because the compiler knows special things about them, but simply as a consequence of the reduction rules specified by Haskell and their declarations as types.

Of course, there are ways to opt-out of laziness. Some of the structures you will see on Hackage do that in various ways; but don't worry, usually this will be declared in their documentation.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM