简体   繁体   中英

Is this Haskell code equivalent to this Python code?

I'm trying to port a Python program to Haskell, and I'm fairly new to NumPy (which the Python program uses) So I'm wondering why this code isn't equivalent. Here's my Haskell code:

data NNetwork = NNetwork { nlayers :: Int
                         , sizes   :: [Int]
                         , biases  :: [[Float]]
                         , weights :: [[Float]] }
                deriving (Show, Ord, Eq)

buildNetwork :: [Int] -> NNetwork
buildNetwork sizes = NNetwork { nlayers = length sizes
                              , sizes   = sizes
                              , biases  = map (\y -> replicate y (sig . toFloat . rands $ y)) sizes
                              , weights = map (\y -> replicate y (toFloat $ rands y)) sizes }
feedforward :: NNetwork -> Float -> [[Float]]
feedforward net a = map (equation a) (zip (weights net) (biases net))

toFloat x = fromIntegral x :: Float

sig :: Float -> Float
sig a = 1 / (1 + exp (-a))

rands :: Int -> Int
rands x = (7 * x) `mod` 11

equation :: Float -> ([Float], [Float]) -> [Float]
equation a (w, b) = map sig $ zipWith (+) (dot w (rep w a)) b
  where dot = zipWith (*)
        rep a b = replicate (length a) b

And the original Python code:

class Network(object):

    def __init__(self, sizes):
        self.num_layers = len(sizes)
        self.sizes = sizes
        self.biases = [np.random.randn(y, 1) for y in sizes[1:]]
        self.weights = [np.random.randn(y, x) 
                        for x, y in zip(sizes[:-1], sizes[1:])]

def sigmoid(z):
    return 1.0/(1.0+np.exp(-z))

def feedforward(self, a):
        """Return the output of the network if "a" is input."""
        for b, w in zip(self.biases, self.weights):
            a = sigmoid(np.dot(w, a)+b)
        return a

I'm trying to port a very simple neural network program from Python to Haskell, because I enjoy Haskell much more. I'm also worried that I'm doing something wrong, because the Haskell code is far more verbose.

- Thanks!

First of all: note that the Python version lacks the equivalent of deriving (Show, Eq, Ord) — try implementing the corresponding __magic__ methods and see how many lines of code are added. Without these, == , <= , > , as well as print Network() make little to no sense.

Basically the verbosity mainly comes from the type signatures. Also, you can move rands to a where block under buildNetwork and simply completely get rid of toFloat by replacing any calls to toFloat with fromIntegral with no type annotation. Plus perhaps some other tiny refactorings.

In general, you can expect, in some situations, for some things to be somewhat more verbose in languages that are generally much more concise. I'm sure as your neural network program progresses towards a more substantial code base, Haskell will be less verbose than Python, ignoring the possible existence of neural network libraries for Python that might be more mature than their (possibly non-existent) Haskell counterparts.

data NNetwork = NNetwork { nlayers :: Int
                         , sizes   :: [Int]
                         , biases  :: [[Float]]
                         , weights :: [[Float]] }
                deriving (Show, Ord, Eq)

buildNetwork sizes =
  NNetwork { nlayers = length sizes
           , sizes   = sizes
           , biases  = map (\y -> replicate y (sig . fromIntegral . rands $ y)) sizes
           , weights = map (\y -> replicate y (fromIntegral . rands $ y)) sizes }
  where rands x = (7 * x) `mod` 11

feedforward net a = map (equation a) (zip (weights net) (biases net))

sig a = 1 / (1 + exp (-a))

equation a (w, b) = map sig $ zipWith (+) (dot w rep) b
  where dot = zipWith (*)
        rep = replicate (length w) a

and well you can do some micro-refactorings in the buildNetwork to remove some minor duplication but that would just shorten the lines, and would possibly make the code less readable to a domain expert:

buildNetwork sizes =
  NNetwork { nlayers = length sizes
           , sizes   = sizes
           , biases  = nameMe sig
           , weights = nameMe id }
  where nameMe fn = map (replicate y (fn y')) sizes
        y'        = fromIntegral $ y * 7 `mod` 11

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM