簡體   English   中英

Spark 中 Keras 模型的包裝器

[英]Wrapper for Keras Model in Spark

我有一個 Keras Nueral Network,我想在 spark 環境中使用包裝器部署這個模型。 所以我在這里嘗試了以下教程

import tensorflow as tf
import keras
from keras.models import Sequential
from keras.layers import Input, Dense, Conv1D, Conv2D, MaxPooling2D, Dropout,Flatten
from keras import backend as K
from keras.models import Model
import numpy as np
import matplotlib.pyplot as plt


from keras.datasets import mnist
(X_train, y_train), (X_test, y_test) = mnist.load_data()


# Expect to see a numpy n-dimentional array of (60000, 28, 28)

type(X_train), X_train.shape, type(X_train)


#This time however, we flatten each of our 28 X 28 images to a vector of 1, 784

X_train = X_train.reshape(-1, 784)
X_test = X_test.reshape(-1, 784)

# expect to see a numpy n-dimentional array of : (60000, 784) for Traning Data shape and (10000, 784) for Test Data shape
type(X_train), X_train.shape, X_test.shape


#We also use sklearn's MinMaxScaler for normalizing

from sklearn.preprocessing import MinMaxScaler
def scaleData(data):
    # normalize features
    scaler = MinMaxScaler(feature_range=(0, 1))
    return scaler.fit_transform(data)

X_train = scaleData(X_train)
X_test = scaleData(X_test)


# We define the same Keras model as earlier

input_shape = (1,28,28) if K.image_data_format() == 'channels_first' else (28,28, 1)
keras_model = Sequential()
keras_model.add(Conv2D(32, kernel_size=(5, 5), activation='relu', input_shape=input_shape, padding='same'))
keras_model.add(MaxPooling2D(pool_size=(2, 2)))
keras_model.add(Conv2D(64, (5, 5), activation='relu', padding='same'))
keras_model.add(MaxPooling2D(pool_size=(2, 2)))
keras_model.add(Flatten())
keras_model.add(Dense(512, activation='relu'))
keras_model.add(Dropout(0.5))
keras_model.add(Dense(10, activation='softmax'))
keras_model.summary()


# Import the Keras to DML wrapper and define some basic variables

from systemml.mllearn import Keras2DML
epochs = 5
batch_size = 100
samples = 60000
max_iter = int(epochs*math.ceil(samples/batch_size))

# Now create a SystemML model by calling the Keras2DML method and feeding it your spark session, Keras model, its input shape, and the  # predefined variables. We also ask to be displayed the traning results every 10 iterations.

sysml_model = Keras2DML(spark, keras_model, input_shape=(1,28,28), weights='weights_dir', batch_size=batch_size, max_iter=max_iter, test_interval=0, display=10)

# Initiate traning. More spark workers and better machine configuration means faster training!

sysml_model.fit(X_train, y_train)

# Test your model's performance on the secluded test set, and re-iterate if required
sysml_model.score(X_test, y_test)

from systemml.mllearn import Keras2DML行我得到的錯誤是

回溯(最近一次調用):文件“d:/SparkJarDirectory/./NNSpark.py”,第 58 行,從 systemml.mllearn 導入 Keras2DML 文件“C:\\Users\\xyz\\AppData\\Local\\Continuum\\anaconda3\\lib \\site-packages\\systemml\\mllearn__init__.py”,第 45 行,從 .estimators import * 文件“C:\\Users\\xyz\\AppData\\Local\\Continuum\\anaconda3\\lib\\site-packages\\systemml\\mllearn\\estimators. py”,第 917 行 def init (self、sparkSession、keras_model、input_shape、transferUsingDF=False、load_keras_weights=True、weights=None、labels=None、batch_size=64、max_iter=2000、test_iter=10、test_interval=500、display= 100, lr_policy="step", weight_decay=5e-4, regularization_type="L2"): ^ SyntaxError: import * only allowed at module level 2019-03-12 20:25:48 INFO ShutdownHookManager:54 - Shutdown hook called 2019 -03-12 20:25:48 INFO ShutdownHookManager:54 - 刪除目錄 C:\\Users\\xyz\\AppData\\Local\\Temp\\spark-2e1736f8-1798-42da-a157-cdf0ade1bf36

根據我的理解,我知道我正在使用的圖書館存在問題

from .estimators import *

__all__ = estimators.__all__

我不確定為什么包裝器不起作用或需要什么修復。 任何幫助表示贊賞。

我認為 systemml 版本 1.2.0 錯過了 python 3.5 的一些修復( https://github.com/apache/systemml/commit/9e7ee19a45102f7cbb37507da25b1ba0641868fd )所以你需要從源代碼安裝 systemml(對於我的設置,這與你的不同,它會 git clone 然后“cd src/main/python; sudo python3.4 setup.py install”)

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM