我正在其社区版上关注databricks教程:

https://docs.databricks.com/applications/deep-learning/deep-learning-pipelines.html#install-deep-learning-pipelines

附件库:


火花深度学习,张量流,Keras,h5py,张量框架


运行此单元格时:

from pyspark.ml.classification import LogisticRegression
from pyspark.ml import Pipeline
from sparkdl import DeepImageFeaturizer

featurizer = DeepImageFeaturizer(inputCol="image", outputCol="features", modelName="InceptionV3")
lr = LogisticRegression(maxIter=20, regParam=0.05, elasticNetParam=0.3, labelCol="label")
p = Pipeline(stages=[featurizer, lr])

p_model = p.fit(train_df)


我收到此错误:

IllegalArgumentException: u'NodeDef mentions attr \'dilations\' not in Op<name=Conv2D; signature=input:T, filter:T -> output:T; attr=T:type,allowed=[DT_HALF, DT_FLOAT]; attr=strides:list(int); attr=use_cudnn_on_gpu:bool,default=true; attr=padding:string,allowed=["SAME", "VALID"]; attr=data_format:string,default="NHWC",allowed=["NHWC", "NCHW"]>; NodeDef: given/conv2d_95/convolution = Conv2D[T=DT_FLOAT, data_format="NHWC", dilations=[1, 1, 1, 1], padding="VALID", strides=[1, 2, 2, 1], use_cudnn_on_gpu=true](given/sub_1, given/conv2d_95/kernel/read). (Check whether your GraphDef-interpreting binary is up to date with your GraphDef-generating binary.).'


*我被卡住了,请帮帮我:)

最佳答案

解决方案将您的张量流降级到1.4.0

关于apache-spark - “检查您的GraphDef解释二进制文件是否与生成GraphDef的二进制文件最新。”,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/48723686/

10-16 01:19