简体   繁体   English

TypeError:ArrayType(DoubleType,true)无法接受对象u'..'

[英]TypeError: ArrayType(DoubleType,true) can not accept object u'..'

I cannot create a DataFrame because of coordinates . 由于coordinates我无法创建DataFrame。 This field does not fit the schema type ArrayType(DoubleType()) . 此字段不适合架构类型ArrayType(DoubleType())

my_schema = StructType(
        [
            StructField('alarm_id', StringType()),
            StructField('coordinates',ArrayType(DoubleType()))
        ])

df = spark.createDataFrame(rows, my_schema) 

I get this error: 我收到此错误:

TypeError: ArrayType(DoubleType,true) can not accept object u'[[[1.7594273000000102, 41.82814869999999], [1.7594281999999908, 41.828104700000004]]]' in type <type 'unicode'>

Is there any workaround? 有什么解决方法吗?

It seems that your data is type of string. 看来您的数据是字符串类型。

You can use the ast lib in order to make it a list. 您可以使用ast lib使其成为列表。

import ast

rows = '[[[1.7594273000000102, 41.82814869999999], [1.7594281999999908, 41.828104700000004]]]'

rows_li = ast.literal_eval(rows)

More on literal_eval 更多关于literal_eval

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Pyspark DataframeType error a: DoubleType can not accept object 'a' in type<class 'str'></class> - Pyspark DataframeType error a: DoubleType can not accept object 'a' in type <class 'str'> Spark TypeError:LongType不能接受类型中的对象u&#39;Value&#39; <type 'unicode'> - Spark TypeError: LongType can not accept object u'Value' in type <type 'unicode'> 将字符串转换为 ArrayType(DoubleType) pyspark dataframe - Casting string to ArrayType(DoubleType) pyspark dataframe pyspark:TypeError:IntegerType不能接受类型中的对象<type 'unicode'> - pyspark: TypeError: IntegerType can not accept object in type <type 'unicode'> TypeError:TimestampType无法接受对象 <class 'str'> 和 <class 'int'> - TypeError: TimestampType can not accept object <class 'str'> and <class 'int'> TypeError:返回数组必须为ArrayType - TypeError: return arrays must be of ArrayType 获取TypeError(“StructType不能接受类型%s中的对象%r”%(obj,type(obj))) - Getting TypeError(“StructType can not accept object %r in type %s” % (obj, type(obj))) PySpark:TypeError:StructType不能接受类型为0.10000000000000001的对象<type 'numpy.float64'> - PySpark: TypeError: StructType can not accept object 0.10000000000000001 in type <type 'numpy.float64'> Numpy:TypeError:返回 arrays 必须是 ArrayType - Numpy: TypeError: return arrays must be of ArrayType 类型错误:字段 col1:LongType 不能接受类型中的对象“”<class 'str'> - TypeError: field col1: LongType can not accept object '' in type <class 'str'>
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM