[英]TypeError: ArrayType(DoubleType,true) can not accept object u'..'
I cannot create a DataFrame because of coordinates
. 由于coordinates
我无法创建DataFrame。 This field does not fit the schema type ArrayType(DoubleType())
. 此字段不适合架构类型ArrayType(DoubleType())
。
my_schema = StructType(
[
StructField('alarm_id', StringType()),
StructField('coordinates',ArrayType(DoubleType()))
])
df = spark.createDataFrame(rows, my_schema)
I get this error: 我收到此错误:
TypeError: ArrayType(DoubleType,true) can not accept object u'[[[1.7594273000000102, 41.82814869999999], [1.7594281999999908, 41.828104700000004]]]' in type <type 'unicode'>
Is there any workaround? 有什么解决方法吗?
It seems that your data is type of string. 看来您的数据是字符串类型。
You can use the ast
lib in order to make it a list. 您可以使用ast
lib使其成为列表。
import ast
rows = '[[[1.7594273000000102, 41.82814869999999], [1.7594281999999908, 41.828104700000004]]]'
rows_li = ast.literal_eval(rows)
More on literal_eval 更多关于literal_eval
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.