本文介绍了pySpark数据帧“声明isinstance(dataType,DataType),” dataType应该是DataType”。的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!
问题描述
我想动态生成我的数据帧架构,但出现以下
错误:
I want to generate my Data Frame schema dynamically I have the following Error:
assert isinstance(dataType, DataType), "dataType should be DataType"
AssertionError: dataType should be DataType
code :
filteredSchema = []
for line in correctSchema:
fieldName = line.split(',')
if fieldName[1] == "decimal":
filteredSchema.append([fieldName[0], "DecimalType()"])
elif fieldName[1] == "string":
filteredSchema.append([fieldName[0], "StringType()"])
elif fieldName[1] == "integer":
filteredSchema.append([fieldName[0], "IntegerType()"])
elif fieldName[1] == "date":
filteredSchema.append([fieldName[0], "DateType()"])
sample1 = [(line[0], line[1], True) for line in filteredSchema]
print sample1
fields = [StructField(line[0], line[1], True) for line in filteredSchema]
如果我使用此功能:
fields = [StructField(line[0], StringType(), True) for line in filteredSchema]
可以正常工作,
但样本1的打印内容为:
but the print out of sample1 is:
[('record_id', 'StringType()', True), ('offer_id', 'DecimalType()', True), ('decision_id', 'DecimalType()', True), ('offer_type_cd', 'DecimalType()', True), ('promo_id', 'DecimalType()', True), ('pymt_method_type_cd', 'DecimalType()', True), ('cs_result_id', 'DecimalType()', True), ('cs_result_usage_type_cd', 'DecimalType()', True), ('rate_index_type_cd', 'DecimalType()', True), ('sub_product_id', 'DecimalType()', True), ('campaign_id', 'DecimalType()', True), ('market_cell_id', 'DecimalType()', True), ('assigned_offer_id', 'StringType()', True), ('accepted_offer_flag', 'StringType()', True), ('current_offer_flag', 'StringType()', True), ('offer_good_until_date', 'StringType()', True), ('rescindable_days', 'DecimalType()', True), ('rescinded_date', 'StringType()', True), ('amount', 'DecimalType()', True), ('max_amount', 'DecimalType()', True), ('amount_financed', 'DecimalType()', True), ('down_pymt', 'DecimalType()', True), ('rate', 'DecimalType()', True), ('term_mm', 'DecimalType()', True), ('origination_fee_amount', 'DecimalType()', True), ('origination_fee_rate', 'DecimalType()', True), ('finance_charge', 'DecimalType()', True), ('nbr_of_pymts', 'DecimalType()', True), ('pymt', 'DecimalType()', True), ('total_pymts', 'DecimalType()', True), ('first_pymt_date', 'StringType()', True), ('contract_date', 'StringType()', True), ('acct_nbr', 'StringType()', True), ('acct_nbr_assigned_dttm', 'StringType()', True), ('acct_expiration_dttm', 'StringType()', True), ('offer_desc', 'StringType()', True), ('min_rate', 'DecimalType()', True), ('max_rate', 'DecimalType()', True), ('min_amount', 'DecimalType()', True), ('annual_fee_amount', 'DecimalType()', True), ('annual_fee_waived_mm', 'DecimalType()', True), ('late_fee_percent', 'DecimalType()', True), ('late_fee_min_amount', 'DecimalType()', True), ('offer_sales_script', 'StringType()', True), ('offer_order', 'DecimalType()', True), ('presentable_flag', 'StringType()', True), ('index_rate', 'DecimalType()', True), ('insrt_dttm', 'StringType()', True), ('insrt_usr_id', 'StringType()', True), ('chng_dttm', 'StringType()', True), ('chng_usr_id', 'StringType()', True), ('actv_flag', 'StringType()', True), ('correlation_id', 'StringType()', True), ('offer_status_type_cd', 'StringType()', True), ('presentation_instrument_nbr', 'StringType()', True)]
我只尝试了不带()的DataType,也代替了DecimalType,我使用了FloatType( )。
I've tried just the DataType without (), also instead of DecimalType i've used FloatType().
推荐答案
弄清楚这些需要没有引号:
Figured it out these need to be without quotes:
filteredSchema.append([fieldName[0], DecimalType()])
elif fieldName[1] == "string":
filteredSchema.append([fieldName[0], StringType()])
elif fieldName[1] == "integer":
filteredSchema.append([fieldName[0], IntegerType()])
elif fieldName[1] == "date":
filteredSchema.append([fieldName[0], DateType()])
这篇关于pySpark数据帧“声明isinstance(dataType,DataType),” dataType应该是DataType”。的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!