JSON字符串输出有问题。我正在使用标签分隔的CSV文件,它看起来像这样:

date        time        loc_id  country name    sub1_id sub2_id type
2014-09-11  00:00:01    179     US      acmnj   269     382     ico
2014-09-11  00:00:01    179     US      acmnj   269     382     ico
2014-09-11  00:00:01    179     GB      acmnj   269     382     ico
2014-09-11  00:00:01    179     US      acmnj   269     382     ico
2014-09-11  00:00:02    179     GB      acmnj   269     383     ico
2014-09-11  00:00:02    179     JP      acmnj   269     383     ico

代码如下:
df = pd.read_csv('log.csv',sep='\t',encoding='utf-16')
count = df.groupby(['country','name','sub1_id','sub2_id','type']).size()
print(count.order(na_position='last',ascending=False).to_frame().to_json(orient='index'))

输出如下(前几行):
{"["US","acmnj",269,383,"ico"]":{"0":76174},"["US","acmnj",269,382,"ico"]":{"0":73609},"["IT","acmnj",269,383,"ico"]":{"0":54211},"["IT","acmnj",269,382,"ico"]":{"0":52398},"["GB","acmnj",269,383,"ico"]":{"0":41346},"["GB","acmnj",269,382,"ico"]":{"0":40140},"["US","acmnj",269,405,"ico"]":{"0":39482},"["US","acmnj",269,400,"ico"]":{"0":39303},"["US","popcdd",178,365,"ico"]":{"0":33168},"["IT","acmnj",269,400,"ico"]":{"0":33026},"["IT","acmnj",269,405,"ico"]":{"0":32824},"["IT","achrfb141",141,42,"ico"]":{"0":26986},"["GB","acmnj",269,405,"ico"]":{"0":25895},"["IN","acmnj",269,383,"ico"]":{"0":25647},"["GB","acmnj",269,400,"ico"]":{"0":25488...
我想用PHP加载这个输出,但是当我试图解码时,我得到了空值。我使用JSON验证器来检查字符串,但它是无效的。我也尝试了不使用orient参数,但得到的JSON格式无效。

最佳答案

这似乎是熊猫的问题我重现了你的错误。
DataFrame.to_json可以接受几个不同的方向参数:“split”、“records”、“index”、“columns”和“values”。
在您的例子中,“split”、“records”和“values”似乎起作用,但“index”和“columns”不起作用。
您可以使用json模块在python中快速测试:

df = pd.read_csv('log.csv',sep='\t',encoding='utf-16')
count = df.groupby(['country','name','sub1_id','sub2_id','type']).size()
f=count.order(ascending=False).to_frame()
json.loads(f.to_json(orient='index'))  # This failed for me
json.loads(f.to_json(orient='records')) #This worked

关于python - Python Pandas to_json()无效格式,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/25810587/

10-10 05:24