问题描述
在阅读与Tweepy进行流传输并经历此示例.我试图编写一个tweepy应用程序以使用tweepy Api抓取实时流数据并将其保存到.csv文件.当我运行代码时,它将返回空的csv文件('OutputStreaming.csv'),其列名称为['Date','Text','Location','Number_Follower','User_Name','Friends_count','Hash_Tag],不是信息流的推文.我还尝试过这样做 ://stackoverflow.com/questions/28130124/how-to-save-a-tweepy-twitter-stream-to-a-file>此,但我在代码中得到了相同的结果:-
After reading streaming with Tweepy and going through this example. I tried to write a tweepy app to crawl live stream data with the tweepy Api and save it to .csv file. When I run my code, it returns empty csv file ('OutputStreaming.csv') with column names['Date', 'Text', 'Location','Number_Follower','User_Name', 'Friends_count','Hash_Tag], not the stream tweets. I also tried to do it in this way also this one, but I am getting the same out put with my code:-
def on_status(self, status):
with open('OutputStreaming.csv', 'w') as f:
f.write(['Author,Date,Text')
writer = csv.writer(f)
writer.writerow([status.created_at.strftime("%Y-%m-%d \
%H:%M:%S")status.text.encode,
status.location,
status.Number_of_follwers,
status.author.screen_name,
status.friends_count])
我被卡住了.我不知道代码在哪里,我的代码看起来像这样:-
I got stuck. I can’t figure out where is the problem with the code, my code look like this:-
import tweepy
from tweepy.streaming import StreamListener
from tweepy import OAuthHandler
from tweepy import Stream
import json #data
#Variables that contains the user credentials to access Twitter API
access_token = "***"
access_token_secret = "***"
consumer_key = "***"
consumer_key_secret = "***"
auth = tweepy.OAuthHandler(consumer_key, consumer_key_secret)
auth.set_access_token(access_token, access_token_secret)
#setup api
api = tweepy.API(auth)
class CustomStreamListener(tweepy.StreamListener):
def on_data(self,data):
if data:
tweet_json = json.loads(data)
if tweet_json:
if not tweet_json['text'].strip().startswith('RT '):
Created = data.created_at.strftime("%Y-%m-%d-%H:%M:%S")`
Text = data.text.encode('utf8')
Location = data.location('utf8')
Follower = data.Number_of_follwers('utf8')
Name = data.author.screen_name('utf8')
Friend = data.friends_count('utf8')
with open('OutputStreaming.csv', 'a') as f:
writer = csv.writer(f)
writer.writerow([Created, Text ,Loaction\
,Follower ,Name ,Friend,status.entities.get('hashtags')])
Time.sleep(10)
return True
def on_error(self, status_code):
if status_code == 420:
return False
else:
print >> sys.stderr, 'Encountered error with status code:',\
status_code
def on_timeout(self):
print >> sys.stderr, 'Timeout...'
return True
# Writing csv titles
with open('OutputStreaming.csv', 'a') as f:
writer = csv.writer(f)
writer.writerow(['Date', 'Text', 'Location','Number_Follower',
'User_Name', 'Friends_count','Hash_Tag'])
if __name__ == '__main__':
l = CustomStreamListener()
streamingAPI = tweepy.streaming.Stream(api.auth, l)
streamingAPI.filter(track=['#Yoga','#Meditation'])
推荐答案
以下是有效的代码:
#!/usr/bin/python3
# coding=utf-8
import tweepy
SEP = ';'
csv = open('OutputStreaming.csv','a')
csv.write('Date' + SEP + 'Text' + SEP + 'Location' + SEP + 'Number_Follower' + SEP + 'User_Name' + SEP + 'Friends_count\n')
class MyStreamListener(tweepy.StreamListener):
def on_status(self, status):
Created = status.created_at.strftime("%Y-%m-%d-%H:%M:%S")
Text = status.text.replace('\n', ' ').replace('\r', '').replace(SEP, ' ')
Location = ''
if status.coordinates is not None:
lon = status.coordinates['coordinates'][0]
lat = status.coordinates['coordinates'][1]
Location = lat + ',' + lon
Follower = str(status.user.followers_count)
Name = status.user.screen_name
Friend = str(status.user.friends_count)
csv.write(Created + SEP + Text + SEP + Location + SEP + Follower + SEP + Name + SEP + Friend + '\n')
def on_error(self, status_code):
print(status_code)
consumer_key = '***'
consumer_secret = '***'
access_token = '***'
access_token_secret = '***'
# stream
auth = tweepy.OAuthHandler(consumer_key, consumer_secret)
auth.set_access_token(access_token, access_token_secret)
myStream = tweepy.Stream(auth, MyStreamListener())
myStream.filter(track=['#Yoga','#Meditation'])
这篇关于Tweepy:抓取实时流推文并保存到.csv文件的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!