我有2个DataFrames
:trips_df
总条目= 1,048,568weather_df
条目总数= 2,654
我正在尝试将每次旅行的total_precipitation
附加为一列。我通过在start_timestamp
中的end_timestamp
中查找每次行程的trips_df
和weather_df
日期时间,并在这些时间内求和precipitation_amount
,然后将该值附加回trips_df
在新列下。
用于执行此操作的代码:
def sum_precipitation(datetime1, datetime2, weather_data):
time1_rd = datetime1.replace(minute=0, second=0)
time2_ru = datetime2.replace(minute=0, second=0) + dt.timedelta(hours=1)
if time1_rd in set(weather_data['start_precipitation_datetime']):
start_idx = weather_data.start_precipitation_datetime[
weather_data.start_precipitation_datetime==time1_rd].index[0]
if time2_ru in set(weather_data['end_precipitation_datetime']):
end_idx = weather_data.end_precipitation_datetime[
weather_data.end_precipitation_datetime==time2_ru].index[0]
precipitation_sum = weather_data.iloc[start_idx:end_idx+1, 7].sum()
else: precipitation_sum = 0
else: precipitation_sum = 0
return round(precipitation_sum, 3)
def join_weather_to_trips(trips_data, weather_data):
trips_weather_df = trips_data.copy()
fn = lambda row : sum_precipitation(row.start_timestamp, row.end_timestamp, weather_data)
col = trips_data.apply(fn, axis=1)
trips_weather_df = trips_weather_df.assign(total_precipitation=col.values)
return trips_weather_df
trip_weather_df = join_weather_to_trips(trips_df, weather_df)
我在65个条目的子集上运行了代码,大约花了1.3秒的时间。 (
CPU times: user 1.27 s, sys: 8.77 ms, total: 1.28 s, Wall time: 1.28 s
)。将该性能推算到我的整个数据中,将需要(1.3 * 1048568)/ 65 = 20971.36秒或5.8小时。有经验的人可以告诉我是否做得正确,可以在哪里加快代码的速度或是否有其他选择(例如更快的实现)?
最佳答案
这可能不是最快的,但是您可以尝试:
trips_df['precipitation_amount'] = 0
for s,e,p in zip(weather_df['start_precipitation_datetime'],
weather_df['end_precipitation_datetime'],
weather_df.precipitation_amount):
masks = trips_df.start_timestamp.between(s,e) | trips_df.end_timestamp.between(s,e)
trips_df.loc[masks, 'precipitation_amount'] += p
在我的计算机上,处理100万次旅行和260次天气花了10秒钟。因此,实际数据大约为100s。
更新:我确实尝试了1M次旅行和2600种天气,
Wall time: 1min 36s
注意:您可能需要将
weather_df['end_precipitation_datetime']
减少一分钟,以免重复计算,以防旅行在某个小时开始。关于python - 我可以使此代码更高效吗?目前,要运行大约100万个条目,大约需要6个小时,我们在Stack Overflow上找到一个类似的问题:https://stackoverflow.com/questions/56191458/