问题描述
我在AWS EMR上使用jupyter笔记本来运行PySpark,但无法从另一个文件导入模块.我尝试了几种我在stackoverflow上搜索过的方法,但没有一个起作用.更具体地说,我尝试了以下操作(在这里,与运行import语句的笔记本位于同一目录中,有一个名为"include.ipynb"的笔记本):
I'm using jupyter notebook on AWS EMR to run PySpark, and is having trouble importing modules from another file. I tried a couple methods that I searched on stackoverflow, none worked. More specifically, I tried the following (here I have a notebook named "include.ipynb" in the same directory as the notebook that runs the import statements):
这两种方法都可以在我的本地计算机上的jupyter笔记本中使用.他们为什么不在AWS上工作?
Both of these methods worked in jupyter notebook on my local computer. Why aren't they working on AWS?
推荐答案
您必须使用Pip或conda在EMR中明确安装这些软件包.您当地的计算机已经安装了那些软件包.
You have to install these packages in your EMR explicitly using Pip or conda. Your local has those packages installed already.
这篇关于如何从运行PySpark内核的EMR jupyter笔记本中的另一个ipynb文件导入?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!