suchen

Heim  >  Fragen und Antworten  >  Hauptteil

python - spark submit的时候怎么导入自己写的依赖模块?

python代码中的import

1

<code>from spark_learning.utils.default_utils import setDefaultEncoding,initSparkContext,ensureOffset</code>

submit命令:

1

2

3

<code>bin/spark-submit --jars /home/jabo/software/spark-1.5.2-bin-hadoop2.6/lib/spark-streaming-kafka-assembly_2.10-1.5.2.jar\

/home/jabo/spark-by-python/spark_learning/third_day/streaming_kafka_avg.py\

--py-files /home/jabo/spark-by-python/spark_learning/utils/default_utils.py</code>

官网解释:

1

<code>For Python applications, simply pass a .py file in the place of <application-jar> instead of a JAR, and add Python .zip, .egg or .py files to the search path with --py-files.</code>

但是会报错,找不到import模块:

1

2

3

4

<code>Traceback (most recent call last):

  File "/home/jabo/spark-by-python/spark_learning/third_day/streaming_kafka_avg.py", line 10, in <module>

    import spark_learning.utils.default_utils

ImportError: No module named spark_learning.utils.default_utils</code>

如何解决??

怪我咯怪我咯2864 Tage vor1016

Antworte allen(1)Ich werde antworten

  • PHP中文网

    PHP中文网2017-04-17 17:25:34

    你可以试一下把--py-files 参数 放在你要运行脚本的前面哈!刚才我们也遇到这个问题 就是这样解决的!

    Antwort
    0
  • StornierenAntwort