Rom*_*man 5 configuration python-3.x google-cloud-platform pyspark google-cloud-dataproc
我想通过谷歌云平台数据流运行一个pyspark工作,但我无法弄清楚如何设置pyspark默认运行python3而不是2.7.
我能找到的最好的是添加这些初始化命令
但是,当我进入集群时,
(a)python命令仍然是python2,
(b)我的作业由于python 2不兼容而失败.
我已经尝试alias python='python3'在我的init.sh脚本中卸载python2和别名,但是唉,没有成功.别名似乎没有坚持.
我像这样创建集群
cluster_config = {
"projectId": self.project_id,
"clusterName": cluster_name,
"config": {
"gceClusterConfig": gce_cluster_config,
"masterConfig": master_config,
"workerConfig": worker_config,
"initializationActions": [
[{
"executableFile": executable_file_uri,
"executionTimeout": execution_timeout,
}]
],
}
}
credentials = GoogleCredentials.get_application_default()
api = build('dataproc', 'v1', credentials=credentials)
response = api.projects().regions().clusters().create(
projectId=self.project_id,
region=self.region, body=cluster_config
).execute()
Run Code Online (Sandbox Code Playgroud)
我的executable_file_uri是谷歌存储; init.sh:
apt-get -y update
apt-get install -y python-dev
wget -O /root/get-pip.py https://bootstrap.pypa.io/get-pip.py
python /root/get-pip.py
apt-get install -y python-pip
pip install --upgrade pip
pip install --upgrade six
pip install --upgrade gcloud
pip install --upgrade requests
pip install numpy
Run Code Online (Sandbox Code Playgroud)
小智 5
我在这里找到了一个答案,这样我的初始化脚本现在看起来像这样:
#!/bin/bash
# Install tools
apt-get -y install python3 python-dev build-essential python3-pip
easy_install3 -U pip
# Install requirements
pip3 install --upgrade google-cloud==0.27.0
pip3 install --upgrade google-api-python-client==1.6.2
pip3 install --upgrade pytz==2013.7
# Setup python3 for Dataproc
echo "export PYSPARK_PYTHON=python3" | tee -a /etc/profile.d/spark_config.sh /etc/*bashrc /usr/lib/spark/conf/spark-env.sh
echo "export PYTHONHASHSEED=0" | tee -a /etc/profile.d/spark_config.sh /etc/*bashrc /usr/lib/spark/conf/spark-env.sh
echo "spark.executorEnv.PYTHONHASHSEED=0" >> /etc/spark/conf/spark-defaults.conf
Run Code Online (Sandbox Code Playgroud)
| 归档时间: |
|
| 查看次数: |
2800 次 |
| 最近记录: |