G. *_*ak. 3 linux amazon-web-services apache-spark
我使用的是 Linux 18.04,我想在 EC2 上运行一个 Spark 集群。
我使用export命令设置环境变量
export AWS_ACCESS_KEY_ID=MyAccesskey
export AWS_SECRET_ACCESS_KEY=Mysecretkey
Run Code Online (Sandbox Code Playgroud)
但是当我运行命令来午餐 Spark 集群时,我得到
错误:必须设置环境变量 AWS_ACCESS_KEY_ID
我把我用过的所有命令都放在了以防万一我犯了错误:
sudo mv ~/Downloads/keypair.pem /usr/local/spark/keypair.pem
sudo mv ~/Downloads/credentials.csv /usr/local/spark/credentials.csv
# Make sure the .pem file is readable by the current user.
chmod 400 "keypair.pem"
# Go into the spark directory and set the environment variables with the credentials information
cd spark
export AWS_ACCESS_KEY_ID=ACCESS_KEY_ID
export AWS_SECRET_ACCESS_KEY=SECRET_KEY
# To install Spark 2.0 on the cluster:
sudo spark-ec2/spark-ec2 -k keypair --identity-file=keypair.pem --region=us-west-2 --zone=us-west-2a --copy-aws-credentials --instance-type t2.micro --worker-instances 1 launch project-launch
Run Code Online (Sandbox Code Playgroud)
我对这些事情很陌生,非常感谢任何帮助
小智 5
您还可以使用 get 子命令来检索值 AWS_ACCESS_KEY_ID 和 AWS_SECRET_ACCESS_KEY aws configure:
AWS_ACCESS_KEY_ID=$(aws configure get aws_access_key_id)
AWS_SECRET_ACCESS_KEY=$(aws configure get aws_secret_access_key)
Run Code Online (Sandbox Code Playgroud)
在命令行中:
sudo AWS_ACCESS_KEY_ID=$(aws configure get aws_access_key_id) AWS_SECRET_ACCESS_KEY=$(aws configure get aws_secret_access_key) spark-ec2/spark-ec2 -k keypair --identity-file=keypair.pem --region=us-west-2 --zone=us-west-2a --copy-aws-credentials --instance-type t2.micro --worker-instances 1 launch project-launch
Run Code Online (Sandbox Code Playgroud)