我正在尝试在我的 AWS 帐户上部署 lambda 函数,该函数将上传到“lambda-bucket-in”的文件压缩到存储桶“lambda-bucket-out”上。我试图遵循的例子就是这个,特别是第二种方式(第一种方式可以正常工作)。
\n我的 pulumi 代码如下:
\nimport * as aws from "@pulumi/aws";\nimport * as pulumi from "@pulumi/pulumi";\n\n\nconst tpsReports = new aws.s3.Bucket("lambda-bucket-in");\nconst tpsZips = new aws.s3.Bucket("lambda-bucket-out");\n\n\n// First, create some IAM machinery:\nconst zipFuncRole = new aws.iam.Role("zipTpsReportsFuncRole", {\n assumeRolePolicy: {\n Version: "2012-10-17",\n Statement: [{\n Action: "sts:AssumeRole",\n Principal: {\n Service: "lambda.amazonaws.com",\n },\n Effect: "Allow",\n Sid: "",\n }],\n },\n });\n new aws.iam.RolePolicyAttachment("zipTpsReportsFuncRoleAttach", {\n role: zipFuncRole,\n policyArn: aws.iam.ManagedPolicy.AWSLambdaExecute,\n });\n \n // Next, create the Lambda function itself:\n const …
Run Code Online (Sandbox Code Playgroud) 我有 Windows 10,我按照本指南安装 Spark 并使其在我的操作系统上运行,只要使用 Jupyter Notebook 工具。我使用这个命令来实例化 master 并导入我工作所需的包:
pyspark --packages graphframes:graphframes:0.8.1-spark3.0-s_2.12 --master local[2]
然而,后来,我发现没有根据上述指南实例化任何工人,我的任务真的很慢。因此,考虑从中得到启发这个,因为我无法找到任何其他方式工作者连接到群集管理器,由于它是由码头工人运行的事实,我想手动设置一切用下面的命令:
bin\spark-class org.apache.spark.deploy.master.Master
master 已正确实例化,因此我继续执行下一个命令:
bin\spark-class org.apache.spark.deploy.worker.Worker spark://<master_ip>:<port> --host <IP_ADDR>
它返回了以下错误:
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
21/04/01 14:14:21 INFO Master: Started daemon with process name: 8168@DESKTOP-A7EPMQG
21/04/01 14:14:21 ERROR SparkUncaughtExceptionHandler: Uncaught exception in thread Thread[main,5,main]
java.lang.ExceptionInInitializerError
at org.apache.spark.unsafe.array.ByteArrayMethods.<clinit>(ByteArrayMethods.java:54)
at org.apache.spark.internal.config.package$.<init>(package.scala:1006)
at org.apache.spark.internal.config.package$.<clinit>(package.scala)
at org.apache.spark.deploy.master.MasterArguments.<init>(MasterArguments.scala:57)
at org.apache.spark.deploy.master.Master$.main(Master.scala:1123)
at org.apache.spark.deploy.master.Master.main(Master.scala)
Caused by: java.lang.reflect.InaccessibleObjectException: Unable to make private java.nio.DirectByteBuffer(long,int) accessible: module java.base does …
Run Code Online (Sandbox Code Playgroud)