我试图使用下面的add命令在Hive类路径中添加一个jar.
命令:hive>添加myjar.jar
但每当我登录hive时,我都需要使用add cmd添加myjar.jar.有什么办法可以在Hive Classpath中永久添加它.
此致,穆罕默德·尼亚兹
执行以下命令时出错.
oozie job -oozie http://localhost:11000/oozie -config coordinator.properties -run
Run Code Online (Sandbox Code Playgroud)
错误:E0505:E0505:应用程序定义[hdfs:// localhost:8020/tmp/oozie-app/coordinator /]不存在
有什么建议.
当我传递参数 staging、temp 和输出 GCS 存储桶位置时,数据流作业失败并出现以下异常。
爪哇代码:
final String[] used = Arrays.copyOf(args, args.length + 1);
used[used.length - 1] = "--project=OVERWRITTEN"; final T options =
PipelineOptionsFactory.fromArgs(used).withValidation().as(clazz);
options.setProject(PROJECT_ID);
options.setStagingLocation("gs://abc/staging/");
options.setTempLocation("gs://abc/temp");
options.setRunner(DataflowRunner.class);
options.setGcpTempLocation("gs://abc");
Run Code Online (Sandbox Code Playgroud)
错误:
INFO: Staging pipeline description to gs://ups-heat-dev- tmp/mniazstaging_ingest_validation/staging/
May 10, 2018 11:56:35 AM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage
INFO: Uploading <42088 bytes, hash E7urYrjAOjwy6_5H-UoUxA> to gs://ups-heat-dev-tmp/mniazstaging_ingest_validation/staging/pipeline-E7urYrjAOjwy6_5H-UoUxA.pb
Dataflow SDK version: 2.4.0
May 10, 2018 11:56:38 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Printed job specification to gs://ups-heat-dev-tmp/mniazstaging_ingest_validation/templates/DataValidationPipeline
May 10, 2018 11:56:40 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Template successfully created. …Run Code Online (Sandbox Code Playgroud)