我通过 homebrew 安装了 apache-spark
brew install apache-spark
Run Code Online (Sandbox Code Playgroud)
然后跑了
Spark-shell
Run Code Online (Sandbox Code Playgroud)
并返回以下警告:
WARNING: An illegal reflective access operation has occurred
WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform (file:/usr/local/Cellar/apache-spark/3.2.0/libexec/jars/spark-unsafe_2.12-3.2.0.jar) to constructor java.nio.DirectByteBuffer(long,int)
WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
WARNING: All illegal access operations will be denied in a future release
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
22/01/10 18:21:03 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Spark context Web UI available at http://ip-192-168-1-176.ec2.internal:4040
Spark context available as 'sc' (master = local[*], app id = local-1641860464071).
Spark session available as 'spark'.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 3.2.0
/_/
Using Scala version 2.12.15 (OpenJDK 64-Bit Server VM, Java 11.0.12)
Type in expressions to have them evaluated.
Type :help for more information.
Run Code Online (Sandbox Code Playgroud)
这是我的 java --version
openjdk 11.0.12 2021-07-20
OpenJDK Runtime Environment Homebrew (build 11.0.12+0)
OpenJDK 64-Bit Server VM Homebrew (build 11.0.12+0, mixed mode)
Run Code Online (Sandbox Code Playgroud)
如何解决这些警告?
小智 7
要消除这些错误,需要解决两个问题:
Java11 和 Spark 3.2 一起使用时会产生这些错误。第一步是切换到 Java8,直到此问题得到解决。
即使我使用的是Java8,我使用brew 也会遇到同样的错误。不要使用brew,而是访问Apache网站https://www.apache.org/dyn/closer.lua/spark/spark-3.2.0/spark-3.2.0-bin-hadoop3.2.tgz并解压这个文件。要设置 PATH 变量以便可以键入“spark-shell”,您需要进入 .bash、.profile 或 .zshrc(无论您使用哪个)并添加“export SPARK_HOME=path-to-spark-文件夹”,然后将 $SPARK_HOME 添加到您的 PATH 变量中。
归档时间: |
|
查看次数: |
5818 次 |
最近记录: |