abs*_*ths 6 azure azure-storage-blobs apache-spark azure-data-lake
我需要能够在我的本地机器上运行 spark 来访问 azure wasb 和 adl url,但我无法让它工作。我在这里有一个精简的例子:
maven pom.xml(全新的pom,只设置了依赖):
<dependencies>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.3.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.3.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.8.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-azure-datalake</artifactId>
<version>3.1.0</version>
</dependency>
<dependency>
<groupId>com.microsoft.azure</groupId>
<artifactId>azure-storage</artifactId>
<version>6.0.0</version>
</dependency>
<dependency>
<groupId>com.microsoft.azure</groupId>
<artifactId>azure-data-lake-store-sdk</artifactId>
<version>2.2.3</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-azure</artifactId>
<version>3.1.0</version>
</dependency>
<dependency>
<groupId>com.microsoft.azure</groupId>
<artifactId>azure-storage</artifactId>
<version>7.0.0</version>
</dependency>
Run Code Online (Sandbox Code Playgroud)
Java 代码(不需要是 java - 可以是 scala):
import org.apache.spark.SparkConf;
import org.apache.spark.SparkContext;
import org.apache.spark.sql.SparkSession;
public class App {
public static void main(String[] args) {
SparkConf config = new SparkConf();
config.setMaster("local");
config.setAppName("app");
SparkSession spark = new SparkSession(new SparkContext(config));
spark.read().parquet("wasb://container@host/path");
spark.read().parquet("adl://host/path");
}
}
Run Code Online (Sandbox Code Playgroud)
无论我尝试什么,我都会得到:
Exception in thread "main" java.io.IOException: No FileSystem for scheme: wasb
Run Code Online (Sandbox Code Playgroud)
adl 也一样。我能找到的每个文档都只是说添加我已经完成的 azure-storage 依赖项,或者说要使用 HDInsight。
有什么想法吗?
我想通了这一点并决定发布一个工作项目,因为这一直是我所寻找的。它托管在这里:
不过,它的症结正如@Shankar Koirala 所建议的:
对于 WASB,设置属性以允许识别 url 方案:
config.set("spark.hadoop.fs.wasb.impl", "org.apache.hadoop.fs.azure.NativeAzureFileSystem");
Run Code Online (Sandbox Code Playgroud)
然后设置授权访问该帐户的属性。对于需要访问的每个帐户,您都需要其中之一。这些是通过 Azure 门户在存储帐户边栏选项卡的访问密钥部分下生成的。
config.set("fs.azure.account.key.[storage-account-name].blob.core.windows.net", "[access-key]");
Run Code Online (Sandbox Code Playgroud)
现在对于 adl,像 WASB 一样分配 fs 方案:
config.set("spark.hadoop.fs.adl.impl", "org.apache.hadoop.fs.adl.AdlFileSystem");
// I don't know why this would be needed, but I saw it
// on an otherwise very helpful page . . .
config.set("spark.fs.AbstractFileSystem.adl.impl", "org.apache.hadoop.fs.adl.Adl");
Run Code Online (Sandbox Code Playgroud)
. . . 最后,在这些属性中设置客户端访问密钥,再次为您需要访问的每个不同帐户设置:
config.set("fs.adl.oauth2.access.token.provider.type", "ClientCredential");
/* Client ID is generally the application ID from the azure portal app registrations*/
config.set("fs.adl.oauth2.client.id", "[client-id]");
/*The client secret is the key generated through the portal*/
config.set("fs.adl.oauth2.credential", "[client-secret]");
/*This is the OAUTH 2.0 TOKEN ENDPOINT under the ENDPOINTS section of the app registrations under Azure Active Directory*/
config.set("fs.adl.oauth2.refresh.url", "[oauth-2.0-token-endpoint]");
Run Code Online (Sandbox Code Playgroud)
我希望这是有帮助的,我希望我能感谢 Shankar 的回答,但我也想知道确切的细节。
我不确定adl尚未测试,但wasb您需要定义要在底层Hadoop配置中使用的文件系统。
由于您使用的是 spark 2.3,因此您可以使用 spark session 创建一个入口点作为
val spark = SparkSession.builder().appName("read from azure storage").master("local[*]").getOrCreate()
Run Code Online (Sandbox Code Playgroud)
现在定义文件系统
spark.sparkContext.hadoopConfiguration.set("fs.azure", "org.apache.hadoop.fs.azure.NativeAzureFileSystem")
spark.sparkContext.hadoopConfiguration.set("fs.azure.account.key.yourAccount.blob.core.windows.net", "yourKey ")
Run Code Online (Sandbox Code Playgroud)
现在将镶木地板文件读取为
val baseDir = "wasb[s]://BlobStorageContainer@yourUser.blob.core.windows.net/"
val dfParquet = spark.read.parquet(baseDir + "pathToParquetFile")
Run Code Online (Sandbox Code Playgroud)
希望这可以帮助!