小编林雅峰*_*林雅峰的帖子

TypeError:'JavaPackage'对象不可调用

当我编码spark sql API时hiveContext.sql()

from pyspark import SparkConf,SparkContext
from pyspark.sql import SQLContext,HiveContext

conf = SparkConf().setAppName("spark_sql")

sc = SparkContext(conf = conf)
hc = HiveContext(sc)

#rdd = sc.textFile("test.txt")
sqlContext = SQLContext(sc)
res = hc.sql("use teg_uee_app")
#for each in res.collect():
#    print(each[0])
sc.stop()
Run Code Online (Sandbox Code Playgroud)

我收到以下错误:

enFile "spark_sql.py", line 23, in <module>
res = hc.sql("use teg_uee_app")
File "/spark/python/pyspark/sql/context.py", line 580, in sql
    return DataFrame(self._ssql_ctx.sql(sqlQuery), self)
File "/spark/python/pyspark/sql/context.py", line 683, in _ssql_ctx
    self._scala_HiveContext = self._get_hive_ctx()
File "/spark/python/pyspark/sql/context.py", line 692, in _get_hive_ctx
return self._jvm.HiveContext(self._jsc.sc())
  TypeError: 'JavaPackage' object is …
Run Code Online (Sandbox Code Playgroud)

apache-spark-sql pyspark

8
推荐指数
1
解决办法
1万
查看次数

标签 统计

apache-spark-sql ×1

pyspark ×1