小编som*_*rti的帖子

FAILED:元数据错误:java.lang.RuntimeException:无法实例化org.apache.hadoop.hive.metastore.HiveMetaStoreClient

我在HDFS和hive实例运行时关闭了我的HDFS客户端.现在,当我重新进入Hive时,我无法执行任何我的DDL任务,例如"show tables"或"describe tablename"等.它给我的错误如下

ERROR exec.Task (SessionState.java:printError(401)) - FAILED: Error in metadata: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
Run Code Online (Sandbox Code Playgroud)

任何人都可以建议我需要做什么来实现我的metastore_db实例化而不重新创建表?否则,我必须再次重复创建整个数据库/模式的工作.

hadoop hive hql hdfs

3
推荐指数
2
解决办法
3万
查看次数

pngfix.c:2151:未定义引用`inflateReset2'

我的平台:

Centos 6.X,Matplotlib-1.3.1,Numpy-1.8.0,Scipy 0.14.0.dev-bb608ba

我正在尝试安装libpng-1.6.6来显示.png文件但是在尝试make它时未能给出以下错误.

注意:我已成功预安装zlib(以及freetype2),这应该是错误所指向的.

pngfix.o: In function `zlib_reset':
/usr/lib/hue/libpng-1.6.6/contrib/tools/pngfix.c:2151: undefined reference to `inflateReset2'
collect2: ld returned 1 exit status
make[1]: *** [pngfix] Error 1
make[1]: Leaving directory `/usr/lib/hue/libpng-1.6.6'
make: *** [all] Error 2
Run Code Online (Sandbox Code Playgroud)

请参阅我的原始线程matplotlib-pyplot-does-not-show-output-no-error的链接

我检查了2151行的pngfix.c.它是zlib_reset函数,与rc设置有关.是否指向更改一些matplotlibrc设置?

   2136 zlib_reset(struct zlib *zlib, int window_bits)
   2137    /* Reinitializes a zlib with a different window_bits */
   2138 {
   2139    assert(zlib->state >= 0); /* initialized by zlib_init */
   2140
   2141    zlib->z.next_in = Z_NULL;
   2142    zlib->z.avail_in = 0;
   2143    zlib->z.next_out = Z_NULL; …
Run Code Online (Sandbox Code Playgroud)

png matplotlib libpng

3
推荐指数
2
解决办法
5697
查看次数

如何从模板类方法中调用已定义类的方法

我应该如何从模板类方法中调用已定义类中的方法?以下是我的情景 -

  1. 模板类

    template <class T>
    class TC {
        void myTemplateMethod() {
            T.myMethod();  //can I call like this ?
        }
    }; 
    
    Run Code Online (Sandbox Code Playgroud)
  2. 定义的类

    class tdef {
        void myMethod() { 
            //does something
        }
    };
    
    Run Code Online (Sandbox Code Playgroud)
  3. 主要

    int main()  {
        TC<tdef> tobj;
        tobj.myTemplateMethod(); //can I call tdef.myMethod() like this?
    }
    
    Run Code Online (Sandbox Code Playgroud)

请注意,我已经调试了这样的代码,并且发现tdef.myMethod()在这样调用时不起作用.还有什么机会在从Template类方法中调用tdef.myMethod()时不处理某些异常?

-Somnath

c++ templates

2
推荐指数
1
解决办法
145
查看次数

java HashMap containsKey返回false虽然key存在

我使用HashMap数据结构来存储SqMatrix(方阵),其中键的类型为MatrixIndex(包含row和col),值的类型为Integer.

但是当我得到"if(mat.containsKey(key))"的输出时,尽管HashMap中有相应的键.

主要代码:

public static void main(String[] args) {

    Random generator = new Random();
    int val = 0;
    Types.MatrixIndex key, key1;
    int matSz = (int) Math.floor(Math.sqrt(10));
    Types.SqMatrix mat = new Types().new SqMatrix(matSz); //matSz*matSz elements
    //HashMap<Types.MatrixIndex,Integer> hMap= new HashMap<Types.MatrixIndex,Integer>(10);
    for (int r=0; r<matSz; r++) {
        for (int c=0; c<matSz; c++) {
            if (r<c) {
                val = generator.nextInt(2) > 0? -1 : val;
                key =(new Types()).new MatrixIndex(r, c);
                key1 = (new Types()).new MatrixIndex(c, r);
                mat.put(key, val);
                mat.put(key1, val);
                generator.setSeed(System.currentTimeMillis());
            }
        }
    } …
Run Code Online (Sandbox Code Playgroud)

java

2
推荐指数
1
解决办法
4804
查看次数

为什么start-all.sh从root导致"无法启动org.apache.spark.deploy.master.Master:JAVA_HOME未设置"?

我正在尝试通过在cloudera quickstart VM 5.3.0上运行的独立Spark服务来执行通过Scala IDE构建的Spark应用程序.

我的cloudera帐户JAVA_HOME是/ usr/java/default

但是,我从cloudera用户执行start-all.sh命令时面临以下错误消息,如下所示:

[cloudera@localhost sbin]$ pwd
/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin
[cloudera@localhost sbin]$ ./start-all.sh
chown: changing ownership of `/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs': Operation not permitted
starting org.apache.spark.deploy.master.Master, logging to /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs/spark-cloudera-org.apache.spark.deploy.master.Master-1-localhost.localdomain.out
/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/spark-daemon.sh: line 151: /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs/spark-cloudera-org.apache.spark.deploy.master.Master-1-localhost.localdomain.out: Permission denied
failed to launch org.apache.spark.deploy.master.Master:
tail: cannot open `/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs/spark-cloudera-org.apache.spark.deploy.master.Master-1-localhost.localdomain.out' for reading: No such file or directory
full log in /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/../logs/spark-cloudera-org.apache.spark.deploy.master.Master-1-localhost.localdomain.out
cloudera@localhost's password: 
localhost: chown: changing ownership of `/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/logs': Operation not permitted
localhost: starting org.apache.spark.deploy.worker.Worker, logging to /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/logs/spark-cloudera-org.apache.spark.deploy.worker.Worker-1-localhost.localdomain.out
localhost: /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/sbin/spark-daemon.sh: line 151: /opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark/logs/spark-cloudera-org.apache.spark.deploy.worker.Worker-1-localhost.localdomain.out: Permission denied
localhost: …
Run Code Online (Sandbox Code Playgroud)

java scala cloudera apache-spark

2
推荐指数
1
解决办法
4739
查看次数

SyntaxError:语法无效

我使用的是Python 2.6.6并且是一个新手.我正进入(状态

  File "./factorizer.py", line 35
    return {n: factorize_naive(n) for n in nums}
                                    ^
SyntaxError: invalid syntax
Run Code Online (Sandbox Code Playgroud)

在返回的for语句中.它应该返回一个字典,每个元素都有数字作为键,素数因子列表作为值.我哪里错了?

python python-2.6

1
推荐指数
1
解决办法
115
查看次数

java.io.IOException:没有用于scheme的文件系统:hdfs

我正在使用Cloudera Quickstart VM CDH5.3.0(就parcels包而言)和Spark 1.2.0 $SPARK_HOME=/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/spark使用命令并使用命令提交Spark应用程序

./bin/spark-submit --class <Spark_App_Main_Class_Name> --master spark://localhost.localdomain:7077 --deploy-mode client --executor-memory 4G ../apps/<Spark_App_Target_Jar_Name>.jar

Spark_App_Main_Class_Name.scala

import org.apache.spark.SparkContext
import org.apache.spark.SparkConf
import org.apache.spark.mllib.util.MLUtils


object Spark_App_Main_Class_Name {

    def main(args: Array[String]) {
        val hConf = new SparkConf()
            .set("fs.hdfs.impl", classOf[org.apache.hadoop.hdfs.DistributedFileSystem].getName)
            .set("fs.file.impl", classOf[org.apache.hadoop.fs.LocalFileSystem].getName)
        val sc = new SparkContext(hConf)
        val data = MLUtils.loadLibSVMFile(sc, "hdfs://localhost.localdomain:8020/analytics/data/mllib/sample_libsvm_data.txt")
        ...
    }

}
Run Code Online (Sandbox Code Playgroud)

但是我在客户端模式下提交应用程序时获得了ClassNotFoundExceptionfororg.apache.hadoop.hdfs.DistributedFileSystem

[cloudera@localhost bin]$ ./spark-submit --class Spark_App_Main_Class_Name --master spark://localhost.localdomain:7077 --deploy-mode client --executor-memory 4G ../apps/Spark_App_Target_Jar_Name.jar
15/11/30 09:46:34 INFO SparkContext: Spark configuration:
spark.app.name=Spark_App_Main_Class_Name
spark.driver.extraLibraryPath=/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/hadoop/lib/native
spark.eventLog.dir=hdfs://localhost.localdomain:8020/user/spark/applicationHistory
spark.eventLog.enabled=true
spark.executor.extraLibraryPath=/opt/cloudera/parcels/CDH-5.3.0-1.cdh5.3.0.p0.30/lib/hadoop/lib/native …
Run Code Online (Sandbox Code Playgroud)

hadoop scala hdfs apache-spark apache-spark-mllib

1
推荐指数
1
解决办法
2万
查看次数

如何使用有界类型参数扩展泛型类

我试图扩展一个带有有界类型参数的Generic类和另一个带有typed参数的泛型类,该类遵循超级泛型的类型参数.

具有上限类型参数的超级通用

public abstract class Cage<T extends Animal> { 
    protected Set<T> cage = new HashSet<T>();
    public abstract void add(T animal);
    public void showAnimals() {
        System.out.println(cage);
    }
}
Run Code Online (Sandbox Code Playgroud)

我想用特定的有界类型创建的泛型类,例如Lion

我尝试了下面的代码,但是我收到一个错误类型参数Lion隐藏了标记"extends"上的类型Lion语法错误,预期

对于LionCage类中的add()方法,我收到错误LionCage类型的方法add(Lion)必须覆盖或实现超类型方法

LionCage 上课的意思 Cage<Lion extends Animal>

public class LionCage<Lion extends Animal> extends Cage<T extends Animal> {
    @Override
    public void add(Lion l) {
        cage.add(l);

    }
}
Run Code Online (Sandbox Code Playgroud)

我的Animal类及其子类Lion,Rat等在Animal.java中定义

public class Animal {
    public String toString() {
        return getClass().getSimpleName();
    }   

}

class Rat extends Animal {}
class Lion …
Run Code Online (Sandbox Code Playgroud)

java generics collections types

1
推荐指数
1
解决办法
96
查看次数