小编kri*_*ang的帖子

Solr 4.1 DataImportHandler ClassNotFoundException

我一直在尝试按照教程设置数据导入处理程序(Solr 4.1)并尝试在之前的帖子中建议的解决方案,例如在多核solr中配置DIH 并将dataimport jar添加到类路径但错误仍然存​​在.任何方法来解决这个问题?

这是整个异常堆栈跟踪:

SEVERE: Unable to create core: collection1
org.apache.solr.common.SolrException: RequestHandler init failure
at org.apache.solr.core.SolrCore.<init>(SolrCore.java:794)
at org.apache.solr.core.SolrCore.<init>(SolrCore.java:607)
at org.apache.solr.core.CoreContainer.createFromLocal(CoreContainer.java:1003)
at org.apache.solr.core.CoreContainer.create(CoreContainer.java:1033)
at org.apache.solr.core.CoreContainer$3.call(CoreContainer.java:629)
at org.apache.solr.core.CoreContainer$3.call(CoreContainer.java:624)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:439)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
at java.util.concurrent.FutureTask.run(FutureTask.java:138)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
at java.lang.Thread.run(Thread.java:680)
Caused by: org.apache.solr.common.SolrException: RequestHandler init failure
at org.apache.solr.core.RequestHandlers.initHandlersFromConfig(RequestHandlers.java:168)
at org.apache.solr.core.SolrCore.<init>(SolrCore.java:731)
... 13 more
Caused by: org.apache.solr.common.SolrException: Error loading class 'org.apache.solr.handler.dataimport.DataImportHandler'
at org.apache.solr.core.SolrResourceLoader.findClass(SolrResourceLoader.java:438)
at org.apache.solr.core.SolrCore.createInstance(SolrCore.java:507)
at org.apache.solr.core.SolrCore.createRequestHandler(SolrCore.java:581)
at    org.apache.solr.core.RequestHandlers.initHandlersFromConfig(RequestHandlers.java:154)
... 14 more
 Caused by: java.lang.ClassNotFoundException: …
Run Code Online (Sandbox Code Playgroud)

solr dataimporthandler

11
推荐指数
1
解决办法
3万
查看次数

将垃圾收集日志保存到$ {yarn.nodemanager.log-dirs}/application _ $ {appid}/container _ $ {contid}中,用于Hadoop Yarn上的映射器和Reducer

我正在尝试为我的映射器和缩减器记录垃圾收集指标.但是我无法让日志进入路径:
${yarn.nodemanager.log-dirs}/application_${appid}/container_${contid}

以下是我的mapred-site.xml及其相关属性:

<property> <name>mapreduce.map.java.opts</name> <value>-Xloggc:${yarn.nodemanager.log-dirs}/application_${appid}/container_${contid}/gc-@taskid@.log -verbose:gc -XX:+PrintGC -XX:+PrintGCDetails -XX:+PrintGCDateStamps -XX:+PrintCommandLineFlags</value> </property> <property> <name>mapreduce.reduce.java.opts</name> <value>-Xloggc:${yarn.nodemanager.log-dirs}/application_${appid}/container_${contid}/gc-@taskid@.log -verbose:gc -XX:+PrintGC -XX:+PrintGCDetails -XX:+PrintGCDateStamps -XX:+PrintCommandLineFlags</value> </property>

但是,尽管有上述配置,但日志并未出现在正确的位置.任何有关此问题的见解都将受到高度赞赏.

java garbage-collection hadoop mapreduce hadoop-yarn

9
推荐指数
1
解决办法
389
查看次数

使用scopt OptionParser和Spark时的NoClassDefFoundError

我使用Apache Spark版本1.2.1和Scala版本2.10.4.我想让示例MovieLensALS正常工作.但是,我遇到了scopt库的错误,这是代码中的一个要求.任何帮助,将不胜感激.我的build.sbt如下:

name := "Movie Recommender System"

version := "1.0"

scalaVersion := "2.10.4"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.1"

libraryDependencies += "org.apache.spark" %% "spark-graphx" % "1.2.1"

libraryDependencies += "org.apache.spark"  % "spark-mllib_2.10" % "1.2.1"

libraryDependencies += "com.github.scopt" %% "scopt" % "3.2.0"

resolvers += Resolver.sonatypeRepo("public")
Run Code Online (Sandbox Code Playgroud)

我得到的错误如下:

   Exception in thread "main" java.lang.NoClassDefFoundError: scopt/OptionParser
    at MovieLensALS.main(MovieLensALS.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)
    at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:358)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

    Caused by: java.lang.ClassNotFoundException: scopt.OptionParser
    at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
    at java.security.AccessController.doPrivileged(Native …
Run Code Online (Sandbox Code Playgroud)

scala noclassdeffounderror apache-spark scopt

5
推荐指数
1
解决办法
3521
查看次数

输出以下C程序

这个C程序的输出应该是什么?

#include<stdio.h>
int main(){
  int x,y,z;
  x=y=z=1;
  z = ++x || ++y && ++z;
  printf("x=%d y=%d z=%d\n",x,y,z);
  return 0;
}
Run Code Online (Sandbox Code Playgroud)

给定的输出是:x = 2 y = 1 z = 1
我理解x的输出,但是没有看到y和z值如何不增加.

c

3
推荐指数
1
解决办法
1140
查看次数

火炬中的列总和

我如何在火炬的柱子上总结?我有一个128*1024张量,我希望通过对所有行求和得到1*1024张量.

例如:a:

1 2 3 4 5 6

我想要b

5 7 9

torch

3
推荐指数
1
解决办法
845
查看次数