小编csc*_*can的帖子

在Cassandra Docker中启用Thrift

我正在尝试启动运行cassandra的docker镜像.我需要使用thrift来与cassandra进行通信,但默认情况下它看起来是禁用的.检查cassandra日志显示:

INFO  21:10:35 Not starting RPC server as requested. 
  Use JMX (StorageService->startRPCServer()) or nodetool (enablethrift) to start it
Run Code Online (Sandbox Code Playgroud)

我的问题是:如何在启动这个cassandra容器时启用thrift?

我试图设置各种环境变量无济于事:

docker run --name cs1 -d -e "start_rpc=true" cassandra
docker run --name cs1 -d -e "CASSANDRA_START_RPC=true" cassandra
docker run --name cs1 -d -e "enablethrift=true" cassandra
Run Code Online (Sandbox Code Playgroud)

thrift cassandra docker

12
推荐指数
2
解决办法
1万
查看次数

避免使用Spring的RestTemplate对URL查询参数进行双重编码

我正在尝试使用Spring的RestTemplate :: getForObject来请求具有URL查询参数的URL.

我试过了:

  • 使用字符串
  • 使用URI :: new创建URI
  • 使用URI :: create创建URI
  • 使用UriComponentsBuilder构建URI

无论我使用哪一个,使用URLEncoder :: encode对url查询参数进行编码都会进行双重编码,并且使用此编码会使url查询参数无法编码.

如何在不对URL进行双重编码的情况下发送此请求?这是方法:

try {
    UriComponentsBuilder builder = UriComponentsBuilder.fromHttpUrl(detectUrl)
            .queryParam("url", URLEncoder.encode(url, "UTF-8"))
            .queryParam("api_key", "KEY")
            .queryParam("api_secret", "SECRET");
    URI uri = builder.build().toUri();
    JSONObject jsonObject = restTemplate.getForObject(uri, JSONObject.class);
    return jsonObject.getJSONArray("face").length() > 0;
} catch (JSONException | UnsupportedEncodingException e) {
    e.printStackTrace();
}
Run Code Online (Sandbox Code Playgroud)

这是一个例子:

没有URLEncoder:

http://www.example.com/query?url=http://query.param/example&api_key=KEY&api_secret=SECRET
Run Code Online (Sandbox Code Playgroud)

使用URLEncoder:

http://www.example.com/query?url=http%253A%252F%252Fquery.param%252Fexample&api_key=KEY&api_secret=SECRET
Run Code Online (Sandbox Code Playgroud)

':'应编码为%3A,'/'应编码为%2F.这确实发生了 - 但是%编码为%25.

java encoding spring

9
推荐指数
1
解决办法
2万
查看次数

在WEB-INF/LIB中使Tomcat忽略Servlet

我创建了一个Web应用程序,我需要它才能与图形数据库进行交互(我正在使用Titan).添加Titan的依赖项时,当我尝试在Tomcat中部署此WAR时,会出现以下错误:

SEVERE: A child container failed during start
java.util.concurrent.ExecutionException: org.apache.catalina.LifecycleException: Failed to start component [StandardEngine[Tomcat].StandardHost[localhost].StandardContext[]]
    at java.util.concurrent.FutureTask.report(FutureTask.java:122)
    at java.util.concurrent.FutureTask.get(FutureTask.java:188)
    at org.apache.catalina.core.ContainerBase.startInternal(ContainerBase.java:1123)
    at org.apache.catalina.core.StandardHost.startInternal(StandardHost.java:800)
    at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:150)
    at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1559)
    at org.apache.catalina.core.ContainerBase$StartChild.call(ContainerBase.java:1549)
    at java.util.concurrent.FutureTask.run(FutureTask.java:262)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:744)
Caused by: org.apache.catalina.LifecycleException: Failed to start component [StandardEngine[Tomcat].StandardHost[localhost].StandardContext[]]
    at org.apache.catalina.util.LifecycleBase.start(LifecycleBase.java:154)
    ... 6 more
Caused by: java.lang.LinkageError: loader constraint violation: loader (instance of org/apache/catalina/loader/WebappClassLoader) previously initiated loading for a different type with name "javax/servlet/ServletContext"
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:800)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
    at …
Run Code Online (Sandbox Code Playgroud)

java spring tomcat servlets maven

8
推荐指数
1
解决办法
8436
查看次数

将外部资源文件夹添加到Spring Boot

我想添加一个相对于jar位置的资源文件夹(除了jar中的打包资源),例如:

/Directory
    Application.jar
    /resources
        test.txt
Run Code Online (Sandbox Code Playgroud)

我尝试过以下方法:

@Override
public void addResourceHandlers(final ResourceHandlerRegistry registry) {
    registry.addResourceHandler("/resources/**")
            .addResourceLocations("/resources/", "file:/resources/");
}
Run Code Online (Sandbox Code Playgroud)

我也尝试过:

.addResourceLocations("/resources/", "file:resources/");
Run Code Online (Sandbox Code Playgroud)

http://localhost:8080/resources/test.txt使用任一设置进行访问都会导致出现白标错误页面.我该如何解决这个问题?

java spring spring-mvc spring-boot

8
推荐指数
1
解决办法
2万
查看次数

Ansible:删除主机

我知道可以添加具有以下任务的主机:

- name: Add new instance to host group
  add_host:
    hostname: '{{ item.public_ip }}'
    groupname: "tag_Name_api_production"
  with_items: ec2.instances
Run Code Online (Sandbox Code Playgroud)

但我似乎找不到从库存中删除主机的方法.有没有办法做到这一点?

ansible ansible-playbook ansible-2.x

7
推荐指数
1
解决办法
5562
查看次数

Lambda PutObjectCommand 失败并显示“解析的凭证对象无效”

我有一个 lambda 尝试将一个对象放入 S3 存储桶中。

配置s3客户端的代码如下:

const configuration: S3ClientConfig = {
  region: 'us-west-2',
};

if (process.env.DEVELOPMENT_MODE) {
  configuration.credentials = {
    accessKeyId: process.env.AWS_ACCESS_KEY!,
    secretAccessKey: process.env.AWS_SECRET_KEY!,
  }
}

export const s3 = new S3Client(configuration);
Run Code Online (Sandbox Code Playgroud)

上传文件的代码如下:

s3.send(new PutObjectCommand({
  Bucket: bucketName,
  Key: fileName,
  ContentType: contentType,
  Body: body,
}))
Run Code Online (Sandbox Code Playgroud)

这在本地有效。lambda 的角色包括一个策略,该策略又包含以下语句:

{
    "Action": [
        "s3:DeleteObject",
        "s3:PutObject"
    ],
    "Resource": [
        "arn:aws:s3:::BUCKET_NAME/*"
    ],
    "Effect": "Allow"
}
Run Code Online (Sandbox Code Playgroud)

但是,当我调用此 lambda 时,它会失败并显示以下堆栈跟踪

Error: Resolved credential object is not valid
    at SignatureV4.validateResolvedCredentials (webpack://backend/../node_modules/@aws-sdk/signature-v4-multi-region/node_modules/@aws-sdk/signature-v4/dist-es/SignatureV4.js?:307:19)
    at SignatureV4.eval (webpack://backend/../node_modules/@aws-sdk/signature-v4-multi-region/node_modules/@aws-sdk/signature-v4/dist-es/SignatureV4.js?:50:30)
    at step (webpack://backend/../node_modules/tslib/tslib.es6.js?:130:23)
    at Object.eval …
Run Code Online (Sandbox Code Playgroud)

amazon-s3 amazon-web-services amazon-iam

7
推荐指数
1
解决办法
8406
查看次数

CloudFront重定向所有带有路径前缀的请求

我有一个静态网站,向API服务器发出请求。我使用S3托管此静态页面,我想使用CloudFront将api调用重定向到api服务器。api调用可以通过api的路径前缀来区分:

domain.com/index.html         s3/index.html
domain.com/js/index.js        s3/js/index.html
domain.com/api/request        api_server/api/request
domain.com/api/other/request  api_server/api/other/request
Run Code Online (Sandbox Code Playgroud)

我目前设置了两个来源-一个用于S3存储桶的S3来源(正在运行),另一个用于我的api服务器的自定义来源(无法运行)。

自定义来源设置如下:

Origin Domain Name: api_elb
Origin path: /
Run Code Online (Sandbox Code Playgroud)

行为设置如下:

Precedence: 0
Path pattern: /api/*
Allowed HTTP Methods: GET, HEAD, OPTIONS, PUT, POST, PATCH, DELETE
Forward Headers: all
Forward Query Strings: yes
Run Code Online (Sandbox Code Playgroud)

这是完整的答复:

> GET /api/logout HTTP/1.1
> Host: a0000aaaaaaaaa.cloudfront.net
> User-Agent: curl/7.43.0
> Accept: */*
>
< HTTP/1.1 502 Bad Gateway
< Content-Type: text/html
< Content-Length: 587
< Connection: keep-alive
< Server: CloudFront
< Date: Mon, 06 Jun 2016 19:37:45 …
Run Code Online (Sandbox Code Playgroud)

amazon-s3 amazon-cloudfront amazon-elb

6
推荐指数
1
解决办法
5286
查看次数

Ansible 与通配符同步

我正在尝试使用通配符同步文件:

- name: Install Services jar
  synchronize: src="{{repo}}/target/all-services-*.jar" dest=/opt/company
Run Code Online (Sandbox Code Playgroud)

我这样做是为了不必每次我们的版本号被碰撞时都更新 ansible。但是,这会在运行时引发文件未找到异常。ansible 支持这个吗?如果是这样,我该怎么做?

ansible ansible-playbook ansible-2.x

6
推荐指数
1
解决办法
3844
查看次数

AnalysisException:必须使用writeStream.start()执行带有流源的查询

我收到一个异常,表示我需要启动一个流才能使用它.但是,正在启动流.这个设置有什么问题?

spark.readStream
  .format("kafka")
  .option("kafka.bootstrap.servers", kafkaBootstrapServers)
  .option("subscribe", "inputTopic")
  .option("startingOffsets", "earliest")
  .load
  .selectExpr(deserializeKeyExpression, deserializeValueExpression)
  .select("value.*")
  .withColumn("date", to_timestamp(from_unixtime(col("date"))))
  .transform(model.transform)
  .select(col("id") as "key", func(col(config.probabilityCol)) as "value.prediction")
  .selectExpr(serializeKeyExpression, serializeValueExpression)
  .writeStream
  .outputMode("update")
  .format("kafka")
  .option("kafka.bootstrap.servers", kafkaBootstrapServers)
  .option("checkpointLocation", "checkpoint")
  .option("topic", "outputTopic")
  .start
Run Code Online (Sandbox Code Playgroud)

这是例外:

Caused by: org.apache.spark.sql.AnalysisException: Queries with streaming sources must be executed with writeStream.start();;
kafka
    at org.apache.spark.sql.catalyst.analysis.UnsupportedOperationChecker$.org$apache$spark$sql$catalyst$analysis$UnsupportedOperationChecker$$throwError(UnsupportedOperationChecker.scala:374)
    at org.apache.spark.sql.catalyst.analysis.UnsupportedOperationChecker$$anonfun$checkForBatch$1.apply(UnsupportedOperationChecker.scala:37)
    at org.apache.spark.sql.catalyst.analysis.UnsupportedOperationChecker$$anonfun$checkForBatch$1.apply(UnsupportedOperationChecker.scala:35)
    at org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:127)
    at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$foreachUp$1.apply(TreeNode.scala:126)
    at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$foreachUp$1.apply(TreeNode.scala:126)
    at scala.collection.immutable.List.foreach(List.scala:381)
    at org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:126)
    ...
    at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$foreachUp$1.apply(TreeNode.scala:126)
    at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$foreachUp$1.apply(TreeNode.scala:126)
    at scala.collection.immutable.List.foreach(List.scala:381)
    at org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:126)
    at org.apache.spark.sql.catalyst.analysis.UnsupportedOperationChecker$.checkForBatch(UnsupportedOperationChecker.scala:35)
    at org.apache.spark.sql.execution.QueryExecution.assertSupported(QueryExecution.scala:51)
    at org.apache.spark.sql.execution.QueryExecution.withCachedData$lzycompute(QueryExecution.scala:62)
    at …
Run Code Online (Sandbox Code Playgroud)

apache-spark spark-structured-streaming

6
推荐指数
1
解决办法
1181
查看次数

如何确保通用的不变性

这个例子在C#中,但问题确实适用于任何OO语言.我想创建一个实现IReadOnlyList的通用不可变类.此外,此类应具有无法修改的基础通用IList.最初,该课程编写如下:

public class Datum<T> : IReadOnlyList<T>
{
    private IList<T> objects;
    public int Count 
    { 
        get; 
        private set;
    }
    public T this[int i]
    {
        get
        {
            return objects[i];
        }
        private set
        {
            this.objects[i] = value;
        }
    }

    public Datum(IList<T> obj)
    {
        this.objects = obj;
        this.Count = obj.Count;
    }

    IEnumerator IEnumerable.GetEnumerator()
    {
        return this.GetEnumerator();
    }
    public IEnumerator<T> GetEnumerator()
    {
        return this.objects.GetEnumerator();
    }
}
Run Code Online (Sandbox Code Playgroud)

但是,这不是一成不变的.正如您可能知道的那样,更改初始IList'obj'会更改Datum的'对象'.

static void Main(string[] args)
{
    List<object> list = new List<object>();
    list.Add("one");
    Datum<object> datum = new Datum<object>(list);
    list[0] = …
Run Code Online (Sandbox Code Playgroud)

c# generics immutability

5
推荐指数
1
解决办法
894
查看次数