我需要编写一个使用我定义的任何不同作业的查询.
{job="traefik" OR job="cadvisor" OR job="prometheus"}
Run Code Online (Sandbox Code Playgroud)
是否可以编写逻辑二进制运算符?
有时,当我运行我的剧本时,它会引发下一次失败:
FAILED! => {"changed": false, "failed": true, "module_stderr": "", "module_stdout": "Traceback (most recent call last):\r\n File \"/root/.ansible/tmp/ansible-tmp-1457967885.72-104659711487416/apt_repository\",
line 3210, in <module>\r\n main()\r\n File \"/root/.ansible/tmp/ansible-tmp-1457967885.72-104659711487416/apt_repository\", line 469, in main\r\n` `cache.update()\r\n File \"/usr/lib/python2.7/dist-packages/apt/cache.py\", line 440,
in update\r\n raise FetchFailedException(e)\r\napt.cache.FetchFailedException: W:Imposible obtener` `http://security.ubuntu.com/ubuntu/dists/trusty-security/main/source/Sources` `La suma hash difiere\r\n,
W:Imposible obtener http://security.ubuntu.com/ubuntu/dists/trusty-security/main/binary-amd64/Packages La suma hash difiere\r\n, W:Imposible obtener http://security.ubuntu.com/ubuntu/dists/trusty-security/main/binary-i386/Packages La suma hash difiere\r\n,
E:Algunos archivos de` `índice fallaron al descargar. Se han ignorado, o se han utilizado unos` `antiguos en su lugar\r\n",
"msg": "MODULE FAILURE", …
Run Code Online (Sandbox Code Playgroud) 我在集群上安装了spark,我也有马拉松,两者都在8080端口,如何更改spark-ui默认端口?
我已经在 AWS 上安装了 spark。当我尝试在 AWS 上执行时,它可以工作,但 spark 不起作用,当我检查 sparkMaster 日志时,我看到以下内容:
Spark Command: /usr/lib/jvm/java-8-oracle/jre/bin/java -cp /home/ubuntu/spark/conf/:/home/ubuntu/spark/jars/* -Xmx1g org.apache.spark$
========================================
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
16/09/12 09:40:18 INFO Master: Started daemon with process name: 5451@server1
16/09/12 09:40:18 INFO SignalUtils: Registered signal handler for TERM
16/09/12 09:40:18 INFO SignalUtils: Registered signal handler for HUP
16/09/12 09:40:18 INFO SignalUtils: Registered signal handler for INT
16/09/12 09:40:18 WARN MasterArguments: SPARK_MASTER_IP is deprecated, please use SPARK_MASTER_HOST
16/09/12 09:40:19 WARN NativeCodeLoader: Unable to load native-hadoop library …
Run Code Online (Sandbox Code Playgroud) 我已经配置了下一个Dockerfile,它工作正常:
FROM frolvlad/alpine-oraclejdk8:slim
VOLUME /tmp
ADD farr-api-0.1.0.jar app.jar
RUN sh -c 'touch /app.jar'
ENV JAVA_OPTS=""
ENTRYPOINT [ "sh", "-c", "java $JAVA_OPTS -Djava.security.egd=file:/dev/./urandom -jar /app.jar" ]
Run Code Online (Sandbox Code Playgroud)
现在我想运行相同但使用docker-compose,所以我尝试使用相同的sintaxys,这是我的docker-compose.yml:
码头工人,compose.yml:
jar:
image: frolvlad/alpine-oraclejdk8:slim
volumes:
- /tmp
add: "farr-api-0.1.0.jar" "app.jar"
command: sh -c 'touch /app.jar'
environment:
JAVA_OPTS=""
entrypoint: [ "sh", "-c", "java $JAVA_OPTS -Djava.security.egd=file:/dev/./urandom -jar /app.jar" ]
Run Code Online (Sandbox Code Playgroud)
它抛出下一个失败:
ERROR: yaml.parser.ParserError: while parsing a block mapping
in "./docker-compose.yml", line 2, column 3
expected <block end>, but found '<scalar>'
in "./docker-compose.yml", line 5, column 29
Run Code Online (Sandbox Code Playgroud)
我认为这可能是sintax问题 …
我尝试在同一个文件中插入 2 个 ansible 块,但 Ansible 用第二个块替换了第一个块。
如果我插入接下来的 2 个块:
- name: Setup java environment
blockinfile:
dest: /home/{{ user }}/.bashrc
block: |
#Java path#
JAVA_HOME={{ java_home }}/
- name: Setup hadoop environment
blockinfile:
dest: /home/{{ user }}/.bashrc
block: |
#Hadooppath#
HADOOP_HOME={{ hadoop_home }}/
Run Code Online (Sandbox Code Playgroud)
只有第二个块会在文件中,因为它替换了第一个块。
我正在尝试使用Traefik作为Ingress控制器在Nginx示例上配置基本身份验证.
我只是"mypasswd"
在Kubernetes 的秘密上创造了秘密.
这是我正在使用的Ingress:
apiVersion: extensions/v1beta1
kind: Ingress
metadata:
name: nginxingress
annotations:
ingress.kubernetes.io/auth-type: basic
ingress.kubernetes.io/auth-realm: traefik
ingress.kubernetes.io/auth-secret: mypasswd
spec:
rules:
- host: nginx.mycompany.com
http:
paths:
- path: /
backend:
serviceName: nginxservice
servicePort: 80
Run Code Online (Sandbox Code Playgroud)
我检查了Traefik仪表板,看来,如果我访问nginx.mycompany.com,我可以检查Nginx网页,但没有基本身份验证.
这是我的nginx部署:
apiVersion: extensions/v1beta1
kind: Deployment
metadata:
name: nginx-deployment
spec:
replicas: 3
template:
metadata:
labels:
app: nginx
spec:
containers:
- name: nginx
image: nginx:1.7.9
ports:
- containerPort: 80
Run Code Online (Sandbox Code Playgroud)
Nginx服务:
apiVersion: v1
kind: Service
metadata:
labels:
name: nginxservice
name: nginxservice
spec:
ports:
# The port that …
Run Code Online (Sandbox Code Playgroud) 我正在尝试将 Grafana 仪表板的导入复制到 Grafana。
我正在使用下一个模块:
- name: Export dashboard
grafana_dashboard:
grafana_url: "http://{{ inventory_hostname }}:3000"
grafana_user: "user"
grafana_password: "password"
org_id: "1"
state: present
slug: "node-exporter"
overwrite: yes
path: "/tmp/test/node_exporter.json"
Run Code Online (Sandbox Code Playgroud)
我在本地计算机和远程计算机中都有node_exporter.json。但是当我运行 ansible playbook 时,它会抛出下一个错误:
fatal: [172.16.8.231]: FAILED! => {"changed": false, "msg": "error : Unable to create the new dashboard node-exporter-test : 404 - {'body': '{\"message\":\"Dashboard not found\",\"status\":\"not-found\"}', 'status': 404, 'content-length': '54', 'url': 'http://172.16.8.231:3000/api/dashboards/db', 'msg': 'HTTP Error 404: Not Found', 'connection': 'close', 'date': 'Wed, 10 Apr 2019 14:52:58 GMT', 'content-type': 'application/json'}."}
Run Code Online (Sandbox Code Playgroud)
它抛出该错误dashboard …
我想将Java中手动定义的字符串压缩为7z。然后我可以将其转换为 base64。我发现很多例子将文件压缩到 7z 然后保存到新文件中。
我只是尝试下一个代码,它正确地获取文件并压缩它:
private static void addToArchiveCompression(SevenZOutputFile out, File file, String dir) throws IOException {
String name = dir + File.separator + file.getName();
if (file.isFile()){
SevenZArchiveEntry entry = out.createArchiveEntry(file, name);
out.putArchiveEntry(entry);
FileInputStream in = new FileInputStream(file);
byte[] b = new byte[1024];
int count = 0;
while ((count = in.read(b)) > 0) {
out.write(b, 0, count);
}
out.closeArchiveEntry();
} else if (file.isDirectory()) {
File[] children = file.listFiles();
if (children != null){
for (File child : children){
addToArchiveCompression(out, child, name); …
Run Code Online (Sandbox Code Playgroud) 我使用默认 CentOS 7 AMI 创建实例。该 AMI 自动创建一个卷并附加到实例。是否可以使用 terraform 读取该卷 ID?我使用以下代码创建实例:
resource "aws_instance" "DCOS-master3" {
ami = "${var.aws_centos_ami}"
availability_zone = "eu-west-1b"
instance_type = "t2.medium"
key_name = "${var.aws_key_name}"
security_groups = ["${aws_security_group.bastion.id}"]
associate_public_ip_address = true
private_ip = "10.0.0.13"
source_dest_check = false
subnet_id = "${aws_subnet.eu-west-1b-public.id}"
tags {
Name = "master3"
}
}
Run Code Online (Sandbox Code Playgroud) ansible ×3
amazon-ec2 ×2
apache-spark ×2
grafana ×2
java ×2
7zip ×1
ansible-2.x ×1
apt ×1
docker ×1
dockerfile ×1
grafana-api ×1
kubernetes ×1
prometheus ×1
terraform ×1
traefik ×1