我不希望硬编码默认的@author模板,而是希望Eclipse使用从帐户信息中获取的用户真实姓名(在Linux中 - 但也欢迎使用Windows解决方案).在Eclipse配置中输入它也是可以接受的,唉我找不到合适的地方.
我按照亚马逊指南安装了ec2 api.我将访问ID和密码设置为环境变量.
这是我的个人资料:
导出AWS_ACCESS_KEY = XXXXX
导出AWS_SECRET_KEY = XXXXXX
export JAVA_HOME =/usr/lib/jvm/java-7-openjdk-amd64/jre
export EC2_HOME =/usr/local/ec2/ec2-api-tools-1.7.1.0
export PATH = $ PATH:$ EC2_HOME/bin
Everythings看起来配置为ask,但我无法连接到aws.
这里以详细模式命令ec2-describe-regions的输出:
Client.AuthFailure: AWS was not able to validate the provided access credentials
ubuntu@ip:~$ ec2dre -v
Setting User-Agent to [ec2-api-tools 1.7.1.0]
2014-07-14 19:10:34,898 [main] DEBUG org.apache.http.wire - >> "POST / HTTP/1.1[\r][\n]"
2014-07-14 19:10:34,912 [main] DEBUG org.apache.http.wire - >> "Host: ec2.amazonaws.com[\r][\n]"
2014-07-14 19:10:34,912 [main] DEBUG org.apache.http.wire - >> "X-Amz-Date: 20140714T191033Z[\r][\n]"
2014-07-14 19:10:34,913 [main] DEBUG org.apache.http.wire - >> "Authorization: AWS4-HMAC-SHA256 Credential=AKIAIT64V5MH2HHF5QZQ/20140714/us-east-1/ec2/aws4_request, …Run Code Online (Sandbox Code Playgroud) 默认情况下Logger,在应用程序运行时可见的所有输出在测试应用程序时都是静音的.
如何强制在specs2报告中显示调试,信息等?
我想要玩!framework dist命令将一些文件夹和文件添加到最终的zip文件中.它们是应用程序工作所必需的.
是否有神奇的project/Build.scala配置使其成为可能?我在剧中找不到它!文件.
我有一些配置问题,我看不到.我已按照最新twirlREADME中提供的说明进行操作,但html根据编译器未定义包.
我已将sbt-twirl插件包含在project/plugins.sbt文件中
addSbtPlugin("com.typesafe.sbt" % "sbt-twirl" % "1.0.3")
Run Code Online (Sandbox Code Playgroud)在project/Build.scala我启用了插件
lazy val root = Project(id = "proj", base = file("."))
.enablePlugins(play.twirl.sbt.SbtTwirl)
.settings(rootSettings: _*)
Run Code Online (Sandbox Code Playgroud)我已经放置page.scala.html在src/main/twirl子目录(直接或使用com/somethin/cool路径)
现在,如果我试图导入html包(或com.somethin.cool.html),编译器会抱怨它未定义.我可以使用'twirlCompile'命令编译模板,它们在target子目录中正确生成,但它们对主项目是不可见的.
我正在尝试设置构建命令,我收到这些错误.我尝试过设置安全配置.
访问控制:
授权:基于矩阵.
用户组:joshis1 - 全部检查.
运行构建后.我得到以下错误.在构建脚本中,我只是想复制一个文件.
Started by user shreyas joshi
Building in workspace /var/lib/jenkins/workspace/Tungsten-Build
[Tungsten-Build] $ /bin/sh /tmp/hudson1841543545003586844.sh
November26
November26
sudo: sorry, you must have a tty to run sudo
sudo: sorry, you must have a tty to run sudo
sudo: sorry, you must have a tty to run sudo
sudo: sorry, you must have a tty to run sudo
sudo: sorry, you must have a tty to run sudo
sudo: sorry, you must have …Run Code Online (Sandbox Code Playgroud) 我有一个specs2测试,它使用FakeApplication和一个嵌入式mongodb数据库.
def inMemoryMongoDatabase(name: String = "default"): Map[String, String] = {
val dbname: String = "play-test-" + scala.util.Random.nextInt
Map(
("mongodb." + name + ".db" -> dbname),
("mongodb." + name + ".port" -> EmbeddedMongoTestPort.toString))
}
override def around[T <% Result](t: => T) = {
running(FakeApplication(additionalConfiguration = inMemoryMongoDatabase(), additionalPlugins = Seq("se.radley.plugin.salat.SalatPlugin"))) {
t // execute t inside a http session
}
}
Run Code Online (Sandbox Code Playgroud)
FakeApplication使用conf目录中的默认application.conf配置以及为每个测试创建的测试数据库的其他配置.
直到我们设置了一个mongodb replicat集,这才有效.现在,application.conf包含此replicat集的配置
mongodb.default.replicaset {
host1.host = "localhost"
host1.port = 27017
host2.host = "localhost"
host2.port = 27018
host3.host = "localhost"
host3.port …Run Code Online (Sandbox Code Playgroud) 想以root身份运行bash脚本但是延迟了.怎么能实现这个?
sudo "sleep 3600; command" , or
sudo (sleep 3600; command)
Run Code Online (Sandbox Code Playgroud)
不起作用.
我正在尝试使用Spark 1.0在HBase(0.96.0-hadoop2)中编写一些简单数据,但我不断遇到序列化问题.这是相关代码:
import org.apache.hadoop.hbase.client._
import org.apache.hadoop.hbase.io.ImmutableBytesWritable
import org.apache.hadoop.hbase.util.Bytes
import org.apache.spark.rdd.NewHadoopRDD
import org.apache.hadoop.hbase.HBaseConfiguration
import org.apache.hadoop.mapred.JobConf
import org.apache.spark.SparkContext
import java.util.Properties
import java.io.FileInputStream
import org.apache.hadoop.hbase.client.Put
object PutRawDataIntoHbase{
def main(args: Array[String]): Unit = {
var propFileName = "hbaseConfig.properties"
if(args.size > 0){
propFileName = args(0)
}
/** Load properties here **/
val theData = sc.textFile(prop.getProperty("hbase.input.filename"))
.map(l => l.split("\t"))
.map(a => Array("%010d".format(a(9).toInt)+ "-" + a(0) , a(1)))
val tableName = prop.getProperty("hbase.table.name")
val hbaseConf = HBaseConfiguration.create()
hbaseConf.set("hbase.rootdir", prop.getProperty("hbase.rootdir"))
hbaseConf.addResource(prop.getProperty("hbase.site.xml"))
val myTable = new HTable(hbaseConf, tableName)
theData.foreach(a=>{
var …Run Code Online (Sandbox Code Playgroud) 我刚刚开始玩Play!框架,并偶然发现这样的问题:在测试模式下运行应用程序时,作为依赖项添加的模块无法编译.到目前为止,错误的模式重复了2/2次.属于失败的测试套件的引用类无法解析为类型.
我的dependencies.yml文件如下所示:
require:
- play 1.2
- secure
- crud
- play -> cobertura 2.1
- play -> paginate head
- play -> messages 1.0
- play -> i18ntools 1.0.1
# - play -> scaffold head
Run Code Online (Sandbox Code Playgroud)
正如您所看到的,我已经禁用了该scaffold模块,但该行中的下一个是paginate.第一个请求后出现的错误如下:
Compilation error
The file {module:paginate-head}/test/play/modules/paginate/MappedPaginatorTest.java could not be compiled. Error raised is : MockModel cannot be resolved to a type
In {module:paginate-head}/test/play/modules/paginate/MappedPaginatorTest.java (around line 16)
12:
13: public class MappedPaginatorTest {
14: @Test
15: public void testPaginateByKey() {
16: …Run Code Online (Sandbox Code Playgroud) scala ×4
specs2 ×2
amazon-ec2 ×1
apache-spark ×1
bash ×1
deployment ×1
eclipse ×1
hbase ×1
jenkins ×1
linux ×1
mongodb ×1
sbt ×1
spray ×1
sudo ×1
twirl ×1