我在AppEngine中有以下java模型类:
public class Xyz ... {
@Persistent
private Set<Long> uvw;
}
Run Code Online (Sandbox Code Playgroud)
在Java中使用空集uvw保存对象Xyz时,我得到一个" null "字段(如appengine数据存储区查看器中所列).当我尝试在python中加载相同的对象(通过remote_api)时,由以下python模型类定义:
class Xyz(db.Model):
uvw = db.ListProperty(int)
Run Code Online (Sandbox Code Playgroud)
我得到一个" BadValueError:属性uvw是必需的 ".
在python中使用空的uvw列表保存同一类的另一个对象时,数据存储区查看器将打印" 缺少 "字段.
显然空列表存储处理在Java和python之间有所不同,并导致"不兼容"对象.
因此我的问题是:有没有办法,或者:
或者关于如何处理两种语言的空列表字段的任何其他建议.
谢谢你的回答!
Google刚刚推出了JDO 3.0(使用DataNucleus 2.0)用于Google App Engine,我想使用它,因为它可以方便地支持无主关系.我已经尝试了几天,但我无法弄清楚如何将它与谷歌Eclipse插件一起使用.我找到了这个网页https://developers.google.com/appengine/docs/java/datastore/jdo/overview-dn2,但我的项目文件夹没有build.xml文件.我尝试创建一个单独的项目并传输我的所有代码,但新项目没有JDO 3.0,也没有build.xml文件.
任何帮助将非常感激.
启动我的应用程序时,我会在每个课程中看到此警告:
WARN [DataNucleus.MetaData] - Class com.mycomp.MyClass was specified in persistence-unit myPersistenceUnit but not annotated, so ignoring
Run Code Online (Sandbox Code Playgroud)
该应用程序正确启动所以没有直接问题,但我想知道这个即将到来的形式,以及如何避免id.
我的persistence.xml看起来像:
<persistence-unit name="myPersistenceUnit">
<provider>org.datanucleus.api.jpa.PersistenceProviderImpl</provider>
<properties>
<property name="datanucleus.ConnectionURL" value="appengine" />
<property name="datanucleus.NontransactionalRead" value="true" />
<property name="datanucleus.NontransactionalWrite" value="true" />
<property name="datanucleus.appengine.datastoreEnableXGTransactions" value="true" />
<property name="datanucleus.jpa.addClassTransformer" value="false" />
</properties>
</persistence-unit>
Run Code Online (Sandbox Code Playgroud)
我正在使用Spring在Google App Engine上运行我的应用程序.
但我找不到警告的起源.似乎有些东西告诉我的应用程序对所有类进行一些检查.
PS:我正在定义我的entityManagerFactory,如下所示:
@Bean
public LocalContainerEntityManagerFactoryBean entityManagerFactory() {
LocalContainerEntityManagerFactoryBean entityManagerFactory = new LocalContainerEntityManagerFactoryBean();
entityManagerFactory.setPersistenceUnitName("myPersistenceUnit");
entityManagerFactory.setPersistenceUnitPostProcessors(new ClasspathScanningPersistenceUnitPostProcessor("com.mycomp.domain"));
return entityManagerFactory;
}
Run Code Online (Sandbox Code Playgroud)
任何帮助赞赏.
是否有用于运行DataNucleus Enhancer的gradle插件?正如我从文档中看到的那样,您只能从Maven或Ant运行它:http://www.datanucleus.org/products/datanucleus/jpa/enhancer.html
试图让我的GAE 1.9.0项目的单元测试使用maven 3.2.1运行.数据存储相关测试失败:
java.util.ServiceConfigurationError:
com.google.appengine.tools.development.LocalRpcService:
Provider com.google.appengine.api.datastore.dev.LocalDatastoreV4Service
could not be instantiated: java.lang.NoClassDefFoundError:
com/google/apphosting/datastore/DatastoreV4$LookupRequestOrBuilder
Run Code Online (Sandbox Code Playgroud)
我明白这意味着我错过了依赖.似乎无法找到有关如何正确配置数据存储区测试的pom.xml的文档.我打电话时测试崩溃了
helper.setup()
Run Code Online (Sandbox Code Playgroud)
在帮手上:
public final LocalServiceTestHelper helper = new LocalServiceTestHelper(
new LocalDatastoreServiceTestConfig(),
new LocalTaskQueueTestConfig(),
new LocalBlobstoreServiceTestConfig(),
new LocalUserServiceTestConfig()));
Run Code Online (Sandbox Code Playgroud)
引用:
https://developers.google.com/appengine/docs/java/tools/maven#junit_dependencies_optional
我的pom.xml中的测试依赖项
<dependency>
<groupId>com.google.appengine</groupId>
<artifactId>appengine-testing</artifactId>
<version>${appengine.target.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.google.appengine</groupId>
<artifactId>appengine-api-stubs</artifactId>
<version>${appengine.target.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>com.google.appengine</groupId>
<artifactId>appengine-api-labs</artifactId>
<version>${appengine.target.version}</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework</groupId>
<artifactId>spring-test</artifactId>
<version>3.2.1.RELEASE</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<version>4.11</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>spring</groupId>
<artifactId>spring-mock</artifactId>
<version>1.0.2</version>
<scope>test</scope>
</dependency>
Run Code Online (Sandbox Code Playgroud)
堆栈跟踪的其余部分:
at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:252)
at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:141)
at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:112)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native …Run Code Online (Sandbox Code Playgroud) 我在Google应用引擎中部署了一个应用程序.当我在更新该实体后立即通过id获取实体时,我得到的数据不一致.我正在使用JDO 3.0来访问应用程序引擎数据存储区.
我有一个实体员工
@PersistenceCapable(detachable = "true")
public class Employee implements Serializable {
/**
*
*/
private static final long serialVersionUID = -8319851654750418424L;
@PrimaryKey
@Persistent(valueStrategy = IdGeneratorStrategy.IDENTITY, defaultFetchGroup = "true")
@Extension(vendorName = "datanucleus", key = "gae.encoded-pk", value = "true")
private String id;
@Persistent(defaultFetchGroup = "true")
private String name;
@Persistent(defaultFetchGroup = "true")
private String designation;
@Persistent(defaultFetchGroup = "true")
private Date dateOfJoin;
@Persistent(defaultFetchGroup = "true")
private String email;
@Persistent(defaultFetchGroup = "true")
private Integer age;
@Persistent(defaultFetchGroup = "true")
private Double salary;
@Persistent(defaultFetchGroup = "true") …Run Code Online (Sandbox Code Playgroud) java google-app-engine jdo datanucleus google-cloud-datastore
我正在使用DataNucleus作为JPA实现来将我的类存储在我的Web应用程序中.我使用一组转换器,它们都有toDTO()和fromDTO().
我的问题是,我想避免整个数据库通过网络发送:
有没有办法显式加载某些字段,并在我加载的类中将其他字段保留为NULL?我没有运气就尝试过DataNucleus文档.
为什么下面的代码导致org.datanucleus.exceptions.NucleusUserException:对象管理器已关闭?似乎在query.getResultList()处抛出异常.
public final void removeUserTokens(final String username) {
final Query query = entityManager.createQuery(
"SELECT p FROM PersistentLogin p WHERE username = :username");
query.setParameter("username", username);
for (Object token : query.getResultList()) {
entityManager.remove(token);
}
}
Run Code Online (Sandbox Code Playgroud)
例外:
org.datanucleus.exceptions.NucleusUserException: Object Manager has been closed
at org.datanucleus.ObjectManagerImpl.assertIsOpen(ObjectManagerImpl.java:3876)
at org.datanucleus.ObjectManagerImpl.getFetchPlan(ObjectManagerImpl.java:376)
at org.datanucleus.store.query.Query.getFetchPlan(Query.java:497)
at org.datanucleus.store.appengine.query.DatastoreQuery$6.apply(DatastoreQuery.java:611)
at org.datanucleus.store.appengine.query.DatastoreQuery$6.apply(DatastoreQuery.java:610)
at org.datanucleus.store.appengine.query.LazyResult.resolveNext(LazyResult.java:94)
at org.datanucleus.store.appengine.query.LazyResult$LazyAbstractListIterator.computeNext(LazyResult.java:215)
at org.datanucleus.store.appengine.query.AbstractIterator.tryToComputeNext(AbstractIterator.java:132)
at org.datanucleus.store.appengine.query.AbstractIterator.hasNext(AbstractIterator.java:127)
at org.datanucleus.store.appengine.query.LazyResult$AbstractListIterator.hasNext(LazyResult.java:169)
at com.mystuff.service.auth.PersistentTokenRepositoryImpl.removeUserTokens(PersistentTokenRepositoryImpl.java:90)
Run Code Online (Sandbox Code Playgroud)
编辑:我增加了datanucleus的日志级别,这就是我所看到的.
FINE: Object Manager "org.datanucleus.ObjectManagerImpl@5d8d3d6c" opened for datastore "org.datanucleus.store.appengine.DatastoreManager@2447e380"
Feb 25, 2010 7:21:38 AM org.datanucleus.ObjectManagerImpl initialiseLevel1Cache
FINE: …Run Code Online (Sandbox Code Playgroud) 我们在其中一个项目中使用JDO.这已经运行了很长一段时间,我们自然需要稍微改变模型.
在JDO中迁移实体类中的字段时,最佳做法是什么?
enum MyEnum {
REGULAR,
MYOLDTYPE // Delete this
}
@PersistenceCapable
public class Entity {
@Persistent
MyEnum myEnumType;
@Persistent
String myString; // Rename this
}
Run Code Online (Sandbox Code Playgroud)
如果我删除枚举值,如果从数据库加载时已经存在异常,那么如何迁移它?
如果我想将myString重命名为myNewString,如何将列重命名为新名称?
我正在用Apache Spark Hive构建apache-spark应用程序.到目前为止一切正常 - 我一直在Intellij IDEA中运行测试和整个应用程序,并使用maven一起进行所有测试.
现在我想从bash运行整个应用程序,让它与本地单节点集群一起运行.我正在使用maven-shade-plugin来构建单个可执行JAR.
当应用程序尝试使用SparkContext创建新的HiveContext时,应用程序崩溃.引发异常告诉我,hive不能创建Metastore,因为datanucleus及其插件系统存在一些问题.我尝试了几个问题,如何运行datanucleus插件系统与阴影,但运气不好.例如: Datanucleus,JDO和可执行jar - 怎么做?
使用hive组合应用程序的可执行JAR并从bash运行它的最佳方法是什么?也许是一些datanucleus及其插件系统的设置?
的pom.xml
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>test</groupId>
<artifactId>hive-test</artifactId>
<version>1.0.0</version>
<dependencies>
<dependency>
<groupId>org.scala-lang</groupId>
<artifactId>scala-library</artifactId>
<version>2.11.7</version>
</dependency>
<!-- spark -->
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>1.6.0</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.11</artifactId>
<version>1.6.0</version>
</dependency>
</dependencies>
<properties>
<!-- To be specified in child pom: <main.class></main.class> -->
<final.jar.name>${project.artifactId}-${project.version}</final.jar.name>
<main.class>com.test.HiveTest</main.class>
</properties>
<build>
<plugins>
<!-- the Maven compiler plugin will compile Java source files -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.3</version>
<configuration>
<source>${java.version}</source>
<target>${java.version}</target>
</configuration>
</plugin> …Run Code Online (Sandbox Code Playgroud) datanucleus maven maven-shade-plugin apache-spark spark-hive
datanucleus ×10
java ×6
jpa ×3
jdo ×2
spring ×2
apache-spark ×1
dto ×1
gradle ×1
lazy-loading ×1
maven ×1
python ×1
schema ×1
spark-hive ×1
unit-testing ×1