Nic*_*lai 5 hibernate hibernate-5.x
从Hibernate 4迁移到5时,我遇到了构造器的弃用和最终删除SchemaExport(Configuration)。Hibernate 5中有什么好的替代方案?
在测试过程中,我们SchemaExport使用配置了一些属性的配置创建实例,并定义了映射资源:
// somewhere else `Properties` are filled and passed as a parameter
Properties properties = new Properties();
properties.put("hibernate.dialect", "org.hibernate.dialect.HSQLDialect");
properties.put("hibernate.connection.driver_class", "org.hsqldb.jdbc.JDBCDriver");
// more properties ...
Configuration configuration = new Configuration();
configuration.setProperties(testProperties);
// parameter `String... mappingResources`
for (final String mapping : mappingResources) {
configuration.addResource(mapping);
}
// this doesn't compile
SchemaExport schemaExport = new SchemaExport(configuration);
Run Code Online (Sandbox Code Playgroud)
由于删除了构造函数,因此最后一行在Hibernate 5上无法编译。
不推荐使用SchemaExport(MetadataImplementor)构造函数,但是我很难找到创建MetadataImplementor实例的好方法。我找到了一些选择,但对我来说,这些选择似乎都很可疑。
我可以在Hibernate中找到的唯一具体实现是MetadataImpl和InFlightMetadataCollectorImpl,但是都在中org.hibernate.boot.internal,所以我假设我不应该使用它们。另外,MetadataImpl有一个庞大的构造函数,其中我需要提供每个小细节,并且InFlightMetadataCollectorImpl需要一个MetadataBuildingOptions,它具有与MetadataImplementor(实现是内部的并且难以构造)相同的问题。
或者,看起来MetadataBuilderImpl可能是构造的便捷方法MetadataImplementor,但它也是内部的。
无论哪种方式,我都找不到如何Properties在MetadataImplementor(或MetadataBuilderImpl为此)上设置(或它们的条目)的方法。
是MetadataImplementor真的需要创建一个SchemaExport?如果是这样,如何从受支持的API中获取一个Properties?如何设置?
最终,我们希望使用来执行脚本execute,但是签名也已在此处更改。我看到它现在需要一个ServiceRegistry,所以也许这是一个出路?
Argh,我只是看到5.2(我想使用)SchemaExport甚至MetadataImplementor不再花时间-仅保留了无参数构造函数。现在怎么办?
在 Hibernate 中,我们有这个基本测试类:
public class BaseNonConfigCoreFunctionalTestCase extends BaseUnitTestCase {
public static final String VALIDATE_DATA_CLEANUP = "hibernate.test.validateDataCleanup";
private StandardServiceRegistry serviceRegistry;
private MetadataImplementor metadata;
private SessionFactoryImplementor sessionFactory;
private Session session;
protected Dialect getDialect() {
if ( serviceRegistry != null ) {
return serviceRegistry.getService( JdbcEnvironment.class ).getDialect();
}
else {
return BaseCoreFunctionalTestCase.getDialect();
}
}
protected StandardServiceRegistry serviceRegistry() {
return serviceRegistry;
}
protected MetadataImplementor metadata() {
return metadata;
}
protected SessionFactoryImplementor sessionFactory() {
return sessionFactory;
}
protected Session openSession() throws HibernateException {
session = sessionFactory().openSession();
return session;
}
protected Session openSession(Interceptor interceptor) throws HibernateException {
session = sessionFactory().withOptions().interceptor( interceptor ).openSession();
return session;
}
protected Session getSession() {
return session;
}
protected void rebuildSessionFactory() {
releaseResources();
buildResources();
}
protected void cleanupCache() {
if ( sessionFactory != null ) {
sessionFactory.getCache().evictAllRegions();
}
}
// ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
// JUNIT hooks
@BeforeClassOnce
@SuppressWarnings( {"UnusedDeclaration"})
protected void startUp() {
buildResources();
}
protected void buildResources() {
final StandardServiceRegistryBuilder ssrb = constructStandardServiceRegistryBuilder();
serviceRegistry = ssrb.build();
afterStandardServiceRegistryBuilt( serviceRegistry );
final MetadataSources metadataSources = new MetadataSources( serviceRegistry );
applyMetadataSources( metadataSources );
afterMetadataSourcesApplied( metadataSources );
final MetadataBuilder metadataBuilder = metadataSources.getMetadataBuilder();
initialize( metadataBuilder );
configureMetadataBuilder( metadataBuilder );
metadata = (MetadataImplementor) metadataBuilder.build();
applyCacheSettings( metadata );
afterMetadataBuilt( metadata );
final SessionFactoryBuilder sfb = metadata.getSessionFactoryBuilder();
initialize( sfb, metadata );
configureSessionFactoryBuilder( sfb );
sessionFactory = (SessionFactoryImplementor) sfb.build();
afterSessionFactoryBuilt( sessionFactory );
}
protected final StandardServiceRegistryBuilder constructStandardServiceRegistryBuilder() {
final BootstrapServiceRegistryBuilder bsrb = new BootstrapServiceRegistryBuilder();
bsrb.applyClassLoader( getClass().getClassLoader() );
// by default we do not share the BootstrapServiceRegistry nor the StandardServiceRegistry,
// so we want the BootstrapServiceRegistry to be automatically closed when the
// StandardServiceRegistry is closed.
bsrb.enableAutoClose();
configureBootstrapServiceRegistryBuilder( bsrb );
final BootstrapServiceRegistry bsr = bsrb.build();
afterBootstrapServiceRegistryBuilt( bsr );
final Map settings = new HashMap();
addSettings( settings );
final StandardServiceRegistryBuilder ssrb = new StandardServiceRegistryBuilder( bsr );
initialize( ssrb );
ssrb.applySettings( settings );
configureStandardServiceRegistryBuilder( ssrb );
return ssrb;
}
protected void addSettings(Map settings) {
}
/**
* Apply any desired config to the BootstrapServiceRegistryBuilder to be incorporated
* into the built BootstrapServiceRegistry
*
* @param bsrb The BootstrapServiceRegistryBuilder
*/
@SuppressWarnings({"SpellCheckingInspection", "UnusedParameters"})
protected void configureBootstrapServiceRegistryBuilder(BootstrapServiceRegistryBuilder bsrb) {
}
/**
* Hook to allow tests to use the BootstrapServiceRegistry if they wish
*
* @param bsr The BootstrapServiceRegistry
*/
@SuppressWarnings("UnusedParameters")
protected void afterBootstrapServiceRegistryBuilt(BootstrapServiceRegistry bsr) {
}
@SuppressWarnings("SpellCheckingInspection")
private void initialize(StandardServiceRegistryBuilder ssrb) {
final Dialect dialect = BaseCoreFunctionalTestCase.getDialect();
ssrb.applySetting( AvailableSettings.CACHE_REGION_FACTORY, CachingRegionFactory.class.getName() );
ssrb.applySetting( AvailableSettings.USE_NEW_ID_GENERATOR_MAPPINGS, "true" );
if ( createSchema() ) {
ssrb.applySetting( AvailableSettings.HBM2DDL_AUTO, "create-drop" );
final String secondSchemaName = createSecondSchema();
if ( StringHelper.isNotEmpty( secondSchemaName ) ) {
if ( !H2Dialect.class.isInstance( dialect ) ) {
// while it may be true that only H2 supports creation of a second schema via
// URL (no idea whether that is accurate), every db should support creation of schemas
// via DDL which SchemaExport can create for us. See how this is used and
// whether that usage could not just leverage that capability
throw new UnsupportedOperationException( "Only H2 dialect supports creation of second schema." );
}
Helper.createH2Schema( secondSchemaName, ssrb.getSettings() );
}
}
ssrb.applySetting( AvailableSettings.DIALECT, dialect.getClass().getName() );
}
protected boolean createSchema() {
return true;
}
protected String createSecondSchema() {
// poorly named, yes, but to keep migration easy for existing BaseCoreFunctionalTestCase
// impls I kept the same name from there
return null;
}
/**
* Apply any desired config to the StandardServiceRegistryBuilder to be incorporated
* into the built StandardServiceRegistry
*
* @param ssrb The StandardServiceRegistryBuilder
*/
@SuppressWarnings({"SpellCheckingInspection", "UnusedParameters"})
protected void configureStandardServiceRegistryBuilder(StandardServiceRegistryBuilder ssrb) {
}
/**
* Hook to allow tests to use the StandardServiceRegistry if they wish
*
* @param ssr The StandardServiceRegistry
*/
@SuppressWarnings("UnusedParameters")
protected void afterStandardServiceRegistryBuilt(StandardServiceRegistry ssr) {
}
protected void applyMetadataSources(MetadataSources metadataSources) {
for ( String mapping : getMappings() ) {
metadataSources.addResource( getBaseForMappings() + mapping );
}
for ( Class annotatedClass : getAnnotatedClasses() ) {
metadataSources.addAnnotatedClass( annotatedClass );
}
for ( String annotatedPackage : getAnnotatedPackages() ) {
metadataSources.addPackage( annotatedPackage );
}
for ( String ormXmlFile : getXmlFiles() ) {
metadataSources.addInputStream( Thread.currentThread().getContextClassLoader().getResourceAsStream( ormXmlFile ) );
}
}
protected static final String[] NO_MAPPINGS = new String[0];
protected String[] getMappings() {
return NO_MAPPINGS;
}
protected String getBaseForMappings() {
return "org/hibernate/test/";
}
protected static final Class[] NO_CLASSES = new Class[0];
protected Class[] getAnnotatedClasses() {
return NO_CLASSES;
}
protected String[] getAnnotatedPackages() {
return NO_MAPPINGS;
}
protected String[] getXmlFiles() {
return NO_MAPPINGS;
}
protected void afterMetadataSourcesApplied(MetadataSources metadataSources) {
}
protected void initialize(MetadataBuilder metadataBuilder) {
metadataBuilder.enableNewIdentifierGeneratorSupport( true );
metadataBuilder.applyImplicitNamingStrategy( ImplicitNamingStrategyLegacyJpaImpl.INSTANCE );
}
protected void configureMetadataBuilder(MetadataBuilder metadataBuilder) {
}
protected boolean overrideCacheStrategy() {
return true;
}
protected String getCacheConcurrencyStrategy() {
return null;
}
protected final void applyCacheSettings(Metadata metadata) {
if ( !overrideCacheStrategy() ) {
return;
}
if ( getCacheConcurrencyStrategy() == null ) {
return;
}
for ( PersistentClass entityBinding : metadata.getEntityBindings() ) {
if ( entityBinding.isInherited() ) {
continue;
}
boolean hasLob = false;
final Iterator props = entityBinding.getPropertyClosureIterator();
while ( props.hasNext() ) {
final Property prop = (Property) props.next();
if ( prop.getValue().isSimpleValue() ) {
if ( isLob( ( (SimpleValue) prop.getValue() ).getTypeName() ) ) {
hasLob = true;
break;
}
}
}
if ( !hasLob ) {
( ( RootClass) entityBinding ).setCacheConcurrencyStrategy( getCacheConcurrencyStrategy() );
}
}
for ( Collection collectionBinding : metadata.getCollectionBindings() ) {
boolean isLob = false;
if ( collectionBinding.getElement().isSimpleValue() ) {
isLob = isLob( ( (SimpleValue) collectionBinding.getElement() ).getTypeName() );
}
if ( !isLob ) {
collectionBinding.setCacheConcurrencyStrategy( getCacheConcurrencyStrategy() );
}
}
}
private boolean isLob(String typeName) {
return "blob".equals( typeName )
|| "clob".equals( typeName )
|| "nclob".equals( typeName )
|| Blob.class.getName().equals( typeName )
|| Clob.class.getName().equals( typeName )
|| NClob.class.getName().equals( typeName )
|| BlobType.class.getName().equals( typeName )
|| ClobType.class.getName().equals( typeName )
|| NClobType.class.getName().equals( typeName );
}
protected void afterMetadataBuilt(Metadata metadata) {
}
private void initialize(SessionFactoryBuilder sfb, Metadata metadata) {
// todo : this is where we need to apply cache settings to be like BaseCoreFunctionalTestCase
// it reads the class/collection mappings and creates corresponding
// CacheRegionDescription references.
//
// Ultimately I want those to go on MetadataBuilder, and in fact MetadataBuilder
// already defines the needed method. But for the [pattern used by the
// tests we need this as part of SessionFactoryBuilder
}
protected void configureSessionFactoryBuilder(SessionFactoryBuilder sfb) {
}
protected void afterSessionFactoryBuilt(SessionFactoryImplementor sessionFactory) {
}
@AfterClassOnce
@SuppressWarnings( {"UnusedDeclaration"})
protected void shutDown() {
releaseResources();
}
protected void releaseResources() {
if ( sessionFactory != null ) {
try {
sessionFactory.close();
}
catch (Exception e) {
System.err.println( "Unable to release SessionFactory : " + e.getMessage() );
e.printStackTrace();
}
}
sessionFactory = null;
if ( serviceRegistry != null ) {
try {
StandardServiceRegistryBuilder.destroy( serviceRegistry );
}
catch (Exception e) {
System.err.println( "Unable to release StandardServiceRegistry : " + e.getMessage() );
e.printStackTrace();
}
}
serviceRegistry=null;
}
@OnFailure
@OnExpectedFailure
@SuppressWarnings( {"UnusedDeclaration"})
public void onFailure() {
if ( rebuildSessionFactoryOnError() ) {
rebuildSessionFactory();
}
}
protected boolean rebuildSessionFactoryOnError() {
return true;
}
@Before
public final void beforeTest() throws Exception {
prepareTest();
}
protected void prepareTest() throws Exception {
}
@After
public final void afterTest() throws Exception {
completeStrayTransaction();
if ( isCleanupTestDataRequired() ) {
cleanupTestData();
}
cleanupTest();
cleanupSession();
assertAllDataRemoved();
}
private void completeStrayTransaction() {
if ( session == null ) {
// nothing to do
return;
}
if ( ( (SessionImplementor) session ).isClosed() ) {
// nothing to do
return;
}
if ( !session.isConnected() ) {
// nothing to do
return;
}
final TransactionCoordinator.TransactionDriver tdc =
( (SessionImplementor) session ).getTransactionCoordinator().getTransactionDriverControl();
if ( tdc.getStatus().canRollback() ) {
session.getTransaction().rollback();
}
}
protected boolean isCleanupTestDataRequired() {
return false;
}
protected void cleanupTestData() throws Exception {
doInHibernate(this::sessionFactory, s -> {
s.createQuery("delete from java.lang.Object").executeUpdate();
});
}
private void cleanupSession() {
if ( session != null && ! ( (SessionImplementor) session ).isClosed() ) {
session.close();
}
session = null;
}
public class RollbackWork implements Work {
public void execute(Connection connection) throws SQLException {
connection.rollback();
}
}
protected void cleanupTest() throws Exception {
}
@SuppressWarnings( {"UnnecessaryBoxing", "UnnecessaryUnboxing"})
protected void assertAllDataRemoved() {
if ( !createSchema() ) {
return; // no tables were created...
}
if ( !Boolean.getBoolean( VALIDATE_DATA_CLEANUP ) ) {
return;
}
Session tmpSession = sessionFactory.openSession();
try {
List list = tmpSession.createQuery( "select o from java.lang.Object o" ).list();
Map<String,Integer> items = new HashMap<String,Integer>();
if ( !list.isEmpty() ) {
for ( Object element : list ) {
Integer l = items.get( tmpSession.getEntityName( element ) );
if ( l == null ) {
l = 0;
}
l = l + 1 ;
items.put( tmpSession.getEntityName( element ), l );
System.out.println( "Data left: " + element );
}
fail( "Data is left in the database: " + items.toString() );
}
}
finally {
try {
tmpSession.close();
}
catch( Throwable t ) {
// intentionally empty
}
}
}
public void inSession(Consumer<SessionImplementor> action) {
log.trace( "#inSession(action)" );
inSession( sessionFactory(), action );
}
public void inTransaction(Consumer<SessionImplementor> action) {
log.trace( "#inTransaction(action)" );
inTransaction( sessionFactory(), action );
}
public void inSession(SessionFactoryImplementor sfi, Consumer<SessionImplementor> action) {
log.trace( "##inSession(SF,action)" );
try (SessionImplementor session = (SessionImplementor) sfi.openSession()) {
log.trace( "Session opened, calling action" );
action.accept( session );
log.trace( "called action" );
}
finally {
log.trace( "Session close - auto-close lock" );
}
}
public void inTransaction(SessionFactoryImplementor factory, Consumer<SessionImplementor> action) {
log.trace( "#inTransaction(factory, action)");
try (SessionImplementor session = (SessionImplementor) factory.openSession()) {
log.trace( "Session opened, calling action" );
inTransaction( session, action );
log.trace( "called action" );
}
finally {
log.trace( "Session close - auto-close lock" );
}
}
public void inTransaction(SessionImplementor session, Consumer<SessionImplementor> action) {
log.trace( "inTransaction(session,action)" );
final Transaction txn = session.beginTransaction();
log.trace( "Started transaction" );
try {
log.trace( "Calling action in txn" );
action.accept( session );
log.trace( "Called action - in txn" );
log.trace( "Committing transaction" );
txn.commit();
log.trace( "Committed transaction" );
}
catch (Exception e) {
log.tracef(
"Error calling action: %s (%s) - rolling back",
e.getClass().getName(),
e.getMessage()
);
try {
txn.rollback();
}
catch (Exception ignore) {
log.trace( "Was unable to roll back transaction" );
// really nothing else we can do here - the attempt to
// rollback already failed and there is nothing else
// to clean up.
}
throw e;
}
}
}
Run Code Online (Sandbox Code Playgroud)
它引导Metadata和ServiceRegistry.
SchemaExport所以,我们可以这样称呼:
new SchemaExport().create( EnumSet.of( TargetType.DATABASE ), metadata() );
Run Code Online (Sandbox Code Playgroud)
小智 5
从其他答案中提取,我必须执行以下操作:
StandardServiceRegistry standardRegistry = new StandardServiceRegistryBuilder()
.applySetting("hibernate.hbm2ddl.auto", "create")
.applySetting("hibernate.dialect", "org.hibernate.dialect.MySQLDialect")
.applySetting("hibernate.id.new_generator_mappings", "true")
.build();
MetadataSources sources = new MetadataSources(standardRegistry);
managedClassNames.forEach(sources::addAnnotatedClass);
MetadataImplementor metadata = (MetadataImplementor) sources
.getMetadataBuilder()
.build();
SchemaExport export = new SchemaExport(metadata);
Run Code Online (Sandbox Code Playgroud)
希望有帮助
| 归档时间: |
|
| 查看次数: |
1773 次 |
| 最近记录: |