nkonev / r2dbc-migrate Goto Github PK
View Code? Open in Web Editor NEWR2DBC database migration library
Home Page: https://nkonev.name/post/136
License: Apache License 2.0
R2DBC database migration library
Home Page: https://nkonev.name/post/136
License: Apache License 2.0
I'm using your spring boot starter 1.0.4, with spring boot 2.3.0 Release and java 11. App was working with the same SQL in a schema.sql file that ran on startup, so db connection is ok.
Server starts up and then seems to hang and never initialises webflux to listen on 8080.
This is what I see in the logs
2020-05-26 01:14:24.208 INFO 17797 --- [ restartedMain] trationDelegate$BeanPostProcessorChecker : Bean 'spring.r2dbc-org.springframework.boot.autoconfigure.r2dbc.R2dbcProperties' of type [org.springframework.boot.autoconfigure.r2dbc.R2dbcProperties] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) 2020-05-26 01:14:24.221 INFO 17797 --- [ restartedMain] trationDelegate$BeanPostProcessorChecker : Bean 'r2dbc.migrate-name.nkonev.r2dbc.migrate.autoconfigure.R2dbcMigrateAutoConfiguration$SpringBootR2dbcMigrateProperties' of type [name.nkonev.r2dbc.migrate.autoconfigure.R2dbcMigrateAutoConfiguration$SpringBootR2dbcMigrateProperties] is not eligible for getting processed by all BeanPostProcessors (for example: not eligible for auto-proxying) 2020-05-26 01:14:24.229 INFO 17797 --- [ restartedMain] n.n.r2dbc.migrate.core.R2dbcMigrate : Configured with MigrateProperties{connectionMaxRetries=500, resourcesPath='classpath:/db/migration/*.sql', chunkSize=1000, dialect=null, validationQuery='select '1' as result', validationQueryExpectedResultValue='1', validationQueryTimeout=PT5S, validationRetryDelay=PT1S, acquireLockRetryDelay=PT1S, acquireLockMaxRetries=100, fileCharset=UTF-8} 2020-05-26 01:14:24.817 INFO 17797 --- [tor-tcp-epoll-1] n.n.r2dbc.migrate.core.R2dbcMigrate : Comparing expected value '1' with provided result '1' 2020-05-26 01:14:24.825 INFO 17797 --- [tor-tcp-epoll-1] n.n.r2dbc.migrate.core.R2dbcMigrate : Successfully got result '1' of test query
I am currently using the spring-starter dependency:
name.nkonev.r2dbc-migrate:r2dbc-migrate-spring-boot-starter:2.7.6
And while following the spring example, added migrations under:
src/main/resources/db/migration
src/test/resources/test/migration
and specified the same paths in the application.yml as:
r2dbc:
migrate:
resources-paths:
- classpath:/db/migration/*.sql
- classpath:/test/migration/*.sql
But when running the tests, the test migrations are not run.
Is there a way to trigger these via a gradle task or any configuration needs to be added for the same?
I want to apply the initial schema using spring.sql.init
on application.yml
and then use r2dbc.migrate
to do the migration. However, the default behavior is that R2dbcMigrateAutoConfiguration.R2dbcMigrateBlockingInvoker
bean is initialized first, and then SqlR2dbcScriptDatabaseInitializer
is initialized later. The call order is reversed and the application crashes.
spring:
sql:
init:
enabled: true
mode: always
schemaLocations: classpath:/db/schema.sql
r2dbc:
migrate:
resourcesPath: classpath:/db/migration/*.sql
# schema.sql
-- users
CREATE TABLE IF NOT EXISTS users (
id BIGSERIAL PRIMARY KEY,
email VARCHAR(191) UNIQUE NOT NULL,
name VARCHAR(191),
hash_salt VARCHAR(500) NOT NULL,
roles VARCHAR(16)[]
);
# V2__alter_deleted_at_field_on_users,nontransactional.sql
ALTER TABLE users ADD COLUMN deleted_at TIMESTAMP NULL DEFAULT NULL;
Caused by: io.r2dbc.postgresql.ExceptionFactory$PostgresqlBadGrammarException: [42P01] relation "users" does not exist
When initializing R2dbcMigrateBlockingInvoker
, it tells SqlR2dbcScriptDatabaseInitializer
that it needs a dependency (I don't know if this is best) so that SqlR2dbcScriptDatabaseInitializer
is called first and R2dbcMigrateBlockingInvoker
is called later. I also tried using @DependOn
or @AutoConfigureAfter
but it didn't work well.
@SpringBootApplication(exclude = [R2dbcMigrateAutoConfiguration::class])
class Application
@Configuration(proxyBeanMethods = false)
@EnableConfigurationProperties(SpringBootR2dbcMigrateProperties::class)
class DatabaseConfiguration {
@Bean(name = ["ar2dbcMigrate"], initMethod = "migrate")
fun r2dbcMigrate(
connectionFactory: ConnectionFactory,
properties: SpringBootR2dbcMigrateProperties,
@Autowired(required = false) maybeUserDialect: SqlQueries?,
@Autowired(required = false) initializer: SqlR2dbcScriptDatabaseInitializer? // <--- THIS
): R2dbcMigrateAutoConfiguration.R2dbcMigrateBlockingInvoker {
return R2dbcMigrateAutoConfiguration.R2dbcMigrateBlockingInvoker(
connectionFactory,
properties,
maybeUserDialect
)
}
}
r2dbc.migrate
and spring.sql.init
to be used together by having SqlR2dbcScriptDatabaseInitializer
called first and then R2dbcMigrateBlockingInvoker
.
Thank you for your awesome project.
It looks like there is an incorrect constant type supplied to the function name.nkonev.r2dbc.migrate.core.R2dbcMigrate.switchIfEmpty().
Here is the offending code
private static Mono<Void> acquireOrWaitForLock(Connection connection, SqlQueries sqlQueries, R2dbcMigrateProperties properties) {
Mono<Long> lockUpdated = Mono.from(connection.createStatement(sqlQueries.tryAcquireLock()).execute())
.flatMap(o -> Mono.from(o.getRowsUpdated()))
.switchIfEmpty(Mono.just(0L))
.flatMap(aLong -> {
if (Integer.valueOf(0).equals(aLong)) {
return Mono.error(new RuntimeException("Equals zero"));
} else {
return Mono.just(aLong);
}
})
.doOnSuccess(integer -> {
LOGGER.info(ROWS_UPDATED, "Acquiring lock", integer);
});
Mono<Long> waitForLock = lockUpdated.retryWhen(reactor.util.retry.Retry.fixedDelay(properties.getAcquireLockMaxRetries(), properties.getAcquireLockRetryDelay()).doAfterRetry(retrySignal -> {
LOGGER.warn("Waiting for lock");
}));
return transactionalWrapUnchecked(connection, true, waitForLock);
}
Here is the stack trace
Caused by: java.lang.ClassCastException: class java.lang.Integer cannot be cast to class java.lang.Long (java.lang.Integer and java.lang.Long are in module java.base of loader 'bootstrap')
at reactor.core.publisher.MonoReduceSeed$ReduceSeedSubscriber.onNext(MonoReduceSeed.java:124)
at reactor.core.publisher.FluxSwitchIfEmpty$SwitchIfEmptySubscriber.onNext(FluxSwitchIfEmpty.java:74)
at reactor.core.publisher.FluxFlatMap$FlatMapMain.tryEmit(FluxFlatMap.java:543)
at reactor.core.publisher.FluxFlatMap$FlatMapInner.onNext(FluxFlatMap.java:984)
at reactor.core.publisher.FluxHandleFuseable$HandleFuseableSubscriber.onNext(FluxHandleFuseable.java:191)
at reactor.core.publisher.Operators$MonoSubscriber.complete(Operators.java:1839)
at reactor.core.publisher.MonoCollectList$MonoCollectListSubscriber.onComplete(MonoCollectList.java:129)
at reactor.core.publisher.FluxHandleFuseable$HandleFuseableSubscriber.onComplete(FluxHandleFuseable.java:236)
at reactor.core.publisher.FluxWindowPredicate$WindowFlux.checkTerminated(FluxWindowPredicate.java:766)
at reactor.core.publisher.FluxWindowPredicate$WindowFlux.drainRegular(FluxWindowPredicate.java:660)
at reactor.core.publisher.FluxWindowPredicate$WindowFlux.drain(FluxWindowPredicate.java:746)
at reactor.core.publisher.FluxWindowPredicate$WindowFlux.onComplete(FluxWindowPredicate.java:812)
at reactor.core.publisher.FluxWindowPredicate$WindowPredicateMain.onNext(FluxWindowPredicate.java:241)
at io.r2dbc.postgresql.util.FluxDiscardOnCancel$FluxDiscardOnCancelSubscriber.onNext(FluxDiscardOnCancel.java:91)
at reactor.core.publisher.FluxContextWrite$ContextWriteSubscriber.onNext(FluxContextWrite.java:107)
at reactor.core.publisher.FluxCreate$BufferAsyncSink.drain(FluxCreate.java:814)
at reactor.core.publisher.FluxCreate$BufferAsyncSink.next(FluxCreate.java:739)
at reactor.core.publisher.FluxCreate$SerializedFluxSink.next(FluxCreate.java:161)
at io.r2dbc.postgresql.client.ReactorNettyClient$Conversation.emit(ReactorNettyClient.java:671)
at io.r2dbc.postgresql.client.ReactorNettyClient$BackendMessageSubscriber.emit(ReactorNettyClient.java:923)
at io.r2dbc.postgresql.client.ReactorNettyClient$BackendMessageSubscriber.onNext(ReactorNettyClient.java:797)
at io.r2dbc.postgresql.client.ReactorNettyClient$BackendMessageSubscriber.onNext(ReactorNettyClient.java:703)
at reactor.core.publisher.FluxHandle$HandleSubscriber.onNext(FluxHandle.java:126)
at reactor.core.publisher.FluxPeekFuseable$PeekConditionalSubscriber.onNext(FluxPeekFuseable.java:854)
at reactor.core.publisher.FluxMap$MapConditionalSubscriber.onNext(FluxMap.java:224)
at reactor.core.publisher.FluxMap$MapConditionalSubscriber.onNext(FluxMap.java:224)
at reactor.netty.channel.FluxReceive.drainReceiver(FluxReceive.java:294)
at reactor.netty.channel.FluxReceive.onInboundNext(FluxReceive.java:403)
at reactor.netty.channel.ChannelOperations.onInboundNext(ChannelOperations.java:404)
at reactor.netty.channel.ChannelOperationsHandler.channelRead(ChannelOperationsHandler.java:113)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:346)
at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:333)
at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:454)
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:290)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412)
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:440)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420)
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:788)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562)
at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997)
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74)
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30)
at java.base/java.lang.Thread.run(Thread.java:1589)
Suppressed: java.lang.Exception: #block terminated with an error
at reactor.core.publisher.BlockingSingleSubscriber.blockingGet(BlockingSingleSubscriber.java:99)
at reactor.core.publisher.Mono.block(Mono.java:1742)
at name.nkonev.r2dbc.migrate.autoconfigure.R2dbcMigrateAutoConfiguration$R2dbcMigrateBlockingInvoker.migrate(R2dbcMigrateAutoConfiguration.java:80)
at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:104)
at java.base/java.lang.reflect.Method.invoke(Method.java:578)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeCustomInitMethod(AbstractAutowireCapableBeanFactory.java:1930)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.invokeInitMethods(AbstractAutowireCapableBeanFactory.java:1872)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1800)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:620)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:542)
at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:335)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:333)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:208)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:322)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:208)
at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:276)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1391)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1311)
at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:887)
at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:791)
at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:229)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1372)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1222)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:582)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:542)
at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:335)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:333)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:208)
at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:410)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFactory.java:1352)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1195)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:582)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:542)
at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:335)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:333)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:208)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:330)
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveValueIfNecessary(BeanDefinitionValueResolver.java:113)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.applyPropertyValues(AbstractAutowireCapableBeanFactory.java:1707)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.populateBean(AbstractAutowireCapableBeanFactory.java:1452)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:619)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:542)
at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:335)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:333)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:208)
at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:276)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1391)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1311)
at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:887)
at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:791)
at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:229)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1372)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1222)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:582)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:542)
at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:335)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:333)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:208)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:955)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:918)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:583)
at org.springframework.boot.web.reactive.context.ReactiveWebServerApplicationContext.refresh(ReactiveWebServerApplicationContext.java:66)
at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:731)
at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:408)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:307)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1303)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1292)
at com.modios.taxsolution.AdminServiceAppKt.main(AdminServiceApp.kt:14)
From #22
If we are specifying a non-default schema to use and the database is empty, the migration fails because the schema does not exist. It would be great to have r2dbc-migrate create the schema that is specified via r2dbc.migrate.migrations-schema
if it does not exist.
I observed this behavior when using Postgres and a Spring Boot app. I understand that I can probably use a SqlR2dbcScriptDatabaseInitializer
, but I think this feature should be baked into the library instead.
Create a command line to execute the migrate tasks.
Disclaimer: I am not 100% certain this is caused by r2dbc-migrate, but I think the spring boot upgrade to 2.7.0 broke some functionality that r2dbc-migrate was using.
With spring boot 2.6.6 and r2dbc-migrate 1.8.5 I can successfully migrate H2 and MariaDB databases.
With spring boot 2.7.0 and r2dbc-migrate 1.8.5 I get the following:
2022-05-23 14:34:44,884 INFO [main] name.nkonev.r2dbc.migrate.autoconfigure.R2dbcMigrateAutoConfiguration$R2dbcMigrateBlockingInvoker: Starting R2DBC migration
2022-05-23 14:34:44,886 INFO [main] reactor.util.Loggers$Slf4JLogger: Configured with R2dbcMigrateProperties{enable=true, connectionMaxRetries=500, resourcesPaths=[classpath:/db/migration/*.sql], chunkSize=1000, dialect=null, validationQuery='select '42' as result', validationQueryExpectedResultValue='42', validationQueryTimeout=PT5S, validationRetryDelay=PT1S, acquireLockRetryDelay=PT1S, acquireLockMaxRetries=100, fileCharset=UTF-8, waitForDatabase=true, migrationsSchema='null', migrationsTable='migrations', migrationsLockTable='migrations_lock'}
2022-05-23 14:34:44,932 INFO [main] reactor.util.Loggers$Slf4JLogger: Creating new test connection
2022-05-23 14:34:45,266 INFO [reactor-tcp-epoll-2] reactor.util.Loggers$Slf4JLogger: Comparing expected value '42' with provided result '42'
2022-05-23 14:34:45,268 INFO [reactor-tcp-epoll-2] reactor.util.Loggers$Slf4JLogger: Closing test connection
2022-05-23 14:34:45,279 INFO [reactor-tcp-epoll-2] reactor.util.Loggers$Slf4JLogger: Successfully got result '42' of test query
2022-05-23 14:34:45,317 INFO [reactor-tcp-epoll-2] reactor.util.Loggers$Slf4JLogger: By 'Making internal tables' 0 rows updated
2022-05-23 14:34:45,319 INFO [reactor-tcp-epoll-2] reactor.util.Loggers$Slf4JLogger: By 'Acquiring lock' 1 rows updated
2022-05-23 14:34:45,331 ERROR [reactor-tcp-epoll-2] reactor.util.Loggers$Slf4JLogger: Got error during migration, will release lock
io.netty.util.IllegalReferenceCountException: refCnt: 0
at io.netty.buffer.AbstractByteBuf.ensureAccessible(AbstractByteBuf.java:1454)
at io.netty.buffer.AbstractByteBuf.checkReadableBytes0(AbstractByteBuf.java:1440)
at io.netty.buffer.AbstractByteBuf.readByte(AbstractByteBuf.java:730)
at io.netty.buffer.AbstractByteBuf.readUnsignedByte(AbstractByteBuf.java:744)
at org.mariadb.r2dbc.codec.TextRowDecoder.setPosition(TextRowDecoder.java:78)
at org.mariadb.r2dbc.codec.TextRowDecoder.get(TextRowDecoder.java:19)
at org.mariadb.r2dbc.MariadbReadable.get(MariadbReadable.java:33)
at org.mariadb.r2dbc.MariadbReadable.get(MariadbReadable.java:70)
at name.nkonev.r2dbc.migrate.core.R2dbcMigrate.lambda$getResultSafely$25(R2dbcMigrate.java:294)
at org.mariadb.r2dbc.MariadbResult.lambda$map$1(MariadbResult.java:121)
...after which application startup fails.
Could this be related to r2dbc-migrate?
Thanks for creating this project and being part of the Spring community.
Spring Boot's reference documentation contains some guidelines on naming third-party starters. We prefer to reserve spring-boot-starter-
for Spring Boot's own starters and for third party starters to use a -spring-boot-starter
suffix instead.
Could you please rename your starter to r2dbc-migrate-spring-boot-starter
?
Hi Nikita,
Another feature request.
In flyway, the version part can be:
Version: Version with dots or underscores separate as many parts as you like (Not for repeatable migrations)
Hope this tool can support it and be consistent with flyway naming convention as well.
Thanks for taking look at it.
Hello,
First of all, your lib is amazing -> Wonderful job.
I would like to suggest add support to H2 because this is focused in tests/TDD, than PostgreSQL, Microsoft SQL Server, MySQL(although those are useful for test as well).
Thanks
r2dbc-migrate is throwing error saying that schema "service_one" does not exist
The build dependencies are
The application.yml properties is as below
r2dbc:
migrate:
migrations-schema: service_one
dialect: postgresql
enable: true
I have 2 migration files in the src/main/resources/db/migration directory named V0__create_schema.sql and V1__create_user.sql
V0__create_schema.sql contains just the SQL statement CREATE SCHEMA IF NOT EXISTS "service_one";
V1__create_user.sql contains a create table statement as follows:
CREATE TABLE service_one."user" (
id text NOT NULL,
employee_id int4 NOT NULL,
job_profile text NULL,
CONSTRAINT user_pk PRIMARY KEY (id)
);
The service is written in Kotlin and uses Java 17 OpenJDK.
The base database integration test class is as follows:
@ActiveProfiles("dbintegrationtest")
@SpringBootTest(
webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT,
classes = [ReactiveKotlinServiceOneApplication::class],
)
@Testcontainers
class SpringBootDatabaseTest {
companion object : KLogging() {
@Container
@ServiceConnection
@JvmStatic
private val postgreSQLContainer =
PostgreSQLContainer<Nothing>(DockerImageName.parse("postgres:12"))
.withReuse(true) as PostgreSQLContainer<Nothing>
}
}
The concrete UserRepositoryImplTest is as follows:
class UserRepositoryImplTest : SpringBootDatabaseTest() {
companion object : KLogging()
@Autowired
private lateinit var userRepositoryImpl: UserRepositoryImpl
@Autowired
private lateinit var databaseClient: DatabaseClient
private val userId = "user-1"
@BeforeEach
fun `clean up`() {
val numberOfRowsDeleted =
databaseClient.sql("delete from ${DatabaseConfig.SCHEMA}.user")
.fetch().rowsUpdated().block()
logger.info { "before test, the number of rows deleted in 'user' table: $numberOfRowsDeleted" }
}
@Test
fun `get a non-existent user should return empty Mono`() {
StepVerifier.create(userRepositoryImpl.getById("non-existent-user-id"))
.expectComplete()
.verify()
}
@Test
fun `insert and get user`() {
val userToBeSaved = User(id = userId, employeeId = 12345)
val userMatchPredicate: (t: User) -> Boolean = { user ->
user.id == userToBeSaved.id && user.employeeId == userToBeSaved.employeeId
}
StepVerifier.create(userRepositoryImpl.updateOrCreate(userToBeSaved))
.expectNextMatches(userMatchPredicate)
.expectComplete()
.verify()
StepVerifier.create(userRepositoryImpl.getById(userToBeSaved.id))
.expectNextMatches(userMatchPredicate)
.expectComplete()
.verify()
}
}
Hi @nkonev ๐
Thank you so much for your project. It's much appreciated.
Is it possible to create Github releases along side them release notes?
Release notes documenting what has been added/changed may help those using to project anticipate any follow up work needed on their end.
For example I'm not sure what has just changed between 2.9.4 to 2.10.0 and if that's going to impact our project.
Thanks.
Hi! I'm thrilled to use this library in my project, but unfortunately, it does not support migration versions with the long type.
I'm using the 'Flyway Migration Creation' IntelliJ IDE plugin, which creates files as follows:
V20240309222903__create_new_example_table.sql
V20240309222927__insert_initial_example_data.sql
When I run the application, it throws a NumberFormatException because it cannot parse this long value as an int in name.nkonev.r2dbc.migrate.core.FilenameParser#getVersion. Could you please take a look at it?
Thanks in advance!
Recently I have found your library and it seems to me that it will help with migrating my database in r2dbc environment. But I found than I cannot use it because your library is compilted with higher version of Java.
<java.version>11</java.version>
Error message looks like this:
Caused by: java.lang.UnsupportedClassVersionError: name/nkonev/r2dbc/migrate/autoconfigure/R2dbcMigrateAutoConfiguration$R2dbcConnectionFactoryDependsOnBeanFactoryPostProcessor has been compiled by a more recent version of the Java Runtime (class file version 55.0), this version of the Java Runtime only recognizes class file versions up to 52.0
at java.lang.ClassLoader.defineClass1(Native Method) ~[na:1.8.0_242]
at java.lang.ClassLoader.defineClass(ClassLoader.java:757) ~[na:1.8.0_242]
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) ~[na:1.8.0_242]
at java.net.URLClassLoader.defineClass(URLClassLoader.java:468) ~[na:1.8.0_242]
at java.net.URLClassLoader.access$100(URLClassLoader.java:74) ~[na:1.8.0_242]
at java.net.URLClassLoader$1.run(URLClassLoader.java:369) ~[na:1.8.0_242]
at java.net.URLClassLoader$1.run(URLClassLoader.java:363) ~[na:1.8.0_242]
at java.security.AccessController.doPrivileged(Native Method) ~[na:1.8.0_242]
at java.net.URLClassLoader.findClass(URLClassLoader.java:362) ~[na:1.8.0_242]
at java.lang.ClassLoader.loadClass(ClassLoader.java:419) ~[na:1.8.0_242]
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352) ~[na:1.8.0_242]
at java.lang.ClassLoader.loadClass(ClassLoader.java:352) ~[na:1.8.0_242]
at java.lang.Class.forName0(Native Method) ~[na:1.8.0_242]
at java.lang.Class.forName(Class.java:348) ~[na:1.8.0_242]
at org.springframework.util.ClassUtils.forName(ClassUtils.java:285) ~[spring-core-5.2.6.RELEASE.jar:5.2.6.RELEASE]
at org.springframework.beans.factory.support.AbstractBeanDefinition.resolveBeanClass(AbstractBeanDefinition.java:469) ~[spring-beans-5.2.6.RELEASE.jar:5.2.6.RELEASE]
at org.springframework.beans.factory.support.AbstractBeanFactory.doResolveBeanClass(AbstractBeanFactory.java:1545) ~[spring-beans-5.2.6.RELEASE.jar:5.2.6.RELEASE]
at org.springframework.beans.factory.support.AbstractBeanFactory.resolveBeanClass(AbstractBeanFactory.java:1472) ~[spring-beans-5.2.6.RELEASE.jar:5.2.6.RELEASE]
... 16 common frames omitted
I still using Java 8. Will be possible to compile the library with target compatibility with java 8.
Thanks.
If any changes are made to an already migrated sql script file and run again, there are no logs ( neither error / nor any info) for this.
Seems like the library has the following logic : For a file named like V1__xxxxxx.sql
, during migration, it parses the version , which is 1 here and checks in migrations
table whether the record with id=1
is present. If it's there, it does nothing, else executes the script.
Thus even if we rename the file after __
(double underscore) keeping the version same, or change it's content, it simply ignores.
This behaviour is quite different from Flyway where checksum computation happens for every script.
Another probable missing feature:
For a multi-tenant microservice, if we choose schema-per-tenant approach (i.e same database server, different schemas for different tenants), then using this library how are we supposed to run migration scripts (creating the tables etc ) for every schema during application startup ? If I am not wrong, the ConnectionFactory
requires mentioning schema.
Follow the Spring Boot pattern, set Set classpath:/db/migration/*.sql
as the default resource paths in R2dbc Migrate Spring Boot starter. Thus just adding a dependency and no extra config to make it work.
I was using R2dbc Migrate 1.8.5 and Spring Boot 2.7.x, it worked well.
In these days, we are upgrading to Spring Boot 3.0, the R2dbc migrate did not start at all.
Our project new stack:
I tried to update the R2dbc migrate to 2.7.8, and got the following exceptions.
It seems R2dbc Migrate does not create the table migrations_lock
before running Migrate command.
The reproduce example: https://github.com/hantsy/spring-r2dbc-sample/tree/master/r2dbc-migrate (Spring Boot 3.0.0, Java 17, R2dbc Postgres)
2022-12-02T11:49:04.079+08:00 INFO 12860 --- [ main] n.n.r.m.a.R2dbcMigrateAutoConfiguration : Starting R2DBC migration
2022-12-02T11:49:04.086+08:00 INFO 12860 --- [ main] n.n.r2dbc.migrate.core.R2dbcMigrate : Configured with R2dbcMigrateProperties{enable=true, connectionMaxRetries=500,
resourcesPaths=[classpath:/db/migration/*.sql], chunkSize=1000, dialect=null, validationQuery='select '42' as result', validationQueryExpectedResultValue='42', validationQueryTimeout=
PT5S, validationRetryDelay=PT1S, acquireLockRetryDelay=PT1S, acquireLockMaxRetries=100, fileCharset=UTF-8, waitForDatabase=true, migrationsSchema='null', migrationsTable='migrations',
migrationsLockTable='migrations_lock'}
2022-12-02T11:49:04.117+08:00 INFO 12860 --- [ main] n.n.r2dbc.migrate.core.R2dbcMigrate : Found 21 sql scripts, see details below
2022-12-02T11:49:04.118+08:00 INFO 12860 --- [ main] n.n.r2dbc.migrate.core.R2dbcMigrate : Found 0 premigration sql scripts
2022-12-02T11:49:04.121+08:00 INFO 12860 --- [ main] n.n.r2dbc.migrate.core.R2dbcMigrate : Found 21 migration sql scripts
2022-12-02T11:49:04.199+08:00 INFO 12860 --- [ main] n.n.r2dbc.migrate.core.R2dbcMigrate : Creating new test connection
2022-12-02T11:49:05.202+08:00 INFO 12860 --- [actor-tcp-nio-1] n.n.r2dbc.migrate.core.R2dbcMigrate : Comparing expected value '42' with provided result '42'
2022-12-02T11:49:05.203+08:00 INFO 12860 --- [actor-tcp-nio-1] n.n.r2dbc.migrate.core.R2dbcMigrate : Closing test connection
2022-12-02T11:49:05.214+08:00 INFO 12860 --- [actor-tcp-nio-1] n.n.r2dbc.migrate.core.R2dbcMigrate : Successfully got result '42' of test query
2022-12-02T11:49:05.524+08:00 ERROR 12860 --- [actor-tcp-nio-1] n.n.r2dbc.migrate.core.R2dbcMigrate : Got error during migration, will release lock
java.lang.ClassCastException: class java.lang.Long cannot be cast to class java.lang.Integer (java.lang.Long and java.lang.Integer are in module java.base of loader 'bootstrap')
at reactor.core.publisher.MonoReduceSeed$ReduceSeedSubscriber.onNext(MonoReduceSeed.java:116) ~[reactor-core-3.5.0.jar!/:3.5.0]
at reactor.core.publisher.FluxSwitchIfEmpty$SwitchIfEmptySubscriber.onNext(FluxSwitchIfEmpty.java:74) ~[reactor-core-3.5.0.jar!/:3.5.0]
at reactor.core.publisher.FluxFlatMap$FlatMapMain.tryEmit(FluxFlatMap.java:543) ~[reactor-core-3.5.0.jar!/:3.5.0]
at reactor.core.publisher.FluxFlatMap$FlatMapInner.onNext(FluxFlatMap.java:984) ~[reactor-core-3.5.0.jar!/:3.5.0]
at reactor.core.publisher.FluxHandleFuseable$HandleFuseableSubscriber.onNext(FluxHandleFuseable.java:193) ~[reactor-core-3.5.0.jar!/:3.5.0]
at reactor.core.publisher.Operators$BaseFluxToMonoOperator.completePossiblyEmpty(Operators.java:2034) ~[reactor-core-3.5.0.jar!/:3.5.0]
at reactor.core.publisher.MonoCollectList$MonoCollectListSubscriber.onComplete(MonoCollectList.java:118) ~[reactor-core-3.5.0.jar!/:3.5.0]
at reactor.core.publisher.FluxHandleFuseable$HandleFuseableSubscriber.onComplete(FluxHandleFuseable.java:238) ~[reactor-core-3.5.0.jar!/:3.5.0]
at reactor.core.publisher.FluxWindowPredicate$WindowFlux.checkTerminated(FluxWindowPredicate.java:768) ~[reactor-core-3.5.0.jar!/:3.5.0]
at reactor.core.publisher.FluxWindowPredicate$WindowFlux.drainRegular(FluxWindowPredicate.java:662) ~[reactor-core-3.5.0.jar!/:3.5.0]
at reactor.core.publisher.FluxWindowPredicate$WindowFlux.drain(FluxWindowPredicate.java:748) ~[reactor-core-3.5.0.jar!/:3.5.0]
at reactor.core.publisher.FluxWindowPredicate$WindowFlux.onComplete(FluxWindowPredicate.java:814) ~[reactor-core-3.5.0.jar!/:3.5.0]
at reactor.core.publisher.FluxWindowPredicate$WindowPredicateMain.onNext(FluxWindowPredicate.java:243) ~[reactor-core-3.5.0.jar!/:3.5.0]
at io.r2dbc.postgresql.util.FluxDiscardOnCancel$FluxDiscardOnCancelSubscriber.onNext(FluxDiscardOnCancel.java:91) ~[r2dbc-postgresql-1.0.0.RELEASE.jar!/:1.0.0.RELEASE]
at reactor.core.publisher.FluxContextWrite$ContextWriteSubscriber.onNext(FluxContextWrite.java:107) ~[reactor-core-3.5.0.jar!/:3.5.0]
at reactor.core.publisher.FluxCreate$BufferAsyncSink.drain(FluxCreate.java:814) ~[reactor-core-3.5.0.jar!/:3.5.0]
at reactor.core.publisher.FluxCreate$BufferAsyncSink.next(FluxCreate.java:739) ~[reactor-core-3.5.0.jar!/:3.5.0]
at reactor.core.publisher.FluxCreate$SerializedFluxSink.next(FluxCreate.java:161) ~[reactor-core-3.5.0.jar!/:3.5.0]
at io.r2dbc.postgresql.client.ReactorNettyClient$Conversation.emit(ReactorNettyClient.java:687) ~[r2dbc-postgresql-1.0.0.RELEASE.jar!/:1.0.0.RELEASE]
at io.r2dbc.postgresql.client.ReactorNettyClient$BackendMessageSubscriber.emit(ReactorNettyClient.java:939) ~[r2dbc-postgresql-1.0.0.RELEASE.jar!/:1.0.0.RELEASE]
at io.r2dbc.postgresql.client.ReactorNettyClient$BackendMessageSubscriber.onNext(ReactorNettyClient.java:813) ~[r2dbc-postgresql-1.0.0.RELEASE.jar!/:1.0.0.RELEASE]
at io.r2dbc.postgresql.client.ReactorNettyClient$BackendMessageSubscriber.onNext(ReactorNettyClient.java:719) ~[r2dbc-postgresql-1.0.0.RELEASE.jar!/:1.0.0.RELEASE]
at reactor.core.publisher.FluxHandle$HandleSubscriber.onNext(FluxHandle.java:128) ~[reactor-core-3.5.0.jar!/:3.5.0]
at reactor.core.publisher.FluxPeekFuseable$PeekConditionalSubscriber.onNext(FluxPeekFuseable.java:854) ~[reactor-core-3.5.0.jar!/:3.5.0]
at reactor.core.publisher.FluxMap$MapConditionalSubscriber.onNext(FluxMap.java:224) ~[reactor-core-3.5.0.jar!/:3.5.0]
at reactor.core.publisher.FluxMap$MapConditionalSubscriber.onNext(FluxMap.java:224) ~[reactor-core-3.5.0.jar!/:3.5.0]
at reactor.netty.channel.FluxReceive.drainReceiver(FluxReceive.java:292) ~[reactor-netty-core-1.1.0.jar!/:1.1.0]
at reactor.netty.channel.FluxReceive.onInboundNext(FluxReceive.java:401) ~[reactor-netty-core-1.1.0.jar!/:1.1.0]
at reactor.netty.channel.ChannelOperations.onInboundNext(ChannelOperations.java:411) ~[reactor-netty-core-1.1.0.jar!/:1.1.0]
at reactor.netty.channel.ChannelOperationsHandler.channelRead(ChannelOperationsHandler.java:113) ~[reactor-netty-core-1.1.0.jar!/:1.1.0]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444) ~[netty-transport-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) ~[netty-transport-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412) ~[netty-transport-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:346) ~[netty-codec-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:333) ~[netty-codec-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:454) ~[netty-codec-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:290) ~[netty-codec-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444) ~[netty-transport-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) ~[netty-transport-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412) ~[netty-transport-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410) ~[netty-transport-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:440) ~[netty-transport-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) ~[netty-transport-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919) ~[netty-transport-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166) ~[netty-transport-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:788) ~[netty-transport-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724) ~[netty-transport-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650) ~[netty-transport-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562) ~[netty-transport-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) ~[netty-common-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) ~[netty-common-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) ~[netty-common-4.1.85.Final.jar!/:4.1.85.Final]
at java.base/java.lang.Thread.run(Thread.java:833) ~[na:na]
Caused by: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'org.springframework.boot.autoconfigure.data.r2dbc.R2dbcDataAutoConfiguration
': Unsatisfied dependency expressed through constructor parameter 0: Error creating bean with name 'r2dbcMigrate' defined in class path resource [name/nkonev/r2dbc/migrate/autoconfigur
e/R2dbcMigrateAutoConfiguration.class]: relation "migrations_lock" does not exist
at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:793) ~[spring-beans-6.0.2.jar!/:6.0.2]
at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:242) ~[spring-beans-6.0.2.jar!/:6.0.2]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1344) ~[spring-beans-6.0.2.jar!/:6.0
.2]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1188) ~[spring-beans-6.0.2.jar!/:6.0.
2]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:561) ~[spring-beans-6.0.2.jar!/:6.0.2]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:521) ~[spring-beans-6.0.2.jar!/:6.0.2]
at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:326) ~[spring-beans-6.0.2.jar!/:6.0.2]
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234) ~[spring-beans-6.0.2.jar!/:6.0.2]
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:324) ~[spring-beans-6.0.2.jar!/:6.0.2]
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:200) ~[spring-beans-6.0.2.jar!/:6.0.2]
at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:412) ~[spring-beans-6.0.2.jar!/:6.0.2]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFactory.java:1324) ~[spring-beans-6.0.2
.jar!/:6.0.2]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1161) ~[spring-beans-6.0.2.jar!/:6.0.
2]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:561) ~[spring-beans-6.0.2.jar!/:6.0.2]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:521) ~[spring-beans-6.0.2.jar!/:6.0.2]
at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:326) ~[spring-beans-6.0.2.jar!/:6.0.2]
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234) ~[spring-beans-6.0.2.jar!/:6.0.2]
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:324) ~[spring-beans-6.0.2.jar!/:6.0.2]
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:200) ~[spring-beans-6.0.2.jar!/:6.0.2]
at org.springframework.beans.factory.support.BeanDefinitionValueResolver.resolveReference(BeanDefinitionValueResolver.java:365) ~[spring-beans-6.0.2.jar!/:6.0.2]
... 41 common frames omitted
Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'r2dbcMigrate' defined in class path resource [name/nkonev/r2dbc/migrate/autoconfigure
/R2dbcMigrateAutoConfiguration.class]: relation "migrations_lock" does not exist
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.initializeBean(AbstractAutowireCapableBeanFactory.java:1751) ~[spring-beans-6.0.2.jar!/:6.0.2]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:599) ~[spring-beans-6.0.2.jar!/:6.0.2]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:521) ~[spring-beans-6.0.2.jar!/:6.0.2]
at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:326) ~[spring-beans-6.0.2.jar!/:6.0.2]
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234) ~[spring-beans-6.0.2.jar!/:6.0.2]
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:324) ~[spring-beans-6.0.2.jar!/:6.0.2]
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:200) ~[spring-beans-6.0.2.jar!/:6.0.2]
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:313) ~[spring-beans-6.0.2.jar!/:6.0.2]
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:200) ~[spring-beans-6.0.2.jar!/:6.0.2]
at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:254) ~[spring-beans-6.0.2.jar!/:6.0.2]
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1405) ~[spring-beans-6.0.2.jar!/:6.0.2]
at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1325) ~[spring-beans-6.0.2.jar!/:6.0.2]
at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:880) ~[spring-beans-6.0.2.jar!/:6.0.2]
at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:784) ~[spring-beans-6.0.2.jar!/:6.0.2]
... 60 common frames omitted
Caused by: io.r2dbc.postgresql.ExceptionFactory$PostgresqlBadGrammarException: relation "migrations_lock" does not exist
at io.r2dbc.postgresql.ExceptionFactory.createException(ExceptionFactory.java:96) ~[r2dbc-postgresql-1.0.0.RELEASE.jar!/:1.0.0.RELEASE]
at io.r2dbc.postgresql.ExceptionFactory.createException(ExceptionFactory.java:65) ~[r2dbc-postgresql-1.0.0.RELEASE.jar!/:1.0.0.RELEASE]
at io.r2dbc.postgresql.ExceptionFactory.handleErrorResponse(ExceptionFactory.java:132) ~[r2dbc-postgresql-1.0.0.RELEASE.jar!/:1.0.0.RELEASE]
at io.r2dbc.postgresql.PostgresqlResult.lambda$getRowsUpdated$0(PostgresqlResult.java:70) ~[r2dbc-postgresql-1.0.0.RELEASE.jar!/:1.0.0.RELEASE]
at reactor.core.publisher.FluxHandleFuseable$HandleFuseableSubscriber.onNext(FluxHandleFuseable.java:178) ~[reactor-core-3.5.0.jar!/:3.5.0]
at reactor.core.publisher.FluxWindowPredicate$WindowFlux.drainRegular(FluxWindowPredicate.java:670) ~[reactor-core-3.5.0.jar!/:3.5.0]
at reactor.core.publisher.FluxWindowPredicate$WindowFlux.drain(FluxWindowPredicate.java:748) ~[reactor-core-3.5.0.jar!/:3.5.0]
at reactor.core.publisher.FluxWindowPredicate$WindowFlux.onNext(FluxWindowPredicate.java:790) ~[reactor-core-3.5.0.jar!/:3.5.0]
at reactor.core.publisher.FluxWindowPredicate$WindowPredicateMain.onNext(FluxWindowPredicate.java:241) ~[reactor-core-3.5.0.jar!/:3.5.0]
at io.r2dbc.postgresql.util.FluxDiscardOnCancel$FluxDiscardOnCancelSubscriber.onNext(FluxDiscardOnCancel.java:91) ~[r2dbc-postgresql-1.0.0.RELEASE.jar!/:1.0.0.RELEASE]
at reactor.core.publisher.FluxContextWrite$ContextWriteSubscriber.onNext(FluxContextWrite.java:107) ~[reactor-core-3.5.0.jar!/:3.5.0]
at reactor.core.publisher.FluxCreate$BufferAsyncSink.drain(FluxCreate.java:814) ~[reactor-core-3.5.0.jar!/:3.5.0]
at reactor.core.publisher.FluxCreate$BufferAsyncSink.next(FluxCreate.java:739) ~[reactor-core-3.5.0.jar!/:3.5.0]
at reactor.core.publisher.FluxCreate$SerializedFluxSink.next(FluxCreate.java:161) ~[reactor-core-3.5.0.jar!/:3.5.0]
at io.r2dbc.postgresql.client.ReactorNettyClient$Conversation.emit(ReactorNettyClient.java:687) ~[r2dbc-postgresql-1.0.0.RELEASE.jar!/:1.0.0.RELEASE]
at io.r2dbc.postgresql.client.ReactorNettyClient$BackendMessageSubscriber.emit(ReactorNettyClient.java:939) ~[r2dbc-postgresql-1.0.0.RELEASE.jar!/:1.0.0.RELEASE]
at io.r2dbc.postgresql.client.ReactorNettyClient$BackendMessageSubscriber.onNext(ReactorNettyClient.java:813) ~[r2dbc-postgresql-1.0.0.RELEASE.jar!/:1.0.0.RELEASE]
at io.r2dbc.postgresql.client.ReactorNettyClient$BackendMessageSubscriber.onNext(ReactorNettyClient.java:719) ~[r2dbc-postgresql-1.0.0.RELEASE.jar!/:1.0.0.RELEASE]
at reactor.core.publisher.FluxHandle$HandleSubscriber.onNext(FluxHandle.java:128) ~[reactor-core-3.5.0.jar!/:3.5.0]
at reactor.core.publisher.FluxPeekFuseable$PeekConditionalSubscriber.onNext(FluxPeekFuseable.java:854) ~[reactor-core-3.5.0.jar!/:3.5.0]
at reactor.core.publisher.FluxMap$MapConditionalSubscriber.onNext(FluxMap.java:224) ~[reactor-core-3.5.0.jar!/:3.5.0]
at reactor.core.publisher.FluxMap$MapConditionalSubscriber.onNext(FluxMap.java:224) ~[reactor-core-3.5.0.jar!/:3.5.0]
at reactor.netty.channel.FluxReceive.drainReceiver(FluxReceive.java:292) ~[reactor-netty-core-1.1.0.jar!/:1.1.0]
at reactor.netty.channel.FluxReceive.onInboundNext(FluxReceive.java:401) ~[reactor-netty-core-1.1.0.jar!/:1.1.0]
at reactor.netty.channel.ChannelOperations.onInboundNext(ChannelOperations.java:411) ~[reactor-netty-core-1.1.0.jar!/:1.1.0]
at reactor.netty.channel.ChannelOperationsHandler.channelRead(ChannelOperationsHandler.java:113) ~[reactor-netty-core-1.1.0.jar!/:1.1.0]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444) ~[netty-transport-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) ~[netty-transport-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412) ~[netty-transport-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:346) ~[netty-codec-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:318) ~[netty-codec-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:444) ~[netty-transport-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) ~[netty-transport-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:412) ~[netty-transport-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410) ~[netty-transport-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:440) ~[netty-transport-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:420) ~[netty-transport-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919) ~[netty-transport-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166) ~[netty-transport-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:788) ~[netty-transport-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:724) ~[netty-transport-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:650) ~[netty-transport-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:562) ~[netty-transport-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:997) ~[netty-common-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) ~[netty-common-4.1.85.Final.jar!/:4.1.85.Final]
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) ~[netty-common-4.1.85.Final.jar!/:4.1.85.Final]
at java.base/java.lang.Thread.run(Thread.java:833) ~[na:na]
Suppressed: java.lang.Exception: #block terminated with an error
at reactor.core.publisher.BlockingSingleSubscriber.blockingGet(BlockingSingleSubscriber.java:99) ~[reactor-core-3.5.0.jar!/:3.5.0]
at reactor.core.publisher.Mono.block(Mono.java:1710) ~[reactor-core-3.5.0.jar!/:3.5.0]
at name.nkonev.r2dbc.migrate.autoconfigure.R2dbcMigrateAutoConfiguration$R2dbcMigrateBlockingInvoker.migrate(R2dbcMigrateAutoConfiguration.java:80) ~[r2dbc-migrate-spri
ng-boot-starter-2.7.8.jar!/:na]
GitHub project link related with the error that causes this issue: https://github.com/PauloPortfolio/springwebflux-tdd-devdojo
The Special character "$" are not being accepted for the migration process.
For example: This INSERT statement is completely normal, and it is normal processed for the r2dbc-migrate BECAUSE we have a 'plaintext' password (DEVDOJO):
INSERT INTO userspasswords
(nameuser, username, password, authorities)
VALUES ('demetria',
'demetria',
'devdojo',
'ROLE_USER');
HOWEVER, when I change the field 'PASSWORD' from PLAIN-TEXT 'DEVDOJO'
to ENCRYTED PASSWORD 'DEVDOJO' ({bcrypt}$2a$10$lBm1Qy45bR/fNT5i5OUBseNPcONtZs10earjLZ773qq.byhK/yKmS), the specials characteres are not allowed, specially "$", for instance:
INSERT INTO userspasswords
(nameuser, username, password, authorities)
VALUES ('demetria',
'demetria',
'{bcrypt}$2a$10$lBm1Qy45bR/fNT5i5OUBseNPcONtZs10earjLZ773qq.byhK/yKmS',
'ROLE_USER');
THIS FINAL ERROR HAPPENS:------------------------>>>
By 'Writing metadata version 3' 1 rows updated
web-api_1 | 2020-08-04 20:54:43.971 INFO 7 --- [tor-tcp-epoll-1] n.n.r2dbc.migrate.core.R2dbcMigrate : Applying MigrationInfo{version=4, description='insert in table userpasswords', splitByLine=false, transactional=true}
web-api_1 | 2020-08-04 20:54:43.977 INFO 7 --- [tor-tcp-epoll-1] n.n.r2dbc.migrate.core.R2dbcMigrate : By 'Releasing lock after error' 1 rows updated
web-api_1 | 2020-08-04 20:54:43.980 WARN 7 --- [ main] onfigReactiveWebServerApplicationContext : Exception encountered during context initialization - cancelling refresh attempt: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'animeController': Unsatisfied dependency expressed through field 'service'; nested exception is org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'animeService': Unsatisfied dependency expressed through field 'repo'; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'animeRepository' defined in academy.devdojo.webflux.repository.AnimeRepository defined in @EnableR2dbcRepositories declared on R2dbcRepositoriesAutoConfigureRegistrar.EnableR2dbcRepositoriesConfiguration: Cannot resolve reference to bean 'r2dbcDatabaseClient' while setting bean property 'databaseClient'; nested exception is org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'org.springframework.boot.autoconfigure.data.r2dbc.R2dbcDataAutoConfiguration': Unsatisfied dependency expressed through constructor parameter 0; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'r2dbcMigrate' defined in class path resource [name/nkonev/r2dbc/migrate/autoconfigure/R2dbcMigrateAutoConfiguration.class]: Invocation of init method failed; nested exception is java.lang.IllegalArgumentException: Statement 'INSERT INTO userspasswords (nameuser, username, password, authorities)
web-api_1 | VALUES ('paulo',
web-api_1 | 'paulo',
web-api_1 | '{bcrypt}$2a$10$lBm1Qy45bR/fNT5i5OUBseNPcONtZs10earjLZ773qq.byhK/yKmS',
web-api_1 | 'ROLE_ADMIN,ROLE_USER');
web-api_1 |
web-api_1 | INSERT INTO userspasswords (nameuser, username, password, authorities)
web-api_1 | VALUES ('demetria',
web-api_1 | 'demetria',
web-api_1 | 'devdojo',
web-api_1 | 'ROLE_USER');' cannot be created. This is often due to the presence of both multiple statements and parameters at the same time.
web-api_1 | 2020-08-04 20:54:43.991 INFO 7 --- [ main] ConditionEvaluationReportLoggingListener :
web-api_1 |
web-api_1 | Error starting ApplicationContext. To display the conditions report re-run your application with 'debug' enabled.
web-api_1 | 2020-08-04 20:54:43.996 ERROR 7 --- [ main] o.s.boot.SpringApplication : Application run failed
run and test the migrate, but not apply on the database
from #22
I'm running the R2DBC Migrate with only the version 1 of my migration scripts. The PostgreSQL is empty. During startup, here is my error logs
`2020-08-18 11:36:58.807+0200 INFO 26064 --- [ restartedMain] n.n.r.m.a.R2dbcMigrateAutoConfiguration : Starting R2DBC migration
2020-08-18 11:36:58.809+0200 INFO 26064 --- [ restartedMain] n.n.r.m.c.R2dbcMigrate : Configured with R2dbcMigrateProperties{connectionMaxRetries=500, resourcesPaths=[classpath:/db/migration/*.sql], chunkSize=1000, dialect=POSTGRESQL, validationQuery='select '42' as result', validationQueryExpectedResultValue='42', validationQueryTimeout=PT5S, validationRetryDelay=PT1S, acquireLockRetryDelay=PT1S, acquireLockMaxRetries=100, fileCharset=UTF-8, waitForDatabase=true, migrationsSchema='null', migrationsTable='migrations', migrationsLockTable='migrations_lock'}
2020-08-18 11:36:58.904+0200 INFO 26064 --- [ restartedMain] n.n.r.m.c.R2dbcMigrate : Creating new test connection
2020-08-18 11:36:59.557+0200 INFO 26064 --- [actor-tcp-nio-1] n.n.r.m.c.R2dbcMigrate : Comparing expected value '42' with provided result '42'
2020-08-18 11:36:59.559+0200 INFO 26064 --- [actor-tcp-nio-1] n.n.r.m.c.R2dbcMigrate : Closing test connection
2020-08-18 11:36:59.569+0200 INFO 26064 --- [actor-tcp-nio-1] n.n.r.m.c.R2dbcMigrate : Successfully got result '42' of test query
2020-08-18 11:36:59.661+0200 INFO 26064 --- [actor-tcp-nio-1] n.n.r.m.c.R2dbcMigrate : By 'Making internal tables' 1 rows updated
2020-08-18 11:36:59.685+0200 INFO 26064 --- [actor-tcp-nio-1] n.n.r.m.c.R2dbcMigrate : By 'Acquiring lock' 1 rows updated
2020-08-18 11:36:59.691+0200 INFO 26064 --- [actor-tcp-nio-1] n.n.r.m.c.R2dbcMigrate : Database version is 0
2020-08-18 11:36:59.701+0200 INFO 26064 --- [actor-tcp-nio-1] n.n.r.m.c.R2dbcMigrate : Applying MigrationInfo{version=1, description='Initial database', splitByLine=false, transactional=true}
2020-08-18 11:36:59.881+0200 INFO 26064 --- [actor-tcp-nio-1] n.n.r.m.c.R2dbcMigrate : By 'MigrationInfo{version=1, description='Initial database', splitByLine=false, transactional=true}' 1 rows updated
2020-08-18 11:36:59.907+0200 WARN 26064 --- [actor-tcp-nio-1] i.r.p.c.ReactorNettyClient : Error: SEVERITY_LOCALIZED=ERROR, SEVERITY_NON_LOCALIZED=ERROR, CODE=42P01, MESSAGE=relation "migrations" does not exist, POSITION=13, FILE=d:\pginstaller_12.auto\postgres.windows-x64\src\backend\parser\parse_relation.c, LINE=1194, ROUTINE=parserOpenTable
2020-08-18 11:36:59.918+0200 ERROR 26064 --- [actor-tcp-nio-1] n.n.r.m.c.R2dbcMigrate : Got error
io.r2dbc.postgresql.ExceptionFactory$PostgresqlBadGrammarException: relation "migrations" does not exist
at io.r2dbc.postgresql.ExceptionFactory.createException(ExceptionFactory.java:85) ~[r2dbc-postgresql-0.8.4.RELEASE.jar:0.8.4.RELEASE]`
However, the migrations and migrations_lock tables are created
Hi Nikita,
Again here.
Would like the support for multiple schema migration, https://flywaydb.org/documentation/faq.html#multiple-schemas.
I am more than happy to implement some of the features I am requesting now.
Running migrations on H2 with MODE=PostgreSQL
causes following exception.
io.r2dbc.spi.R2dbcBadGrammarException: Duplicate column name "?column?"; SQL statement:
insert into "migrations_lock"(id, locked) select * from (select 1, false) x where not exists(select * from "migrations_lock") [42121-214]
I am not sure whether that's an issue of r2dbc-migrate or r2dbc-h2. I don`t understand why that issue occurs.
name.nkonev.r2dbc-migrate:r2dbc-migrate-spring-boot-starter:2.7.6
org.springframework.boot:spring-boot-starter-parent:2.7.1
Hi Nikita,
Thanks for making this handy tool.
One feature I would like to have in this tool is the support for Repeatable migration in flyway.
Currently when I have a R__xxx.sql
, I am getting
Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'r2dbcMigrate' defined in class path resource [name/nkonev/r2dbc/migrate/autoconfigure/R2dbcMigrateAutoConfiguration.class]: Invocation of init method failed; nested exception is java.lang.NumberFormatException: For input string: "R"
I am using the latest release.
Spring Boot 3 at RC phase. Unfortunately R2DBC Migrate broken because of use R2BC 1.0 SPI etc in Spring Framework 6. I've try locally update dependency and it seems to work with supported drivers. May be release some milestone artifact based on Spring Boot 3 for testing and provide valuable feedback to Spring team?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.