Code Monkey home page Code Monkey logo

lc-spring-data-r2dbc's Introduction

lc-spring-data-r2dbc

   
       

The goal this library is to provide basic ORM features not covered by Spring Data R2DBC (now part of Spring Data Relational project).

Features

  • Lazy loading
  • Linked entities (1 to 1, 1 to n, n to 1, n to n)
  • Select statement with joins
  • Save (insert/update) with cascade
  • Delete with cascade
  • Composite Id
  • Sequence
  • Insert multiple rows in a single INSERT request (except for MySql)
  • Schema generation, with indexes, foreign key constraints, sequences
  • Array columns (only with Postgresql)

Features are detailed with examples in the wiki section

Supported databases

  • H2
  • Postgres
  • MySql

Dependencies version

Dependency groupId artifactId version Latest version
Spring Boot org.springframework.boot spring-boot-starter-data-r2dbc 2.6.7
Appache Commons Lang org.apache.commons commons-lang3 3.12.0
Javassist org.javassist javassist 3.28.0-GA
H2 driver io.r2dbc r2dbc-h2 0.9.1.RELEASE
MySql driver dev.miku r2dbc-mysql 0.8.2.RELEASE
PostgreSQL driver org.postgresql r2dbc-postgresql 0.9.1.RELEASE

Configuration

Dependency configuration

Add the dependency to your project, depending on your database (you may add several if you are using multiple databases in your project):

H2

Maven

<dependency>
  <groupId>net.lecousin.reactive-data-relational</groupId>
  <artifactId>h2</artifactId>
  <version>0.10.2</version>
</dependency>

Gradle

implementation group: 'net.lecousin.reactive-data-relational', name: 'h2', version: '0.10.2'

Postgres

Maven

<dependency>
  <groupId>net.lecousin.reactive-data-relational</groupId>
  <artifactId>postgres</artifactId>
  <version>0.10.2</version>
</dependency>

Gradle

implementation group: 'net.lecousin.reactive-data-relational', name: 'postgres', version: '0.10.2'

MySql

Maven

<dependency>
  <groupId>net.lecousin.reactive-data-relational</groupId>
  <artifactId>mysql</artifactId>
  <version>0.10.2</version>
</dependency>

Gradle

implementation group: 'net.lecousin.reactive-data-relational', name: 'mysql', version: '0.10.2'

Spring Boot configuration

In your Spring Boot application class, you need to:

  • add @EnableR2dbcRepositories(repositoryFactoryBeanClass = LcR2dbcRepositoryFactoryBean.class)
  • launch the initializer LcReactiveDataRelationalInitializer.init() that will add functionalities to your entity classes, before your application starts. This step MUST be done before Spring starts to ensure no entity class is loaded yet in the JVM.
  • configure your database

Example:

@SpringBootApplication
@EnableR2dbcRepositories(repositoryFactoryBeanClass = LcR2dbcRepositoryFactoryBean.class)
@Import(PostgresConfiguration.class) // here you can change depending on the database you are using
public class MyApp {

	public static void main(String[] args) {
		LcReactiveDataRelationalInitializer.init();
		SpringApplication.run(MyApp.class, args);
	}

}

The @Import annotation is used when using a single database connection, and your connection is configured through application properties (application.properties or application.yml). Depending on your database, you can use one of this configuration class:

  • net.lecousin.reactive.data.relational.h2.H2Configuration
  • net.lecousin.reactive.data.relational.mysql.MySqlConfiguration
  • net.lecousin.reactive.data.relational.postgres.PostgresConfiguration

Finally, configure how to connect to the database using Spring R2DBC normal configuration, here is an example of application.yml file:

spring:
  r2dbc:
    username: sa
    url: r2dbc:h2:mem:///testdb;DB_CLOSE_DELAY=-1;

Using custom connection factory

If you want to configure the connection programmatically instead of using application properties, instead of using @Import annotation like in the previous example, you can extend the configuration class to provide your own ConnectionFactory:

import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;

import io.r2dbc.h2.H2ConnectionConfiguration;
import io.r2dbc.h2.H2ConnectionFactory;
import net.lecousin.reactive.data.relational.h2.H2Configuration;

@Configuration
public class H2TestConfiguration extends H2Configuration {

	@Override
	@Bean
	public H2ConnectionFactory connectionFactory() {
		return new H2ConnectionFactory(
			H2ConnectionConfiguration.builder()
			.url("mem:testdb;DB_CLOSE_DELAY=-1;")
			.username("sa")
			.build()
		);
	}
	
}

Multiple databases

If you need to connect to multiple databases, the configuration is different. You need to create a @Configuration class for each database connection, extending class LcR2dbcEntityOperationsBuilder. Instead of declaring @EnableR2dbcRepositories directly on your application class, you will declare it to each configuration class.

Here is an example of such a configuration class:

@Configuration
@EnableR2dbcRepositories(repositoryFactoryBeanClass = LcR2dbcRepositoryFactoryBean.class, basePackages = "com.example.book.dao.repository", entityOperationsRef = "bookOperations")
public class BookConfig extends LcR2dbcEntityOperationsBuilder {

	/** Connection factory. */
	@Bean
	@Qualifier("bookDatabaseConnectionFactory")
	public ConnectionFactory bookDatabaseConnectionFactory(@Value("${database.book}") String databaseUrl) {
		return ConnectionFactories.get(databaseUrl);
	}
	
	/** Entity operations bean. */
	@Bean
	@Qualifier("bookOperations")
	public LcR2dbcEntityTemplate bookOperations(@Qualifier("bookDatabaseConnectionFactory") ConnectionFactory connectionFactory) {
		return buildEntityOperations(connectionFactory);
	}

}
  • define a bean to create a ConnectionFactory (here we get a url from the application configuration, but you can create it in another way)
  • define a bean LcR2dbcEntityTemplate with the connection factory as argument
  • add the annotation @EnableR2dbcRepositories with the packages containing the repositories that will use this database, and the attribute entityOperationsRef set to the qualifier of the LcR2dbcEntityTemplate bean

A complete example illustrating a Spring Boot application connecting to different databases is available in the repository lc-spring-data-r2dbc-sample.

Application startup time

By default, when calling LcReactiveDataRelationalInitializer.init(), all classes present in the class path are analyzed to find entity classes. This can take some time especially if you have many libraries in your class path.

This behavior allows no additional configuration, which make it easier during development. However for production, if it is important for your application to startup faster (for example a microservice), you can declare the list of entity classes in a YAML resource file lc-reactive-data-relational.yaml (similar to file persistence.xml for JPA). The classes are declared under the name entities, each level can declare a package, then leaf names are classes. For example:

entities:
  - net.lecousin.reactive.data.relational.test:
    - simplemodel:
      - BooleanTypes
      - CharacterTypes
    - onetoonemodel:
      - MyEntity1
      - MySubEntity1
    - onetomanymodel:
      - RootEntity
      - SubEntity
      - SubEntity2
      - SubEntity3

The presence of this file will disable class path analysis. Note that you may have several resource files lc-reactive-data-relational.yaml in your class path (for example if several modules are providing entities), and all will be processed.

JUnit 5

For your tests, using JUnit 5, you can use the annotation @DataR2dbcTest provided by Spring, and add the annotation @EnableR2dbcRepositories(repositoryFactoryBeanClass = LcR2dbcRepositoryFactoryBean.class).

In order to make sure the initializer is launched before any test class is loaded, add the following maven dependency, which will automatically call LcReactiveDataRelationalInitializer.init() during JUnit startup:

<dependency>
  <groupId>net.lecousin.reactive-data-relational</groupId>
  <artifactId>test-junit-5</artifactId>
  <version>0.10.2</version>
  <scope>test</scope>
</dependency>

Usage

A more complete guide is available in the wiki section, here is an overview.

Entity class

Your entity classes are declared using the Spring Data R2DBC annotations @Table and @Column:

@Table
public class MyEntity {
	@Column
	private String myText;
}

In addition, here are the following supported annotations:

  • org.springframework.data.annotation.Id indicates a primary key
  • net.lecousin.reactive.data.relational.annotations.CompositeId indicates properties to use as a unique key (cannot be used together with @Id)
  • net.lecousin.reactive.data.relational.annotations.GeneratedValue indicates a value that should be generated by the database (auto increment, using a sequence, or generating a random UUID)
  • org.springframework.data.annotation.Version is used for optimistic lock
  • net.lecousin.reactive.data.relational.annotations.ForeignKey indicates a link with a foreign key stored in the table. The cascade behavior can be specified using the attributes optional, onForeignDeleted, and cascadeDelete. The type of the attribute in the class will be the linked entity class (not the foreign key itself). A foreign key cannot be used on a collection type.
  • net.lecousin.reactive.data.relational.annotations.ForeignTable is the other side of the foreign key. It does not store anything in the table but indicates the link to another class to use with joins or lazy loading. A foreign table can be used on a collection type for a one to many link.
  • net.lecousin.reactive.data.relational.annotations.JoinTable can be used for a many to many (n-n) relationship when no additional field is required on the join table. The join table will be automatically created with the 2 foreign keys. This allows to join directly between 2 tables with many to many relationship in a transparent manner.
  • org.springframework.data.annotation.CreatedDate and org.springframework.data.annotation.LastModifiedDate can be used to automatically store respectively the creation date and modification date. It can be used with a column of type Long, Instant, LocalDate, LocalTime or LocalDateTime. OffsetTime and ZonedDateTime can be used except for MySql that does not support columns with timezone information.
  • net.lecousin.reactive.data.relational.annotations.ColumnDefinition allows to specify constraints for schema generation.

Additional methods may be declared in an Entity class to handle lazy loading, documented in the dedicated section.

Example:

@Table
public class MyEntity {
	@Id @GeneratedValue
	private Long id;
	
	@ColumnDefinition(max = 100, nullable = true)
	private String someOptionalText;
	
	@ForeignKey(optional = false)
	private MyOtherEntity linkedEntity;
	
	[...]
}

Important note for Java > 11

If you use a Java version > 11, the JVM does not allow to modify classes in another package, and this library won't be able to enhance your classes to provide all the functionalities. As a workaround for now, you must explicitly allow it by placing an empty interface named AllowEnhancer in every package containing entities.

You can just create it like this:

public interface AllowEnhancer {
}

Spring Repository

You can use Spring repositories as usual. The methods to save and delete (such as save, saveAll, delete, deleteAll) will automatically be done with cascade. The select methods (findXXX) do not perform any join by default, but allow lazy loading. However joins can be done using SelectQuery. This is described in the 2 following sections.

Lazy loading

Spring Repository methods to find entities (find...) do not perform any join. If your class has links to other classes, they won't be loaded, however this library supports lazy loading which will load the linked entities on demand.

Lazy loading is done by declaring additional methods on your entity class, with a default body. The body will be automatically replaced with the correct code.

Here is an example of an entity enabling lazy loading:

@Table
public class MyEntity {

	@Id @GeneratedValue
	private Long id;

	@ForeignKey(optional = false)
	private MyOtherEntity other;

	@ForeignTable(joinKey = "myEntity")
	private List<JoinEntity> links;

	/* The usual getters and setters. */

	public MyOtherEntity getOther() {
		return other;
	}
	
	public void setOther(MyOtherEntity other) {
		this.other = other;
	}
	
	public List<JoinEntity> getLinks() {
		return links;
	}

	public void setLinks(List<JoinEntity> links) {
		this.links = links;
	}
	
	/* Lazy loading methods. */

	public Mono<MyOtherEntity> lazyGetOther() {
		return null; // you can just return null, the correct code will be automatically generated.
	}

	public Flux<JoinEntity> lazyGetLinks() {
		return null; // you can just return null, the correct code will be automatically generated.
	}
	
	/* Lazy loading of this MyEntity (this class). */
	
	/** @return true if this MyEntity is loaded from database, false if only the @Id is available. */
	public boolean entityLoaded() {
		return false; // you can just return false, the correct code will be automatically generated.
	}
	
	/** Ensure this MyEntity is loaded from database before to use. */
	public Mono<MyEntity> loadEntity() {
		return null; // you can just return null, the correct code will be automatically generated.
	}
	
}

As illustrated by this example, you can define the following methods to handle lazy loading on an entity class:

  • methods public Mono<T> lazyGetXXX() { return null; } to get a loaded entity on a @ForeignKey or @ForeignTable field XXX.
  • For collections, the same method can be declared with a Flux: public Flux<T> lazyGetXXX() { return null; } to get entities from a collection with @ForeignTable
  • a method public boolean entityLoaded() { return false; } to know if the entity instance is fully loaded or not.
  • a method public Mono<T> loadEntity() { return null; } where T is your entity class, to load the instance from the database. If the entity is already loaded, Mono.just(this) is returned (no redundant database query on each call).

The body of those methods will be automatically generated during enhancement (when calling LcReactiveDataRelationalInitializer.init() at startup), that's why you can just return null or false in your code.

Using methods lazyGetXXX or loadEntity allow to ensure the entity or attributes are loaded. If already loaded, nothing is done, else a database request is done. That's why a Mono or Flux is returned, so you can perform your actions in a non-blocking mode even if a database request needs to be executed.

In case an entity or attribute is not loaded, and you are not using those methods, here is the behavior you can expect:

  • On a @ForeignKey attribute, a class will be instantiated with the foreign key as id, all other attributes are not set. That means your attribute is not null (using classical getter) but all its attributes are null except its primary key which is pre-filled. In other words you can get the foreign key value without needing to load the full entity.
  • On a @ForeignTable attribute, the attribute will be null. The reason is that the database table of your entity does not store any information about this link (unlike when having a foreign key).

Note that all those methods are completely optional. If you define some of those methods, the enhancer will generate the corresponding code, else the method will just not be available.

Select with joins

Lazy loading is often not a good solution in term of performance, and we may want to load linked entities in a single database request using joins.

In JPA, entity graphs are used to specify the attributes to fetch automatically. This library does not provide similar way to indicate which links need to be loaded, but you can use the class SelectQuery to perform more complex searches and load linked entities.

The SelectQuery can be used anywhere, including in your Spring Data repositories as default method (note that default methods in Spring Data repositories is not supported with Kotlin).

For example:

public interface RootEntityRepository extends LcR2dbcRepository<RootEntity, Long> {

	/** Search RootEntity having the given value, and fetch sub entities 'list'. */
	default Flux<RootEntity> findByValue(String value) {
		return SelectQuery
		.from(RootEntity.class, "entity")                      // SELECT entity FROM RootEntity AS entity
		.where(Criteria.property("entity", "value").is(value)) // WHERE entity.value = :value
		.join("entity", "list", "sub")                         // JOIN entity.list AS sub
		.execute(getLcClient());
	}
	
	/** Search RootEntity with a sub-entity having the given value, and fetch sub entities. */
	default Flux<RootEntity> findBySubValue(String value) {
		return SelectQuery
		.from(RootEntity.class, "entity")                      // SELECT entity FROM RootEntity AS entity
		.join("entity", "list", "sub")                         // JOIN entity.list AS sub
		.where(Criteria.property("sub", "subValue").is(value)) // WHERE sub.subValue = :value
		.execute(getLcClient());
	}
	
	/** Count RootEntity with a sub-entity having the given value. */
	default Mono<Long> findBySubValue(String value) {
		return SelectQuery
		.from(RootEntity.class, "entity")                      // SELECT entity FROM RootEntity AS entity
		.join("entity", "list", "sub")                         // JOIN entity.list AS sub
		.where(Criteria.property("sub", "subValue").is(value)) // WHERE sub.subValue = :value
		.executeCount(getLcClient());
	}

	/** Search RootEntity having the same value as one of its sub-entities. */	
	default Flux<RootEntity> havingSubValueEqualsToValue() {
		return SelectQuery
		.from(RootEntity.class, "entity")                      // SELECT entity FROM RootEntity AS entity
		.join("entity", "list", "sub")                         // JOIN entity.list AS sub
		.where(Criteria.property("sub", "subValue").is("entity", "value")) // WHERE sub.subValue = entity.value
		.execute(getLcClient());
	}
	
	/** Get all RootEntity and fetch links 'list', 'list2' and 'list3'. */
	default Flux<RootEntity> findAllFull() {
		return getLcClient().execute(
			SelectQuery.from(RootEntity.class, "root")         // SELECT root FROM RootEntity as root
			.join("root", "list", "sub1")                      // JOIN root.list as sub1
			.join("root", "list2", "sub2")                     // JOIN root.list2 as sub2
			.join("root", "list3", "sub3")                     // JOIN root.list3 as sub3
		);
	}

}

You can note the method getLcClient() needed to execute requests, which is automatically available if your repository extends LcR2dbcRepository. If you don't need it, your repository can just extend the R2dbcRepository base interface of Spring.

SelectQuery can be used anywhere, not only in a repository, in this case you will need the client instance that you can inject in your class:

	@Autowired
	private LcReactiveDataRelationalClient lcClient;

In case you have multiple database connections, you can inject the entity template configured for a specific database connection, then get the client using template.getLcClient():

	@Autowired
	@Qualifier("bookOperations")
	private LcR2dbcEntityTemplate template;

Resources

License

lc-spring-data-r2dbc is Open Source software released under the Apache 2.0 license.

Contributing

Any contribution is welcome !

Do not hesitate to start a discussion about new features you would like to contribute to, or to open an issue if you encounter a bug.

Contributions to improve the documentation are also welcome.

lc-spring-data-r2dbc's People

Contributors

iamceph avatar lecousin avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

lc-spring-data-r2dbc's Issues

Could not create query for public abstract net.lecousin.reactive.data.relational.LcReactiveDataRelationalClient net.lecousin.reactive.data.relational.repository.LcR2dbcRepository.getLcClient()

I have got the error when build my application using lc-spring-data-r2dbc

Could not create query for public abstract net.lecousin.reactive.data.relational.LcReactiveDataRelationalClient net.lecousin.reactive.data.relational.repository.LcR2dbcRepository.getLcClient(); Reason: Failed to create query for method public abstract net.lecousin.reactive.data.relational.LcReactiveDataRelationalClient net.lecousin.reactive.data.relational.repository.LcR2dbcRepository.getLcClient(); No property 'getLcClient' found for type 'LcReactiveDataRelationalClient'

My repository is configured as follows

public interface DscUserRepository extends LcR2dbcRepository<DscUser, UUID> {

}

Could you please check the problem? Many thanks!

SelectExecution#executeWithoutPreSelect does not pass the reactive context to the fromDb variable, which is why transactions do not work.

To reproduce, just create a table with 1 record. And run the following code:
Transaction method:

  1. Delete everything
  2. Select everything using SelectQuery
  3. We see that the quantity is not 0.
    @Transactional
    public Mono<Void> test(){
        return entityRepository.deleteAll()
                .thenMany(SelectQuery.from(Entity.class, "e").execute(lcClient))
                .count()
                .map(cnt -> cnt) //bug = is not empty
                .then();
    }

SelectExecution sources:

	private Flux<T> executeWithoutPreSelect() {
		SelectMapping mapping = buildSelectMapping();

                // FIXME: there is no context for transactions
		Flux<Map<String, Object>> fromDb = buildFinalSql(mapping, query.where, true, hasJoinMany()).execute().fetch().all();
		return Flux.create((Consumer<FluxSink<T>>)sink -> {
			RowHandler handler = new RowHandler(mapping, sink);
			fromDb.doOnComplete(handler::handleEnd).subscribe(handler::handleRow, sink::error);
		});
	}

Joined property is not populated when using SelectQuery.join

Hi,
I have those two (simplified) entities with a one-to-one relationship:

@Table("project")
data class ProjectEntity(
    @Id
    val id: ProjectId,
    var name: String,
    @ForeignKey(onForeignDeleted = ForeignKey.OnForeignDeleted.SET_TO_NULL, cascadeDelete = true)
    var address: AddressEntity
)

@Table("address")
data class AddressEntity(
    @Id
    val id: AddressId,
    var street: String = "",
    var city: String = ""
)

And a repository:

@Repository
interface ProjectRepository: LcR2dbcRepository<ProjectEntity, ProjectId> {
    override fun findById(id: ProjectId): Mono<ProjectEntity> {
        return SelectQuery<ProjectEntity> = SelectQuery.from(ProjectEntity::class.java, "project")
            .join("project", "address", "address")
            .where(Criteria.property("project", "id").`is`(id))
            .execute(lcClient)
            .next()
    }
}

When I try to use this findById function, I find myself with an empty AddressEntity in the address field. The only well-assigned field is the id (the others take their default values).

I did a little quick digging, and found out that it might be caused by the cache within the LcEntityReader.
When the project is first loaded during the execution of the query, it creates a first empty instance of the address and adds it to the cache. Then, when the fillLinkedEntities function is called for the joins, it finds the cached address and doesn't bother to read the query result to populate it.

Did I miss something in my setup or is it a bug?

PS: I had to add default values to the fields of AddressEntity because of the empty instance created during the query execution (I would get exceptions because of the null values). Is there any way to avoid that?

PPS: I saw in your wiki that default functions aren't supported in kotlin for the Spring Repositories. Actually, as I did, it is possible to add custom functions to a repository by using the compiler arg -Xjvm-default (and with the annotation @JvmDefault for older kotlin versions)

Generating default UUID on PostgreSQL 13/14

Hi @lecousin,
Currently, the implemented PostgreSQL dialect supports the default creation of UUID v4 using the UUID_GENERATE_V4() function derived from the extension. PostgreSQL database as of version 13 supports native function for UUID generation - gen_random_uuid() (https://www.postgresql.org/docs/current/functions-uuid.html), so it is not required to install the above extension.

I would ask you to support the native function in the currently used dialect or completely create a new dialect under the latest versions of the database.

Incorrect handling of collections in EntityState

The persisted values in the EntityState class for the collection are copied to the LinkedList in the savePersistedValue() method. As a result, by defining a collection of type Set in an entity tagged with the @ColumnDefinition(updateable = false) annotation, the SaveProcessor method for restoring values is executed. As this is done via the reflection mechanism, the application gives an error about not being able to assign a collection of type Set to a collection of type LinkedList.

Many To Many does not return correct entity on lazy join

Hi,
I am trying to use the @jointable annotation with lazyGet methods.
I see that the lazyGet returns an instance of the current class.
For example from this repository AbstractTestManyToManyModel Entity3::lazyGetLinks should returns a Flux but instead it returns a Flux .
I saw it during debugging the code as well.
So how can Flux use lazy?

If I use LcR2dbcRepository, I get the following error

springboot: 2.5.6, lecousin-r2dbc: 0.4.0

Field lcClient in net.lecousin.reactive.data.relational.repository.LcR2dbcRepositoryFactoryBean required a bean of type 'net.lecousin.reactive.data.relational.LcReactiveDataRelationalClient' that could not be found.

The injection point has the following annotations:
	- @org.springframework.beans.factory.annotation.Autowired(required=true)


Action:

Consider defining a bean of type 'net.lecousin.reactive.data.relational.LcReactiveDataRelationalClient' in your configuration.
  • repository
public interface TestEntityRepository extends LcR2dbcRepository<TestEntity, Long> {

    default Flux<TestEntity> findAllAndJoin(){
        return SelectQuery.from(TestEntity.class, "test1")
                .join("test2", "testId", "test2")
                .execute(getLcClient());
    }
}
  • application.yml
spring:
  r2dbc:
    password: root
    username: root
    url: r2dbc:mysql://localhost:3306/bin-springreactive
  • lc-reactive-data-relational.yaml
entities:
  - org.xiaowu.springdatar2dbc.entity:
      - onetomany:
          - TestEntity
      - manytoone:
          - TestEntity2
  • Application
@EnableR2dbcRepositories(repositoryFactoryBeanClass = LcR2dbcRepositoryFactoryBean.class,
        basePackages = {"org.xiaowu.springdatar2dbc.mapper"})
public class SpringDataR2dbcApplication {

    public static void main(String[] args) {
        LcReactiveDataRelationalInitializer.init();
        SpringApplication.run(SpringDataR2dbcApplication.class, args);
    }
}

Entity Unknown Error

net.lecousin.reactive.data.relational.model.ModelAccessException: Unknown entity type

I have that error even though I already annotated the entity class with @Table(value="table_name") and entity class property with @Column(value="column_name") to be recognized as entity. Why the entity is didn't recognized by lc-spring-data-r2dbc? Can anyone help me to solve this problem? Thanks in advance!

Below is the entity.

package com.sg.atlas.infrastructure.model.entity;

import lombok.AllArgsConstructor;
import lombok.Builder;
import lombok.Data;
import lombok.NoArgsConstructor;
import org.springframework.data.annotation.Id;
import org.springframework.data.relational.core.mapping.Column;
import org.springframework.data.relational.core.mapping.Table;

import java.sql.Timestamp;

@Data
@AllArgsConstructor
@NoArgsConstructor
@Builder
@Table(value = "account")
public class Account {
    @Id
    private Long id;
    @Column
    private String name;
    @Column
    private String email;
    @Column
    private String password;
    @Column(value = "created_at")
    private Timestamp createdAt;
    @Column(value = "updated_at")
    private Timestamp updatedAt;
}

Below is the complete stack trace.

net.lecousin.reactive.data.relational.model.ModelAccessException: Unknown entity type: com.sg.atlas.infrastructure.model.entity.Account
	at net.lecousin.reactive.data.relational.LcReactiveDataRelationalClient.getRequiredEntity(LcReactiveDataRelationalClient.java:121) ~[core-0.10.0.jar:na]
	Suppressed: reactor.core.publisher.FluxOnAssembly$OnAssemblyException: 
Error has been observed at the following site(s):
	*__checkpoint ⇢ org.springframework.boot.actuate.metrics.web.reactive.server.MetricsWebFilter [DefaultWebFilterChain]
	*__checkpoint ⇢ HTTP GET "/accounts/1" [ExceptionHandlingWebHandler]
Original Stack Trace:
		at net.lecousin.reactive.data.relational.LcReactiveDataRelationalClient.getRequiredEntity(LcReactiveDataRelationalClient.java:121) ~[core-0.10.0.jar:na]
		at net.lecousin.reactive.data.relational.repository.LcR2dbcRepositoryImpl.findById(LcR2dbcRepositoryImpl.java:73) ~[core-0.10.0.jar:na]
		at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:104) ~[na:na]
		at java.base/java.lang.reflect.Method.invoke(Method.java:577) ~[na:na]
		at org.springframework.data.repository.core.support.RepositoryMethodInvoker$RepositoryFragmentMethodInvoker.lambda$new$0(RepositoryMethodInvoker.java:289) ~[spring-data-commons-2.6.4.jar:2.6.4]
		at org.springframework.data.repository.core.support.RepositoryMethodInvoker.doInvoke(RepositoryMethodInvoker.java:137) ~[spring-data-commons-2.6.4.jar:2.6.4]
		at org.springframework.data.repository.core.support.RepositoryMethodInvoker.invoke(RepositoryMethodInvoker.java:121) ~[spring-data-commons-2.6.4.jar:2.6.4]
		at org.springframework.data.repository.core.support.RepositoryComposition$RepositoryFragments.invoke(RepositoryComposition.java:529) ~[spring-data-commons-2.6.4.jar:2.6.4]
		at org.springframework.data.repository.core.support.RepositoryComposition.invoke(RepositoryComposition.java:285) ~[spring-data-commons-2.6.4.jar:2.6.4]
		at org.springframework.data.repository.core.support.RepositoryFactorySupport$ImplementationMethodExecutionInterceptor.invoke(RepositoryFactorySupport.java:639) ~[spring-data-commons-2.6.4.jar:2.6.4]
		at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186) ~[spring-aop-5.3.19.jar:5.3.19]
		at org.springframework.data.repository.core.support.QueryExecutorMethodInterceptor.doInvoke(QueryExecutorMethodInterceptor.java:163) ~[spring-data-commons-2.6.4.jar:2.6.4]
		at org.springframework.data.repository.core.support.QueryExecutorMethodInterceptor.invoke(QueryExecutorMethodInterceptor.java:138) ~[spring-data-commons-2.6.4.jar:2.6.4]
		at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186) ~[spring-aop-5.3.19.jar:5.3.19]
		at org.springframework.aop.interceptor.ExposeInvocationInterceptor.invoke(ExposeInvocationInterceptor.java:97) ~[spring-aop-5.3.19.jar:5.3.19]
		at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186) ~[spring-aop-5.3.19.jar:5.3.19]
		at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:215) ~[spring-aop-5.3.19.jar:5.3.19]
		at jdk.proxy3/jdk.proxy3.$Proxy94.findById(Unknown Source) ~[na:na]
		at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:104) ~[na:na]
		at java.base/java.lang.reflect.Method.invoke(Method.java:577) ~[na:na]
		at org.springframework.aop.support.AopUtils.invokeJoinpointUsingReflection(AopUtils.java:344) ~[spring-aop-5.3.19.jar:5.3.19]
		at org.springframework.aop.framework.ReflectiveMethodInvocation.invokeJoinpoint(ReflectiveMethodInvocation.java:198) ~[spring-aop-5.3.19.jar:5.3.19]
		at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163) ~[spring-aop-5.3.19.jar:5.3.19]
		at org.springframework.dao.support.PersistenceExceptionTranslationInterceptor.invoke(PersistenceExceptionTranslationInterceptor.java:137) ~[spring-tx-5.3.19.jar:5.3.19]
		at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:186) ~[spring-aop-5.3.19.jar:5.3.19]
		at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:215) ~[spring-aop-5.3.19.jar:5.3.19]
		at jdk.proxy3/jdk.proxy3.$Proxy94.findById(Unknown Source) ~[na:na]
		at com.sg.atlas.core.service.impl.AccountServiceImpl.findOneById(AccountServiceImpl.java:32) ~[main/:na]
		at com.sg.atlas.infrastructure.controller.AccountController.findOneById(AccountController.java:28) ~[main/:na]
		at java.base/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:104) ~[na:na]
		at java.base/java.lang.reflect.Method.invoke(Method.java:577) ~[na:na]
		at org.springframework.web.reactive.result.method.InvocableHandlerMethod.lambda$invoke$0(InvocableHandlerMethod.java:144) ~[spring-webflux-5.3.19.jar:5.3.19]
		at reactor.core.publisher.MonoFlatMap$FlatMapMain.onNext(MonoFlatMap.java:125) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.Operators$MonoSubscriber.complete(Operators.java:1816) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoZip$ZipCoordinator.signal(MonoZip.java:251) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoZip$ZipInner.onNext(MonoZip.java:336) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoPeekTerminal$MonoTerminalPeekSubscriber.onNext(MonoPeekTerminal.java:180) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxDefaultIfEmpty$DefaultIfEmptySubscriber.onNext(FluxDefaultIfEmpty.java:101) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxSwitchIfEmpty$SwitchIfEmptySubscriber.onNext(FluxSwitchIfEmpty.java:74) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.Operators$ScalarSubscription.request(Operators.java:2398) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.Operators$MultiSubscriptionSubscriber.set(Operators.java:2194) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.Operators$MultiSubscriptionSubscriber.onSubscribe(Operators.java:2068) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxFlatMap.trySubscribeScalarMap(FluxFlatMap.java:192) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoFlatMap.subscribeOrReturn(MonoFlatMap.java:53) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.Mono.subscribe(Mono.java:4385) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoZip.subscribe(MonoZip.java:128) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.InternalMonoOperator.subscribe(InternalMonoOperator.java:64) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoDefer.subscribe(MonoDefer.java:52) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoIgnoreThen$ThenIgnoreMain.subscribeNext(MonoIgnoreThen.java:240) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoIgnoreThen$ThenIgnoreMain.onComplete(MonoIgnoreThen.java:203) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoFlatMap$FlatMapMain.onComplete(MonoFlatMap.java:181) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.Operators.complete(Operators.java:137) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoZip.subscribe(MonoZip.java:120) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.Mono.subscribe(Mono.java:4400) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoIgnoreThen$ThenIgnoreMain.subscribeNext(MonoIgnoreThen.java:263) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoIgnoreThen.subscribe(MonoIgnoreThen.java:51) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.InternalMonoOperator.subscribe(InternalMonoOperator.java:64) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoFlatMap$FlatMapMain.onNext(MonoFlatMap.java:157) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxSwitchIfEmpty$SwitchIfEmptySubscriber.onNext(FluxSwitchIfEmpty.java:74) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoNext$NextSubscriber.onNext(MonoNext.java:82) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxConcatMap$ConcatMapImmediate.innerNext(FluxConcatMap.java:282) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxConcatMap$ConcatMapInner.onNext(FluxConcatMap.java:863) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxMapFuseable$MapFuseableSubscriber.onNext(FluxMapFuseable.java:127) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoPeekTerminal$MonoTerminalPeekSubscriber.onNext(MonoPeekTerminal.java:180) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.Operators$ScalarSubscription.request(Operators.java:2398) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoPeekTerminal$MonoTerminalPeekSubscriber.request(MonoPeekTerminal.java:139) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxMapFuseable$MapFuseableSubscriber.request(FluxMapFuseable.java:169) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.Operators$MultiSubscriptionSubscriber.set(Operators.java:2194) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.Operators$MultiSubscriptionSubscriber.onSubscribe(Operators.java:2068) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxMapFuseable$MapFuseableSubscriber.onSubscribe(FluxMapFuseable.java:96) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoPeekTerminal$MonoTerminalPeekSubscriber.onSubscribe(MonoPeekTerminal.java:152) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoJust.subscribe(MonoJust.java:55) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.Mono.subscribe(Mono.java:4400) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxConcatMap$ConcatMapImmediate.drain(FluxConcatMap.java:451) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxConcatMap$ConcatMapImmediate.onSubscribe(FluxConcatMap.java:219) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxIterable.subscribe(FluxIterable.java:165) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxIterable.subscribe(FluxIterable.java:87) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.InternalMonoOperator.subscribe(InternalMonoOperator.java:64) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoDefer.subscribe(MonoDefer.java:52) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.InternalMonoOperator.subscribe(InternalMonoOperator.java:64) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoDefer.subscribe(MonoDefer.java:52) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.InternalMonoOperator.subscribe(InternalMonoOperator.java:64) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoDefer.subscribe(MonoDefer.java:52) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.Mono.subscribe(Mono.java:4400) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoIgnoreThen$ThenIgnoreMain.subscribeNext(MonoIgnoreThen.java:263) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoIgnoreThen.subscribe(MonoIgnoreThen.java:51) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.InternalMonoOperator.subscribe(InternalMonoOperator.java:64) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoDeferContextual.subscribe(MonoDeferContextual.java:55) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.netty.http.server.HttpServer$HttpServerHandle.onStateChange(HttpServer.java:967) ~[reactor-netty-http-1.0.18.jar:1.0.18]
		at reactor.netty.ReactorNetty$CompositeConnectionObserver.onStateChange(ReactorNetty.java:677) ~[reactor-netty-core-1.0.18.jar:1.0.18]
		at reactor.netty.transport.ServerTransport$ChildObserver.onStateChange(ServerTransport.java:478) ~[reactor-netty-core-1.0.18.jar:1.0.18]
		at reactor.netty.http.server.HttpServerOperations.onInboundNext(HttpServerOperations.java:570) ~[reactor-netty-http-1.0.18.jar:1.0.18]
		at reactor.netty.channel.ChannelOperationsHandler.channelRead(ChannelOperationsHandler.java:93) ~[reactor-netty-core-1.0.18.jar:1.0.18]
		at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
		at reactor.netty.http.server.HttpTrafficHandler.channelRead(HttpTrafficHandler.java:222) ~[reactor-netty-http-1.0.18.jar:1.0.18]
		at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.channel.CombinedChannelDuplexHandler$DelegatingChannelHandlerContext.fireChannelRead(CombinedChannelDuplexHandler.java:436) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:327) ~[netty-codec-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:299) ~[netty-codec-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.channel.CombinedChannelDuplexHandler.channelRead(CombinedChannelDuplexHandler.java:251) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:658) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:584) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986) ~[netty-common-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) ~[netty-common-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) ~[netty-common-4.1.76.Final.jar:4.1.76.Final]
		at java.base/java.lang.Thread.run(Thread.java:833) ~[na:an]

Required property not found

Hello! We're working on an application that has somewhat nested data, but we're having trouble figuring out how to handle the joins via SelectQuery.

Given the following two (Kotlin) tables:

@Table("cards")
data class Card(
    @Id
    val id: String,

    val deck: String,
    val ordinal: Short,

    @Version
    val version: Int = 0,

    @ForeignTable(joinKey = "id", optional = true)
    val data: Set<CardData>
)
@Table("card_data")
data class CardData(
    @ForeignKey
    val card: String,

    val column: String,
    val src: List<String>,
    val version: Int
)

And the following repository:

interface CardRepository : LcR2dbcRepository<Card, String> {
    fun findAllByDeck(deck: String): Flux<Card> =
        SelectQuery.from(Card::class.java, "card")
            .where(Criteria.property("card", "deck").`is`(deck))
            .join("card", "data", "card_data")
            .execute(lcClient)
}

The generated query: SELECT cards.id, cards.deck, cards.ordinal, cards.version FROM cards WHERE cards.deck = $1

Returns this error:

java.lang.IllegalStateException: Required property data not found for class com.benkyo.decks.data.Card!
    at org.springframework.data.mapping.PersistentEntity.getRequiredPersistentProperty(PersistentEntity.java:161) ~[spring-data-commons-2.5.4.jar:2.5.4]

We're stumped at this point - we feel like we've followed everything in the README and the wiki, and nothing we've tried is working.

Invocation of init method failed; nested exception is java.lang.StackOverflowError

@lecousin Hi! I have application with entities which have OneToOne and ManyToMany relations.
I marked it how into the docs
(For example)

@foreignkey(onForeignDeleted = ForeignKey.OnForeignDeleted.SET_TO_NULL)
private ClientEntity client;

@foreignkey(onForeignDeleted = ForeignKey.OnForeignDeleted.SET_TO_NULL)
private TenantEntity tenant;

@jointable(joinProperty = "imageEntities", columnName = "object_id", tableName = "rel_object_image")
private Set objectEntities = new HashSet<>();
Many to Many relation in other entity
https://github.com/jointable(joinProperty = "objectEntities", columnName = "image_id", tableName = "rel_object_image")
private Set imageEntities = new HashSet<>();

Also i have lazy init methods

and on the main class
@EnableR2dbcRepositories(repositoryFactoryBeanClass = LcR2dbcRepositoryFactoryBean.class, basePackages = "some.package.entity")
and LcReactiveDataRelationalInitializer.init(); in main method.
Also i have lc-reactive-data-relational.yaml file with relation which app scans after starting

and I have error when i start application

2023-10-18T16:56:51.117 [main] INFO  n.l.r.d.relational.enhance.Enhancer.enhanceClasses - Enhancing 5 entity classe(s)
2023-10-18T16:56:51.155 [main] INFO  n.l.r.d.relational.enhance.Enhancer.createJoinTable - Create join table class ru.mts.iot.core.repoimgsrv.entity.JoinEntity_ImageEntity_ObjectEntity with table name rel_object_image
2023-10-18T16:56:51.227 [main] INFO  n.l.r.d.relational.enhance.Enhancer.enhanceClasses - 6 class(es) enhanced

Unsatisfied dependency expressed through constructor parameter 0; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'imageRepository' defined in ru.mts.iot.core.repoimgsrv.repository.ImageRepository defined in @EnableR2dbcRepositories declared on R2dbcRepositoriesAutoConfigureRegistrar.EnableR2dbcRepositoriesConfiguration: Invocation of init method failed; nested exception is java.lang.StackOverflowError
	at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:800)
	at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:229)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1372)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1222)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:582)
	at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:542)
	at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:335)
	at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234)
	at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:333)
	at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:208)
	at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:955)
	at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:918)
	at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:583)
	at org.springframework.boot.web.reactive.context.ReactiveWebServerApplicationContext.refresh(ReactiveWebServerApplicationContext.java:66)
	at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:731)
	at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:408)
	at org.springframework.boot.SpringApplication.run(SpringApplication.java:307)
	at org.springframework.boot.SpringApplication.run(SpringApplication.java:1303)
	at org.springframework.boot.SpringApplication.run(SpringApplication.java:1292)

I think cause is with relations. Can you help me please? :)

Circular dependency on latest Spring version (2.6.x)

Heyo, I got an Circular dependency exception while starting Spring.
I am using latest Spring and Postgres, running this configuration with application.yaml.

Here is the error: https://paste.gg/p/anonymous/77b4004078a04a7cad6a346ed0b7951a

How can I solve this? Thank you.

How to convert enums?

Hi, firstly I'd like to say thank you for this library it is very useful.

I'm trying to register custom converters but my Bean is conflicting with an existing bean already registered by lc-spring-data-r2dbc library.

Kotlin code

val converterList: MutableList<Converter<*, *>> = ArrayList()
converterList.add(MyConverter())

return R2dbcCustomConversions(storeConversions, converterList)

Error

The bean 'r2dbcCustomConversions', defined in net.lecousin.reactive.data.relational.mysql.MySqlConfiguration, could not be registered. A bean with that name has already been defined in class path resource [com/capitalise/entities/configs/R2dbcConfig.class] and overriding is disabled.

Any suggestion on how can I register a converter?

I need this in order to converter enums, however, perhaps could have another solution to load and convert enum values.

Thank you.

Can't update entity to database by using repository

Why save(entity) method repository did not behave like an update in update operation when using it to relational property of the entity? But, an entity without relational property is could update normally using save(entity) method repository.

Below is the entity with relational properties.

package com.sg.atlas.infrastructure.model.entity;

import com.fasterxml.jackson.annotation.JsonIdentityInfo;
import com.fasterxml.jackson.annotation.ObjectIdGenerators;
import lombok.*;
import net.lecousin.reactive.data.relational.annotations.ForeignKey;
import net.lecousin.reactive.data.relational.annotations.ForeignTable;
import net.lecousin.reactive.data.relational.annotations.JoinTable;
import org.springframework.data.annotation.Id;
import org.springframework.data.relational.core.mapping.Column;
import org.springframework.data.relational.core.mapping.Table;

import java.sql.Timestamp;
import java.util.Set;

@Data
@AllArgsConstructor
@NoArgsConstructor
@Builder
@Table(value = "document")
public class Document {
    @Id
    private Long id;
    @Column
    private String name;
    @Column
    private String description;
    @ForeignKey
    @Column(value = "account_id")
    private Account account;
    @ForeignKey
    @Column(value = "type_id")
    private DocumentType type;
    @Column(value = "created_at")
    private Timestamp createdAt;
    @Column(value = "updated_at")
    private Timestamp updatedAt;
}

Below is the complete stacktrace.


org.springframework.dao.DataIntegrityViolationException: executeMany; SQL [INSERT INTO "document_type" ("created_at", name, description, id, "updated_at") VALUES ($1, $2, $3, $4, $5)]; duplicate key value violates unique constraint "document_type_pkey"; nested exception is io.r2dbc.postgresql.ExceptionFactory$PostgresqlDataIntegrityViolationException: [23505] duplicate key value violates unique constraint "document_type_pkey"
	at org.springframework.r2dbc.connection.ConnectionFactoryUtils.convertR2dbcException(ConnectionFactoryUtils.java:229) ~[spring-r2dbc-5.3.19.jar:5.3.19]
	Suppressed: reactor.core.publisher.FluxOnAssembly$OnAssemblyException: 
Error has been observed at the following site(s):
	*__checkpoint ⇢ Handler com.sg.atlas.infrastructure.controller.DocumentController#updateOneById(Long, Document) [DispatcherHandler]
	*__checkpoint ⇢ org.springframework.boot.actuate.metrics.web.reactive.server.MetricsWebFilter [DefaultWebFilterChain]
	*__checkpoint ⇢ HTTP PUT "/documents/1" [ExceptionHandlingWebHandler]
Original Stack Trace:
		at org.springframework.r2dbc.connection.ConnectionFactoryUtils.convertR2dbcException(ConnectionFactoryUtils.java:229) ~[spring-r2dbc-5.3.19.jar:5.3.19]
		at org.springframework.r2dbc.core.DefaultDatabaseClient.lambda$inConnectionMany$8(DefaultDatabaseClient.java:147) ~[spring-r2dbc-5.3.19.jar:5.3.19]
		at reactor.core.publisher.Flux.lambda$onErrorMap$29(Flux.java:6946) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.Flux.lambda$onErrorResume$30(Flux.java:6999) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxOnErrorResume$ResumeSubscriber.onError(FluxOnErrorResume.java:94) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxUsingWhen$UsingWhenSubscriber.deferredError(FluxUsingWhen.java:398) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxUsingWhen$RollbackInner.onComplete(FluxUsingWhen.java:475) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.Operators$MultiSubscriptionSubscriber.onComplete(Operators.java:2058) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoIgnoreThen$ThenIgnoreMain.onComplete(MonoIgnoreThen.java:209) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoIgnoreThen$ThenIgnoreMain.subscribeNext(MonoIgnoreThen.java:238) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoIgnoreThen$ThenIgnoreMain.onComplete(MonoIgnoreThen.java:203) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxPeek$PeekSubscriber.onComplete(FluxPeek.java:260) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoIgnoreThen$ThenIgnoreMain.onComplete(MonoIgnoreThen.java:209) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.Operators.complete(Operators.java:137) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.netty.FutureMono.doSubscribe(FutureMono.java:122) ~[reactor-netty-core-1.0.18.jar:1.0.18]
		at reactor.netty.FutureMono$ImmediateFutureMono.subscribe(FutureMono.java:83) ~[reactor-netty-core-1.0.18.jar:1.0.18]
		at reactor.core.publisher.MonoIgnoreThen$ThenIgnoreMain.subscribeNext(MonoIgnoreThen.java:240) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoIgnoreThen$ThenIgnoreMain.onComplete(MonoIgnoreThen.java:203) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoPeekTerminal$MonoTerminalPeekSubscriber.onComplete(MonoPeekTerminal.java:299) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoIgnoreElements$IgnoreElementsSubscriber.onComplete(MonoIgnoreElements.java:89) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxConcatMap$ConcatMapImmediate.drain(FluxConcatMap.java:368) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxConcatMap$ConcatMapImmediate.onComplete(FluxConcatMap.java:276) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxPeekFuseable$PeekFuseableSubscriber.onComplete(FluxPeekFuseable.java:277) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.Operators$ScalarSubscription.request(Operators.java:2400) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxPeekFuseable$PeekFuseableSubscriber.request(FluxPeekFuseable.java:144) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxConcatMap$ConcatMapImmediate.onSubscribe(FluxConcatMap.java:236) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxPeekFuseable$PeekFuseableSubscriber.onSubscribe(FluxPeekFuseable.java:178) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxJust.subscribe(FluxJust.java:68) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.Mono.subscribe(Mono.java:4400) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoIgnoreThen$ThenIgnoreMain.subscribeNext(MonoIgnoreThen.java:263) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoIgnoreThen.subscribe(MonoIgnoreThen.java:51) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoDefer.subscribe(MonoDefer.java:52) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.Mono.subscribe(Mono.java:4400) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoIgnoreThen$ThenIgnoreMain.subscribeNext(MonoIgnoreThen.java:263) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoIgnoreThen.subscribe(MonoIgnoreThen.java:51) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.Mono.subscribe(Mono.java:4400) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxOnErrorResume$ResumeSubscriber.onError(FluxOnErrorResume.java:103) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoIgnoreElements$IgnoreElementsSubscriber.onError(MonoIgnoreElements.java:84) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxMap$MapSubscriber.onError(FluxMap.java:132) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxFilter$FilterSubscriber.onError(FluxFilter.java:157) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxFilter$FilterConditionalSubscriber.onError(FluxFilter.java:291) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxMap$MapConditionalSubscriber.onError(FluxMap.java:259) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.Operators.error(Operators.java:198) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoError.subscribe(MonoError.java:53) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoDeferContextual.subscribe(MonoDeferContextual.java:55) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.InternalMonoOperator.subscribe(InternalMonoOperator.java:64) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoDefer.subscribe(MonoDefer.java:52) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.Mono.subscribe(Mono.java:4400) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxUsingWhen$UsingWhenSubscriber.onError(FluxUsingWhen.java:364) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxFlatMap$FlatMapMain.checkTerminated(FluxFlatMap.java:842) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxFlatMap$FlatMapMain.drainLoop(FluxFlatMap.java:608) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxFlatMap$FlatMapMain.drain(FluxFlatMap.java:588) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxFlatMap$FlatMapMain.innerError(FluxFlatMap.java:863) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxFlatMap$FlatMapInner.onError(FluxFlatMap.java:990) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxHandle$HandleSubscriber.onError(FluxHandle.java:210) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.MonoFlatMapMany$FlatMapManyInner.onError(MonoFlatMapMany.java:255) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxHandleFuseable$HandleFuseableSubscriber.onNext(FluxHandleFuseable.java:198) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxFilterFuseable$FilterFuseableConditionalSubscriber.onNext(FluxFilterFuseable.java:337) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxContextWrite$ContextWriteSubscriber.onNext(FluxContextWrite.java:107) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxPeekFuseable$PeekConditionalSubscriber.onNext(FluxPeekFuseable.java:854) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxPeekFuseable$PeekConditionalSubscriber.onNext(FluxPeekFuseable.java:854) ~[reactor-core-3.4.17.jar:3.4.17]
		at io.r2dbc.postgresql.util.FluxDiscardOnCancel$FluxDiscardOnCancelSubscriber.onNext(FluxDiscardOnCancel.java:91) ~[r2dbc-postgresql-0.9.1.RELEASE.jar:0.9.1.RELEASE]
		at reactor.core.publisher.FluxDoFinally$DoFinallySubscriber.onNext(FluxDoFinally.java:130) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxHandle$HandleSubscriber.onNext(FluxHandle.java:126) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxCreate$BufferAsyncSink.drain(FluxCreate.java:814) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxCreate$BufferAsyncSink.next(FluxCreate.java:739) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxCreate$SerializedFluxSink.next(FluxCreate.java:161) ~[reactor-core-3.4.17.jar:3.4.17]
		at io.r2dbc.postgresql.client.ReactorNettyClient$Conversation.emit(ReactorNettyClient.java:635) ~[r2dbc-postgresql-0.9.1.RELEASE.jar:0.9.1.RELEASE]
		at io.r2dbc.postgresql.client.ReactorNettyClient$BackendMessageSubscriber.emit(ReactorNettyClient.java:887) ~[r2dbc-postgresql-0.9.1.RELEASE.jar:0.9.1.RELEASE]
		at io.r2dbc.postgresql.client.ReactorNettyClient$BackendMessageSubscriber.onNext(ReactorNettyClient.java:761) ~[r2dbc-postgresql-0.9.1.RELEASE.jar:0.9.1.RELEASE]
		at io.r2dbc.postgresql.client.ReactorNettyClient$BackendMessageSubscriber.onNext(ReactorNettyClient.java:667) ~[r2dbc-postgresql-0.9.1.RELEASE.jar:0.9.1.RELEASE]
		at reactor.core.publisher.FluxHandle$HandleSubscriber.onNext(FluxHandle.java:126) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxPeekFuseable$PeekConditionalSubscriber.onNext(FluxPeekFuseable.java:854) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxMap$MapConditionalSubscriber.onNext(FluxMap.java:220) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.core.publisher.FluxMap$MapConditionalSubscriber.onNext(FluxMap.java:220) ~[reactor-core-3.4.17.jar:3.4.17]
		at reactor.netty.channel.FluxReceive.drainReceiver(FluxReceive.java:279) ~[reactor-netty-core-1.0.18.jar:1.0.18]
		at reactor.netty.channel.FluxReceive.onInboundNext(FluxReceive.java:388) ~[reactor-netty-core-1.0.18.jar:1.0.18]
		at reactor.netty.channel.ChannelOperations.onInboundNext(ChannelOperations.java:404) ~[reactor-netty-core-1.0.18.jar:1.0.18]
		at reactor.netty.channel.ChannelOperationsHandler.channelRead(ChannelOperationsHandler.java:93) ~[reactor-netty-core-1.0.18.jar:1.0.18]
		at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:327) ~[netty-codec-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:314) ~[netty-codec-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:435) ~[netty-codec-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:279) ~[netty-codec-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:658) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:584) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986) ~[netty-common-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) ~[netty-common-4.1.76.Final.jar:4.1.76.Final]
		at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) ~[netty-common-4.1.76.Final.jar:4.1.76.Final]
		at java.base/java.lang.Thread.run(Thread.java:833) ~[na:na]
Caused by: io.r2dbc.postgresql.ExceptionFactory$PostgresqlDataIntegrityViolationException: duplicate key value violates unique constraint "document_type_pkey"
	at io.r2dbc.postgresql.ExceptionFactory.createException(ExceptionFactory.java:102) ~[r2dbc-postgresql-0.9.1.RELEASE.jar:0.9.1.RELEASE]
	at io.r2dbc.postgresql.ExceptionFactory.createException(ExceptionFactory.java:65) ~[r2dbc-postgresql-0.9.1.RELEASE.jar:0.9.1.RELEASE]
	at io.r2dbc.postgresql.ExceptionFactory.handleErrorResponse(ExceptionFactory.java:132) ~[r2dbc-postgresql-0.9.1.RELEASE.jar:0.9.1.RELEASE]
	at reactor.core.publisher.FluxHandleFuseable$HandleFuseableSubscriber.onNext(FluxHandleFuseable.java:176) ~[reactor-core-3.4.17.jar:3.4.17]
	at reactor.core.publisher.FluxFilterFuseable$FilterFuseableConditionalSubscriber.onNext(FluxFilterFuseable.java:337) ~[reactor-core-3.4.17.jar:3.4.17]
	at reactor.core.publisher.FluxContextWrite$ContextWriteSubscriber.onNext(FluxContextWrite.java:107) ~[reactor-core-3.4.17.jar:3.4.17]
	at reactor.core.publisher.FluxPeekFuseable$PeekConditionalSubscriber.onNext(FluxPeekFuseable.java:854) ~[reactor-core-3.4.17.jar:3.4.17]
	at reactor.core.publisher.FluxPeekFuseable$PeekConditionalSubscriber.onNext(FluxPeekFuseable.java:854) ~[reactor-core-3.4.17.jar:3.4.17]
	at io.r2dbc.postgresql.util.FluxDiscardOnCancel$FluxDiscardOnCancelSubscriber.onNext(FluxDiscardOnCancel.java:91) ~[r2dbc-postgresql-0.9.1.RELEASE.jar:0.9.1.RELEASE]
	at reactor.core.publisher.FluxDoFinally$DoFinallySubscriber.onNext(FluxDoFinally.java:130) ~[reactor-core-3.4.17.jar:3.4.17]
	at reactor.core.publisher.FluxHandle$HandleSubscriber.onNext(FluxHandle.java:126) ~[reactor-core-3.4.17.jar:3.4.17]
	at reactor.core.publisher.FluxCreate$BufferAsyncSink.drain(FluxCreate.java:814) ~[reactor-core-3.4.17.jar:3.4.17]
	at reactor.core.publisher.FluxCreate$BufferAsyncSink.next(FluxCreate.java:739) ~[reactor-core-3.4.17.jar:3.4.17]
	at reactor.core.publisher.FluxCreate$SerializedFluxSink.next(FluxCreate.java:161) ~[reactor-core-3.4.17.jar:3.4.17]
	at io.r2dbc.postgresql.client.ReactorNettyClient$Conversation.emit(ReactorNettyClient.java:635) ~[r2dbc-postgresql-0.9.1.RELEASE.jar:0.9.1.RELEASE]
	at io.r2dbc.postgresql.client.ReactorNettyClient$BackendMessageSubscriber.emit(ReactorNettyClient.java:887) ~[r2dbc-postgresql-0.9.1.RELEASE.jar:0.9.1.RELEASE]
	at io.r2dbc.postgresql.client.ReactorNettyClient$BackendMessageSubscriber.onNext(ReactorNettyClient.java:761) ~[r2dbc-postgresql-0.9.1.RELEASE.jar:0.9.1.RELEASE]
	at io.r2dbc.postgresql.client.ReactorNettyClient$BackendMessageSubscriber.onNext(ReactorNettyClient.java:667) ~[r2dbc-postgresql-0.9.1.RELEASE.jar:0.9.1.RELEASE]
	at reactor.core.publisher.FluxHandle$HandleSubscriber.onNext(FluxHandle.java:126) ~[reactor-core-3.4.17.jar:3.4.17]
	at reactor.core.publisher.FluxPeekFuseable$PeekConditionalSubscriber.onNext(FluxPeekFuseable.java:854) ~[reactor-core-3.4.17.jar:3.4.17]
	at reactor.core.publisher.FluxMap$MapConditionalSubscriber.onNext(FluxMap.java:220) ~[reactor-core-3.4.17.jar:3.4.17]
	at reactor.core.publisher.FluxMap$MapConditionalSubscriber.onNext(FluxMap.java:220) ~[reactor-core-3.4.17.jar:3.4.17]
	at reactor.netty.channel.FluxReceive.drainReceiver(FluxReceive.java:279) ~[reactor-netty-core-1.0.18.jar:1.0.18]
	at reactor.netty.channel.FluxReceive.onInboundNext(FluxReceive.java:388) ~[reactor-netty-core-1.0.18.jar:1.0.18]
	at reactor.netty.channel.ChannelOperations.onInboundNext(ChannelOperations.java:404) ~[reactor-netty-core-1.0.18.jar:1.0.18]
	at reactor.netty.channel.ChannelOperationsHandler.channelRead(ChannelOperationsHandler.java:93) ~[reactor-netty-core-1.0.18.jar:1.0.18]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:327) ~[netty-codec-4.1.76.Final.jar:4.1.76.Final]
	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:314) ~[netty-codec-4.1.76.Final.jar:4.1.76.Final]
	at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:435) ~[netty-codec-4.1.76.Final.jar:4.1.76.Final]
	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:279) ~[netty-codec-4.1.76.Final.jar:4.1.76.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:658) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:584) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496) ~[netty-transport-4.1.76.Final.jar:4.1.76.Final]
	at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:986) ~[netty-common-4.1.76.Final.jar:4.1.76.Final]
	at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) ~[netty-common-4.1.76.Final.jar:4.1.76.Final]
	at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) ~[netty-common-4.1.76.Final.jar:4.1.76.Final]
	at java.base/java.lang.Thread.run(Thread.java:833) ~[na:an]
	```

Exception when lazy loading

I am getting an exception when trying to lazy load a joined table. I'm using postgres. The app is written in kotlin but I haven't seen any indication yet that the error is kotlin-related.

This entity is successfully loaded from the repository:
image

Joined entity:
image

When calling lazyGetShipment:

java.util.NoSuchElementException: null
	at kotlinx.coroutines.reactor.MonoKt.awaitSingle(Mono.kt:79) ~[kotlinx-coroutines-reactor-1.5.2.jar:na]
	at com.shipi.shipi_shipment.service.ShipmentService.getStoreShipments$suspendImpl(ShipmentService.kt:77) ~[main/:na]
	at com.shipi.shipi_shipment.service.ShipmentService$getStoreShipments$1.invokeSuspend(ShipmentService.kt) ~[main/:na]
	at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33) ~[kotlin-stdlib-1.6.10.jar:1.6.10-release-923(1.6.10)]
	at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:106) ~[kotlinx-coroutines-core-jvm-1.5.2.jar:na]
	at kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(CoroutineScheduler.kt:571) ~[kotlinx-coroutines-core-jvm-1.5.2.jar:na]
	at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.executeTask(CoroutineScheduler.kt:750) ~[kotlinx-coroutines-core-jvm-1.5.2.jar:na]
	at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.runWorker(CoroutineScheduler.kt:678) ~[kotlinx-coroutines-core-jvm-1.5.2.jar:na]
	at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.run(CoroutineScheduler.kt:665) ~[kotlinx-coroutines-core-jvm-1.5.2.jar:na]

When calling loadEntity() in ShipmentHistoryEntity:

2022-03-29 02:55:56.189  WARN 8712 --- [actor-tcp-nio-1] i.r.p.client.ReactorNettyClient          : Error: SEVERITY_LOCALIZED=ERROR, SEVERITY_NON_LOCALIZED=ERROR, CODE=42703, MESSAGE=column entity.shipment does not exist, HINT=Perhaps you meant to reference the column "entity.shipment_id"., POSITION=80, FILE=parse_relation.c, LINE=3359, ROUTINE=errorMissingColumn
2022-03-29 02:55:56.199  WARN 8712 --- [atcher-worker-1] n.g.e.SimpleDataFetcherExceptionHandler  : Exception while fetching data (/storeShipments) : executeMany; bad SQL grammar [SELECT entity.id AS f0000, entity.status AS f0001, entity.updated_at AS f0002, entity.shipment AS f0003 FROM shipment_history entity WHERE entity.id = $1 LIMIT 1 OFFSET 0]; nested exception is io.r2dbc.postgresql.ExceptionFactory$PostgresqlBadGrammarException: [42703] column entity.shipment does not exist

org.springframework.r2dbc.BadSqlGrammarException: executeMany; bad SQL grammar [SELECT entity.id AS f0000, entity.status AS f0001, entity.updated_at AS f0002, entity.shipment AS f0003 FROM shipment_history entity WHERE entity.id = $1 LIMIT 1 OFFSET 0]; nested exception is io.r2dbc.postgresql.ExceptionFactory$PostgresqlBadGrammarException: [42703] column entity.shipment does not exist
	at org.springframework.r2dbc.connection.ConnectionFactoryUtils.convertR2dbcException(ConnectionFactoryUtils.java:235) ~[spring-r2dbc-5.3.14.jar:5.3.14]
	Suppressed: reactor.core.publisher.FluxOnAssembly$OnAssemblyException: 
Error has been observed at the following site(s):
	*__checkpoint ⇢ SELECT FROM ShipmentHistoryEntity AS entity WHERE entity.id EQUALS 0 LIMIT 0,1
Original Stack Trace:
		at org.springframework.r2dbc.connection.ConnectionFactoryUtils.convertR2dbcException(ConnectionFactoryUtils.java:235) ~[spring-r2dbc-5.3.14.jar:5.3.14]

The second exception is interesting to me. It indicates that it is attempting to fetch shipment from the shipment_history table, even though shipment is annotated as a foreign key to the shipment table. Is there anything I'm doing wrong?

PreferredConstructor.Parameter is unknown for LcEntityReader

dependencies

implementation 'net.lecousin.reactive-data-relational:postgres:0.10.2'
implementation 'org.springframework.boot:spring-data-commons:2.7.0'

stacktrace

java.lang.NoClassDefFoundError: org/springframework/data/mapping/PreferredConstructor$Parameter
	at net.lecousin.reactive.data.relational.mapping.LcEntityReader.getOrCreateInstance(LcEntityReader.java:233) ~[core-0.10.2.jar:na]
	at net.lecousin.reactive.data.relational.mapping.LcEntityReader.getOrCreateInstance(LcEntityReader.java:220) ~[core-0.10.2.jar:na]
	at net.lecousin.reactive.data.relational.mapping.LcEntityReader.read(LcEntityReader.java:110) ~[core-0.10.2.jar:na]
	at net.lecousin.reactive.data.relational.mapping.LcEntityReader.read(LcEntityReader.java:103) ~[core-0.10.2.jar:na]
	at net.lecousin.reactive.data.relational.mapping.LcReactiveDataAccessStrategy.lambda$getRowMapper$0(LcReactiveDataAccessStrategy.java:42) ~[core-0.10.2.jar:na]
	at io.r2dbc.postgresql.PostgresqlResult.lambda$map$2(PostgresqlResult.java:123) ~[r2dbc-postgresql-0.9.1.RELEASE.jar:0.9.1.RELEASE]
	at reactor.core.publisher.FluxHandle$HandleSubscriber.onNext(FluxHandle.java:110) ~[reactor-core-3.4.18.jar:3.4.18]
	at reactor.core.publisher.MonoFlatMapMany$FlatMapManyInner.onNext(MonoFlatMapMany.java:250) ~[reactor-core-3.4.18.jar:3.4.18]
	at reactor.core.publisher.FluxHandleFuseable$HandleFuseableSubscriber.onNext(FluxHandleFuseable.java:191) ~[reactor-core-3.4.18.jar:3.4.18]
	at reactor.core.publisher.FluxFilterFuseable$FilterFuseableConditionalSubscriber.onNext(FluxFilterFuseable.java:337) ~[reactor-core-3.4.18.jar:3.4.18]
	at reactor.core.publisher.FluxContextWrite$ContextWriteSubscriber.onNext(FluxContextWrite.java:107) ~[reactor-core-3.4.18.jar:3.4.18]
	at reactor.core.publisher.FluxPeekFuseable$PeekConditionalSubscriber.onNext(FluxPeekFuseable.java:854) ~[reactor-core-3.4.18.jar:3.4.18]
	at reactor.core.publisher.FluxPeekFuseable$PeekConditionalSubscriber.onNext(FluxPeekFuseable.java:854) ~[reactor-core-3.4.18.jar:3.4.18]
	at io.r2dbc.postgresql.util.FluxDiscardOnCancel$FluxDiscardOnCancelSubscriber.onNext(FluxDiscardOnCancel.java:91) ~[r2dbc-postgresql-0.9.1.RELEASE.jar:0.9.1.RELEASE]
	at reactor.core.publisher.FluxDoFinally$DoFinallySubscriber.onNext(FluxDoFinally.java:130) ~[reactor-core-3.4.18.jar:3.4.18]
	at reactor.core.publisher.FluxHandle$HandleSubscriber.onNext(FluxHandle.java:126) ~[reactor-core-3.4.18.jar:3.4.18]
	at reactor.core.publisher.FluxCreate$BufferAsyncSink.drain(FluxCreate.java:814) ~[reactor-core-3.4.18.jar:3.4.18]
	at reactor.core.publisher.FluxCreate$BufferAsyncSink.next(FluxCreate.java:739) ~[reactor-core-3.4.18.jar:3.4.18]
	at reactor.core.publisher.FluxCreate$SerializedFluxSink.next(FluxCreate.java:161) ~[reactor-core-3.4.18.jar:3.4.18]
	at io.r2dbc.postgresql.client.ReactorNettyClient$Conversation.emit(ReactorNettyClient.java:635) ~[r2dbc-postgresql-0.9.1.RELEASE.jar:0.9.1.RELEASE]
	at io.r2dbc.postgresql.client.ReactorNettyClient$BackendMessageSubscriber.emit(ReactorNettyClient.java:887) ~[r2dbc-postgresql-0.9.1.RELEASE.jar:0.9.1.RELEASE]
	at io.r2dbc.postgresql.client.ReactorNettyClient$BackendMessageSubscriber.onNext(ReactorNettyClient.java:761) ~[r2dbc-postgresql-0.9.1.RELEASE.jar:0.9.1.RELEASE]
	at io.r2dbc.postgresql.client.ReactorNettyClient$BackendMessageSubscriber.onNext(ReactorNettyClient.java:667) ~[r2dbc-postgresql-0.9.1.RELEASE.jar:0.9.1.RELEASE]
	at reactor.core.publisher.FluxHandle$HandleSubscriber.onNext(FluxHandle.java:126) ~[reactor-core-3.4.18.jar:3.4.18]
	at reactor.core.publisher.FluxPeekFuseable$PeekConditionalSubscriber.onNext(FluxPeekFuseable.java:854) ~[reactor-core-3.4.18.jar:3.4.18]
	at reactor.core.publisher.FluxMap$MapConditionalSubscriber.onNext(FluxMap.java:224) ~[reactor-core-3.4.18.jar:3.4.18]
	at reactor.core.publisher.FluxMap$MapConditionalSubscriber.onNext(FluxMap.java:224) ~[reactor-core-3.4.18.jar:3.4.18]
	at reactor.netty.channel.FluxReceive.drainReceiver(FluxReceive.java:279) ~[reactor-netty-core-1.0.19.jar:1.0.19]
	at reactor.netty.channel.FluxReceive.onInboundNext(FluxReceive.java:388) ~[reactor-netty-core-1.0.19.jar:1.0.19]
	at reactor.netty.channel.ChannelOperations.onInboundNext(ChannelOperations.java:404) ~[reactor-netty-core-1.0.19.jar:1.0.19]
	at reactor.netty.channel.ChannelOperationsHandler.channelRead(ChannelOperationsHandler.java:93) ~[reactor-netty-core-1.0.19.jar:1.0.19]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) ~[netty-transport-4.1.77.Final.jar:4.1.77.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) ~[netty-transport-4.1.77.Final.jar:4.1.77.Final]
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) ~[netty-transport-4.1.77.Final.jar:4.1.77.Final]
	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:327) ~[netty-codec-4.1.77.Final.jar:4.1.77.Final]
	at io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:314) ~[netty-codec-4.1.77.Final.jar:4.1.77.Final]
	at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:435) ~[netty-codec-4.1.77.Final.jar:4.1.77.Final]
	at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:279) ~[netty-codec-4.1.77.Final.jar:4.1.77.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) ~[netty-transport-4.1.77.Final.jar:4.1.77.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) ~[netty-transport-4.1.77.Final.jar:4.1.77.Final]
	at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:357) ~[netty-transport-4.1.77.Final.jar:4.1.77.Final]
	at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1410) ~[netty-transport-4.1.77.Final.jar:4.1.77.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:379) ~[netty-transport-4.1.77.Final.jar:4.1.77.Final]
	at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:365) ~[netty-transport-4.1.77.Final.jar:4.1.77.Final]
	at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:919) ~[netty-transport-4.1.77.Final.jar:4.1.77.Final]
	at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:166) ~[netty-transport-4.1.77.Final.jar:4.1.77.Final]
	at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:722) ~[netty-transport-4.1.77.Final.jar:4.1.77.Final]
	at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:658) ~[netty-transport-4.1.77.Final.jar:4.1.77.Final]
	at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:584) ~[netty-transport-4.1.77.Final.jar:4.1.77.Final]
	at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:496) ~[netty-transport-4.1.77.Final.jar:4.1.77.Final]
	at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:995) ~[netty-common-4.1.77.Final.jar:4.1.77.Final]
	at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) ~[netty-common-4.1.77.Final.jar:4.1.77.Final]
	at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) ~[netty-common-4.1.77.Final.jar:4.1.77.Final]
	at java.base/java.lang.Thread.run(Thread.java:833) ~[na:na]
Caused by: java.lang.ClassNotFoundException: org.springframework.data.mapping.PreferredConstructor$Parameter
	at java.base/jdk.internal.loader.BuiltinClassLoader.loadClass(BuiltinClassLoader.java:641) ~[na:na]
	at java.base/jdk.internal.loader.ClassLoaders$AppClassLoader.loadClass(ClassLoaders.java:188) ~[na:na]
	at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:520) ~[na:na]
	... 54 common frames omitted

inner org.springframework.data.mapping.PreferredConstructor.Parameter class were migrated to own class org.springframework.data.mapping.Parameter class.

Repository.save not fully updating postgres database

Hello again, here is my situation:

  • I create an entity, and save it to the database. This works as expected
  • I update the entity, including some data in a joined foreign table
  • I call repository.save again, and the data in the foreign table is inserted, but the entity is not updated

The repository has not been customized:

@Repository
interface ShipmentRepository : LcR2dbcRepository<ShipmentEntity, Long>

This is the initial save, which works as expected.

        var shipment = ShipmentEntity()
        shipment.createdBy = auth.id!!
        shipment.name = request.name
        shipment.fragile = request.fragile
        shipment.weight = request.weight
        shipment.length = request.length
        shipment.width = request.width
        shipment.height = request.height
        shipment.storeId = request.storeId.value.toLong()

        val shipmentHistory = arrayListOf(ShipmentHistoryEntity())
        shipmentHistory[0].shipment = shipment
        shipment.history = shipmentHistory

        shipment = shipmentRepository.save(shipment).awaitSingle()

For debugging purposes, this is the corresponding SQL statements run by r2dbc (these look good to me):

2022-04-02 04:09:18.307 DEBUG 11312 --- [actor-tcp-nio-1] o.s.r2dbc.core.DefaultDatabaseClient     : Executing SQL statement [INSERT INTO shipment (id, name, created_by, customer_id, delivery_image, delivery_image_ext, store_id, fragile, weight, width, length, height, reserved_at, store_location, customer_location) VALUES (NEXTVAL('shipment_id_seq'), $1, $2, $3, NULL, NULL, $4, $5, $6, $7, $8, $9, NULL, NULL, NULL)]
2022-04-02 04:09:18.604 DEBUG 11312 --- [actor-tcp-nio-1] o.s.r2dbc.core.DefaultDatabaseClient     : Executing SQL statement [INSERT INTO shipment_history (id, status, updated_at, driver_id, transport_id, shipment_id) VALUES (NEXTVAL('shipment_history_id_seq'), $1, $2, NULL, NULL, $3)]

Then some updates are made to the shipment entity:

            shipment.customerId = data.createShipmentUserInfo.toLong()

            val storeLocation = addressData.createShipmentAddressInfo.storeLocation
            val customerLocation = addressData.createShipmentAddressInfo.customerLocation
            shipment.storeLocation =
                geometryFactory.createPoint(Coordinate(storeLocation.longitude, storeLocation.latitude))
            shipment.customerLocation =
                geometryFactory.createPoint(Coordinate(customerLocation.longitude, customerLocation.latitude))
            val newHistory = ShipmentHistoryEntity()
            newHistory.status = ShipmentStatus.REQUESTED
            newHistory.shipment = shipment
            shipment.history.add(newHistory)

            shipmentRepository.save(shipment).awaitSingle()

I understand that the geometry point updates may be a unique case, however customerId is a simple bigint/long, but that is not updating either. Here is the corresponding SQL:

2022-04-02 04:09:32.218 DEBUG 11312 --- [actor-tcp-nio-2] o.s.r2dbc.core.DefaultDatabaseClient     : Executing SQL statement [INSERT INTO shipment_history (id, status, updated_at, driver_id, transport_id, shipment_id) VALUES (NEXTVAL('shipment_history_id_seq'), $1, $2, NULL, NULL, $3)]

As you can see, the new history element is generated, but nothing in shipment is updated, even though customerId, customerLocation, and storeLocation have been updated. Note that I updated the r2dbc driver to org.postgresql:r2dbc-postgresql:0.9.1.RELEASE, to support the PostGis extension (for the geometry type). Am I doing anything wrong here?

Redundant updates of collection used as an array column

The library supports collections supported as an array column. However, a problem occurs when updating an already existing entity, for which unnecessary statements are executed. If the collection of elements held as array column type has not changed, SQL statements are executed anyway. The problem is in the needsUpdate() method in the SaveProcessor class. The compared reference between persisted and actual collections is never satisfied. Object.deepEquals() alone would be sufficient.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.