Code Monkey home page Code Monkey logo

mikro-orm / mikro-orm Goto Github PK

View Code? Open in Web Editor NEW
7.2K 7.2K 481.0 56.71 MB

TypeScript ORM for Node.js based on Data Mapper, Unit of Work and Identity Map patterns. Supports MongoDB, MySQL, MariaDB, MS SQL Server, PostgreSQL and SQLite/libSQL databases.

Home Page: https://mikro-orm.io

License: MIT License

TypeScript 99.05% JavaScript 0.95% Shell 0.01% Batchfile 0.01%
database datamapper entities entity identity-map javascript libsql mongodb mysql nodejs orm postgre postgresql sql sql-server sqlite sqlite3 typescript typescript-orm unit-of-work

mikro-orm's Introduction

MikroORM

TypeScript ORM for Node.js based on Data Mapper, Unit of Work and Identity Map patterns. Supports MongoDB, MySQL, MariaDB, PostgreSQL and SQLite (including libSQL) databases.

Heavily inspired by Doctrine and Hibernate.

NPM version NPM dev version Chat on discord Downloads Coverage Status Maintainability Build Status

πŸ€” Unit of What?

You might be asking: What the hell is Unit of Work and why should I care about it?

Unit of Work maintains a list of objects (entities) affected by a business transaction and coordinates the writing out of changes. (Martin Fowler)

Identity Map ensures that each object (entity) gets loaded only once by keeping every loaded object in a map. Looks up objects using the map when referring to them. (Martin Fowler)

So what benefits does it bring to us?

Implicit Transactions

First and most important implication of having Unit of Work is that it allows handling transactions automatically.

When you call em.flush(), all computed changes are queried inside a database transaction (if supported by given driver). This means that you can control the boundaries of transactions simply by calling em.persistLater() and once all your changes are ready, calling flush() will run them inside a transaction.

You can also control the transaction boundaries manually via em.transactional(cb).

const user = await em.findOneOrFail(User, 1);
user.email = '[email protected]';
const car = new Car();
user.cars.add(car);

// thanks to bi-directional cascading we only need to persist user entity
// flushing will create a transaction, insert new car and update user with new email
// as user entity is managed, calling flush() is enough
await em.flush();

ChangeSet based persistence

MikroORM allows you to implement your domain/business logic directly in the entities. To maintain always valid entities, you can use constructors to mark required properties. Let's define the User entity used in previous example:

@Entity()
export class User {

  @PrimaryKey()
  id!: number;

  @Property()
  name!: string;

  @OneToOne(() => Address)
  address?: Address;

  @ManyToMany(() => Car)
  cars = new Collection<Car>(this);

  constructor(name: string) {
    this.name = name;
  }

}

Now to create new instance of the User entity, we are forced to provide the name:

const user = new User('John Doe'); // name is required to create new user instance
user.address = new Address('10 Downing Street'); // address is optional

Once your entities are loaded, make a number of synchronous actions on your entities, then call em.flush(). This will trigger computing of change sets. Only entities (and properties) that were changed will generate database queries, if there are no changes, no transaction will be started.

const user = await em.findOneOrFail(User, 1, {
  populate: ['cars', 'address.city'],
});
user.title = 'Mr.';
user.address.street = '10 Downing Street'; // address is 1:1 relation of Address entity
user.cars.getItems().forEach(car => car.forSale = true); // cars is 1:m collection of Car entities
const car = new Car('VW');
user.cars.add(car);

// now we can flush all changes done to managed entities
await em.flush();

em.flush() will then execute these queries from the example above:

begin;
update "user" set "title" = 'Mr.' where "id" = 1;
update "user_address" set "street" = '10 Downing Street' where "id" = 123;
update "car"
  set "for_sale" = case
    when ("id" = 1) then true
    when ("id" = 2) then true
    when ("id" = 3) then true
    else "for_sale" end
  where "id" in (1, 2, 3)
insert into "car" ("brand", "owner") values ('VW', 1);
commit;

Identity Map

Thanks to Identity Map, you will always have only one instance of given entity in one context. This allows for some optimizations (skipping loading of already loaded entities), as well as comparison by identity (ent1 === ent2).

πŸ“– Documentation

MikroORM documentation, included in this repo in the root directory, is built with Docusaurus and publicly hosted on GitHub Pages at https://mikro-orm.io.

There is also auto-generated CHANGELOG.md file based on commit messages (via semantic-release).

✨ Core Features

πŸ“¦ Example Integrations

You can find example integrations for some popular frameworks in the mikro-orm-examples repository:

TypeScript Examples

JavaScript Examples

πŸš€ Quick Start

First install the module via yarn or npm and do not forget to install the database driver as well:

Since v4, you should install the driver package, but not the db connector itself, e.g. install @mikro-orm/sqlite, but not sqlite3 as that is already included in the driver package.

yarn add @mikro-orm/core @mikro-orm/mongodb       # for mongo
yarn add @mikro-orm/core @mikro-orm/mysql         # for mysql/mariadb
yarn add @mikro-orm/core @mikro-orm/mariadb       # for mysql/mariadb
yarn add @mikro-orm/core @mikro-orm/postgresql    # for postgresql
yarn add @mikro-orm/core @mikro-orm/mssql         # for mssql
yarn add @mikro-orm/core @mikro-orm/sqlite        # for sqlite
yarn add @mikro-orm/core @mikro-orm/better-sqlite # for better-sqlite
yarn add @mikro-orm/core @mikro-orm/libsql        # for libsql

or

npm i -s @mikro-orm/core @mikro-orm/mongodb       # for mongo
npm i -s @mikro-orm/core @mikro-orm/mysql         # for mysql/mariadb
npm i -s @mikro-orm/core @mikro-orm/mariadb       # for mysql/mariadb
npm i -s @mikro-orm/core @mikro-orm/postgresql    # for postgresql
npm i -s @mikro-orm/core @mikro-orm/mssql         # for mssql
npm i -s @mikro-orm/core @mikro-orm/sqlite        # for sqlite
npm i -s @mikro-orm/core @mikro-orm/better-sqlite # for better-sqlite
npm i -s @mikro-orm/core @mikro-orm/libsql        # for libsql

Next, if you want to use decorators for your entity definition, you will need to enable support for decorators as well as esModuleInterop in tsconfig.json via:

"experimentalDecorators": true,
"emitDecoratorMetadata": true,
"esModuleInterop": true,

Alternatively, you can use EntitySchema.

Then call MikroORM.init as part of bootstrapping your app:

To access driver specific methods like em.createQueryBuilder() we need to specify the driver type when calling MikroORM.init(). Alternatively we can cast the orm.em to EntityManager exported from the driver package:

import { EntityManager } from '@mikro-orm/postgresql';
const em = orm.em as EntityManager;
const qb = em.createQueryBuilder(...);
import type { PostgreSqlDriver } from '@mikro-orm/postgresql'; // or any other SQL driver package

const orm = await MikroORM.init<PostgreSqlDriver>({
  entities: ['./dist/entities'], // path to your JS entities (dist), relative to `baseDir`
  dbName: 'my-db-name',
  type: 'postgresql',
});
console.log(orm.em); // access EntityManager via `em` property

There are more ways to configure your entities, take a look at installation page.

Read more about all the possible configuration options in Advanced Configuration section.

Then you will need to fork entity manager for each request so their identity maps will not collide. To do so, use the RequestContext helper:

const app = express();

app.use((req, res, next) => {
  RequestContext.create(orm.em, next);
});

You should register this middleware as the last one just before request handlers and before any of your custom middleware that is using the ORM. There might be issues when you register it before request processing middleware like queryParser or bodyParser, so definitely register the context after them.

More info about RequestContext is described here.

Now you can start defining your entities (in one of the entities folders). This is how simple entity can look like in mongo driver:

./entities/MongoBook.ts

@Entity()
export class MongoBook {

  @PrimaryKey()
  _id: ObjectID;

  @SerializedPrimaryKey()
  id: string;

  @Property()
  title: string;

  @ManyToOne(() => Author)
  author: Author;

  @ManyToMany(() => BookTag)
  tags = new Collection<BookTag>(this);

  constructor(title: string, author: Author) {
    this.title = title;
    this.author = author;
  }

}

For SQL drivers, you can use id: number PK:

./entities/SqlBook.ts

@Entity()
export class SqlBook {

  @PrimaryKey()
  id: number;

}

Or if you want to use UUID primary keys:

./entities/UuidBook.ts

import { v4 } from 'uuid';

@Entity()
export class UuidBook {

  @PrimaryKey()
  uuid = v4();

}

More information can be found in defining entities section in docs.

When you have your entities defined, you can start using ORM either via EntityManager or via EntityRepositorys.

To save entity state to database, you need to persist it. Persist takes care or deciding whether to use insert or update and computes appropriate change-set. Entity references that are not persisted yet (does not have identifier) will be cascade persisted automatically.

// use constructors in your entities for required parameters
const author = new Author('Jon Snow', '[email protected]');
author.born = new Date();

const publisher = new Publisher('7K publisher');

const book1 = new Book('My Life on The Wall, part 1', author);
book1.publisher = publisher;
const book2 = new Book('My Life on The Wall, part 2', author);
book2.publisher = publisher;
const book3 = new Book('My Life on The Wall, part 3', author);
book3.publisher = publisher;

// just persist books, author and publisher will be automatically cascade persisted
await em.persistAndFlush([book1, book2, book3]);

To fetch entities from database you can use find() and findOne() of EntityManager:

const authors = em.find(Author, {}, { populate: ['books'] });

for (const author of authors) {
  console.log(author); // instance of Author entity
  console.log(author.name); // Jon Snow

  for (const book of author.books) { // iterating books collection
    console.log(book); // instance of Book entity
    console.log(book.title); // My Life on The Wall, part 1/2/3
  }
}

More convenient way of fetching entities from database is by using EntityRepository, that carries the entity name, so you do not have to pass it to every find and findOne calls:

const booksRepository = em.getRepository(Book);

const books = await booksRepository.find({ author: '...' }, { 
  populate: ['author'],
  limit: 1,
  offset: 2,
  orderBy: { title: QueryOrder.DESC },
});

console.log(books); // Loaded<Book, 'author'>[]

Take a look at docs about working with EntityManager or using EntityRepository instead.

🀝 Contributing

Contributions, issues and feature requests are welcome. Please read CONTRIBUTING.md for details on the process for submitting pull requests to us.

Authors

πŸ‘€ Martin AdΓ‘mek

See also the list of contributors who participated in this project.

Show Your Support

Please ⭐️ this repository if this project helped you!

πŸ“ License

Copyright Β© 2018 Martin AdΓ‘mek.

This project is licensed under the MIT License - see the LICENSE file for details.

mikro-orm's People

Contributors

asifarran avatar b4nan avatar boenrobot avatar darkbasic avatar dasprid avatar dependabot[bot] avatar dvlalex avatar erie0210 avatar evantrimboli avatar francisco-sanchez-molina avatar funduck avatar hehmonke avatar jbmikk avatar jsprw avatar kpervin avatar langstra avatar lookfirst avatar merceyz avatar ml1nk avatar oliversalzburg avatar pepakriz avatar renovate-bot avatar renovate[bot] avatar rhyek avatar roderik avatar rubiin avatar spencerkaiser avatar stefansundin avatar tsangste avatar willsoto avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

mikro-orm's Issues

ValidationError: Employee entity is missing @PrimaryKey()

I have three entities: Employee, which extends User, which extends StandardEntity. I'm getting the error: ValidationError: Employee entity is missing @PrimaryKey() when initializing mikro-orm. It seems that mikro-orm goes only to the first extends to look for the @PrimaryKey. It does work fine with other entities that directly extends StandardEntity but not when inheritance is nested in multiple levels.

standard-entity.ts:

import { IEntity, PrimaryKey, Property } from 'mikro-orm'
import { ObjectId } from 'mongodb'
import { Field, ID } from 'type-graphql'

export abstract class StandardEntity {
  @PrimaryKey()
  readonly _id: ObjectId

  @Field(type => ID)
  id: string

  @Property()
  @Field()
  createdAt: Date

  @Property({ onUpdate: () => new Date() })
  @Field()
  updatedAt: Date

  constructor() {
    const id = new ObjectId()
    this._id = id
    this.id = id.toHexString()

    const now = new Date()
    this.createdAt = now
    this.updatedAt = now
  }
}

export interface StandardEntity extends IEntity<string> {}

user.ts:

import { IsEmail } from 'class-validator'
import { Property } from 'mikro-orm'
import { Field, InterfaceType } from 'type-graphql'

import { Language, StandardEntity } from '@/shared'
import { UserType } from '@/user'

@InterfaceType({
  resolveType: value => (value.type === 'customer' ? 'Customer' : 'Employee'),
})
export abstract class User extends StandardEntity {
  @Property()
  @Field()
  blocked: boolean = false

  @Property()
  @Field()
  displayName: string

  @Property()
  @Field()
  fullName: string

  @Property()
  @Field()
  @IsEmail()
  email: string

  @Property()
  @Field({ nullable: true })
  phoneNumber?: string

  @Property()
  @Field({ nullable: true })
  photoUrl?: string

  @Property()
  @Field(type => Language)
  lang: Language

  @Property()
  @Field({ nullable: true })
  identityNumber?: string

  @Property()
  @Field(type => UserType)
  type: UserType

  constructor(
    displayName: string,
    fullName: string,
    email: string,
    phoneNumber: string | undefined,
    photoUrl: string | undefined,
    lang: Language,
    identityNumber: string | undefined,
    type: UserType,
  ) {
    super()
    this.displayName = displayName
    this.fullName = fullName
    this.email = email
    this.phoneNumber = phoneNumber
    this.photoUrl = photoUrl
    this.lang = lang
    this.identityNumber = identityNumber
    this.type = type
  }
}

employee.ts:

import { Entity, Property } from 'mikro-orm'
import { Field } from 'type-graphql'

import { Language } from '@/shared'
import { EmployeeRole, UserType } from '@/user'

import { User } from './user'


@Entity({ collection: 'users' })
export class Employee extends User {
  @Property()
  @Field(type => EmployeeRole)
  role: EmployeeRole

  constructor(
    displayName: string,
    fullName: string,
    email: string,
    phoneNumber: string | undefined,
    photoUrl: string | undefined,
    lang: Language,
    identityNumber: string | undefined,
    role: EmployeeRole,
  ) {
    super(
      displayName,
      fullName,
      email,
      phoneNumber,
      photoUrl,
      lang,
      identityNumber,
      UserType.EMPLOYEE,
    )

    this.role = role
  }
}

Do not require manual persisting of already known entities

Before actual change set computation, iterate thru the identity map and persist all entities found there. By doing this, calling persistLater() will not be needed for entities that are loaded from database (and therefore part of the identity map).

Disable auto flushing by default

Currently there is autoFlush option that has default value set to true. This means that em.persist() by default flushes the EM automatically (behaves like em.persistAndFlush()). This differs from other ORMs (doctrine, hibernate, etc), and can cause a lot of confusion, as each persist call will run the query inside a small transaction.

Set autoFlush option to false by default and change em.persist() method to be synchronous by default. Keep autoFlush in place to ease migration to v3.

BREAKING CHANGE: em.flush() call is now required to run database queries. If you were using persistLater()/persistAndFlush(), then no action is required. You can now use persist() instead of persistLater().

MongoDB: useUnifiedTopology

I'm getting the following error message when MongoDB connection is initialized:

(node:18769) DeprecationWarning: current Server Discovery and Monitoring engine is deprecated, and will be removed in a future version. To use the new Server Discover and Monitoring engine, pass option { useUnifiedTopology: true } to the MongoClient constructor.

As the error says, it's just a matter of adding useUnifiedTopology: true to this line: https://github.com/mikro-orm/mikro-orm/blob/master/lib/connections/MongoConnection.ts#L35
Not sure if it could cause any side-effect though.

Dependency Version
node 8.13.0
mikro-orm 2.7.7
mongodb 3.3.2

Error when handle decorator @PrimaryKey

TypeError: Cannot read property '1' of null at Function.lookupPathFromDecorator (/Users/chao.yang/Work/stuff-content/node_modules/mikro-orm/dist/utils/Utils.js:177:78) at /Users/chao.yang/Work/stuff-content/node_modules/mikro-orm/dist/decorators/PrimaryKey.js:12:23 at DecorateProperty (/Users/chao.yang/Work/stuff-content/node_modules/reflect-metadata/Reflect.js:553:33)

OS: Mac OSX
Node: 10.x.x
Framework: NestJS
DB: Postgres

Add @OneToOne decorator

Functionally pretty much the same as @ManyToOne, but can have more strict validation. Also handy for schema generation, as one can choose owning side as opposed to @ManyToOne, plus we can add UNIQUE index on 1:1 column.

Another difference will be support for orphan removal (#36), which cannot be used with @ManyToOne.

How to use it with MongoDB Atlas?

I am trying to use mikro-orm with MongoDB Atlas, but I am getting a permissions error.

{ MongoError: user is not allowed to do action [find] on [debug.business]
    at Connection.<anonymous> (/home/alejandro/Proyectos/localeco/backend/node_modules/mongodb-core/lib/connection/pool.js:443:61)
    at Connection.emit (events.js:189:13)
    at processMessage (/home/alejandro/Proyectos/localeco/backend/node_modules/mongodb-core/lib/connection/connection.js:364:10)
    at TLSSocket.<anonymous> (/home/alejandro/Proyectos/localeco/backend/node_modules/mongodb-core/lib/connection/connection.js:533:15)
    at TLSSocket.emit (events.js:189:13)
    at addChunk (_stream_readable.js:284:12)
    at readableAddChunk (_stream_readable.js:265:11)
    at TLSSocket.Readable.push (_stream_readable.js:220:10)
    at TLSWrap.onStreamRead [as onread] (internal/stream_base_commons.js:94:17)
  ok: 0,
  errmsg:
   'user is not allowed to do action [find] on [debug.business]',
  code: 8000,
  codeName: 'AtlasError',
  name: 'MongoError',
  [Symbol(mongoErrorContextSymbol)]: {} }

I think I have followed the documentation correctly. Here the relevant code.

index.ts

import { Collection, MongoClient, ObjectId } from "mongodb";
import "reflect-metadata";
import { container } from "tsyringe";
import OrmClient from "./ormClient";
import WebServer from "./webServer";

async function bootstrap(): Promise<void> {
    const ormClient: OrmClient = container.resolve(OrmClient);
    const webServer: WebServer = container.resolve(WebServer);

    await ormClient.initialize();
    await webServer.initialize();

    await test();
}

async function test(): Promise<void> {
    const mongo: MongoClient = new MongoClient(process.env.DATABASE_URL as string, {
        auth: {
            user: process.env.DATABASE_USER as string,
            password: process.env.DATABASE_PASSWORD as string,
        },
        useNewUrlParser: true,
    });

    await mongo.connect();

    const collection: Collection = mongo.db(process.env.DATABASE_NAME as string).collection("business");

// tslint:disable-next-line: no-console
    console.log(await collection.findOne({ _id: ObjectId.createFromHexString("5c9d155106dd7215cee1da4c")}));
}

export default bootstrap();

ormClient.ts

import { EntityManager, MikroORM } from "mikro-orm";
import { singleton } from "tsyringe";

@singleton()
export default class OrmClient {
    private mikroOrm!: MikroORM;

    public async initialize(): Promise<void> {
        this.mikroOrm = await MikroORM.init({
            clientUrl: process.env.DATABASE_URL as string,
            user: process.env.DATABASE_USER as string,
            password: process.env.DATABASE_PASSWORD as string,
            dbName: process.env.DATABASE_NAME as string,
            entitiesDirs: ["build/entities"],
            entitiesDirsTs: ["source/entities"],
        });
    }

    public get em(): EntityManager {
        return this.mikroOrm.em;
    }
}

entities/business.ts

import { Entity, IEntity, PrimaryKey, Property } from "mikro-orm";
import { ObjectId } from "mongodb";

@Entity()
export class Business {
    @PrimaryKey()
    public _id!: ObjectId;

    @Property()
    public name!: string;

    @Property()
    public description!: string;
}

export interface Business extends IEntity<string> { }

webServer.ts

import express, { Application } from "express";
import { createServer, Server } from "http";
import { injectable } from "tsyringe";
import GraphqlApi from "./middlewares/graphqlApi";
import OrmContext from "./middlewares/ormContext";

@injectable()
export default class WebServer {
    private application: Application;
    private server: Server;

    public constructor() {
        this.application = express();
        this.server = createServer(this.application);
    }

    public async initialize(): Promise<void> {
        const ormContext: OrmContext = new OrmContext();
        const graphqlApi: GraphqlApi = new GraphqlApi();

        ormContext.apply(this.application);
        graphqlApi.apply(this.application);

        await this.listen();
    }

    ...

middlewares/ormContext.ts

import { Application } from "express";
import { RequestContext } from "mikro-orm";
import { autoInjectable, inject } from "tsyringe";
import OrmClient from "../ormClient";

@autoInjectable()
export default class OrmContext {
    private ormClient: OrmClient;

    public constructor(@inject(OrmClient) ormClient?: OrmClient) {
        this.ormClient = ormClient as OrmClient;
    }

    public apply(application: Application): void {
        application.use((...[, , next]) => {
// tslint:disable-next-line: no-console
            console.log("PASO");
            RequestContext.create(this.ormClient.em, next);
        });
    }
}

providers/businessStore.ts

import { DataSource } from "apollo-datasource";
import { autoInjectable, inject } from "tsyringe";
import { Business } from "../entities/business";
import OrmClient from "../ormClient";
import { GraphqlContext } from "../types/common";

@autoInjectable()
export default class BusinessStore extends DataSource<GraphqlContext> {
    private ormClient: OrmClient;

    public constructor(@inject(OrmClient) ormClient?: OrmClient) {
        super();

        this.ormClient = ormClient as OrmClient;
    }

    public async getId(id: string): Promise<any> {
// tslint:disable-next-line: no-console
        console.log(id);

        try {
            return await this.ormClient.em.findOne(Business, "5c9d155106dd7215cee1da4c");
        } catch (error) {
// tslint:disable-next-line: no-console
            console.log(error);
            return null;
        }
    }
}

And the output of start the application and do one request:

{ _id: 5c9d155106dd7215cee1da4c,
  name: 'Prueba',
  description: 'Prueba de creaccion' }
PASO
5c9d155106dd7215cee1da4c
{ MongoError: user is not allowed to do action [find] on [debug.business]
    at Connection.<anonymous> (/home/alejandro/Proyectos/localeco/backend/node_modules/mongodb-core/lib/connection/pool.js:443:61)
    at Connection.emit (events.js:189:13)
    at processMessage (/home/alejandro/Proyectos/localeco/backend/node_modules/mongodb-core/lib/connection/connection.js:364:10)
    at TLSSocket.<anonymous> (/home/alejandro/Proyectos/localeco/backend/node_modules/mongodb-core/lib/connection/connection.js:533:15)
    at TLSSocket.emit (events.js:189:13)
    at addChunk (_stream_readable.js:284:12)
    at readableAddChunk (_stream_readable.js:265:11)
    at TLSSocket.Readable.push (_stream_readable.js:220:10)
    at TLSWrap.onStreamRead [as onread] (internal/stream_base_commons.js:94:17)
  ok: 0,
  errmsg:
   'user is not allowed to do action [find] on [debug.business]',
  code: 8000,
  codeName: 'AtlasError',
  name: 'MongoError',
  [Symbol(mongoErrorContextSymbol)]: {} }

As you can see using the MongoDB native driver with same credentials all work well.

Support multiple conditions in JOINs

Currently QB allows only to join automatically based on FK. Add support for adding additional conditions so one can make it the join only subset of rows.

const res = await qb.select('*').leftJoin('other', 'o', { foo: 'bar' }).execute();
// this will trigger query similar to `SELECT * FROM table t LEFT JOIN other AS o ON (o.id = t.other_id AND foo = 'bar')`

Complex join conditions are also supported, including smart query operators.

qb.select(['a.*', 'b.*'])
  .leftJoin('a.books', 'b', {
    'b.foo:gte': '123',
    'b.baz': { $gt: 1, $lte: 10 },
    '$or': [
      { 'b.foo': null, 'b.baz': 0, 'b.bar:ne': 1 },
      { 'b.bar': /321.*/ },
      { $and: [
        { 'json_contains(`a`.`meta`, ?)': [{ 'b.foo': 'bar' }] },
        { 'json_contains(`a`.`meta`, ?) = ?': [{ 'b.foo': 'bar' }, false] },
        { 'lower(b.bar)': '321' },
      ] },
    ],
  });

Add support for `Cascade.ALL`

Currently Cascade.PERSIST is default value, but when you want to have both persist and remove cascading, you need to specify them both via [Cascade.PERSIST, Cascade.REMOVE].

Cascade.ALL would behave like those two flags.

Support smart search conditions

Currently when you want to search with other than equals operator, you need to wrap the condition in another object like this

const cond = { date: { $gt: new Date() } };

Add support for smart key operators:

const cond = { 'date:>': new Date() } };

List of possible operators:

> < <= >= ! :in :nin :gt(e) :lt(e) :ne :not

Support virtual property getters

If you can have virtual property like fullName, that will be computed based on firstName and lastName:

@Property({ name: 'fullName' })
getFullName() {
  return this.firstName + this.lastName;
}

When defined like this, the property fullName will be available as part of serialized response (output of entity.toJSON()).

Is there any locking mechanism?

is there any locking mechanism? is there any roadmap to implement it? It's really great project, hope many users will use this

Implement transactions in mongo driver

MongoDB 4 supports transactions, so we should add support for that.

Maybe via new driver keeping the current MongoDriver for MongoDB 3, or via some configuration option that would be disabled by default (or automatically enabled via feature detection/version comparison).

Add onInit lifecycle hook

Fired after the entity instance was created, good place to fill your virtual/shadow properties:

@Entity()
export class Author {

  @Property()
  firstName: string;

  @Property()
  lastName: string;

  @Property({ persist: false })
  fullName: string;

  @OnInit()
  onInit() {
    this.fullName = `${this.firstName} ${this.lastName}`;
  }
  
}

Automatically map raw results to entities when setting collection items

Currently when user loads results via QB, they cannot be directly assigned to collection via Collection.set (it works but you will end up with plain objects assigned instead of entities, which can result in unwanted behaviour like lack of proper serialization, aka toJSON won't be called on collection items).

const res = await repo.createQueryBuilder().select('*').execute(); // we have array of objects as result here
const users = res.map(data => repo.map(data)); // this is currently needed but won't be after this improvement
entity.users.set(users); // set the result to collection

logo contribution

Hello @B4nan . I contribute to open source projects in my free time. You want me to design a logo for this repo? Actually, I have an idea. I can show you if you want. I will wait for feedback. Best regards.

Partially update a MongoDb embedded document

I'm not sure if this is a bug or if is expected behavior, because I can't find documentation about how to work with embedded documents.

entity/locationModel.ts

import { Entity, IEntity, ManyToOne, PrimaryKey, Property } from "mikro-orm";
import { ObjectId } from "mongodb";
import LocationData from "../repositories/locationData";
import { GeometryEntity } from "../types/embedded";
import { BusinessModel } from "./businessModel";

@Entity({ collection: "location", customRepository: () => LocationData })
export class LocationModel {
    @PrimaryKey({ type: "ObjectId" })
    public _id!: ObjectId;

    @Property({ type: "GeometryEntity" })
    public geometry: GeometryEntity;

    @ManyToOne({ type: "BusinessModel", entity: () => BusinessModel })
    public business!: BusinessModel;

    public constructor(geometry: GeometryEntity) {
        this.geometry = geometry;
    }
}

export interface LocationModel extends IEntity<string> { }

types/embedded.ts

export interface GeometryEntity {
    type: string;
    coordinates: number[];
}

repositories/locationData.ts

import { EntityRepository } from "mikro-orm";
import { LocationModel } from "../entities/locationModel";
import { CreateLocation, EditLocation } from "../types/inputs/location";

export default class LocationData extends EntityRepository<LocationModel> {
    ...

    public async update(input: EditLocation): Promise<LocationModel | null> {
        const locationModel: LocationModel | null = await this.findOne(input.id);

// tslint:disable-next-line: no-console
        console.log(input);

// tslint:disable-next-line: no-console
        console.log(locationModel);

        if (locationModel) {
            locationModel.assign(input);

// tslint:disable-next-line: no-console
            console.log(locationModel);

            try {
                await this.persist(locationModel);
            } catch (error) {
// tslint:disable-next-line: no-console
                console.log(error);
            }
        }

        return locationModel;
    }
}

Output result:

[query-logger] db.getCollection("location").find({"_id":"5c9d218960ccc21b905f8069"}).limit(1).toArray(); [took 72 ms]
[Object: null prototype] {
  id: '5c9d218960ccc21b905f8069',
  geometry: [Object: null prototype] { coordinates: [ 25, 25 ] } }
LocationModel {
  _id: 5c9d218960ccc21b905f8069,
  geometry: { type: 'Point', coordinates: [ 50, 50 ] },
  business:
   BusinessModel {
     _id: 5c9d155106dd7215cee1da4c,
     locations:
      Collection {
        owner: [Circular],
        items: [],
        initialized: false,
        dirty: false,
        _populated: false },
     __initialized: false } }
LocationModel {
  _id: 5c9d218960ccc21b905f8069,
  geometry: [Object: null prototype] { coordinates: [ 25, 25 ] },
  business:
   BusinessModel {
     _id: 5c9d155106dd7215cee1da4c,
     locations:
      Collection {
        owner: [Circular],
        items: [],
        initialized: false,
        dirty: false,
        _populated: false },
     __initialized: false } }
{ MongoError: Can't extract geo keys: { _id: ObjectId('5c9d218960ccc21b905f8069'), business: ObjectId('5c9d155106dd7215cee1da4c'), geometry: { coordinates: [ 25, 25 ] } }  unknown GeoJSON type: { coordinates: [ 25, 25 ] }
    at Function.create (/home/alejandro/Proyectos/localeco/backend/node_modules/mongodb-core/lib/error.js:43:12)
    at toError (/home/alejandro/Proyectos/localeco/backend/node_modules/mongodb/lib/utils.js:149:22)
    at coll.s.topology.update (/home/alejandro/Proyectos/localeco/backend/node_modules/mongodb/lib/operations/collection_ops.js:1465:39)
    at handler (/home/alejandro/Proyectos/localeco/backend/node_modules/mongodb-core/lib/topologies/replset.js:1155:22)
    at /home/alejandro/Proyectos/localeco/backend/node_modules/mongodb-core/lib/connection/pool.js:397:18
    at process._tickCallback (internal/process/next_tick.js:61:11)
  driver: true,
  name: 'MongoError',
  index: 0,
  code: 16755,
  errmsg:
   'Can\'t extract geo keys: { _id: ObjectId(\'5c9d218960ccc21b905f8069\'), business: ObjectId(\'5c9d155106dd7215cee1da4c\'), geometry: { coordinates: [ 25, 25 ] } }  unknown GeoJSON type: { coordinates: [ 25, 25 ] }',
  [Symbol(mongoErrorContextSymbol)]: {} }

It seems that embedded document is overwritten instead of be merged, so my better workaround was:

    ...
    const locationModel: LocationModel | null = await this.findOne(input.id);

    if (locationModel) {
        const { geometry, ...location } = input;
        locationModel.assign(location);
        Object.assign(locationModel.geometry, geometry);
        await this.persist(locationModel);
    }
    ...

I think there should have a better approach. Some guidance about how to work with MongoDB embedded documents?

Add `Reference<T>` wrapper to allow better type checking of m:1 and 1:1 references

Hi,

I noticed that documentation suggest to define one-2-one and many-2-one references this way:

@Entity()
export class User {
  @PrimaryKey()
  id!: number;

  @Property()
  username: string;

  @OneToOne({ mappedBy: 'credentials' })
  credentials: Credentials;

  constructor(props: Omit<User, 'id'>) {
    super();
    this.username = props.username;
    this.credentials = props.credentials;
  }

but when i load user with entity manager property credential can be Reference. It causes problems with typing and strict null checking.

user.credentails.some_property can just be null in runtime, but it can't be null from perspective of my domain model and type system. This problem can be solved if credentials property will not be Credentials type directly, but some proxy generalized with Credentials type. For example,

@Entity()
export class User {
  @PrimaryKey()
  id!: number;

  @Property()
  username: string;

  @OneToOne({ mappedBy: 'credentials' })
  credentials: Reference<Credentials>;

  constructor(props: Omit<User, 'id'>) {
    super();
    this.username = props.username;
    this.credentials = new Reference(props.credentials);
  }
}

So if i call user.credentials it returns Reference and to access credentials i can do something like:

await user.credentials.load()

If reference is already populated it will just skip queries. Is it possible already?

Default naming strategy on @ManyToMany() generated tables when @Entity({ collection: "name" }) is defined

I don't know if this behavior change on next version 3.0, but currently default name strategy don't seems to respect custom names of entities, at least on many to many relations, when these are defined using @entity({ collection: "name" }).

Example:

@Entity({ collection: "artists" })
export class ArtistModel {
    @PrimaryKey({ type: "number" })
    public id!: number;

    @ManyToMany({ type: "StudioModel", entity: () => StudioModel, mappedBy: "artists" })
    public studios: Collection<StudioModel>;

    public constructor() {
        this.studios = new Collection(this);
    }
}

export interface ArtistModel extends IEntity<number> { }

@Entity({ collection: "studios" })
export class StudioModel {
    @PrimaryKey({ type: "number" })
    public id!: number;

    @ManyToMany({ type: "ArtistModel", entity: () => ArtistModel, inversedBy: "studios" })
    public artists: Collection<ArtistModel>;

    public constructor() {
        this.artists = new Collection(this);
    }
}

export interface StudioModel extends IEntity<number> { }

The schema generator create this:

DROP TABLE IF EXISTS `artists`;

CREATE TABLE `artists` (
  `id` int(11) unsigned NOT NULL AUTO_INCREMENT,
  PRIMARY KEY (`id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8;

DROP TABLE IF EXISTS `studios`;

CREATE TABLE `studios` (
  `id` int(11) unsigned NOT NULL AUTO_INCREMENT,
  PRIMARY KEY (`id`),
) ENGINE=InnoDB DEFAULT CHARSET=utf8;

DROP TABLE IF EXISTS `studio_model_to_artist_model`;

CREATE TABLE `studio_model_to_artist_model` (
  `id` int(11) unsigned NOT NULL AUTO_INCREMENT,
  `studio_model_id` int(11) unsigned NOT NULL,
  `artist_model_id` int(11) unsigned NOT NULL,
  PRIMARY KEY (`id`),
  KEY `studio_model_id` (`studio_model_id`),
  KEY `artist_model_id` (`artist_model_id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8;

ALTER TABLE `studio_model_to_artist_model`
  ADD CONSTRAINT `studio_model_to_artist_model_ibfk_1` FOREIGN KEY (`studio_model_id`) REFERENCES `studios` (`id`) ON DELETE CASCADE ON UPDATE CASCADE,
  ADD CONSTRAINT `studio_model_to_artist_model_ibfk_2` FOREIGN KEY (`artist_model_id`) REFERENCES `artists` (`id`) ON DELETE CASCADE ON UPDATE CASCADE;

I think it would have to create:

DROP TABLE IF EXISTS `studios_to_artists`;

CREATE TABLE `studios_to_artists` (
  `id` int(11) unsigned NOT NULL AUTO_INCREMENT,
  `studios_id` int(11) unsigned NOT NULL,
  `artists_id` int(11) unsigned NOT NULL,
  PRIMARY KEY (`id`),
  KEY `studios_id` (`studios_id`),
  KEY `artists_id` (`artists_id`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8;

ALTER TABLE `studios_to_artists`
  ADD CONSTRAINT `studios_to_artists_ibfk_1` FOREIGN KEY (`studios_id`) REFERENCES `studios` (`id`) ON DELETE CASCADE ON UPDATE CASCADE,
  ADD CONSTRAINT `studios_to_artists_ibfk_2` FOREIGN KEY (`artists_id`) REFERENCES `artists` (`id`) ON DELETE CASCADE ON UPDATE CASCADE;

Assign values generated or calculated by a query ​​to an entity

I'm not sure if this feature is out of ambit of a ORM. Feel free of close the issue directly, if so.

I have a entity:

@Entity({ collection: "location" })
export class LocationModel implements EntityLocation {
    @PrimaryKey()
    public _id!: ObjectId;

    @Property()
    public geometry!: EntityGeometry;

    @ManyToOne({ entity: () => BusinessModel })
    public business!: BusinessModel;

    public distance?: number;
}

export interface LocationModel extends IEntity<string> { }

And a query, MongoDB aggregation, that create a new value calculated (distance):

public async findNear(coordinates: number[]): Promise<LocationModel[]> {
    const entities: EntityLocation[] = await this.locationRepository.aggregate([
        {
            $geoNear: {
                distanceField: "distance",
                distanceMultiplier: 0.001,
                near: { coordinates, type: "Point" },
                spherical: true,
            },
        },
    ]);

    return entities.map((entity) => this.locationRepository.create(entity));
}

When converting the query result to entity, the value "distance" is ignored and I have to handle it manually. Can be possible implement something to handle it automatically, something like:

@Entity({ collection: "location" })
export class LocationModel implements EntityLocation {
    @PrimaryKey()
    public _id!: ObjectId;

    @Property()
    public geometry!: EntityGeometry;

    @ManyToOne({ entity: () => BusinessModel })
    public business!: BusinessModel;

    @Property({ persist: false })
    public distance?: number;
}

export interface LocationModel extends IEntity<string> { }

Integrate knex

Integrate knex into QueryBuilder and Connection. This will allow handling easier connection pooling implementation and will come handy when building better schema management support (like computing differential updates).

Knex instance will be used as an SQL client in newly created AbstractSqlConnection. It will allow executing plain SQL statements (strings) as well as knex query builder instances. Current QueryBuilder class will use this knex client under the hood to build the query. Users will be able to get this configured knex instance and build custom queries with knex directly.

It should allow us to simplify schema generator and remove some parts of Platform implementations.

This will probably bring some breaking changes. Follow up to #56.

Suggestion for debug in version 3.0

I think taking advantage the next major version can be interesting replace current debug configuration:

logger: console.log.bind(console),
debug: true,

And substitute it by debug module, this can permit to filter easily the output:

DEBUG=mikro-orm:query

PS: This is more opinionated. But now there is one MySQL connector that seems more official mariadb (developed by MariaDB and the benchmarks seems good).

Cross-database joins and cross-schema joins

I'm pretty new to mikro-orm and I don't really know whether or not mikro orm supports cross-database joins and cross-schema joins. In typeorm, you can do something like this

course_repo.createQueryBuilder('course')
  .select('course.name')
  .addSelect('student.name')
  .innerJoin("someDatabase.student", 'student', 'student.id = course.student_id')
  .getRawMany()

Implement `QueryBuilder.leftJoin()`

orm.em.createQueryBuilder(Publisher, 'p')
  .select(['p.*', 'b.*', 'a.*'])
  .leftJoin('p.books', 'b')
  .leftJoin('b.author', 'a')
  .where({ 'p.name': 'test 123', 'b.name': /3$/ });

Add support for Map collections or composite key.

I have a table definition which has a composite primary key and was mapped in Java as a CollectionOfElements.

Example:
Supplier: { id, name, created_at, updated_at }
SupplierConfiguration: {supplier_id, property_key, property_value}

The composite key is supplier_id and property_key, and in Hibernate this is managed as a Map<String, String> using the decorators below.

@CollectionOfElements(fetch = FetchType.EAGER)
@JoinTable(name = "supplier_configuration", joinColumns = @JoinColumn(name = "supplier_id"))
@MapKey(columns = @Column(name = "property_key"))
@Column(name = "property_value")
public Map<String, String> getConfiguration() {
  return configuration;
}

Support for entities globs instead of entitiesDirs

Internally mikro-orm seems to be doing a lot of work with the directories and entities to get the directories and validate them, but then turn them into a glob for addExistingSourceFiles.

It seems like we could cut a lot of that out by just going all in on globs, and if we have something without magic (["entities"]), just throw an "/*" on the end when reading the config.

A use case for me is with Nest.js. I'd like to just include any file in my workspace that ends in .entity.ts, but I'm kinda required to shove them into an entities subdirectory directory in the respective module's folder right now and use a glob

{ entitiesDir: "./**/entities/"}

TypeError: Cannot read property getInstanceProperties' of undefined

Try to create entities in files with same name but in different folders, like:

  • category/entities.ts
  • product/entities.ts
  • user/entities.ts

and you'll get the following error when initializing:

UnhandledPromiseRejectionWarning: TypeError: Cannot read property getInstanceProperties' of undefined

Investigating a bit, I found that TypeScriptMetadataProvider creates a map of source files keyed by the filename, causing some kind of conflict for the above mentioned case where multiple files have the same name but are on different folders.

Support cascade merging detached entity

When you clear identity map, and want to merge one of detached entities so you can work with it, cascade merge mechanism does not correctly merge all its associations.

Lifecycle hooks not run if defined on base entity

I was trying to setup my base entity to have @BeforeCreate and @BeforeUpdate hooks on it to handle setting some common fields (id, created, createdBy, updated, updatedBy, etc) but they seem to be ignored. They work fine if I put them on one of my my main entities, just not on the base one I extend from.

Is this possible or is there a better way to define this functionality once and reuse it?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    πŸ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. πŸ“ŠπŸ“ˆπŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❀️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.