arsenyyankovsky / typeorm-aurora-data-api-driver Goto Github PK
View Code? Open in Web Editor NEWA bridge between TypeORM and Aurora Data API
License: MIT License
A bridge between TypeORM and Aurora Data API
License: MIT License
I tried to the best of my ability find documentation regarding schema migrations, however, couldn't find any information regarding it.
Does this lib supports schema migrations supported by typeorm? Is there any documentation?
Hi,
Glad to see typeorm has been release with the PR for this.
I'm running into an issue that I'm not sure how to overcome. I installed this driver in my project and got a test Lambda up and running. The lambda produces this error:
2019-09-16T16:14:02.831Z 67f1cd3a-b3b5-4ca8-abe1-ad8fcfa50d52 { Error: Cannot find module '/var/task/node_modules/typeorm-aurora-data-api-driver'
at Function.Module._resolveFilename (module.js:547:15)
at Function.Module._load (module.js:474:25)
at Module.require (module.js:596:17)
at require (internal/module.js:11:18)
at Function.PlatformTools.load (/var/task/node_modules/typeorm/platform/PlatformTools.js:112:24)
at AuroraDataApiDriver.loadDependencies (/var/task/node_modules/typeorm/driver/aurora-data-api/AuroraDataApiDriver.js:664:60)
at new AuroraDataApiDriver (/var/task/node_modules/typeorm/driver/aurora-data-api/AuroraDataApiDriver.js:237:14)
at DriverFactory.create (/var/task/node_modules/typeorm/driver/DriverFactory.js:56:24)
at new Connection (/var/task/node_modules/typeorm/connection/Connection.js:52:59)
at ConnectionManager.create (/var/task/node_modules/typeorm/connection/ConnectionManager.js:56:26) code: 'MODULE_NOT_FOUND' }
After inspecting my node_modules directory I can see typeorm-aurora-data-api-driver
in there but with no code inside? I would have expected a bin
folder or index.js
or something. Is this correct?
While researching TypoORM with AWS' Data API, I ran into the bellow error (simple serverless lambda w/ TypeScript). typeorm-aurora-data-api-driver is not present in the zipped lambda sources.
I am not sure whether it is related to a similar closed issue or a configuration error on my side.
{
"errorType": "Error",
"errorMessage": "Cannot find module '/var/task/node_modules/typeorm-aurora-data-api-driver'\nRequire stack:\n- /var/task/node_modules/typeorm/platform/PlatformTools.js\n- /var/task/node_modules/typeorm/driver/sqlserver/SqlServerDriver.js\n- /var/task/node_modules/typeorm/migration/MigrationExecutor.js\n- /var/task/node_modules/typeorm/connection/Connection.js\n- /var/task/node_modules/typeorm/connection/ConnectionManager.js\n- /var/task/node_modules/typeorm/index.js\n- /var/task/src/functions/hello/handler.js\n- /var/runtime/UserFunction.js\n- /var/runtime/index.js",
"code": "MODULE_NOT_FOUND",
"requireStack": [
"/var/task/node_modules/typeorm/platform/PlatformTools.js",
"/var/task/node_modules/typeorm/driver/sqlserver/SqlServerDriver.js",
"/var/task/node_modules/typeorm/migration/MigrationExecutor.js",
"/var/task/node_modules/typeorm/connection/Connection.js",
"/var/task/node_modules/typeorm/connection/ConnectionManager.js",
"/var/task/node_modules/typeorm/index.js",
"/var/task/src/functions/hello/handler.js",
"/var/runtime/UserFunction.js",
"/var/runtime/index.js"
],
"stack": [
"Error: Cannot find module '/var/task/node_modules/typeorm-aurora-data-api-driver'",
"Require stack:",
"- /var/task/node_modules/typeorm/platform/PlatformTools.js",
"- /var/task/node_modules/typeorm/driver/sqlserver/SqlServerDriver.js",
"- /var/task/node_modules/typeorm/migration/MigrationExecutor.js",
"- /var/task/node_modules/typeorm/connection/Connection.js",
"- /var/task/node_modules/typeorm/connection/ConnectionManager.js",
"- /var/task/node_modules/typeorm/index.js",
"- /var/task/src/functions/hello/handler.js",
"- /var/runtime/UserFunction.js",
"- /var/runtime/index.js",
" at Function.Module._resolveFilename (internal/modules/cjs/loader.js:831:15)",
" at Function.Module._load (internal/modules/cjs/loader.js:687:27)",
" at Module.require (internal/modules/cjs/loader.js:903:19)",
" at require (internal/modules/cjs/helpers.js:74:18)",
" at Function.PlatformTools.load (/var/task/src/platform/PlatformTools.ts:130:20)",
" at AuroraDataApiPostgresDriver.loadDependencies (/var/task/src/driver/aurora-data-api-pg/AuroraDataApiPostgresDriver.ts:104:17)",
" at new AuroraDataApiPostgresDriver (/var/task/src/driver/aurora-data-api-pg/AuroraDataApiPostgresDriver.ts:56:14)",
" at DriverFactory.create (/var/task/src/driver/DriverFactory.ts:65:24)",
" at new Connection (/var/task/src/connection/Connection.ts:128:43)",
" at ConnectionManager.create (/var/task/src/connection/ConnectionManager.ts:64:28)"
]
}
This is the webpack config:
const slsw = require('serverless-webpack');
const nodeExternals = require('webpack-node-externals');
const TsconfigPathsPlugin = require('tsconfig-paths-webpack-plugin');
module.exports = {
context: __dirname,
mode: slsw.lib.webpack.isLocal ? 'development' : 'production',
entry: slsw.lib.entries,
devtool: slsw.lib.webpack.isLocal ? 'eval-cheap-module-source-map' : 'source-map',
resolve: {
extensions: ['.mjs', '.json', '.ts'],
symlinks: false,
cacheWithContext: false,
plugins: [
new TsconfigPathsPlugin({
configFile: './tsconfig.paths.json',
}),
],
},
output: {
libraryTarget: 'commonjs',
path: path.join(__dirname, '.webpack'),
filename: '[name].js',
},
optimization: {
concatenateModules: false,
},
target: 'node',
externals: [nodeExternals()],
module: {
rules: [
// all files with a `.ts` or `.tsx` extension will be handled by `ts-loader`
{
test: /\.(tsx?)$/,
loader: 'ts-loader',
exclude: [
[
path.resolve(__dirname, 'node_modules'),
path.resolve(__dirname, '.serverless'),
path.resolve(__dirname, '.webpack'),
],
],
options: {
transpileOnly: true,
experimentalWatchApi: true,
},
},
],
},
plugins: [],
};
I'd appreciate any hint.
Thanks for making this available. Was so pleased I'd chosen an ORM when investigating using aurora serverless!
I see that the actual db client package is contained in another package (https://github.com/ArsenyYankovsky/typeorm-aurora-data-api-driver/blob/master/package.json#L104)
As this would be a potentially sensitive part of the the project, I'd far prefer to get some visibility into here. Is this possible perhaps?
I am using
I am getting this error "ERROR: operator does not exist: uuid = character varying" When i try to find a record or try to save the record. If I downgrade the driver to "2.0.0" i dont get this error but then i started getting errors abount enum type casting. Error is "ERROR: column "mediaType" is of type medias_mediatype_enum but expression is of type character varying".
It worked fine with node and express project but started causing issues on Server-less Project which is using Typescript, and Webpack
I have set the formatOptions in connection options as
formatOptions: {
// additional format options to pass to the Data API client
castParameters: true,
}
This is my Root Entity
import {
Column,
CreateDateColumn,
DeleteDateColumn,
ManyToOne,
PrimaryGeneratedColumn,
} from 'typeorm';
import { Brand, Nullable } from '../types';
import { EntityUsers } from './users.entity';
export abstract class RootEntity<X extends Brand<string, string>> {
@PrimaryGeneratedColumn('uuid')
id!: X;
@ManyToOne('EntityUsers')
createdBy!: EntityUsers;
@Column({ nullable: false })
createdById!: EntityUsers['id'];
@CreateDateColumn()
createdAt!: Date;
@DeleteDateColumn()
deletedDate!: Date;
}
export interface IRootHistory<
U extends Brand<string, string>,
T extends RootEntity<U>
> {
recordId: U;
}
here it is my actual entitiy
import { Column, Entity, PrimaryGeneratedColumn, VersionColumn } from 'typeorm';
import { ONE_LINK_STAN_LENGTH } from '../constants/db-constants';
import { Brand } from '../types';
import { RootEntity } from './root-entity';
export type EntityOneLinkStanCountersId = Brand<
string,
'OneLinkStanCountersId'
>;
abstract class EntityOneLinkStanCountersBase {
@PrimaryGeneratedColumn('uuid')
id!: EntityOneLinkStanCountersId;
@Column({ type: 'date', nullable: false })
date!: Date;
@Column({ type: 'integer', width: ONE_LINK_STAN_LENGTH, nullable: false })
counter!: number;
}
@Entity('oneLinkStanCounters')
export class EntityOneLinkStanCounters extends EntityOneLinkStanCountersBase {
@VersionColumn()
version!: number;
}
This is how i am trying to get data
import { getRepository } from 'typeorm';
const StanRepository = getRepository(EntityOneLinkStanCounters);
const stan = await StanRepository.findOne({ id });
if (!stan) return null;
return stan;
First off, thank you for creating this! It's a massive help.
I was wondering if we need to do await connection.close()
as we normally would with socket connections. Or does it have no effect?
ive been trying to track down an answer to my issue all day. when running a simple query (looking up a user by id for instance) i am getting this error:
i just looked at the aws js sdk docs and noticed there is no param type that directly correlates to uuid
there is also this note on the aws data api general docs
so my question is can this really not handle querying with a uuid?
when timeout my transaction it need to run
Please let me know if this is the right place for general discussions.
I am curious the reason behind the requirement of the region
option when resourceArn
has that information embedded already.
Is it required separately by some other components in TypeORM or AWS SDK? If so, what's the implications when specifying a region
different with the one embedded in resourceArn
?
I have several columns that use timestamptz, however TypeORM (or the driver or AWS - I really don't know) returns the value without the 'Z' that represents the correct timezone. The value is treated as a string type and not as a Date type (normal behaviour) due to (I guess) parsing problems.
PLEASE SEE this issue from local-data-api for more information:
Me and @koxudaxi are trying to investigate but we need more help...
I've made a test repo to try the strange (error???) behaviour. You can find it here.
After hours of trial and error I've finally discovered another issue (sorry ๐ ๐ ๐ ) that is a little "bit dangerous" since has not been discovered even in the older versions (pre 0.6).
The value oftimestamptz (timestamp with timezone)
(See this) is trimmed after 3 digits (of the ms) removing the 'Z' of the timezone.
In javascript trying to parse the value without the 'Z' results in a 'string' type rather than a 'Date' type. However this causes undefined behaviour or parsing exceptions in other languages.
NB: This problem affect both working and not working versions of docker compose.
I've updated the with a new testing procedures (not only for timestamptz): https://github.com/carlocorradini/local-data-api-issue
PS: timestamp works ๐
Any idea or suggestion will be super appreciated!!!
Hi,
Since the upgrade from 0.2.26
to 0.2.32
, we have an error when trying to insert a simple-json
field.
See theses logs:
INSERT INTO `survey_question_user_answer`(`id`, `position`, `questionAnswerId`, `questionId`, `userSubmissionId`) VALUES (DEFAULT, :param_0, DEFAULT, :param_1, :param_2) -- PARAMETERS: [{"name":"param_0","value":"{\"top\":18.149767019916705,\"left\":43.66928614606803}","cast":"JSON"},{"name":"param_1","value":34},{"name":"param_2","value":1490}]
BadRequestException: Database error code: 1064. Message: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'JSON), DEFAULT, 34, 1490)' at line 1
I think it's related to commit 2272929 where the value transformed before being persisted.
Is there a word around to rollback to the previous behavior ?
Thanks a lot!
Package versions and database engine type (please complete the following information):
Describe the bug
When connecting to local data api it results in
HTTPEndPoint not enabled for arn:aws:rds:us-east-1:1234578123456789012:cluster:dummy
To Reproduce
Please provide the code which reproduces an issue or as much information as you can. Thank you.
{
type: 'aurora-data-api',
secretArn: ARN_VALUE,
resourceArn: RESOURCE_ARN_VALUE,
database: DB_NAME,
region: REGION,
//rest of configs
serviceConfigOptions: {
endpoint: 'http://127.0.0.1:8080',
credentials: {
accessKeyId: 'dummy',
secretAccessKey: 'dummy'
}
}
}
this was based on #34
I tried with and without credentials object and is reproducible whether the Nest service is running as a container or on the host machine.
Depending on how we can get this setup in the data-api-client
library it would be nice to be able to enable this feature here as well.
You can find the docs for wrapping AWS clients here: https://www.npmjs.com/package/aws-xray-sdk-core#user-content-capture-all-outgoing-aws-requests
Does this driver support passing configuration options via env variables?
e.g. TYPEORM_SECRET_ARN
I think aws-sdk
should be added as dependency
and not devDependency
.
Deprecated as of 10.7.0. highlight(lang, code, ...args) has been deprecated.
Deprecated as of 10.7.0. Please use highlight(code, options) instead.
https://github.com/highlightjs/highlight.js/issues/2277
Package versions and database engine type:
Describe the bug
Hello, thank you for creating this amazing driver. It's a lot helpful.
I am refering this issue for maintaing connection with Aurora Serverless and lambda.
I am using serverless framework as infrastucture as a code tool. The issue I am facing is when I make a first API call everything works properly but on second API call I get the following error:
AlreadyHasActiveConnectionError: Cannot create a new connection named "default", because connection with such name already exist and it now has an active connection session.
at AlreadyHasActiveConnectionError.TypeORMError [as constructor]
To Reproduce
The following is my db.ts
where I have written all the logic for maintaing connection with reference to above issue.
import "reflect-metadata";
import { Todo } from "src/entity/Todo";
import {
Connection,
ConnectionOptions,
createConnection,
EntityTarget,
} from "typeorm";
let cachedConnection: Connection;
const getConnection = async () => {
if (cachedConnection) {
return cachedConnection;
}
const secretArn: string = process.env.AWS_SECRET_ARN;
const resourceArn: string = process.env.AWS_RESOURCE_ARN;
const region: string = process.env.AWS_REGION;
const connectionOptions: ConnectionOptions = {
type: "aurora-data-api-pg",
database: "todoapp",
secretArn,
resourceArn,
region,
};
cachedConnection = await createConnection({
...connectionOptions,
entities: [Todo],
});
return cachedConnection;
};
export const getData = async (Model: EntityTarget<unknown>) => {
try {
const connection = await getConnection();
const modelRepository = connection.getRepository(Model);
const data = await modelRepository.find();
return data;
} catch (error) {
throw error;
}
};
export const addData = async (Model: EntityTarget<unknown>, data: unknown) => {
try {
const connection = await getConnection();
const modelRepository = connection.getRepository(Model);
await modelRepository.save(data);
} catch (error) {
throw error;
}
};
I am just calling this getData and addData methods from my lambda functions as follows:
function1:
await addData(Todo, data);
function2:
const todos = await getData(Todo);
Note: I am using serverless-offline plugin to run the code offline. ( Not sure if this may be causing the issue )
What I have seen is that cachedConnection becomes null on second different API call however it shouldn't.
Can you please check what I am doing wrong and how can I resolve this issue?
Package versions and database engine type (please complete the following information):
Whenever i try to execute any typeorm query i get this error. "BadRequestException: FATAL: password authentication failed for user "masteruser" from aws-sdk
code breaks at user = await userRepo.findOne({ email });
Hello,
First of all, thanks a lot for the library, I haven't seen anything like typeorm-aurora-data-api-driver with support for Aurora Serverless PostgreSQL. Thanks for working hard and implementing this solution.
I have a question regarding Date type. I've tried to setup a model with type: Date but received the following error:
@Column({ type: 'date', nullable: true })
date: string;
"ERROR: column "date" is of type date but expression is of type character varying\n Hint: You will need to rewrite or cast the expression."
I tried to send a value in "2020-01-01" string format.
query: INSERT INTO "entitry"("id", "date") VALUES (:param_1, :param_2) -- PARAMETERS: [{"param_1":1,"param_2":"2020-10-17"}]
The same code works fine on postresql started in a docker container. Also I tried to create an entity manually in aws query editor and it also worked with 'YYYY-MM-DD' string. After this I tried to update already existing entity using typeorm and if I provide the same date in 'YYYY-MM-DD' format (typeorm checks what fields should be updated and excluded 'date' field for that case) it works. How typeorm knows the field has the same value in dates column if I provide it in the wrong format/way?
I also looked at tests for typeorm-aurora-data-api-driver and haven't found any for date :)
Thanks a lot for your help.
Nice lib! Getting an error like this:
DataTypeNotSupportedError: Data type "timestamptz" in "AuthToken.createdAt" is not supported by "aurora-data-api" database.
How come timestamptz
is not supported?
Hi,
We're using this package to connect to our Aurora serverless database cluster. For testing, we use a local Postgres database and everything works fine. The transformParameters
in the PostgresQueryTransformer
incorrectly assumes that our string (which is a valid UUID) should be cast to type uuid
. The column we're querying is in fact a character varying
(regular string) as it does not only store UUIDs but also other strings.
Because of this incorrect assumption, we are unable to query our database when the parameter we're using is formatted as an UUID.
Another issue is that we've got a column specified as @Column('json') content: SomeType;
which works fine with a regular Postgres connection, but we're prompted with the following error when trying to insert data: ERROR: column "content" is of type json but expression is of type character varying
TypeORM connections can be consumed as global scope objects like so:
// inside async context
await createConnection({
/* connection details */
});
const user = new User();
user.firstName = "Timber";
await user.save(); // error using this driver but works with native typeorm (see below)
When using this driver, the above code throws this error:
ConnectionNotFoundError: Connection "default" was not found.
While using native typeorm connection to postgresql it works.
This driver also works when calling the useConnection
method on the entity like so:
// inside async context
const connection = await createConnection({
/* connection details */
});
Article.useConnection(connection);
const user = new User();
user.firstName = "Timber";
await user.save(); // works with this driver
But ideally, this shouldn't be required?
It would be awesome if it was possible to give a endpoint URL so you can use this package to connect to a local-data-api in Docker like https://github.com/koxudaxi/local-data-api
We use docker-compose with that docker image for local development purposes but we haven't found a way to connect to it using TypeORM and this driver. Any help is appreciated!
Package versions and database engine type (please complete the following information):
Describe the bug
MySQL supports SET data type, which I also confirmed with raw DDL in AWS console that it works in Aurora Serverless.
But when I synchronize with a SET column, it throws the following error:
DataTypeNotSupportedError: Data type "set" in "testSet" is not supported by "aurora-data-api" database.
To Reproduce
Synchronize with the following entity,
enum TestSet {
foo = "foo";
bar = "bar";
}
@Entity()
class Test {
@Column({
type: "set",
enum: TestSet,
nullable: false,
})
testSet!: TestSet[];
}
TypeError: date.getUTCFullYear is not a function
at dateToDateTimeString (/opt/nodejs/node_modules/typeorm-aurora-data-api-driver/dist/typeorm-aurora-data-api-driver.umd.js:943:25)
at PostgresQueryTransformer.preparePersistentValue (/opt/nodejs/node_modules/typeorm-aurora-data-api-driver/dist/typeorm-aurora-data-api-driver.umd.js:1181:32)
at DataApiDriver.preparePersistentValue (/opt/nodejs/node_modules/typeorm-aurora-data-api-driver/dist/typeorm-aurora-data-api-driver.umd.js:1363:42)
at AuroraDataApiPostgresDriver.preparePersistentValue (/opt/nodejs/node_modules/app/node_modules/typeorm/driver/aurora-data-api-pg/AuroraDataApiPostgresDriver.js:72:28)
at /opt/nodejs/node_modules/app/node_modules/typeorm/query-builder/UpdateQueryBuilder.js:365:57
at Array.forEach (<anonymous>)
at /opt/nodejs/node_modules/app/node_modules/typeorm/query-builder/UpdateQueryBuilder.js:353:25
at Array.forEach (<anonymous>)
at UpdateQueryBuilder.createUpdateExpression (/opt/nodejs/node_modules/app/node_modules/typeorm/query-builder/UpdateQueryBuilder.js:347:85)
at UpdateQueryBuilder.getQuery (/opt/nodejs/node_modules/app/node_modules/typeorm/query-builder/UpdateQueryBuilder.js:45:21)
}
When trying to update:
@Column({ nullable: true, type: 'timestamp' })
public scheduledToSendAt?: Date
using
const announcement: IUpdateAnnouncement = {
...data,
updatedAt: new Date(),
}
const announcementRepo = connection.getRepository(Announcement)
await announcementRepo.save(announcement)
Package versions and database engine type (please complete the following information):
Describe the bug
When using an enum, MySQL results in syntax error.
To Reproduce
getRepository(Product).create({ petType: UserType.normal })
UPDATE `product` SET `type` = :param_0 WHERE `id` = :param_1 -- PARAMETERS: [{"name":"param_0","value":"normal","cast":"user_type_enum"},{"name":"param_1","value":"1"}]
[0] BadRequestException: Database error code: 1064. Message: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'user_type) WHERE `externalId` = '1'' at line 1
We quickly fixed the issue by comment the line https://github.com/ArsenyYankovsky/typeorm-aurora-data-api-driver/blob/master/src/query-transformer/mysql-query-transformer.ts#L55 :
cast: metadata.enumName || `${metadata.entityMetadata.tableName}_${metadata.databaseName.toLowerCase()}_enum`,
Potentially this will allow to run nightly build and pause the db when needed.
After installing via npm and attempting to set the type according to the readme, typeorm throws a typescript error with an unknown type aurora-data-api
.
TS2322: Type '"aurora-data-api"' is not assignable to type
Any tips adding the driver as an acceptable type to typeorm?
Hey, thanks for writing this. I'm sure its going to be a hit with all serverless projects.
I'm trying to get is set up? I assume I need to npm install the typeorm branch with support for this driver as well? How do I do that?
I tried with
npm install --save typeorm@ArsenyYankovsky/typeorm#feature/aurora-data-api
but that fails with:
npm ERR! path <OMITTED>/node_modules/typeorm/cli.js
npm ERR! code ENOENT
npm ERR! errno -2
npm ERR! syscall chmod
npm ERR! enoent ENOENT: no such file or directory, chmod ' <OMITTED>/node_modules/typeorm/cli.js'
npm ERR! enoent This is related to npm not being able to find a file.
npm ERR! enoent
When an entity has uuid as a primary column
either:
@Entity()
export class Organization {
@PrimaryGeneratedColumn()
@Generated('uuid')
id!: string;
or:
@Entity()
export class ExampleEntity {
@PrimaryGeneratedColumn('uuid')
id!: string;
...
then the following:
const connection = await createConnection({
type: 'aurora-data-api-pg',
entities: [ExampleEntity],
...
});
fails with:
2020-06-26T19:49:40.638Z 16e2936c-2dce-40bb-bdc6-0c01b2745e06 ERROR Invoke Error {"errorType":"TypeError","errorMessage":"Cannot read property 'connect' of undefined","stack":["TypeError: Cannot read property 'connect' of undefined"," at /var/task/node_modules/typeorm/driver/postgres/PostgresDriver.js:285:42"," at new Promise (<anonymous>)"," at /var/task/node_modules/typeorm/driver/postgres/PostgresDriver.js:284:40"," at Array.map (<anonymous>)"," at AuroraDataApiPostgresDriver.<anonymous> (/var/task/node_modules/typeorm/driver/postgres/PostgresDriver.js:283:103)"," at step (/var/task/node_modules/tslib/tslib.js:141:27)"," at Object.next (/var/task/node_modules/tslib/tslib.js:122:57)"," at /var/task/node_modules/tslib/tslib.js:115:75"," at new Promise (<anonymous>)"," at Object.__awaiter (/var/task/node_modules/tslib/tslib.js:111:16)"]}
Removing the 'uuid' allows createConnection to succeed.
If I use:
const connection = await createConnection({
type: 'postgres',
entities: [ExampleEntity],
...
});
with 'uuid' then createConnection also succeeds.
Package versions and database engine type (please complete the following information):
Describe the bug
Code below makes an invalid request to DataApi
await Article.findOneOrFail({
where: {
id: "e279da6d-e0dc-4fce-9027-297ba928d4b3",
},
});
{ "message": ERROR: operator does not exist: uuid = character varying\n Hint: No operator matches the given name and argument type(s). You might need to add explicit type casts.\n Position: [...]",
"code": "BadRequestException",
[...]
}
Probably introduced by #77.
Hey,
I am fairly confident I have found a bug as I have seen this issue crop up now in two different scenarios. When running against my local mysql docker instance I have no issues but when running against Aurora Serverless MySql I have encountered the error:
"'param_1'' is an invalid type"
(well, param_X)
This is my log:
INFO query: SELECT `NewCompany`.`id` AS `NewCompany_id`, `NewCompany`.`name` AS `NewCompany_name`, `NewCompany`.`websiteUrl` AS `NewCompany_websiteUrl`, `NewCompany`.`numberOfEmployees` AS `NewCompany_numberOfEmployees`, `NewCompany`.`suggestedCategories` AS `NewCompany_suggestedCategories`, `NewCompany__existingCategories`.`updatedAt` AS `NewCompany__existingCategories_updatedAt`, `NewCompany__existingCategories`.`createdAt` AS `NewCompany__existingCategories_createdAt`, `NewCompany__existingCategories`.`version` AS `NewCompany__existingCategories_version`, `NewCompany__existingCategories`.`id` AS `NewCompany__existingCategories_id`, `NewCompany__existingCategories`.`name` AS `NewCompany__existingCategories_name`, `NewCompany__existingCategories`.`parentId` AS `NewCompany__existingCategories_parentId` FROM `new_company` `NewCompany` LEFT JOIN `new_company_existing_categories_company_category` `NewCompany_NewCompany__existingCategories` ON `NewCompany_NewCompany__existingCategories`.`newCompanyId`=`NewCompany`.`id` LEFT JOIN `company_category` `NewCompany__existingCategories` ON `NewCompany__existingCategories`.`id`=`NewCompany_NewCompany__existingCategories`.`companyCategoryId` WHERE (`NewCompany`.`id` IN (:param_0)) AND `NewCompany`.`id` IN (:param_1) -- PARAMETERS: [
{
"param_0": "ccca5f74-5dbd-485c-86a1-925622aeea61",
"param_1": [
"ccca5f74-5dbd-485c-86a1-925622aeea61"
]
}
I am going to have to try and find some time to reproduce an example for diagnosis. However the original time I saw this error it was rather obvious - there were two parameters which were being passed in the wrong way around. It's worth noting that I am using this in conjunction with typeorm-transactional-cls-hooked
.
I understand this isn't a very good issue report but I wanted to at least mention it in the event this is something you have come across yourself.
I am using this driver in production so I need to resolve this as soon as possible :)
Versions:
"typeorm": "^0.2.19",
"typeorm-aurora-data-api-driver": "^1.1.4",
"typeorm-transactional-cls-hooked": "^0.1.8"
Hi @ArsenyYankovsky, I know this is not an issue, but I've been searching for many hours, maybe you have an idea
So, I'm using:
I'm connecting to the db like it's described in github,
const connection = await createConnection({
type: 'aurora-data-api-pg',
database: 'test-db',
secretArn: 'arn:aws:secretsmanager:eu-west-1:537011205135:secret:xxxxxx/xxxxxx/xxxxxx',
resourceArn: 'arn:aws:rds:eu-west-1:xxxxx:xxxxxx:xxxxxx',
region: 'eu-west-1'
})
And this is how I use it inside of my Lambda function
export const testConfiguration: APIGatewayProxyHandler = async (event, _context) => {
let response;
try {
const connectionOptions: ConnectionOptions = await getConnectionOptions();
const connection = await createConnection({
...connectionOptions,
entities,
});
const userRepository = connection.getRepository(User);
const users = await userRepository.find();
response = {
statusCode: 200,
body: JSON.stringify({ users }),
};
} catch (e) {
response = {
statusCode: 500,
body: JSON.stringify({ error: 'server side error' }),
};
}
return response;
};
When I execute is first time it works just well
But second and next times I'm getting an error
AlreadyHasActiveConnectionError: Cannot create a new connection named "default", because connection with such name already exist and it now has an active connection session.
So, what is the proper way to manage this connection?
Should it be somehow reused?
I've found some resolutions for simple RDS but the whole point of Aurora Serverless Data API is that you don't have to manage the connection
So I have a FK column with uuid
type.
And when I'm trying to do:
entity.foreignId = "valid-uuid-formatted-string"
await repository.save(entity)
I'm getting this:
"code": "BadRequestException",
"message": "ERROR: column \"foreign_id\" is of type uuid but expression is of type character varying\n Hint: You will need to rewrite or cast the expression.
Trying to rewrite it using createQueryBuilder
and casts at the moment.
I had a problem with inserting values with the timestamp
mysql data type. Javascript Date object 2020-05-28T03:56:48.466Z
was being transformed to "2020-05-28 05:56:48"
. (I am in UTC+2 timezone. Server uses UTC, however I encountered this issue during local development.)
Then I tried running the script with
$ TZ='UTC' node somescript.js
and 2020-05-28T03:56:48.466Z
was correctly transformed to "2020-05-28 03:56:48"
.
I think the problem might be that aws-sdk does not communicate my system timezone to Aurora Data API and expects UTC.
Related source code: https://github.com/ArsenyYankovsky/data-api-client/blob/support-date/index.js#L189
const formatDate = (date) => {
const padNumber = (number) => number <= 9 ? '0' + number : number
const year = date.getFullYear()
const month = date.getMonth() + 1
const day = date.getDate()
const hour = date.getHours()
const minutes = date.getMinutes()
const seconds = date.getSeconds()
return year + '-' + padNumber(month) + '-' + padNumber(day) + ' ' + padNumber(hour) + ':' + padNumber(minutes) + ':' + padNumber(seconds)
}
Expected: timestamp string sent to Data API should be in UTC and not be related to system timezone.
Suggested fix: use UTC variants of Date methods like Date.prototype.getUTCHours()
Thanks for this amazing library!
Hey everyone, first of all I want to thank you for this plugin that makes it possible to use Aurora Serverless with Serverless Lambdas - well done! I created a connection manager to handle connections and was wondering about some unusual long Lambda execution times. Reason seems to be creation of an initial data API connection which takes more than 6 seconds. Because Lambdas are stateless EACH Lambda that connects to the DB (which in a real world application is a lot because you have a lot of REST endpoints wich do some stuff in a microservice setup) needs that initial "startup time" which isn't possible in production. So my question is if anyone has a solution to this problem.
My connection manager looks like this:
// Database.ts
import {
Connection,
createConnection,
getConnectionManager,
ConnectionOptions,
} from 'typeorm'
import { User } from '../entities/User'
const connectionManager = getConnectionManager()
export async function getConnection(): Promise<Connection> {
let connection = null
if (connectionManager.has('default')) {
console.log('Re-using connection')
connection = await connectionManager.get('default')
} else {
console.log('Creating new connection') // This takes about 6 seconds
const connectionOptions: ConnectionOptions =
process.env.STAGE === 'local'
? {
type: 'mysql',
host: 'localhost',
port: 3306,
username: process.env.DB_USER,
password: process.env.DB_PASSWORD,
database: process.env.DB_NAME,
}
: {
type: 'aurora-data-api',
database: process.env.DB_NAME,
secretArn: process.env.DB_SECRET_ARN,
resourceArn: process.env.DB_RESOURCE_ARN,
region: 'eu-central-1',
}
try {
connection = await createConnection({
...connectionOptions,
entities: [User],
synchronize: false,
logging: false,
})
} catch (error) {
console.log(error)
}
}
return connection
}
And then in my services I can do something like this:
import { getConnection } from './Database'
import { User } from '../entities/User'
[...]
await getConnection()
const repository = getRepository(User)
Really appreciate your help, thanks!
Hey thank for you the world on this project, looks awesome.
I was wondering if you have performed any benchmarks of this driver versus a default postgres connection w/ pooling, or if AWS has said anything about the data api performance.
Would be happy to assist if not!
continueAfterTimeout
is supported by AWS Aurora Data Api as well as https://github.com/jeremydaly/data-api-client.
I saw this issue (and PR) but ti was closed without any workaround or impl
#55
I can see that the driver supports querying Aurora Serverless. I have a need to run TypeORM schema migrations against an Aurora Serverless Postgress database using the AWS Data API from a ci server e.g. gitlab.
Is there a way to do that with this driver?
Package versions and database engine type (please complete the following information):
Describe the bug
Column Transformers are not fired at all
To Reproduce
place a transformer on a column field
@Column({
name: "example",
transformer: {
from: (v: string) => {
return v.trim();
},
to: (v) => {
return v;
},
},
})
public example!: string;
Cannot save null values to mysql database. Null query parameter values fail to map correctly in normalizeParams
, resulting in TypeError: Cannot convert undefined or null to object
TypeError: Cannot convert undefined or null to object
at Function.keys (<anonymous>)
at app/node_modules/typeorm-aurora-data-api-driver/dist/typeorm-aurora-data-api-driver.umd.js:440
at Array.reduce (<anonymous>)
at normalizeParams (app/node_modules/typeorm-aurora-data-api-driver/dist/typeorm-aurora-data-api-driver.umd.js:437)
at query (app/node_modules/typeorm-aurora-data-api-driver/dist/typeorm-aurora-data-api-driver.umd.js:688)
at Object.query (app/node_modules/typeorm-aurora-data-api-driver/dist/typeorm-aurora-data-api-driver.umd.js:908)
at DataApiDriver.<anonymous> (app/node_modules/typeorm-aurora-data-api-driver/dist/typeorm-aurora-data-api-driver.umd.js:1344)
at step (app/node_modules/typeorm-aurora-data-api-driver/dist/typeorm-aurora-data-api-driver.umd.js:81)
at Object.next (app/node_modules/typeorm-aurora-data-api-driver/dist/typeorm-aurora-data-api-driver.umd.js:62)
at app/node_modules/typeorm-aurora-data-api-driver/dist/typeorm-aurora-data-api-driver.umd.js:55
at new Promise (<anonymous>)
at __awaiter (app/node_modules/typeorm-aurora-data-api-driver/dist/typeorm-aurora-data-api-driver.umd.js:51)
at DataApiDriver.query (app/node_modules/typeorm-aurora-data-api-driver/dist/typeorm-aurora-data-api-driver.umd.js:1336)
at AuroraDataApiQueryRunner.<anonymous> (app/node_modules/typeorm/driver/aurora-data-api/AuroraDataApiQueryRunner.js:174)
at step (app/node_modules/typeorm/node_modules/tslib/tslib.js:141)
at Object.next (app/node_modules/typeorm/node_modules/tslib/tslib.js:122)
First, src/query-transformer/mysql-query-transformer.ts:127
transformParameters(parameters?: any[])
maps all null
(and undefined) parameters to null
, and not to the object of type { name: ``param_${index}``, value: parameter }
.
Then in normalizeParams
in dist/typeorm-aurora-data-api.umd.js
:
const normalizeParams = params => params.reduce((acc, p) =>
Array.isArray(p) ? acc.concat([normalizeParams(p)])
: (
(Object.keys(p).length === 2 && p.name && p.value !== 'undefined') ||
(Object.keys(p).length === 3 && p.name && p.value !== 'undefined' && p.cast)
) ? acc.concat(p)
: acc.concat(splitParams(p))
, []);
which takes the transformed parameters from above into params
. It expects each element to be either an array or an object with two or three keys. When p
is null, Object.keys(p)
throws the error TypeError: Cannot convert undefined or null to object
.
Because the mysql query transformer transformParameters
returns null for null parameter values, it breaks in normalizeParams
.
At src/query-transformer/mysql-query-transformer.ts:127, make the following changes:
protected transformParameters(parameters?: any[]) {
if (!parameters) {
return parameters
}
const expandedParameters = this.expandArrayParameters(parameters)
return expandedParameters.map((parameter, index) => {
if (parameter === undefined) { // << remove parameter === null
return parameter
}
if (typeof parameter === 'object' && parameter !== null && parameter.value) { // << add parameter !== null
return ({
name: `param_${index}`,
...parameter,
})
}
return {
name: `param_${index}`,
value: parameter, // << value here will be null for null parameters (works in my testing)
}
})
}
I've started working on a Postgres support for the driver. I managed to get basic queries to work, but most of the tests fail currently. Currently the issue is the date string conversion. I've created a thread on AWS forums here.
I will push the branch later.
I also found that the cold start is much better with Postgres (under 10 seconds) and it doesn't throw an error.
Package versions and database engine type (please complete the following information):
Describe the bug
Entities that have enum
column types cause errors when trying to query the database where the enum column is in the WHERE
criteria. The error is caused because the enum column criteria value is not cast correctly.
To Reproduce
https://github.com/veho-technologies/typeorm-data-api-enum-issue
The above repo describes and demonstrates the issue.
Example error:
BadRequestException: ERROR: operator does not exist: example_status_enum = character varying
Hint: No operator matches the given name and argument type(s). You might need to add explicit type casts.
Position: 313
at Object.extractError (/Users/chris/projects/veho/seshat-debug/node_modules/aws-sdk/lib/protocol/json.js:52:27)
at Request.extractError (/Users/chris/projects/veho/seshat-debug/node_modules/aws-sdk/lib/protocol/rest_json.js:55:8)
at Request.callListeners (/Users/chris/projects/veho/seshat-debug/node_modules/aws-sdk/lib/sequential_executor.js:106:20)
at Request.emit (/Users/chris/projects/veho/seshat-debug/node_modules/aws-sdk/lib/sequential_executor.js:78:10)
at Request.emit (/Users/chris/projects/veho/seshat-debug/node_modules/aws-sdk/lib/request.js:688:14)
at Request.transition (/Users/chris/projects/veho/seshat-debug/node_modules/aws-sdk/lib/request.js:22:10)
at AcceptorStateMachine.runTo (/Users/chris/projects/veho/seshat-debug/node_modules/aws-sdk/lib/state_machine.js:14:12)
at /Users/chris/projects/veho/seshat-debug/node_modules/aws-sdk/lib/state_machine.js:26:10
at Request.<anonymous> (/Users/chris/projects/veho/seshat-debug/node_modules/aws-sdk/lib/request.js:38:9)
at Request.<anonymous> (/Users/chris/projects/veho/seshat-debug/node_modules/aws-sdk/lib/request.js:690:12) {
code: 'BadRequestException',
time: 2021-05-05T15:46:47.936Z,
requestId: 'xxXxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx',
statusCode: 400,
retryable: false,
retryDelay: 78.57561114326721
}
Firstly - thank you so much for your work on this.
Does this support synchronization and migrations? If not, whats your workflow of performing these? eg. creating / dropping tables
Hello , thanks for the driver ;)
i'm facing a parsing error when trying to query to aurora postgresql serverless :
ERROR: syntax error at or near "."
the query is :
SELECT DISTINCT `distinctAlias`.`User_id` as "ids_User_id" FROM (SELECT `User`.`id` AS `User_id`, `User`.`email` AS `User_email`....
if i replace "`" by " " " seems to execute correctly, in my test, but not sure how to make it work generally , any help would be welcome
thanks
Package versions and database engine type (please complete the following information):
Describe the bug
Hi I'm having an issue with the .find from the getRepository it always returns me no results even though there are results to be shown. I know this is an issue with find because as soon as I try with raw sql providing a simple 'Select * from cards' I get the correct information.
To Reproduce
I have the following configuration:
Docker Compose:
version: '3.1'
services:
local-data-api:
image: koxudaxi/local-data-api
restart: always
environment:
MYSQL_HOST: db
MYSQL_PORT: 3306
MYSQL_USER: root
MYSQL_PASSWORD: example
RESOURCE_ARN: 'arn:aws:rds:us-east-1:123456789012:cluster:dummy'
SECRET_ARN: 'arn:aws:secretsmanager:us-east-1:123456789012:secret:dummy'
ports:
- "8080:80"
networks:
- database
db:
image: mysql:5.6
command: --default-authentication-plugin=mysql_native_password
restart: always
environment:
MYSQL_ROOT_PASSWORD: example
MYSQL_DATABASE: test
ports:
- "3306:3306"
networks:
- database
networks:
database:
name: database
Then I have the following connection:
import { Connection, createConnection } from 'typeorm';
import {Card} from './entities/card'
export const getConnection = async (): Promise<Connection> => {
const connection = await createConnection({
type: 'aurora-data-api',
database: 'test',
secretArn: 'arn:aws:secretsmanager:us-east-1:123456789012:secret:dummy',
resourceArn: 'arn:aws:rds:us-east-1:123456789012:cluster:dummy',
region: 'eu-west-1',
entities: [Card],
serviceConfigOptions: {
endpoint: 'http://local-data-api:80'
}
})
return connection
}
This entity:
import { Column, Entity, PrimaryGeneratedColumn } from "typeorm";
@Entity({name: 'cards'})
export class Card{
@PrimaryGeneratedColumn()
id: number;
@Column()
title: string;
@Column()
tablename: string;
}
and finally the code that has the problem is:
~
import { APIGatewayProxyResult, APIGatewayEvent } from 'aws-lambda';
import { Card } from '../database/entities/card';
import { getConnection } from '../database/db_connection'
export const lambdaHandler = async (req: APIGatewayEvent): Promise<APIGatewayProxyResult> => {
try {
let conn = await getConnection();
let cardRepo = await conn.getRepository(Card);
let res = await cardRepo.find();
console.log('res', res); // --- returning everytime []!
await conn.query('select * from cards'); //the strange part is that if I try with raw SQL like this it works
return {"headers":{"Content-Type": "application/json", "Access-Control-Allow-Origin": "*"}, "body": JSON.stringify(res), statusCode: 200 }
} catch (exception) {
return { body: `{exception: ${exception.stack}}`, statusCode: 500 };
}
};
The sql statements I use to create the table was this:
CREATE DATABASE test;
CREATE TABLE cards(
id INT PRIMARY KEY AUTO_INCREMENT,
tablename varchar(255),
title varchar(255)
)
Any help is much appreciated, thank you!
I noticed this project depends on a fork of data-api-client instead of the original project and is now out of date. Any reason not to just depend on the original project?
It seems that enum values when using postgres are not supported. Maybe even the Data API does not support enums in stringValue parameters (I just asked AWS about this).
Similar problem with jsonb type: jeremydaly/data-api-client#34
Workaround:
I managed to get enums working by wrapping the value in a function (which tells typeorm to pass it as raw - so it avoids parameters)
example:
status: () => `'${Status.PENDING}'`,
// instead of
status: Status.PENDING,
What also works is casting (mentioned in the linked issue):
UPDATE property SET status=:status::property_status_enum
(enum name in schema is property_status_enum
and passed query parameters: {status: 'pending'}
)
Casting requires you to use raw queries which kind of goes against typeorm workflow. Could this casting be implemented in this driver or somewhere else? @ArsenyYankovsky
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.