Code Monkey home page Code Monkey logo

pgtyped's Introduction

Version Actions Status Join the chat at https://gitter.im/pgtyped/community

PgTyped makes it possible to use raw SQL in TypeScript with guaranteed type-safety.
No need to map or translate your DB schema to TypeScript, PgTyped automatically generates types and interfaces for your SQL queries by using your running Postgres database as the source of type information.


Features:

  1. Automatically generates TS types for parameters/results of SQL queries of any complexity.
  2. Supports extracting and typing queries from both SQL and TS files.
  3. Generate query types as you write them, using watch mode.
  4. Useful parameter interpolation helpers for arrays and objects.
  5. No need to define your DB schema in TypeScript, your running DB is the live source of type data.
  6. Prevents SQL injections by not doing explicit parameter substitution. Instead, queries and parameters are sent separately to the DB driver, allowing parameter substitution to be safely done by the PostgreSQL server.
  7. Native ESM support. Runtime dependencies are also provided as CommonJS.

Documentation

Visit our documentation page at https://pgtyped.dev/

Getting started

  1. npm install -D @pgtyped/cli typescript (typescript is a required peer dependency for pgtyped)
  2. npm install @pgtyped/runtime (@pgtyped/runtime is the only required runtime dependency of pgtyped)
  3. Create a PgTyped config.json file.
  4. Run npx pgtyped -w -c config.json to start PgTyped in watch mode.

More info on getting started can be found in the Getting Started page. You can also refer to the example app for a preconfigured example.

Example

Lets save some queries in books.sql:

/* @name FindBookById */
SELECT * FROM books WHERE id = :bookId;

PgTyped parses the SQL file, extracting all queries and generating strictly typed TS queries in books.queries.ts:

/** Types generated for queries found in "books.sql" */

//...

/** 'FindBookById' parameters type */
export interface IFindBookByIdParams {
  bookId: number | null;
}

/** 'FindBookById' return type */
export interface IFindBookByIdResult {
  id: number;
  rank: number | null;
  name: string | null;
  author_id: number | null;
}

/**
 * Query generated from SQL:
 * SELECT * FROM books WHERE id = :bookId
 */
export const findBookById = new PreparedQuery<
  IFindBookByIdParams,
  IFindBookByIdResult
>(...);

Query findBookById is now statically typed, with types inferred from the PostgreSQL schema.
This generated query can be imported and executed as follows:

import { Client } from 'pg';
import { findBookById } from './books.queries';

export const client = new Client({
  host: 'localhost',
  user: 'test',
  password: 'example',
  database: 'test',
});

async function main() {
  await client.connect();
  const books = await findBookById.run(
    {
      bookId: 5,
    },
    client,
  );
  console.log(`Book name: ${books[0].name}`);
  await client.end();
}

main();

Resources

  1. Configuring pgTyped
  2. Writing queries in SQL files
  3. Advanced queries and parameter expansions in SQL files
  4. Writing queries in TS files
  5. Advanced queries and parameter expansions in TS files

Project state:

This project is being actively developed and its APIs might change. All issue reports, feature requests and PRs appreciated.

License

MIT

Copyright (c) 2019-present, Adel Salakh

pgtyped's People

Contributors

adelsz avatar bag-man avatar bradleyayers avatar darky avatar dependabot[bot] avatar elob avatar gervinfung avatar gitter-badger avatar golergka avatar jessevelden avatar jgonera avatar joecarver avatar johnnycrazy avatar leifniem avatar logan12358 avatar lucas-gregoire avatar mako-taco avatar maylukas avatar mrsafalpiya avatar msakrejda avatar mudrz avatar nbarnett avatar nick-keller avatar renovate-bot avatar renovate[bot] avatar robinclowers avatar snarky-puppy avatar tamlyn avatar tim-field avatar water-a avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pgtyped's Issues

Reusing the same parameter in WHERE fails to parse

The problem

Using the same argument in 2 places inside a WHERE fails to parse. For example:

/*
  @name FindAllQuestions1
*/
SELECT id FROM "Question" WHERE ( ( :category IS NULL ) OR ( "_categoryId" = :category ) );

Results in:

Error in query. Details: {
  errorCode: 'scanner_yyerror',
  hint: undefined,
  message: 'syntax error at or near ":"',
  position: '37'
}

However, if I keep the query almost the same but just remove one :category, it works:

/*
  @name FindAllQuestions2
*/
SELECT id FROM "Question" WHERE ( ( 123 IS NULL ) OR ( "_categoryId" = :category ) );

(No errors)

/*
  @name FindAllQuestions3
*/
SELECT id FROM "Question" WHERE ( ( :category IS NULL ) OR ( "_categoryId" = 'foo' ) );

(could not determine data type of parameter $1, which makes sense and it parsed just fine)

DB Logs

Looking at my database logs I can't help but notice that the first usage of the parameter was not replaced by $1:

db_1  | 2020-06-05 13:47:55.249 UTC [2572] ERROR:  syntax error at or near ":" at character 37
db_1  | 2020-06-05 13:47:55.249 UTC [2572] STATEMENT:  SELECT id FROM "Question" WHERE ( ( :category IS NULL ) OR ( "_categoryId" = $1 ) )

Support for node-postgres QueryConfig

Hey!
This package suite seems very useful and well done, so first of all thanks for the good work!

I was wondering, as it seems the package relies upon use in conjunction with node-postgres, why redefine the types as opposed to using the ones provided from pg.

For example, why don't the processQueryAST/processQueryString functions return the pg type QueryConfig or Submittable? https://github.com/brianc/node-postgres-docs/blob/master/content/api/2-client.mdx#clientqueryconfig-queryconfig-callback-err-error-result-result--void--void

This will allow for better extensibility and integration with the pg client, which is currently wrapped - One example that comes to mind is that it is currently not easy to leverage pg support for prepared statements (which can be done simply by adding a name to the QueryConfig).

ending commas are not parsed correctly

when you have a table

CREATE TABLE untitled_table (
    id SERIAL PRIMARY KEY,
    a text,
    b text NOT NULL
);

a query like

/*
  @name UpdateA
*/
UPDATE public.untitled_table
SET
  a = :a,
  b = :b
WHERE id = :id;

results in a parse error: Parse error: extraneous input 'a,' expecting ID

you need to add a white space between the identifier and the comma for it to parse:

UPDATE public.untitled_table
SET
  a = :a ,
  b = :b
WHERE id = :id;

How to Support Dynamic Condition?

let query = knex.select("*").from('users');
if (args.name !== undefined) {
  query = query.where('name', args.name);
}
if (args.age_gt !== undefined) {
  query = query.where('age', '>', args.age_gt);
}
if (args.age_lt !== undefined) {
  query = query.where('age', '<', args.age_lt);
}
const result = await query;

How can we express such a logic in pgtyped ?

Install as dev dependency

Our docker images have only production node_modules, without dev ones.
When I try to use @pgtyped/cli as dev dependency I get runtime error Error: Cannot find module 'chalk'. I tried to install @pgtyped/cli as a non-dev dependency and got error Error: Cannot find module 'typescript'

SSL mode

Add config option / env variable for ssl mode.

Also option with #32 ?sslmode=require should be enough.

Incompatible with other sql comments directives

Seems pgtyped uses strict format of sql like:

/* @name IUsers */
select * from users;

But something another comments breaks pgtyped:

/* bla-bla */
/* @name IUsers */
select * from users;
Parsed file:
1 |
2 > /* bla-bla */
3 > /* @name IUsers */
4 | select * from users;
5 |
Errors:
- (2:6) Parse error: token recognition error at: '-b'
- (2:3) Parse error: missing '@name' at 'bla'
- (2:8) Parse error: extraneous input 'la' expecting {'@param', '*/'}
- (3:0) Parse error: mismatched input '/*' expecting {ID, WORD, STRING}

In SQL can be used another comments and even directives from another tools like https://github.com/xialvjun/ts-sql-plugin, for example:

/* ts-sql-plugin:ignore-cost */
/* @name IUsers */
select * from users;

Need simply ignore unparsed stuff

Should @pgtyped/query depend on Typescript?

I was producing a production build with yarn install --production and noticed @pgtyped/query pulls in Typescript https://github.com/adelsz/pgtyped/blob/master/packages/query/src/index.ts#L13.

This seems odd given query is the runtime dependency. If it really is needed then it should be added to the dependencies of @pgtyped/query currently it's only available because it's in devDependencies of the root project.

It looks like the generator is the only thing that uses parseTypescriptFile abd parseSQLFile. Perhaps they should be refactored out of query?

[Bug] Array spread query broken in 0.8.0

A previously working query:

export const insertParamUpdate = sql<IInsertParamUpdateQuery>`
    INSERT INTO data.action_log (_chainid, _height, _txindex, _eventindex, _eventname, agreement_address, datetime_param, datetime)
    VALUES
    $$params(_chainid, _height, _txindex, _eventindex, _eventname, agreement_address, datetime_param, datetime);
`;

Where the columns are a subset of the table throws this error on generation:

Error in query. Details: {
  errorCode: 'scanner_yyerror',
  hint: undefined,
  message: 'syntax error at or near "("',
  position: '148'
}

I'll follow up with a test case unless its obvious what has changed

Wrong nullability of columns, which calculated from functions

Few examples:

select sum(column) * 12 as hour ...
select to_char(column, 'DD.MM.YYYY HH24:MI') as date ...

In this examples column is nullable, which generate hour and date nullable too at runtime.
But pgtyped generate not null type of column.

PG array types are unsupported

I've set up pgtyped without any problems, but I have a column in my table of type character varying(100)[] or varchar(100)[]. Upon running pgtyped CLI, I get the following error:

Error: Postgres type '_varchar' is not supported by mapping
    at TypeAllocator.use (/home/mark/Work/configurator-sdk/node_modules/@pgtyped/cli/lib/types.js:99:34)
    at /home/mark/Work/configurator-sdk/node_modules/@pgtyped/cli/lib/generator.js:54:36
    at Array.forEach (<anonymous>)
    at /home/mark/Work/configurator-sdk/node_modules/@pgtyped/cli/lib/generator.js:53:21
    at Generator.next (<anonymous>)
    at fulfilled (/home/mark/Work/configurator-sdk/node_modules/@pgtyped/cli/lib/generator.js:5:58)
    at processTicksAndRejections (internal/process/task_queues.js:94:5)

and the column gets a type of never in the resulting .ts file. Would be great to add support for this kind of types in the future!

Columns without attribute data should be nullable

Looks like Postgres doesn't provide pg_attribute information for aggregated/derived columns (ex. #120).
Example: select sum(column) * 12 as hour ...

We need to make sure we generate nullable types for all such columns, even if their non-nullable (postgres doesn't provide that info).

Treat "null" and "undefined" as interchangeable in query params interface

If I have a value that's of type string | undefined (for example) I need to coerce it to string | null before I can pass it as a query param.

const accessToken: string | undefined = urlParameters.accessToken;
await isAccessTokenValidForFeed.run({
    accessToken: accessToken || null,
    feedId
}, db);

It's true that SQL has very different null semantics than JS but this shouldn't matter for input params.

(Again, happy to put together a PR for this if its a desirable feature.)

Support enums

This seems like the most obvious/useful place to start with types that need to be pulled in from Postgres. Enum types and their values need to be read from the Postgres catalog and passed along as additional types from getTypes in the query package. These types can be merged in with the type mapping in #51 which can generate a corresponding typescript enum declaration as a TypeAlias and output that in the header of the generated file.

A similar approach can probably be used for composite types.

Support converting column names to camelCase

We in our project use implicit conversion of snake_case -> camelCase fields of database result in runtime.
Is exists ability to represent it via pgtyped ?

export interface IFindBookByIdResult {
  id: number;
  author_id: number | null; // <-- snake_case field
}

↓ ↓ ↓

export interface IFindBookByIdResult {
  id: number;
  authorId: number | null; // <-- camelCase field
}

Support for additional Postgres types (json, jsonb, bytea, etc)

Thanks for the library :)

Looking at https://github.com/adelsz/pgtyped/blob/master/packages/cli/src/generator.ts#L20-L58 it seems like the generator does not support a few types that are used quite often. Namely json/jsonb and bytea. What would it take to add support? Depending on the complexity I might be able to find time to have a go at implementing.

For reference the complete list of PG types: https://www.postgresql.org/docs/current/datatype.html - full support would be challenging...

Perhaps it would be possible to fall back to a generated Buffer or string Typescript type for currently unsupported types. The Buffer fallback would work for bytea. json would be something like type Json = null | boolean | number | string | Json[] | { [key: string]: Json }

Support type overrides in config

Allow override of type mapping form config by adding types section that takes a map of postgres type names to Type:

{
  "transforms": [ ... ],
  "types": {
    "int:": {
      "name": "WholeNumber",
      "from": "./my-pkg/whole-number"
    },
    "bytea": {
      "name": "Uint8Array"
    },
    "shipping_status": {
      "name": "ShippingStatus",
      "enumValues": ["Shipped", "Pending"]
    }
  },
}

This would support the currently supported AliasedType, ImportedType, and EnumType definitions and apply them as overrides into the TypeMapping passed to TypeAllocator

Some issues:

  • Relative import resolution - we probably need to accept imports rooted at srcDir in config and convert them to relative imports in code. There are probably some corner cases to consider when emitting generated code outside of the source tree (is that even allowed?)

[Idea] Generate runtime types with io-ts

I have been enjoying 'database-first domain modelling', and also io-ts (having come across it here) (thanks).

Thinking about higher-order types, custom types, and type overrides #60 at the same time as defining io-ts schema has made me wonder if io-ts should generate runtime types (rather than static types) using io-ts. It would be strictly more powerful and not much more effort from the code generation side (actually it might be easier https://github.com/gcanti/io-ts-codegen).

You can still get hold of types by using t.TypeOf<typeof IoTsSchema> so we can just put that everywhere we'd currently put our existing types.

This allows dependees not only to check their code for type agreement but also to parse inputs that are related to database-defined schema (as they surely must be). This could allow the DB schema to be the statically typed centre of a system. Inputs can be typed at run-time and compile-time from schema derived from the DB.

I think io-ts will be much better than the inchoate type system I added for enums for transporting type information around for composite types. And I can't entirely put my finger on why yet, but I think using runtime types at generation time and in generated code (as well as to carry the database type mapping) is going to unlock some powerful features.

insert queries for nullable types generate non-nullable types

when you have a table

CREATE TABLE untitled_table (
    id SERIAL PRIMARY KEY,
    a text,
    b text NOT NULL
);

a query like this:

/*
  @name InsertItems
  @param item -> (a, b)
*/
INSERT INTO public.untitled_table(a, b)
	VALUES :item RETURNING id;

generates

export interface IInsertItemsParams {
  item: {
    a: string,
    b: string
  };
}

which does not contain the option where a is undefined, it should be

export interface IInsertItemsParams {
  item: {
    a: string | undefined,
    b: string
  };
}

Add chalk to dependencies in @pgtyped/query

I've installed @pgtyped/query as the main dependency and @pgtyped/cli as the dev dependency and get an error:

Error: Cannot find module 'chalk'
Require stack:
- /app/node_modules/@pgtyped/query/lib/loader/sql/logger.js
- /app/node_modules/@pgtyped/query/lib/loader/sql/index.js
- /app/node_modules/@pgtyped/query/lib/preprocessor.js
- /app/node_modules/@pgtyped/query/lib/index.js
- /app/utils/graphql/context.js
- /app/index.js
    at Function.Module._resolveFilename (internal/modules/cjs/loader.js:1029:15)
    at Function.Module._load (internal/modules/cjs/loader.js:898:27)
    at Module.require (internal/modules/cjs/loader.js:1089:19)
    at require (internal/modules/cjs/helpers.js:73:18)
    at Object.<anonymous> (/app/node_modules/@pgtyped/query/lib/loader/sql/logger.js:6:33)
    at Module._compile (internal/modules/cjs/loader.js:1200:30)
    at Object.Module._extensions..js (internal/modules/cjs/loader.js:1220:10)
    at Module.load (internal/modules/cjs/loader.js:1049:32)
    at Function.Module._load (internal/modules/cjs/loader.js:937:14)
    at Module.require (internal/modules/cjs/loader.js:1089:19) {
  code: 'MODULE_NOT_FOUND',
  requireStack: [
    '/app/node_modules/@pgtyped/query/lib/loader/sql/logger.js',
    '/app/node_modules/@pgtyped/query/lib/loader/sql/index.js',
    '/app/node_modules/@pgtyped/query/lib/preprocessor.js',
    '/app/node_modules/@pgtyped/query/lib/index.js',
    '/app/utils/graphql/context.js',
    '/app/index.js'
  ]
}

Annotation ideas: @single and @maybe

@single: returns exactly one row. Not nullable
@maybe: returns zero or one rows. Nullable

I'm just starting to use pgtyped, maybe having an array everywhere is most optimal in practice

select queries for non nullable types generate nullable types

possibly related to #130

when you have a table

CREATE TABLE untitled_table (
    id SERIAL PRIMARY KEY,
    a text,
    b text NOT NULL
);

a query like this:

/*
  @name GetB
*/
SELECT id FROM public.untitled_table
WHERE b = :b;

generates

export interface IGetBParams {
  b: string | null | void;
}

where b is null or void, it should probably be

export interface IGetBParams {
  b: string;
}

not-null columns in insert queries

Hello. Thank you for this awesome project.

We have table items with field name, which is not nullable.
We also have query:

INSERT INTO items (name)
VALUES (name)

Generated type:

export interface ICreateItemParams {
  name: string | null | void;
}

Why it has null and void types?

Usage with concatenated query

Would this library work with queries that are concatenated from parts? Any ideas on how to solve this?

Example:

let query = sql`
  SELECT *
  FROM foo
  WHERE bar = true
`

if (baz === true) {
  query += sql` AND baz = true`
}

Thanks!

Ability to configure DB port

Hi,

Is it possible to configure the DB port via config.json?

Configuring a DB string mentioned in issue #32 would also provide a way to configure port

Thanks!

PG JSON unions are unsupported

Update "Table" set params = "params" || :params where "id" = :id;

breaks PGTyped. This is valid PG syntax, for example, try running this in REPL

select '{"a" :"a", "b": "b"}'::jsonb || '{"a": "b"}'::jsonb as text

sql files not parsed when parameter type cannot be determined

when you have a table

CREATE TABLE untitled_table (
    id SERIAL PRIMARY KEY,
    a text,
    b text NOT NULL
);

the following query

/*
  @name UpdateA
*/
UPDATE public.untitled_table
SET
  a = (
    CASE WHEN (:a IS NOT NULL)
      THEN :a
      ELSE a
    END
  )
WHERE id = :id;

does not parse

Error in query. Details: {
  errorCode: 'scanner_yyerror',
  hint: undefined,
  message: 'syntax error at or near ":"',
  position: '57'
}

the same query in typescript files results in could not determine data type of parameter $....
in typescript files casting the value via ::TEXT fixes the issue

in sql files

UPDATE public.untitled_table
SET
  a = (
    CASE WHEN (:a::TEXT IS NOT NULL)
      THEN :a
      ELSE a
    END
  )
WHERE id = :id;

still results in a syntax error

[IDEA] Support raw (unescaped) SQL to be passed to queries

I have a query written in SQL and I love that the types are automatically created. So far, so good :)

The problem:

However, my use case requires that I dynamically generate ORDER BY … phrase and unfortunately that's not possible with parametrised queries.

Solution:

Therefore, I need to pass a raw SQL code to the query which would be concatenated with it as is, without any escaping.

Proposed syntax:

Same as with usual parameters, but their names would be prefixed with unsafe_:

ORDER BY :unsafe_orderBy

Additionally, and also to make sure it's a non-breaking change, it could be indicated when the query is used:

import { unsafe } from 'pg-typed';
// …
await myQuery.run({ unsafe_orderBy: unsafe(`"MyColumn" ASC`) };

Risks:

When used incorrectly, this could lead to SQLi vulnerabilities.

Annotated parameter is missing nullable type

For the following sql

/*
   @name addTodo
   @param todo -> (description, done, duedate)
 */
INSERT INTO todo(description, done, duedate) VALUES :todo;

/* @name addTodoNoObjectParam */
INSERT INTO todo(description, done, duedate)
VALUES (:description, :done, :duedate);

I get different type system:

/** 'addTodo' parameters type */
export interface IAddTodoParams {
  todo: {
    description: string,
    done: boolean,
    duedate: Date
  };
}

/** 'addTodoNoObjectParam' parameters type */
export interface IAddTodoNoObjectParamParams {
  description: string | null | void;
  done: boolean | null | void;
  duedate: Date | null | void;
}

Is duedate type for addTodo really supposed to be just Date?
(the column duedate in the database is a nullable timestamp)

Release 0.7.1

I see the tag exists with the enum changes, did you mean to release it or is just pending?

Use emitFileName from config

Hi, thank you for this super useful tool!

I just can't get the emitFileName in my config working. From what I can tell, it's not actually being used:
https://github.com/adelsz/pgtyped/blob/master/packages/cli/src/index.ts#L146

This is my config- if you see an error on my part I'd appreciate if you could please point it out.

  "transforms": [
    {
      "mode": "sql",
      "include": "notifications.sql",
      "emitFileName": "../../../types/src/api/sql/notifications.ts"
    }
  ],
  "srcDir": "./src/db",
  "db": {
    "dbName": "api_dev",
    "user": "postgres",
    "password": "postgres",
    "host": "localhost"
  }
}

Otherwise happy to put in a PR if you haven't got the time.
Thanks again!

Order seems to matter in insert into

If inserting a row with a subset of the columns if the order of my subset does not match the declared order of the table columns.

For example if I insert a row with just four columns out of many set in the same order as they were declared in the table the query works as expected:

const insertParamUpdate = sql<IInsertParamUpdateQuery>`
    insert into data.event_log
    (_chainid, _height, _txindex, storage_address  )
    values
    $$params(_chainid, _height, _txindex,  storage_address);
`;

But if I swap storage_address in the column list nad the values object:

const insertParamUpdate = sql<IInsertParamUpdateQuery>`
    insert into data.event_log
    (_chainid, storage_address, _height, _txindex )
    values
    $$params(_chainid, storage_address _height, _txindex);
`;

This fails with a invalid input for type bigint. This makes it seem like storage_address is being provided for a _height value.

It's late now but a test case shouldn't be too hard if this is a general bug.

Infer field names for object pick

When you have a long list of columns for example in an INSERT INTO query you end up having to write down the same long list twice. Once in the parameter pick expansion and once in the query. It would be nice to have a parameter expansion that expands both the values and the keys of an object. Something like:

/*
  @name insertUsers
  @param users -> ((name, age)...)
*/
INSERT INTO users ::users VALUES :users RETURNING id;

Possible syntax for the keys: ::users, :keyof:users, :users.keys, :users:keys, would any of these be ambiguous?

Error parsing query with quotes

I have a query that looks something like this.

/* @name FindThingById */
SELECT
    *
FROM
    database.prefix$environment."Thing"
WHERE
    id = :id;

Thing was created with quotes, so it is actually Thing and not thing.

If I attempt to run pgtyped with

npx pgtyped -w -c config.json

I receive a parsing error.

Parsed file:
1 | /* @name FindThingById */
2 | SELECT
3 |     *
4 | FROM
5 >     database.prefix$environment."Thing"
6 | WHERE
7 |     id = :id;
8 |
9 |
Errors:
- (5:32) Parse error: token recognition error at: '"'
- (5:38) Parse error: token recognition error at: '"'

I've tried numerous things to escape the " but without success. How can I make this query work with quotes?

Feasibility of Generic Drivers for other databases (SQLite)

First, thank you for this project!

I saw that there are folks that desire this for MySQL (#50), and this is a related request. I am considering trying to hack in support for SQLite, and wanted to talk about feasibility.

We currently use Typescript to build and run queries against a SQLite database (https://github.com/sql-js/sql.js). I am of course curious how easy it would be to write a "driver" that I could substitute for the default PG driver you have implemented.

I took a look at the generator:

const typeData = await getTypes(queryData, queryName, connection);

At present, it looks like the getTypes function, as implemented here, is the one doing the talking to PG:

export async function getTypes(
queryData: IInterpolatedQuery,
name: string,
queue: AsyncQueue,
): Promise<IQueryTypes | IParseError> {

Once getTypes returns its IQueryTypes, however, the generator appears to be unaware of PG, right?

I realize that there might be work involved in making the driver selectable, etc, and creating the appropriate connection instead of assuming it's an AsyncQueue, but that looks pretty do-able.

const connection = new AsyncQueue();

Before I get cracking on this, would I be in thinking that a new DB driver should focus on implementing getTypes for SQLite, and that the rest of the "knowledge" and assumptions of PG would be relatively simple to extract out of the generator?

Union types for enums

Slight buyers remorse from me (shame I didn't think of this a bit earlier!): an issue with using Typescript enums for database enums is that the types we generate in different files are incompatible. That is, Typescript considers them to be different types when in fact they represent the same underlying type.

This ends up feeling quite clumsy if you are passing otherwise compatible objects around because you end up having to convert enum types; payloadType as string as PayloadTypeFromQuery2.

We could instead use unions, so:

export const enum PayloadType {
  Message = 'Message',
  Dynamite = 'Dynamite',
}

becomes:

export type PayloadType = 'Message' | 'Dynamite';

This means that we would get type-safety on enums and type arguments referencing the same database enum would be compatible.

We would loose the runtime abilities of a enum doing this though. Some thoughts on options:

  1. Switch entirely to unions
  2. Generate string enums for each type but make the parameters/returns be of union type
  3. Attempt to 'globalise' enums across generation of different files sharing a enum so that the enum definition is hoisted to some location where it can be imported by all queries that use that enum

I am leaning towards 1 being a better default. We can also give an option to generate enums (but maybe we should favour simplicity and drop support for now). It maintains generated file locality and mostly 'just works'. If someone needs a runtime enum or prefers the enum syntax (which does feel clearer and more conventional) they can always create one.

2 is an attempt at the best of both worlds, but it feels a bit weird generating enums that we don't actually use directly, it won't be obvious that you can use them, and you'll loose editor/IDE support.

3 might be cleanest for a codebase but feels like scope creep and without knowing/assuming a lot more about a project structure would be hard to do.

MySql support

I really like the approach of this project.
We're using MySql though - any chance this will support MySql? :)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.