eturino / apollo-link-scalars Goto Github PK
View Code? Open in Web Editor NEWcustom apollo link to allow to parse custom scalars
License: MIT License
custom apollo link to allow to parse custom scalars
License: MIT License
I'm submitting a ...
[ ] bug report
[ ] feature request
[ ] question about the decisions made in the repository
[X ] question about how to use this project
Summary
I am trying to this approach to generate the schema for withScalars
method from apollo-link-scalars. However, the GraphQLSchema generated from using the utility methods from graphql is not compatible with the GraphQLSchema type from apollo-link-scalars. Here is the package version I am using.
"apollo-link": "^1.2.14",
"graphql": "^15.1.0",
For the implementation I just followed the approach to generate the schema.
const schema = buildClientSchema((introspectionResult as unknown) as IntrospectionQuery)
I notice the graphql
package version from apollo link scalars is "graphql": "^14.5.8",
.
Updated: Just found this PR. #115 . It seems to fix the case. Do you have any plan to merge it? Thank you.
I'm submitting a ...
[x] bug report
[ ] feature request
[ ] question about the decisions made in the repository
[ ] question about how to use this project
Summary
Unable to run the app when calling withScalars
and completely crashes.
I get the following error
ERROR Error: Cannot use GraphQLScalarType "Boolean" from another module or realm.
Ensure that there is only one instance of "graphql" in the node_modules
directory. If different versions of "graphql" are the dependencies of other
relied on modules, use "resolutions" to ensure only one version is installed.
import { withScalars } from 'apollo-link-scalars';
import config from '@/config';
import { ApolloClient, ApolloLink, HttpLink } from '@apollo/client';
import { RetryLink } from '@apollo/client/link/retry';
import cache from './cache';
import introspectionResult from "./../graphql.schema.json";
import { buildClientSchema, IntrospectionQuery } from "graphql";
const schema = buildClientSchema((introspectionResult as unknown) as IntrospectionQuery) as any;
const retryLink = new RetryLink();
const httpLink = new HttpLink({ uri: config.API_URL });
const client = new ApolloClient({
cache,
link: ApolloLink.from([
withScalars({ schema }),
retryLink,
httpLink
])
});
export default client;
I have tried to enforce the graphql version with the following in my package.json
"overrides": {
"graphql": "^16.0.0"
},
"resolutions": {
"graphql": "^16.0.0"
},
Anybody else have this issue?
I'm submitting a ...
[ ] bug report
[ ] feature request
[ ] question about the decisions made in the repository
[x] question about how to use this project
Summary
The README does not currently clarify where responsibility lies for handling null values, when null values will or will not be passed to various functions, etc.. Some additional detail/clarification in the README would help in understanding exactly what needs to be handled within implementations of custom mappings and nullFunctions
.
The one example of a custom mapping in the README seems to contradict itself because the implementation of serialize
does not handle null, but the implementation of parseValue
does handle null:
const typesMap = {
CustomScalar: {
serialize: (parsed: CustomScalar) => parsed.toString(),
parseValue: (raw: string | number | null): CustomScalar | null => {
return raw ? new CustomScalar(raw) : null;
}
}
};
In addition to the contradiction between the two methods on whether null needs to be handled, it's confusing that parseValue
is defined to handle string
and number
input, but serialize
clearly returns only string
.
My confusion boils down to these 2 (interrelated) questions:
nullFunctions
implementation? For example, what is the order in which they are called, and under which conditions? Are values always unconditionally passed through both the nullFunctions
and custom mapping functions in both directions (serialize and parse), or does apollo-link-scalars
do some null/undefined detection handling along the way to bypass certain steps and simplify what needs to be handled within the custom functions?A detailed explanation of how values are processed in both directions would probably help clarify this.
I'm submitting a ...
[ ] bug report
[ ] feature request
[ ] question about the decisions made in the repository
[*] question about how to use this project
Summary
When I try to use the apollo-link-scalars
as advised in the README I get following error after I try to do first GraphQL query:
Uncaught (in promise) Error: Network error: operation.getContext is not a function
at new ApolloError (bundle.esm.js:76)
at bundle.esm.js:1469
My apollo client setup is following:
const cache = new InMemoryCache()
const schemaDocument = loader('../common/schema.graphql')
const link = ApolloLink.from([
withScalars({
schema: buildASTSchema(schemaDocument),
typesMap: { DateTime: GraphQLDateTime }
}),
new HttpLink()
])
const client = new ApolloClient({
cache,
link
})
The error disappears when the apollo-scalars-link
is removed.
I'm have an application created by Create React App.
Let me know if there's any other information I can provide, however this is all I know right now.
Follow up to #370, fieldB2
now gets passed into parseValue
twice, which I think is not intended.
Changing the parseValue
of Date
inside the test to (typeof raw === 'string' ? new Date(raw) : null)
reveals the issue.
treatSelection
should have some way to make sure a field is only being parsed once, or a new implementation is needed.
I'm submitting a ...
[ ] bug report
[ ] feature request
[x] question about the decisions made in the repository
[ ] question about how to use this project
Summary
I know that not every package follows semantic versioning, but going from v2.1.2 to v0.4.0 as latest version on npm looks like a publishing mistake to me. Was this step intended or what's the point about going down 2 major versions?
Kind regards
I'm submitting a ...
[x] bug report
[ ] feature request
[ ] question about the decisions made in the repository
[ ] question about how to use this project
Summary
While working on a project with Apollo Client 3.8, I've encountered a type incompatibility issue between the type of ApolloLink
and the return type of withScalars
. Specifically, the types of the fields split
, left
, right
in ApolloLink
appear to be incorrect.
Other information (e.g. detailed explanation, stacktraces, related issues, suggestions how to fix, links for us to have context, eg. StackOverflow, personal fork, etc.)
I'm submitting a bug report
Summary
The library (0.3.0) does not appear to be working well with Metro bundler. This line import { FunctionsMap, isNone, mapIfArray } from "..";
has been transpiled to const __1 = require("..");
and causes runtime errors as __1
is evaluating to undefined
.
I'm submitting a ...
[ ] bug report
[ ] feature request
[ ] question about the decisions made in the repository
[X] question about how to use this project
Summary
I'm trying to parse dates with this but they're still showing up as strings even with defined with Date
type and using a resolver Date: DateTimeResolver
:
import {
ApolloClient,
InMemoryCache,
HttpLink,
ApolloLink,
} from '@apollo/client'
import introspectionResult from 'shared/gql/generated.schema.json'
import { buildClientSchema, IntrospectionQuery } from 'graphql'
import { withScalars } from 'apollo-link-scalars'
import { DateTimeResolver } from 'graphql-scalars'
const schema = buildClientSchema(
(introspectionResult as unknown) as IntrospectionQuery
)
const httpLink = new HttpLink({
uri: 'http://localhost:4000',
credentials: 'include',
})
const typesMap = {
Date: DateTimeResolver,
}
const link = ApolloLink.from([
(withScalars({ schema, typesMap }) as unknown) as ApolloLink,
httpLink,
])
function createApolloClient() {
return new ApolloClient({
ssrMode: typeof window === 'undefined',
link,
cache: new InMemoryCache(),
// Necessary to pass the session cookie to the server on every request
credentials: 'include',
})
}
I'm using Apollo Client v3.
I'm submitting a ...
[ ] bug report
[x] feature request
[ ] question about the decisions made in the repository
[ ] question about how to use this project
Summary
I think it is very simple and backward compatible, right?
Other information (e.g. detailed explanation, stacktraces, related issues, suggestions how to fix, links for us to have context, eg. StackOverflow, personal fork, etc.)
bug report
The library (0.3.0) does not appear to be working well with Metro bundler. This line import { FunctionsMap, isNone, mapIfArray } from "..";
has been transpiled to const __1 = require("..");
and causes runtime errors as __1
is evaluating to undefined
.
I'm submitting a ...
[x ] bug report
[ ] feature request
[ ] question about the decisions made in the repository
[ ] question about how to use this project
Summary
We recently getting following error when building the frontend code using webpack. It seems somewhere in the code, we are not explicitly exporting package instead we pulling entire graphql-tools
, hence it was failing at the compilation.
web-webpack error ModuleNotFoundError: Module not found: Error: Can't resolve 'child_process' in '/home/jenkins/workspace/project/node_modules/@graphql-tools/git-loader'
Instead of https://github.com/eturino/apollo-link-scalars/blob/v0.1.10/src/index.ts#L1, we can change to
import { makeExecutableSchema } from "@graphql-tools/schema";
Version update from 0.1.6
to 0.1.11
.
for objects, use the GraphQLObjectType
directly
I'm submitting a ...
[ ] bug report
[x] feature request
[ ] question about the decisions made in the repository
[x] question about how to use this project
Summary
It seems when it comes to SSR (or maybe it's just Next.js), this library will not work since the data does not go through the serializer when the the data is being serialized to serve to the frontend.
Is there suggested method to get this to possibly work, or if not, would it be feasible to implement? Thanks.
I'm submitting a ...
[ ] bug report
[X] feature request
[ ] question about the decisions made in the repository
[] question about how to use this project
Summary
Hi,
We're using this project to great success for transforming some scalars like date times etc. We're also using an implementation of the Maybe monad for more functional-style access to nullable properties (https://github.com/hojberg/seidr).
We'd love to be able to automatically translate the nullable types into Maybes, so e.g. a type that looks like:
type Name {
title: String
name: String!
suffix: String
}
When the server returns:
{
"title": "Dr",
"name": "Greg House",
"suffix": null
}
Gets transformed to
{
title: Just("Dr")
name: "Greg House",
suffix: Nothing()
}
I appreciate that this isn't directly the purpose of this project, but given the set of tasks for schema traversal, parsing and serializing etc, I was wondering if you would have any interest in expanding the scope of the project to allow it.
The specific task would be to change the traversal so that whenever it comes across a nullable type, it runs a special serializer/parser as appropriate.
If you are interested, I'm more than happy to do the work on it. If not, I'll probably look at either forking this to allow it, or taking a lot of the structure to make a similar project that can work in tandem with this to do the transformation.
Cheers!
I'm submitting a ...
[ ] bug report
[ ] feature request
[ ] question about the decisions made in the repository
[x] question about how to use this project
Summary
What is the best way to use this link with a remote schema (from an introspection endpoint)?
As mentioned by @brabeji in #28, this link should not be responsible for making GraphQLError
s when the response encounters a null on a non-null field. There could be valid reasons for that, for example with some directives. In any case, this link should not be the judge of that.
We should remove those validations and their tests
Hi all,
if you want to revive your custom scalars from the persisted Cache (https://github.com/apollographql/apollo-cache-persist) (where it is saved as stringified JSON), you can use the JSON.parse function with a revive method.
I am using it as such:
import { DateTime } from 'luxon';
// Works for keys like createdAt, updatedAt and timestamp.
export const parseJsonWithDateTime = (
jsonString: string,
keyCheckFunctions: Array<(key: string) => boolean> = [
(key) => key === 'timestamp',
(key) => key.endsWith('At'),
],
) => {
return JSON.parse(jsonString, (key, value) => {
for (const keyCheckFunction of keyCheckFunctions) {
if (keyCheckFunction(key)) {
return DateTime.fromISO(value);
}
}
});
};
To enable custom scalar type generation with the newest apollo codegen, you can use the following setup for codegen.yml
I am not so sure about the two options here
passthroughCustomScalars: true
and customScalarFormat: 'passthrough'
but they don't break it.
hooks:
afterAllFileWrite:
- prettier --write
overwrite: true
schema: 'graphql.schema.json'
documents: 'src/**/*.{tsx,ts}'
config:
scalars:
DateTime: DateTime
JSONObject: JSONObject
generates:
src/generated-graphql-types.tsx:
plugins:
- 'typescript'
- 'typescript-operations'
- 'typescript-react-apollo'
- 'named-operations-object'
config:
namingConvention:
default: 'no-change-case'
enumValues: 'keep'
passthroughCustomScalars: true
customScalarFormat: 'passthrough'
withComponent: false
withHOC: true
./src/generated-graphql-fragment-matcher.ts:
plugins:
- fragment-matcher
config:
namingConvention:
default: 'no-change-case'
enumValues: 'keep'
passthroughCustomScalars: true
customScalarFormat: 'passthrough'
./graphql.schema.json:
plugins:
- 'introspection'
config:
passthroughCustomScalars: true
customScalarFormat: 'passthrough'
namingConvention:
default: 'no-change-case'
enumValues: 'keep'
Hope it helps someone.
I'm submitting a ...
[x] bug report
[ ] feature request
[ ] question about the decisions made in the repository
[ ] question about how to use this project
Summary
fragmentReducer()
returns an incorrect result when there are multiple occurrences of the same field. The last occurence of a field should overwrite the previous ones but it's not the case. When this happens, the selections
are wrong and some scalar fields may not be parsed (because they are incorrectly excluded from the selections).
Other information (e.g. detailed explanation, stacktraces, related issues, suggestions how to fix, links for us to have context, eg. StackOverflow, personal fork, etc.)
Here's an example:
query MyQuery {
someField {
...FragmentA,
subFieldB { // should overwrite subFieldB from FragmentA
...FragmentB
}
}
}
fragment FragmentA on A {
fieldA1
fieldA2
fieldA3
subFieldB {
fieldB2
}
}
fragment FragmentB on B {
fieldB1
fieldB2
fieldB3
}
I expect fragmentReducer()
to resolve the query to:
query MyQuery {
someField {
fieldA1
fieldA2
fieldA3
subFieldB {
fieldB1
fieldB2
fieldB3
}
}
}
But instead it is:
query MyQuery {
someField {
fieldA1
fieldA2
fieldA3
subFieldB {
fieldB2
}
}
}
Now the result is that if fieldB1
or fieldB3
are custom scalars, they are ignored by parser.parseObjectWithSelections(data, rootType, rootSelections);
and are not converted at all.
In this specific example, I believe the problem comes from uniqueNodes()
in fragment-utils.ts
. Before calling this function, all the fields are still there (subFieldB is there twice) but uniqueNodes()
keeps the first occurence instead of the last one.
I'm submitting a ...
[x] bug report
[ ] feature request
[ ] question about the decisions made in the repository
[ ] question about how to use this project
Summary
PR #115 broke the module by exporting a member of devDependency graphql-tools
.
I recommend against adding graphql-tools
to the dependency list for people, like me, who don't need it.
If you switch to ESLint a rule could be added to avoid this in the future.
Other information (e.g. detailed explanation, stacktraces, related issues, suggestions how to fix, links for us to have context, eg. StackOverflow, personal fork, etc.)
./node_modules/apollo-link-scalars/build/module/index.js
Module not found: Error: Can't resolve 'graphql-tools' in 'node_modules/apollo-link-scalars/build/module'
#545 was closed too fast.
When a field is present multiple times in a document (because of overwriting fragments), its value is incorrectly parsed multiple times.
A workaround is to check manually in parseValue
if the provided value is raw or already parsed but it would be better to avoid calling parseValue multiple times.
I'm submitting a ...
[X] question about how to use this project
Summary
To avoid bundling a huge schema, can I generate a smaller schema consisting of only the fields I care about, matching the type information in my actual schema and passing just this small schema to the link?
Other information (e.g. detailed explanation, stacktraces, related issues, suggestions how to fix, links for us to have context, eg. StackOverflow, personal fork, etc.)
E.g. something like
const schema = gql`
type Event {
start_time: Timestamp,
end_time: Timestamp,
}
type Ticket {
start_time: Timestamp,
end_time: Timestamp,
}
type Order{
created: Timestamp,
}
scalar Timestamp
`
const typesMap = {
Timestamp: {
parseValue(time) {
return new Date(time * 1000);
},
serialize(date: Date) {
return Math.round(date.getTime() * 1000);
}
}
};
export default withScalars({ schema, typesMap })
Where Event, Ticket, Order have a bunch of other fields in my actual schema, which also has a bunch of other types. Would it cause any issues?
I generate this schema from my actual schema using a script.
TIA for the advice.
I'm submitting a ...
[x] bug report
[ ] feature request
[ ] question about the decisions made in the repository
[ ] question about how to use this project
Summary
I'm using the latest to date versions of both the link and the client, but this issue existed on the old version as well.
On seemingly random occasions the link won't parse a date. My app is very heavy with dates, so that happens pretty often. Right now I'm experiencing this in a component that compares 2 dates - literally one is parsed and the other one is still a string.
I ruled out this being a cache issue since it still occurs if I make fetchPolicy: 'no-cache'. With some logging enabled I see that the date isn't even supplied to the parser. The type of the field is correct in the schema.
Apollo client v3.3.12
Apollo link scalars v2.1.1
My client setup:
import { ApolloClient, ApolloLink, InMemoryCache } from '@apollo/client'
import { withScalars } from 'apollo-link-scalars'
import { parseISO } from 'date-fns'
import { buildClientSchema, IntrospectionQuery } from 'graphql'
import result from './generated/graphql'
import introspectionResult from './generated/graphql.schema.json'
import { createHttpLink } from '@apollo/client/link/http'
export const makeApolloClient = (
bearerToken?: string,
): ApolloClient<unknown> => {
const schema = buildClientSchema(
(introspectionResult as unknown) as IntrospectionQuery,
)
const ScalarsLink = withScalars({
schema,
typesMap: {
DateTime: {
serialize: (parsed: Date) => parsed.toISOString(),
parseValue: (ISOstring: string | null): Date | null =>
ISOstring ? parseISO(ISOstring) : null,
},
},
})
const HttpLink = createHttpLink({
uri: '/graphql',
...(bearerToken && { headers: { Authorization: `Bearer ${bearerToken}` } }),
})
return new ApolloClient({
link: ApolloLink.from([ScalarsLink, HttpLink]),
cache: new InMemoryCache({
...result,
}),
})
}
As you see, I removed all other 3rd party links, the issue still persists
I'm submitting a ...
[ * ] bug report
[ ] feature request
[ ] question about the decisions made in the repository
[ ] question about how to use this project
Summary
I use apollo-link-scalars to serialize and parse Dates. for my specific case, i need to send the time zone with the date. So i use the serialize function to transform and send the date in this format: 2023-08-19T02:00:00+02:00
I have a request with an input variable of type Date that is nullable, if i change this variable and then I put it back to the first value, the request will re-launch even with the cache-first fetchPolicy (using the useQuery hook).
This behavior is not preset if the serialize function only does a toISOString
so this code will work correctly
const typesMap = {
Date: {
serialize: (parsed: unknown): string | null => {
return parsed instanceof Date ? parsed.toISOString() : null;
},
parseValue(raw: unknown) {
...
}
}
while this one will re-launch query it has already fetched with fetchPolicy to cache-first
const typesMap = {
Date: {
serialize: (parsed: unknown): string | null => {
return parsed instanceof Date ? toISOStringWithTimezone(parsed) : null;
},
parseValue(raw: unknown) {
...
}
}
My guess is apollo when comparing old to new values, uses the old values that have been serialized using the custom serialize function, but serializes to new values using the default one (toJSON function). Just a guess thought.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.