Code Monkey home page Code Monkey logo

gql's People

Contributors

anibalsolon avatar artofhuman avatar bollwyvl avatar cdce8p avatar cito avatar connorbrinton avatar cybniv avatar ekampf avatar itolosa avatar jbrvjxsc avatar joricht avatar kingdarboja avatar leszekhanusz avatar luketaverne avatar ma-pony avatar mirkan1 avatar mmmeeedddsss avatar monsieurv avatar mvanlonden avatar pawelrubin avatar phdesign avatar pkucmus avatar pvanderlinden avatar pzingg avatar sneko avatar stegben avatar syrusakbary avatar wallee94 avatar wilbertom avatar willfrey avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

gql's Issues

Create separate documentation using Sphinx

We should create separate documentation using Sphinx, since the README is getting too large.

The documentation can then be made availabel online as GitHub pages deployed with an GitHub action, or via ReadTheDocs. RTD has the advantage that it can create versioned docs if we need that.

Is is possible to use an existing aiohttp session?

According to the documentation I should create a new session to make GraphQL requests using aiohttp, but imagine I have a provider that gives me both rest/graphql interfaces to its API, it would be nice to use a single session for both kinds of requests. Is that possible?

I feel like changing this would be enough, is that the case?

Connection ack error when keep alive message arrives first

Hi!. Currently I'm using this package to test a Hasura project, but when the WebsocketsTransport waits for the ack message on initial connection, the Hasura server first responds with a keep alive message. Because of this, the WebsocketsTransport raises a TransportProtocolError exception, given that it waits for a connection_ack, not a ka message.

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/sanchez/Projects/graphql_client.py", line 829, in test_subscription
    asyncio.run(self.something())
  File "/usr/lib/python3.8/asyncio/runners.py", line 43, in run
    return loop.run_until_complete(main)
  File "/usr/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "/home/sanchez/Projects/graphql_client.py", line 839, in something
    async with Client(
  File "/home/sanchez/Projects/venv/lib/python3.8/site-packages/gql/client.py", line 161, in __aenter__
    await self.transport.connect()
  File "/home/sanchez/Projects/venv/lib/python3.8/site-packages/gql/transport/websockets.py", line 508, in connect
    raise e
  File "/home/sanchez/Projects/venv/lib/python3.8/site-packages/gql/transport/websockets.py", line 503, in connect
    await self._send_init_message_and_wait_ack()
  File "/home/sanchez/Projects/venv/lib/python3.8/site-packages/gql/transport/websockets.py", line 193, in _send_init_message_and_wait_ack
    raise TransportProtocolError(
gql.transport.exceptions.TransportProtocolError: Websocket server did not return a connection ack

Here you can see the order from the messages sent by my Hasura server. In this case, I'm testing using the Hasura web console:

image

As you can see, the ka message comes first, so TransportProtocolError rejects this message, and raises the above exception. Maybe TransportProtocolError needs to ignore ka messages when wating for ack message

cookies not available in v.0.1.0

Not sure how the project is being built, but when I tried to call RequestsHTTPTransport with cookies it failed. Though the code says this is being passed through to HTTPTransport that expects it, an inspect.getargspec(HTTPTransport.__init__) shows that cookies is not defined, so this does not work.

Still maintained?

Just checking to ask if this project is still maintained? I see that Graphene itself allows one to query a GraphQL API, so I'm wondering if that's recommended as a client instead? Also, is there any documentation anywhere? (nothing is linked from the README)

When description is excluded from gql api unable to create Client

transport = RequestsHTTPTransport(gql_endpoint, use_json=True)
client = Client(retries=0, transport=transport, fetch_schema_from_transport=True)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/dgoodman/anaconda3/lib/python3.6/site-packages/gql/client.py", line 26, in __init__
    schema = build_client_schema(introspection)
  File "/home/dgoodman/anaconda3/lib/python3.6/site-packages/graphql/utils/build_client_schema.py", line 306, in build_client_schema
    if schema_introspection["directives"]
  File "/home/dgoodman/anaconda3/lib/python3.6/site-packages/graphql/utils/build_client_schema.py", line 305, in <listcomp>
    [build_directive(d) for d in schema_introspection["directives"]]
  File "/home/dgoodman/anaconda3/lib/python3.6/site-packages/graphql/utils/build_client_schema.py", line 280, in build_directive
    directive_introspection.get("args", {}), GraphQLArgument
  File "/home/dgoodman/anaconda3/lib/python3.6/site-packages/graphql/utils/build_client_schema.py", line 250, in build_input_value_def_map
    for f in input_value_introspection
  File "/home/dgoodman/anaconda3/lib/python3.6/site-packages/graphql/utils/build_client_schema.py", line 250, in <listcomp>
    for f in input_value_introspection
  File "/home/dgoodman/anaconda3/lib/python3.6/site-packages/graphql/utils/build_client_schema.py", line 256, in build_input_value
    description=input_value_introspection["description"],
KeyError: 'description'

The GraphQL API I am hitting excludes description all together. Is this an issue worth addressing? If so I can open a PR to attempt to fix.

Error when importing gql.dsl

from gql import dsl
---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
<ipython-input-1-d63fa713ec57> in <module>()
----> 1 from gql import dsl

/Users/omer.katz/.virtualenvs/playground/lib/python3.5/site-packages/gql/dsl.py in <module>()
      6 from graphql.language import ast
      7 from graphql.language.printer import print_ast
----> 8 from graphql.type import (GraphQLField, GraphQLFieldDefinition, GraphQLList,
      9                           GraphQLNonNull, GraphQLEnumType)
     10

ImportError: cannot import name 'GraphQLFieldDefinition'

I tried to upgrade to graphql-core's master but it seems like the class is not defined anymore

Invalid or incomplete introspection with async transport

Hi, I seem to be having issues migrating from a blocking implementation to async. The migrated code raises an exception.

Original:

from os import getenv
from gql import gql, Client
from gql.transport.requests import RequestsHTTPTransport

query = gql('''
    query {
        user(username:"Test") {
            username
        }
    }
''')


if __name__ == '__main__':

    transport = RequestsHTTPTransport(
        url='http://api.aidungeon.io/graphql/',
        headers={'X-Access-Token': getenv('AIDUNGEON_TOKEN')}
    )

    client = Client(
        transport=transport,
        fetch_schema_from_transport=True,
    )

    result = client.execute(query)
    print(result)

Output:

{'user': {'username': 'Test'}}

Migrated

from os import getenv
from gql import gql, Client
from gql.transport.aiohttp import AIOHTTPTransport
import asyncio


query = gql('''
    query {
        user(username:"Test") {
            username
        }
    }
''')

async def main():

    transport = AIOHTTPTransport(
    url='http://api.aidungeon.io/graphql/',
    headers={'X-Access-Token': getenv('AIDUNGEON_TOKEN')}
    )

    client = Client(
        transport=transport,
        fetch_schema_from_transport=True
    )

    async with client as session:
        result = await session.execute(query)
        print(result)

asyncio.run(main())

Output:

TypeError: Invalid or incomplete introspection result. Ensure that you are passing the 'data' attribute of an introspection response and no 'errors' were returned alongside: None.

Not sure how to proceed. Thanks

Logging the GQL request when using variable_values.

I'm using version 2.0 and I'd like to add some logging in my app to show the log the full gql json complete with variables in a way that I can paste into GQL Playground for debugging purposes.

Any tips ?

gql_statement = gql(gql_request.get_gql_template())

result = client().execute(
document=gql_statement,
variable_values=request_params
)

Breaking change in graphql-core

There's some backward-compatibility breaks in graphql-core==3.0.0, that lead to the following import error:

from gql import Client
ImportError                               Traceback (most recent call last)
<ipython-input-1-2d34726b19bf> in <module>
----> 1 from gql import Client

~\AppData\Local\Continuum\miniconda3\lib\site-packages\gql\__init__.py in <module>
      1 from .gql import gql
----> 2 from .client import Client
      3
      4 __all__ = ['gql', 'Client']

~\AppData\Local\Continuum\miniconda3\lib\site-packages\gql\client.py in <module>
      1 import logging
      2
----> 3 from graphql import parse, introspection_query, build_ast_schema, build_client_schema
      4 from graphql.validation import validate
      5

ImportError: cannot import name 'introspection_query' from 'graphql' (C:\Users\nathan.demaria\AppData\Local\Continuum\miniconda3\lib\site-packages\graphql\__init__.py)

AFAIK the solution here is add a ceiling graphql-core<3.0.0 to the install_requires of this package, but I'm curious if somebody familiar with the plan for packages in this repo has better options/longer term recommendations - I see repos for graphql-core-next and gql-next, and it looks like the graphql-core package is now published to PyPI from the -next repo, but gql is still from this repo.

Roadmap getting us from legacy to modern version

Python 2 has reached EOL and no further development will be done as stated on the official Python release schedule page for 2.7

Being the last of the 2.x series, 2.7 will receive bugfix support until 2020. Support officially stops January 1 2020, but the final release will occur after that date.

Planned future release dates:

2.7.18 code freeze January, 2020
2.7.18 release candidate early April, 2020
2.7.18 mid-April, 2020

Also, Python 3.5 support will stop at 2020-09-13 as seen at status-of-python-branches, so would be great to drop support for Python 3.5 in order to push forward and use 3.6+ features on the code (like f-strings).

Query External Schema

Is there a way to pass in a schema.graphql or schema.json file and query against that?

Allow errors and data

Sometimes the returned object contains both errors and data, for instance when executing a query such as:

query {
   fieldOk
   fieldFailed
}

If fieldFailed is null-able and during its resolve it fails, the return will be something like:

{"data": {"fieldOk": "value", "fieldFailed": null},
 "errors": [{"message": "some error message", "paths": ["fieldFailed"]}]
}

Then throwing an exception is not the best thing to do. My suggestion is to just pass-thru the result dict and let the user handle that -- but that breaks API, then you may consider a new execute() method.

Should pass kwargs into RequestsHTTPTransport requests method

Looks like the RequestsHTTPTransport did not pass the kwargs into the requests. It should be flexible to pass the kwargs into the requests when init the transport or execute the execute function.

post_args = {
'headers': self.headers,
'auth': self.auth,
'timeout': timeout or self.default_timeout,
data_key: payload
}
request = requests.post(self.url, **post_args)
request.raise_for_status()

For example: post_args.update(self.kwargs)

After upgrading from v0.4.0 to v2.0.0 I get an exception

Hi,

I found out that I was on a very old version (v0.4.0) and upgraded to the latest steady release (v2.0.0) and simple queries that used to work not don't work and I get an exception:

Exception: {'message': "You cannot access body after reading from request's data stream"}

Did anyone encounter this ? what does this message mean ?

Allow 400 status code if result contain valid graphql error message

The graphql server I am using return a HTTP 400 response for invalid requests. So I get both a 400 response, but with proper graphql error data in the body.

Using graphiql or similar I get the graphql error presented, but with gql.client I only get

requests.exceptions.HTTPError: 400 Client Error: Bad Request for url: ...

due to

request.raise_for_status()

in https://github.com/graphql-python/gql/blob/master/gql/transport/requests.py#L39

These errors should be returned to the user, not just throw a 400-exception

DSL does not allow querying with nested fields

I'm trying to construct the following query using the DSL:

query GetCustomerIdByEnrollmentSecret($enrollment_secret: String) {
	viewer {
    allCustomers(where: {enrollment_secret: {eq: $enrollment_secret}}) {
      edges {
        node {
          id
        }
      }
    }
  } 
}

The following code:

ds.Query.viewer.allCustomers

Produces the following error:

AttributeError: 'DSLField' object has no attribute 'allCustomers'

When I introspect the schema the field exists correctly:

In [157]: q=client.schema.get_query_type()

In [158]: q.fields
Out[158]: 
OrderedDict([('node',
              <graphql.type.definition.GraphQLField at 0x7f870d196458>),
             ('getRole',
              <graphql.type.definition.GraphQLField at 0x7f870d196048>),
             ('getCustomer',
              <graphql.type.definition.GraphQLField at 0x7f870d166818>),
             ('getScheduledQuery',
              <graphql.type.definition.GraphQLField at 0x7f870d1668b8>),
             ('getFile',
              <graphql.type.definition.GraphQLField at 0x7f870d166ea8>),
             ('getAgent',
              <graphql.type.definition.GraphQLField at 0x7f870d1669a8>),
             ('getUser',
              <graphql.type.definition.GraphQLField at 0x7f870d166e08>),
             ('viewer',
              <graphql.type.definition.GraphQLField at 0x7f870d105bd8>),
             ('checkHealth',
              <graphql.type.definition.GraphQLField at 0x7f870d105b38>),
             ('searchRequestLogs',
              <graphql.type.definition.GraphQLField at 0x7f870d1056d8>)])

In [159]: q.fields['viewer']
Out[159]: <graphql.type.definition.GraphQLField at 0x7f870d105bd8>

In [160]: v=q.fields['viewer']

In [161]: v.resolver
Out[161]: <function graphql.utils.build_client_schema.no_execution>

In [162]: v.type
Out[162]: <graphql.type.definition.GraphQLObjectType at 0x7f870d172308>

In [163]: v.type.fields
Out[163]: 
OrderedDict([('id', <graphql.type.definition.GraphQLField at 0x7f870cefd598>),
             ('allRoles',
              <graphql.type.definition.GraphQLField at 0x7f870cefd5e8>),
             ('allCustomers',
              <graphql.type.definition.GraphQLField at 0x7f870cefd638>),
             ('allScheduledQueries',
              <graphql.type.definition.GraphQLField at 0x7f870cefd688>),
             ('allFiles',
              <graphql.type.definition.GraphQLField at 0x7f870cefd6d8>),
             ('allAgents',
              <graphql.type.definition.GraphQLField at 0x7f870cefd728>),
             ('allUsers',
              <graphql.type.definition.GraphQLField at 0x7f870cefd778>),
             ('user',
              <graphql.type.definition.GraphQLField at 0x7f870cefd7c8>)])

What am I doing wrong here? Does the code support such a use case?

Document the "dsl" module/language

The somewhat obscure "dsl" module/language included in gql should be documented. It is currently unclear to me where it is coming from, whether it mimics an existing thing or was an invention of @syrusakbary. Does anybody know?

Note: I'm not talking about the "schema definition language" of GraphQL here, but about the dsl module in gql. I think the name "dsl" is also a bit misleading because it usually means "domain specific language", but this is actually a general purpose language and not domain specific in the usual sense of "application domain specific". As far as I understand it is just an alternative way to express GraphQL documents in Python instead of JSON, Similar to the "SQL Expression language" in SQLAlchemy which allows writing SQL queries in Python.

State of the project?

Please tell us the current state of the project in the README.

At the moment it looks dead.

Which GraphQL client library do you recommend?

compose queries like Apollo

I have some mutations/queries that i want to compose into a single query/mutation.

for ex.

organization = """
mutation ( $organization: OrganizationInput!) {
  Organization: createOrfOrganization (details: $organization) {
    obj {
      id
    }
    status
    errors
  }
}
"""

taxonomy = """
mutation ($taxonomy: TaxonomyInput!) {
  Taxonomy: createOrfTaxonomy (details: $taxonomy) {
    obj {
      id
    }
    status
    errors
  }
}
"""

I would like to have a function that can take the code above and do something like this

>>> compose(organization, taxonomy)
mutation ( $organization: OrganizationInput!, $taxonomy: TaxonomyInput!) {
  Organization: createOrfOrganization (details: $organization) {
    obj {
      id
    }
    status
    errors
  }
  Taxonomy: createOrfTaxonomy (details: $taxonomy) {
    obj {
      id
    }
    status
    errors
  }
}

This is an important feature that the python code should support.

No module named requests

I am trying query to my django-graphql api using gql

from gql import gql, Client
from gql.transport.requests import RequestsHTTPTransport

query = gql(""""
     query($email: String!){
        user(email: $email) {
            id
            username
            email
            isVerified
        }
    }""")
   variables = {"email": "[email protected]"}
    transport = RequestsHTTPTransport(url='http://localhost/graphql/')
    client = Client(transport=transport)
    response = client.execute(query, variables)

But, I am getting this error:

File "...\lib\site-packages\gql\transport\requests.py", line 3, in
import requests
ModuleNotFoundError: No module named 'requests'

No documentation

Tried replicating the two tests against a publicly available end-point, but they fail. Also tried lifting implementation ideas from Apollo documentation into Python, but the differences were not intuitively discoverable. Unable to proceed with any implementation.

Can't parse Shopify GraphQL schema

I'm trying to use it with Shopify GraphQL Admin API but it can't parse the schema.

Single and multiline comments that use triple quotes:

"""Marks an element of a GraphQL schema as having restricted access."""
"""
Represents the access scope permission that is applicable to a merchant's shop, such as `read_orders`.

"""

Doesn't like null default value:

directive @accessRestricted(
  reason: String = null
) on FIELD_DEFINITION | OBJECT

Doesn't like union types:

type AppPurchaseOneTime implements AppPurchase & Node {

I'm using version 0.5.0

schema.graphql.zip

Cookies Issue

If you use pip install gql for python 2.7, there is no cookies arg in HTTPTransport.
Two program need to be updated!

  1. in request.py
post_args = {
            'headers': self.headers,
            'auth': self.auth,
            'cookies': self.cookies, #add this line
            'timeout': timeout or self.default_timeout,
            data_key: payload
        }
  1. in http.py
def __init__(self, url, headers=None, cookies=None):
        self.url = url
        self.headers = headers
        self.cookies = cookies

question / feature - generating lists of items within gql statement.

I have a mutation on my Apollo Server which allows for 0-N items in the request. Given the following have you guys any advice on how handle adding 0 or N items ? The base GQL minus the items on the item array is getting correctly generated and being received by my server.

 an_item = {
            "prop": 123,
        }

apply_mutation = gql('''
mutation apply($id: String!, $items: DiscountItemInput!) {
         applyDiscount(
             applyObject: {
              orderId: $id
              storeId: 2             
              items: [
               # can I do variable substitution using a dict here ?
               {
               },
                 .... 0 - N item
              ]
            }
          ) {
           ... list of fields to be returned
          }
        }
       ''')
        params = {
             "Id": 123          
        }
client.execute(apply_mutation, variable_values=params)

Tests are failing since upgrade of graphql-core from 2.3.1 to 2.3.2

$ pytest tests

...

ERROR tests/starwars/test_dsl.py - TypeError: init() got an unexpected keyword argument 'type'
ERROR tests/starwars/test_query.py - TypeError: init() got an unexpected keyword argument 'type'
ERROR tests/starwars/test_validation.py - TypeError: init() got an unexpected keyword argument 'type'

403 Client Error: Forbidden

Hello,

I'm trying to run an example code from the repository and it returns:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "<stdin>", line 4, in client
  File "/usr/local/lib/python2.7/dist-packages/gql/client.py", line 23, in __init__
    introspection = transport.execute(parse(introspection_query)).data
  File "/usr/local/lib/python2.7/dist-packages/gql/transport/requests.py", line 38, in execute
    request.raise_for_status()
  File "/usr/local/lib/python2.7/dist-packages/requests/models.py", line 862, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 403 Client Error: Forbidden for url: http://swapi.graphene-python.org/graphql

After analysing the response, I saw that the error was due to a CSRF verification failure.

code:

from gql import Client, gql
from gql.transport.requests import RequestsHTTPTransport

query = gql('''
    {
      myFavoriteFilm: film(id:"RmlsbToz") {
        id
        title
        episodeId
        characters(first:5) {
          edges {
            node {
              name
            }
          }
        }
      }
    }
  ''')

client = Client(
    transport=RequestsHTTPTransport(url='http://swapi.graphene-python.org/graphql')
  )

client.execute(query)

Published?

pip install gql
Collecting gql
  Could not find a version that satisfies the requirement gql (from versions: )
No matching distribution found for gql

I would love this! Just started a really important company project using Apollo redux on the frontend. Graphene seems like a great win, but would also love to be able to use gql!

Race condition when connecting with the same websockets transport twice at the same time

Hi @leszekhanusz ,

First of all, thanks again for implementing the subscription part, that's great!

I still get some issues to set up multiple subscriptions and I'm not sure how to solve this. Consider taking your example: https://github.com/graphql-python/gql/blame/master/README.md#L318-L342

Can you confirm me that about subscriptions the asyncio.create_task(...) will immediately run the function in another thread, and that all the await taskX is to remain the program blocking until each task finishes?

On my side even with your example I get this random error (it's not immediate, sometimes after 1 second, sometimes 10...):

RuntimeError: cannot call recv while another coroutine is already waiting for the next message

The message is pretty explicit but I don't understand how to bypass this ๐Ÿ˜ข

If you have any idea ๐Ÿ‘

Thank you,

EDIT: that's weird because sometimes without modifying the code, the process can run more than 5 minutes without having this error...

EDIT2: Note that sometimes I also get this error about the subscribe(...) method

    async for r in self.ws_client.subscribe(subscriptions['scanProbesRequested']):
TypeError: 'async for' requires an object with __aiter__ method, got generator

EDIT3: If I use a different way of doing async (with the same library)

try:
        loop = asyncio.get_event_loop()
        task3 = loop.create_task(execute_subscription1())
        task4 = loop.create_task(execute_subscription2())
        loop.run_forever()

it works without any error. That's really strange...

ModuleNotFoundError with the latest 0.5.0 version and the documentation example

Reproduction:

$ virtualenv env
$ . .\env\Scripts\activate
$ pip install aiohttp gql
$ python
>>> from gql.transport.aiohttp import AIOHTTPTransport
Traceback (most recent call last):
  File ".\test_gql.py", line 1, in <module>
    from gql.transport.aiohttp import AIOHTTPTransport
ModuleNotFoundError: No module named 'gql.transport.aiohttp'

Windows 10 1909
Python version 3.8.x
gql version 0.5.0 (latest available version in PyPI)

Skip https certificate verification

Hi,

It seems it's not possible to skip https certificate verification without hardcode in the lib.

Skip works fine if to add in:
class RequestsHTTPTransport() -> execute: post_args -> add 'verify': False

And it seem's it would be perfect to add https verification as additional method's param.

query with multiple params failed with Invalid AST node.

When I have one value in params I'm able to run query. However when I provide multiple params query failed with Invalid AST node. Is this an existing issue or I'm doing something wrong.

query6 = ("""query getFirstOrders($numOrders: Int, $numLineItems: Int)
{
        orders(first:$numOrders) {
            pageInfo {
                hasNextPage
            }
            edges {
                node {
                    id
                    lineItems (first:$numLineItems) {
                        edges {
                            node {
                                id
                            }
                        }
                    }
                }
            }
        }
}

"""
)
params = {
    "numOrders": 20,
    "numLineItems": 5
}

client.execute(query6, params)

Client.execute should take optional operation_name

Eventually the document may contain more than one query or mutation. In this case the queries must be named and operationName should say which one to execute.

Not that useful in real practice, but sometimes while developing it's handy to keep feeding the same file and just change which operation to use.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.