Code Monkey home page Code Monkey logo

Comments (12)

pquentin avatar pquentin commented on June 6, 2024 1

Thank you for your perseverance getting to the bottom of this!

I'm wondering if your configuration for coerce is right, looks like it should be per-field: https://www.elastic.co/guide/en/elasticsearch/reference/current/coerce.html. And if you want to make sure you don't mistakes like typos, consider using dynamic: strict.

Elasticsearch has to achieve a delicate balance between telling users about invalid data to avoid problems like this down the line but also trying its best to not reject data that could be useful. And it also has to care about backwards-compatibility.

Closing since it's not actionable anymore. Thank you for this productive thread where we all learned something :)

from elasticsearch-py.

pquentin avatar pquentin commented on June 6, 2024

Wow, that's surprising. If I can reproduce this next week and you don't beat me to it, I will open an Elasticsearch bug.

from elasticsearch-py.

pquentin avatar pquentin commented on June 6, 2024

I can't reproduce. This code works fine for me.

I had to adjust the curl command, though. As in https://www.elastic.co/guide/en/elasticsearch/reference/current/docs-bulk.html#bulk-curl, I created a "requests" file:

{ "index": { "_index": "products-index" } }
{ "name": "Bulk Item 666", "price": 666, "reviews": ["Good"], "createdAt": 1697213529 }
{ "index": { "_index": "products-index" } }
{ "name": "Bulk Item 123", "price": 123, "reviews": ["Good"], "createdAt": 1697213529 }

And then issued:

curl --request POST \
  --url http://localhost:9200/_bulk \
  --header 'content-type: application/x-ndjson' \
  --header 'user-agent: vscode-restclient' \
  --data-binary "@requests"

But I suspect this is just "Copy request as cURL" not working properly in vscode-restclient.

from elasticsearch-py.

usersina avatar usersina commented on June 6, 2024

@pquentin I don't think it's an elasticsearch problem but an elasticsaerch-py issue. There is a problem serializing the data after it has been successfully queried from elastic.

Querying with CURL works as intended. This on the other hand throws the JSONDecodeError error.

es.search(
    index="products-index",
    query={"match_all": {}},
)

from elasticsearch-py.

pquentin avatar pquentin commented on June 6, 2024

To be clear from my point of view, there's no problem in Elasticsearch, and no problem in elasticsearch-py. The reason I initially assumed Elasticsearch is because it could not be elasticsearch-py. Indeed, the traceback you showed starts with:

SerializationError: Unable to deserialize as JSON: b'{

The b prefix here says that this is the binary data the elasticsearch-py has received. Decoding it is the first step, there's no modification before that. Could something be modifying the body in your network somehow? It's pretty unlikely that Elasticsearch would give invalid JSON, and I haven't been unable to reproduce it.

Also, do you mind sharing the full traceback instead of just the exception?

from elasticsearch-py.

usersina avatar usersina commented on June 6, 2024

Definitely, here's the full stack.

Traceback (most recent call last):
  File "/home/sina/.pyenv/versions/ai-service/lib/python3.11/site-packages/elastic_transport/_serializer.py", line 93, in loads
    return self.json_loads(data)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/home/sina/.pyenv/versions/ai-service/lib/python3.11/site-packages/elastic_transport/_serializer.py", line 89, in json_loads
    return json.loads(data)
           ^^^^^^^^^^^^^^^^
  File "/home/sina/.pyenv/versions/3.11.5/lib/python3.11/json/__init__.py", line 346, in loads
    return _default_decoder.decode(s)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/sina/.pyenv/versions/3.11.5/lib/python3.11/json/decoder.py", line 337, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/sina/.pyenv/versions/3.11.5/lib/python3.11/json/decoder.py", line 353, in raw_decode
    obj, end = self.scan_once(s, idx)
               ^^^^^^^^^^^^^^^^^^^^^^
json.decoder.JSONDecodeError: Expecting ',' delimiter: line 5 column 29 (char 326)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/sina/Documents/github/sinaapps/datavacuum-old/ai-service/src/test.py", line 6, in <module>
    es.search(
  File "/home/sina/.pyenv/versions/ai-service/lib/python3.11/site-packages/elasticsearch/_sync/client/utils.py", line 414, in wrapped
    return api(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^
  File "/home/sina/.pyenv/versions/ai-service/lib/python3.11/site-packages/elasticsearch/_sync/client/__init__.py", line 3924, in search
    return self.perform_request(  # type: ignore[return-value]
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/sina/.pyenv/versions/ai-service/lib/python3.11/site-packages/elasticsearch/_sync/client/_base.py", line 285, in perform_request
    meta, resp_body = self.transport.perform_request(
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/sina/.pyenv/versions/ai-service/lib/python3.11/site-packages/elastic_transport/_transport.py", line 347, in perform_request
    data = self.serializers.loads(raw_data, meta.mimetype)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/sina/.pyenv/versions/ai-service/lib/python3.11/site-packages/elastic_transport/_serializer.py", line 196, in loads
    return self.get_serializer(mimetype).loads(data)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/sina/.pyenv/versions/ai-service/lib/python3.11/site-packages/elastic_transport/_serializer.py", line 95, in loads
    raise SerializationError(
elastic_transport.SerializationError: Unable to deserialize as JSON: b'{"took":25,"timed_out":false,"_shards":{"total":1,"successful":1,"skipped":0,"failed":0},"hits":{"total":{"value":4,"relation":"eq"},"max_score":1.0,"hits":[{"_index":"products-index","_id":"1","_score":1.0,"_source":{\n    "name": "Samsung S22 Ultra",\n    "price": 975.99,\n    "reviews": ["Good"],\n    "createdAt": 1697145247 // epoch_second\n}},{"_index":"products-index","_id":"pRG9JYsB0QTitqW3X0qT","_score":1.0,"_source":{\n    "name": "iPhone 14 Pro",\n    "price": 1099,\n    "reviews": ["Good"],\n    "createdAt":  1697145249 // epoch_second\n}},{"_index":"products-index","_id":"phG9JYsB0QTitqW3ZEq5","_score":1.0,"_source":{ "name": "Bulk Item 666", "price": 666, "reviews": ["Good"], "createdAt": 1697145250 }},{"_index":"products-index","_id":"pxG9JYsB0QTitqW3ZEq5","_score":1.0,"_source":{ "name": "Bulk Item 123", "price": 123, "reviews": ["Good"], "createdAt": 1697145250 }}]}}'

That said, I tried to dig a bit and noticed that the response.data returned from the urlopen in elastic-transport-python method includes the // epoch_second, which should not be there at that stage already.

image

The data is returned correctly with a CURL request (no // epoch_second in the output)

curl -X GET "localhost:9200/products-index/_search?pretty" -H 'Content-Type: application/json' -d'
{
    "query": {
        "match_all": {}
    }
}
' | less

from elasticsearch-py.

pquentin avatar pquentin commented on June 6, 2024

This is super weird. Since I can't reproduce myself, I'll need more help from you to debug this. Since curl works, the focus will be to understand what is different in curl vs. elasticsearch-py.

Let's start by looking at the headers:

It would also be helpful to get a packet capture for each, using tcpdump/Wireshark. But we can do this as a second step.

from elasticsearch-py.

usersina avatar usersina commented on June 6, 2024

Sure! I'll also be explicit about what I do:

1. Debugging elasticsearch_transport

  • Given the following
import elastic_transport
from elasticsearch import Elasticsearch

es = Elasticsearch(["http://localhost:9200"])

elastic_transport.debug_logging()

es.search(
    index="products-index",
    query={"match_all": {}},
    typed_keys=True,
)
  • Here's the output
[2023-11-02T10:53:22] > POST /products-index/_search?typed_keys=true HTTP/1.1
> Accept: application/vnd.elasticsearch+json; compatible-with=8
> Connection: keep-alive
> Content-Type: application/vnd.elasticsearch+json; compatible-with=8
> User-Agent: elasticsearch-py/8.10.0 (Python/3.11.5; elastic-transport/8.4.1)
> X-Elastic-Client-Meta: es=8.10.0,py=3.11.5,t=8.4.1,ur=1.26.17
> {"query":{"match_all":{}}}
< HTTP/1.1 200 OK
< Transfer-Encoding: chunked
< X-Elastic-Product: Elasticsearch
< Content-Type: application/vnd.elasticsearch+json;compatible-with=8
< {"took":2,"timed_out":false,"_shards":{"total":1,"successful":1,"skipped":0,"failed":0},"hits":{"total":{"value":4,"relation":"eq"},"max_score":1.0,"hits":[{"_index":"products-index","_id":"1","_score":1.0,"_source":{
    "name": "Samsung S22 Ultra",
    "price": 975.99,
    "reviews": ["Good"],
    "createdAt": 1697145247 // epoch_second
}},{"_index":"products-index","_id":"pRG9JYsB0QTitqW3X0qT","_score":1.0,"_source":{
    "name": "iPhone 14 Pro",
    "price": 1099,
    "reviews": ["Good"],
    "createdAt":  1697145249 // epoch_second
}},{"_index":"products-index","_id":"phG9JYsB0QTitqW3ZEq5","_score":1.0,"_source":{ "name": "Bulk Item 666", "price": 666, "reviews": ["Good"], "createdAt": 1697145250 }},{"_index":"products-index","_id":"pxG9JYsB0QTitqW3ZEq5","_score":1.0,"_source":{ "name": "Bulk Item 123", "price": 123, "reviews": ["Good"], "createdAt": 1697145250 }}]}}
[2023-11-02T10:53:22] POST http://localhost:9200/products-index/_search?typed_keys=true [status:200 duration:0.014s]
[2023-11-02T10:53:22] POST http://localhost:9200/products-index/_search?typed_keys=true [status:N/A duration:0.014s]
Traceback (most recent call last):
  File "/home/sina/.pyenv/versions/ai-service/lib/python3.11/site-packages/elastic_transport/_serializer.py", line 93, in loads
    return self.json_loads(data)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/home/sina/.pyenv/versions/ai-service/lib/python3.11/site-packages/elastic_transport/_serializer.py", line 89, in json_loads
    return json.loads(data)
           ^^^^^^^^^^^^^^^^
  File "/home/sina/.pyenv/versions/3.11.5/lib/python3.11/json/__init__.py", line 346, in loads
    return _default_decoder.decode(s)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/sina/.pyenv/versions/3.11.5/lib/python3.11/json/decoder.py", line 337, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/sina/.pyenv/versions/3.11.5/lib/python3.11/json/decoder.py", line 353, in raw_decode
    obj, end = self.scan_once(s, idx)
               ^^^^^^^^^^^^^^^^^^^^^^
json.decoder.JSONDecodeError: Expecting ',' delimiter: line 5 column 29 (char 325)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/sina/Documents/github/sinaapps/datavacuum-old/ai-service/src/test.py", line 8, in <module>
    es.search(
  File "/home/sina/.pyenv/versions/ai-service/lib/python3.11/site-packages/elasticsearch/_sync/client/utils.py", line 414, in wrapped
    return api(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^
  File "/home/sina/.pyenv/versions/ai-service/lib/python3.11/site-packages/elasticsearch/_sync/client/__init__.py", line 3924, in search
    return self.perform_request(  # type: ignore[return-value]
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/sina/.pyenv/versions/ai-service/lib/python3.11/site-packages/elasticsearch/_sync/client/_base.py", line 285, in perform_request
    meta, resp_body = self.transport.perform_request(
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/sina/.pyenv/versions/ai-service/lib/python3.11/site-packages/elastic_transport/_transport.py", line 347, in perform_request
    data = self.serializers.loads(raw_data, meta.mimetype)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/sina/.pyenv/versions/ai-service/lib/python3.11/site-packages/elastic_transport/_serializer.py", line 196, in loads
    return self.get_serializer(mimetype).loads(data)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/sina/.pyenv/versions/ai-service/lib/python3.11/site-packages/elastic_transport/_serializer.py", line 95, in loads
    raise SerializationError(
elastic_transport.SerializationError: Unable to deserialize as JSON: b'{"took":2,"timed_out":false,"_shards":{"total":1,"successful":1,"skipped":0,"failed":0},"hits":{"total":{"value":4,"relation":"eq"},"max_score":1.0,"hits":[{"_index":"products-index","_id":"1","_score":1.0,"_source":{\n    "name": "Samsung S22 Ultra",\n    "price": 975.99,\n    "reviews": ["Good"],\n    "createdAt": 1697145247 // epoch_second\n}},{"_index":"products-index","_id":"pRG9JYsB0QTitqW3X0qT","_score":1.0,"_source":{\n    "name": "iPhone 14 Pro",\n    "price": 1099,\n    "reviews": ["Good"],\n    "createdAt":  1697145249 // epoch_second\n}},{"_index":"products-index","_id":"phG9JYsB0QTitqW3ZEq5","_score":1.0,"_source":{ "name": "Bulk Item 666", "price": 666, "reviews": ["Good"], "createdAt": 1697145250 }},{"_index":"products-index","_id":"pxG9JYsB0QTitqW3ZEq5","_score":1.0,"_source":{ "name": "Bulk Item 123", "price": 123, "reviews": ["Good"], "createdAt": 1697145250 }}]}}'

2. Verbose curl

curl -v -X GET "localhost:9200/products-index/_search?pretty" -H 'Content-Type: application/json' -d'
{
    "query": {
        "match_all": {}
    }
}
'
*   Trying [::1]:9200...
* Connected to localhost (::1) port 9200
> GET /products-index/_search?pretty HTTP/1.1
> Host: localhost:9200
> User-Agent: curl/8.4.0
> Accept: */*
> Content-Type: application/json
> Content-Length: 50
>
< HTTP/1.1 200 OK
< X-elastic-product: Elasticsearch
< content-type: application/json
< Transfer-Encoding: chunked
<
{
  "took" : 2,
  "timed_out" : false,
  "_shards" : {
    "total" : 1,
    "successful" : 1,
    "skipped" : 0,
    "failed" : 0
  },
  "hits" : {
    "total" : {
      "value" : 4,
      "relation" : "eq"
    },
    "max_score" : 1.0,
    "hits" : [
      {
        "_index" : "products-index",
        "_id" : "1",
        "_score" : 1.0,
        "_source" : {
          "name" : "Samsung S22 Ultra",
          "price" : 975.99,
          "reviews" : [
            "Good"
          ],
          "createdAt" : 1697145247
        }
      },
      {
        "_index" : "products-index",
        "_id" : "pRG9JYsB0QTitqW3X0qT",
        "_score" : 1.0,
        "_source" : {
          "name" : "iPhone 14 Pro",
          "price" : 1099,
          "reviews" : [
            "Good"
          ],
          "createdAt" : 1697145249
        }
      },
      {
        "_index" : "products-index",
        "_id" : "phG9JYsB0QTitqW3ZEq5",
        "_score" : 1.0,
        "_source" : {
          "name" : "Bulk Item 666",
          "price" : 666,
          "reviews" : [
            "Good"
          ],
          "createdAt" : 1697145250
        }
      },
      {
        "_index" : "products-index",
        "_id" : "pxG9JYsB0QTitqW3ZEq5",
        "_score" : 1.0,
        "_source" : {
          "name" : "Bulk Item 123",
          "price" : 123,
          "reviews" : [
            "Good"
          ],
          "createdAt" : 1697145250
        }
      }
    ]
  }
}
* Connection #0 to host localhost left intact

Now the interesting part is, after removing the pretty request param from the curl command, I already managed to reproduce the problem.

curl -v -X GET "localhost:9200/products-index/_search" -H 'Content-Type: application/json' -d'
{
    "query": {
        "match_all": {}
    }
}
'
*   Trying [::1]:9200...
* Connected to localhost (::1) port 9200
> GET /products-index/_search HTTP/1.1
> Host: localhost:9200
> User-Agent: curl/8.4.0
> Accept: */*
> Content-Type: application/json
> Content-Length: 50
>
< HTTP/1.1 200 OK
< X-elastic-product: Elasticsearch
< content-type: application/json
< Transfer-Encoding: chunked
<
{"took":1,"timed_out":false,"_shards":{"total":1,"successful":1,"skipped":0,"failed":0},"hits":{"total":{"value":4,"relation":"eq"},"max_score":1.0,"hits":[{"_index":"products-index","_id":"1","_score":1.0,"_source":{
    "name": "Samsung S22 Ultra",
    "price": 975.99,
    "reviews": ["Good"],
    "createdAt": 1697145247 // epoch_second
}},{"_index":"products-index","_id":"pRG9JYsB0QTitqW3X0qT","_score":1.0,"_source":{
    "name": "iPhone 14 Pro",
    "price": 1099,
    "reviews": ["Good"],
    "createdAt":  1697145249 // epoch_second
* Connection #0 to host localhost left intact
}},{"_index":"products-index","_id":"phG9JYsB0QTitqW3ZEq5","_score":1.0,"_source":{ "name": "Bulk Item 666", "price": 666, "reviews": ["Good"], "createdAt": 1697145250 }},{"_index":"products-index","_id":"pxG9JYsB0QTitqW3ZEq5","_score":1.0,"_source":{ "name": "Bulk Item 123", "price": 123, "reviews": ["Good"], "createdAt": 1697145250 }}]}}

from elasticsearch-py.

pquentin avatar pquentin commented on June 6, 2024

Interesting. Thanks for being more explicit.

  • Can you reproduce with your own reproducer commands in your original post, starting from a clean cluster? I see you're using different data now. It's also good to have confirmation that you can reproduce with curl.
  • Have you tried passing pretty=True to the client?
  • Also, I don't think typed_keys is relevant here as it only applies to aggregations.

from elasticsearch-py.

usersina avatar usersina commented on June 6, 2024

It's the same data actually (mostly, I might've added one entry or two). I also tried with a clean cluster by re-running the steps to reproduce above and I still have the same problem with curl.

That said, passing the pretty=True flag actually solves the problem!

from elasticsearch import Elasticsearch

es = Elasticsearch(["http://localhost:9200"])
data = es.search(
    index="products-index",
    query={"match_all": {}},
    pretty=True,
)
print(data)

Like you mentioned, typed_keys is irrelevant.

One other thing I tried is using 8.10.4 instead of 8.10.2 and the problem is also gone (curl works as intended)

from elasticsearch-py.

pquentin avatar pquentin commented on June 6, 2024

I still have the same problem with curl.

Do you mean that using the reproducer above you still get // epoch_second with curl? But the steps above use the Elasticsearch client as a last step. I suppose the best proof would be to reproduce using a Dockerfile, after all it's only a few curl commands.

One other thing I tried is using 8.10.4 instead of 8.10.2 and the problem is also gone (curl works as intended)

I'm not seeing anything related between 8.10.2 and 8.10.4: elastic/elasticsearch@v8.10.2...v8.10.4. Would you mind trying 8.10.3 too?

from elasticsearch-py.

usersina avatar usersina commented on June 6, 2024

Looks like 8.10.3 isn't available as a docker image and therefore cannot verify it easily without building it myself.

While trying to setup the curl commands for the containerized environment, I delete my cluster and re-created it. To my surprise, the issue is not happening anymore... Neither on 8.10.2 nor on 8.10.4

Here's my most up to date "reproducer"

# Run elasticsearch (give it some time)
docker run --name elasticsearch -p 9200:9200 -e "discovery.type=single-node" -e "xpack.security.enabled=false" -e "xpack.security.http.ssl.enabled=false" elasticsearch:8.10.2

# Health check
curl -X GET "http://localhost:9200/_cluster/health?wait_for_status=yellow&timeout=50s&pretty"

# Create the index mapping
curl --request PUT \
    --url 'http://localhost:9200/products-index?pretty=' \
    --header 'content-type: application/json' \
    --data '{"settings": {"index.mapping.coerce": false},"mappings": {"properties": {"name": { "type": "text" },"price": { "type": "double" },"reviews": { "type": "text" },"createdAt": { "type": "date", "format": "epoch_second" }},"dynamic": true}}'

# Add some data
curl --request POST \
    --url http://localhost:9200/_bulk \
    --header 'content-type: application/x-ndjson' \
    --data-binary @- <<EOF
{ "index": { "_index": "products-index" } }
{ "name": "Bulk Item 666", "price": 666, "reviews": ["Good"], "createdAt": 1697213529 }
{ "index": { "_index": "products-index" } }
{ "name": "Bulk Item 123", "price": 123, "reviews": ["Good"], "createdAt": 1697213529 }
EOF

# Query the data (pretty)
curl --request GET \
    --url 'http://localhost:9200/products-index/_search?pretty=' \
    --header 'content-type: application/json' \
    --data '{"query": {"match_all": {}}}'

# Query the data
curl --request GET \
    --url 'http://localhost:9200/products-index/_search' \
    --header 'content-type: application/json' \
    --header 'user-agent: vscode-restclient' \
    --data '{"query": {"match_all": {}}}'

# Delete the index
curl --request DELETE \
    --url 'http://localhost:9200/products-index?pretty=' \
    --header 'content-type: application/json'

Is what I would like to say, but I finally found the problem...

Culprit (it was me all along)

It turns out that the REST Client extension was for whatever reason sending a // epoch_second comment that I had written a while ago, and elasticsearch was miraculously accepting it... (Ofc not blaming elasticsearch for my hiccup, not in the least)

Here's what the request accepted by elasticsearch looks like in REST Client format:

PUT http://localhost:9200/products-index/_doc/1?pretty
Content-Type: application/json

{
    "name": "Samsung S22 Ultra",
    "price": 975.99,
    "reviews": ["Good"],
    "createdAt": {{$timestamp}} // epoch_second
}

For the record, this exact same "REST Client" code generates the following curl command, which is not accepted by elastic

#  Faulty curl command with a bad body
curl --request PUT \
  --url 'http://localhost:9200/products-index/_doc/1?pretty=' \
  --header 'content-type: application/json' \
  --header 'user-agent: vscode-restclient' \
  --data '{"name": "Samsung S22 Ultra","price": 975.99,"reviews": ["Good"],"createdAt": 1699042698 // epoch_second}'

{
  "error" : {
    "root_cause" : [
      {
        "type" : "document_parsing_exception",
...
}

@pquentin thanks for following through and sorry about such an anticlimactic ending ^^'

Feel free to close the issue!

from elasticsearch-py.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.