symerio / pgeocode Goto Github PK
View Code? Open in Web Editor NEWPostal code geocoding and distance calculation
Home Page: https://pgeocode.readthedocs.io/
License: BSD 3-Clause "New" or "Revised" License
Postal code geocoding and distance calculation
Home Page: https://pgeocode.readthedocs.io/
License: BSD 3-Clause "New" or "Revised" License
With issues #40 and #34 , we see that genonames data source that we rely on seems to have intermittent availability issues. To protect against such intermittent failures can we think of adding a fallback data source or a cache. Both the issues were short lived and resolved by itself.
In Issue #40 @richunger pointed out that the data was available at http://download.geonames.org/export/dump/ when we had the issue, can we rely on that as a backup data source.
NL and FR are working fine, but when trying out other countries it gives me this error.
KeyError: "There is no item named 'DE.txt' in the archive"
But they are still accessible: https://symerio.github.io/postal-codes-data/data/geonames/DE.txt
Hi,
It is possible to add support for postal codes in Morocco?
Right now, I'm getting this error when I try to use the country code 'MA' with python package :
country=MA is not a known country code. See the README for a list of supported countries
I noticed in GeoNames that the zip code 'MA' was available.
Thanks you for this package !
Hi,
Useful package. Seems the Geonames folks have published some more countries:
AZ
CL
CY
EE
FM
HT
KR
MW
PE
PW
RS
SG
Can you update, please?
Recently the new download location changed to "https://download.geonames.org/export/dump/{country}.zip"
simply changing the URL doesn't work, as the downloaded file looks incorrect then in ~/pgeocode_data/
Downloading the file manually or with wget
does download the right data.
I'm not sure what is going on, either the server serves cached data based on the user agent or there is some incorrect caching happening on the client side.
In any case this is a blocker for pgeocode.
https://download.geonames.org/export/zip/{country}.zip , this link is throwing 404 error.
On windows 10 using Python 3.9.7 the quickstart example crashes at
"nomi = pgeocode.Nominatim('fr')"
with the error
urllib.error.URLError: <urlopen error [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed: certificate has expired (_ssl.c:1129)>
This is probably caused by the Let's Encrypt certificate.
You can do a quick fix by either removing the first URL in line 25 in pgeocode.py or by turning off ssl verification in line 141
import ssl
with urllib.request.urlopen(url, context=ssl._create_unverified_context()) as
There are some postal codes were the latitude and longitude that are estimated from postal codes that are shown in sea and also instances were the postal code is shown in another city.Following are sample postal codes where this is happening
Pincode latitude longitude
400069 18.9596 72.8604
400079 18.9596 72.88815
400054 18.9292 72.8856
395003 21.2659 72.55363636
395005 21.2218 72.28558182
395001 21.1941 72.5018
Hi,
This is the data fetched from pgocode for 80030 postal code:
postal_code 80030
country_code IT
place_name Spartimento, Gallo, Cimitile, Carbonara Di Nol...
state_name Campania
state_code 4
county_name Napoli
county_code NaN
community_name NaN
community_code NaN
latitude 40.926694
longitude 14.520376
accuracy 4.0
This is the raw from GeoNames:
IT 80030 Spartimento Campania 04 Napoli NA 40.891 14.4724 4
IT 80030 Gallo Campania 04 Napoli NA 40.9424 14.5455 4
IT 80030 Cimitile Campania 04 Napoli NA 40.9423 14.5256 4
IT 80030 Carbonara Di Nola Campania 04 Napoli NA 40.8744 14.5788 4
IT 80030 Camposano Campania 04 Napoli NA 40.9524 14.5299 4
IT 80030 San Vitaliano Campania 04 Napoli NA 40.9244 14.4746 4
IT 80030 Castello Di Cisterna Campania 04 Napoli NA 40.9159 14.4112 4
IT 80030 Mariglianella Campania 04 Napoli NA 40.9296 14.437 4
IT 80030 Visciano Campania 04 Napoli NA 40.9238 14.5824 4
IT 80030 Tufino Campania 04 Napoli NA 40.9559 14.5657 4
IT 80030 Liveri Campania 04 Napoli NA 40.9041 14.5654 4
IT 80030 San Paolo Bel Sito Campania 04 Napoli NA 40.9135 14.5486 4
IT 80030 Comiziano Campania 04 Napoli NA 40.9516 14.5512 4
IT 80030 Scisciano Campania 04 Napoli NA 40.9066 14.4745 4
IT 80030 Roccarainola Campania 04 Napoli NA 40.9725 14.5428 4
IT 80030 Schiava Campania 04 Napoli NA 40.9267 14.5204 3
IT 80030 Gargani Campania 04 Napoli NA 40.9267 14.5204 3
The data from GeoNames dataset is correct, I assume that the library is misinterpreting NA as NaN.
Hello,
I just used the library, but I noticed that the distance returned was incorrect.
dist = pgeocode.GeoDistance('PH')
dist.query_postal_code('0860', '1227')
returns 76 kms. I am sure that the distance is not 76kms. The source maybe was the lat, long provided as I tried googling the provided values for 1227 but it was located in a very far province.
I have a set of zip codes that I need to decode to countries. To do so, I plan to query each nominatim and return any country where a match is found.
In trying to instantiate each nominatum, I received an error:
import pgeocode
nomis = {cc: pgeocode.Nominatim(cc) for cc in pgeocode.COUNTRIES_VALID}
When investigating:
for cc in pgeocode.COUNTRIES_VALID:
try:
pgeocode.Nominatim(cc)
except Exception as ex:
print(cc, ex)
I get:
AS No numeric types to aggregate
LT sequence item 1: expected str instance, float found
VA No numeric types to aggregate
Suggest either investigating, or dropping from the valid countries list. Thanks!
While I was playing with the code, I just recognized that there are some postal codes for germany that the code returns no information regarding latitude, longitude, state_name and etc. I checked them in google map and I found that they are indeed valid postal codes.
Here is the list of the postal codes with null results:
Thank you in advance for your support!
Bests,
Amin
I have noticed that pgeocode v 0.2.1
request to get locations based on Z.P. from Brazil is not working. I have tried erasing the -
or with it, but still, it returns an empty value.
It should be fairly straightforward to implement inverse geo-coding, i.e. find the closest city/region/country for given coordinates.
This would require downloading all the datasets however.
Ideally, the nearest neighbor lookup could be done efficiently with a KD-tree structure (e.g. in scipy). Though scipy is a fairly heavy dependency so maybe this could be an optional feature if scipy is installed.
Super easy fix, will save tons of dependency headaches.
pip freeze > requirements.txt
Regarding the query_postal_code function:
As seen on the code below. The actual postal code i'm searching for is 'K2C' but in-order to search for it i had to insert '5CA' in front of it. My understanding is the 5 - represents the accuracy and CA - is the Country code.
Why do these first three characters need to be added? Am i right in what they represent?
nomi = pgeocode.Nominatim('CA')
nomi.query_postal_code("5CA K2C")
thanks
Hi,
The function query_location
is not working at all. I found this website with the background information for pgeocode 0.1.0 and it seems the function has as argument pass
.
My pgeocode version is 0.2.1.
Given a location and a distance, what's the right way to find populated places that are exactly that distance from the location?
I don't want to search for everything inside a circle, just what is on its perimeter.
Thanks!
Hello ,
I am using pgeocode to fetch longitude and latitude from postal code.
as i am using this package from last few months i am facing issue of HTTP
currently this issue happening more frequently than past.
This is hampering my pipeline run as well.
can you please give me solution or way around for this problem.
Thanks and regards,
nikhil
It might be useful to support alternate download locations in case GeoNames website goes down. This would also help reproducibility (I'm not sure how often GeoNames database is updated and if that is tracked somewhere).
This would require storing the data somewhere. One possibility for free hosting could be to attach it to Github releases.
For instance, maybe @zaro's implementation in zaro@6a3c743 could be adapted.
I want to find 'KR' country code, but this package is not Offer default.
So, How Can i find 'KR' country_code.
The new template is https://download.geonames.org/export/dump/{country}.zip
At the first run, whenever trying to download a new country, you get this error.
A solution has been provided.
[[source]]
url = "https://pypi.org/simple"
verify_ssl = true
name = "pypi"
[packages]
pgeocode = '>=0.1.0'
[dev-packages]
[requires]
python_version = "3.7"
Traceback (most recent call last):
File ".../geocode.py", line 3, in <module>
nomi = pgeocode.Nominatim('FR')
File ".../lib/python3.7/site-packages/pgeocode.py", line 55, in __init__
self._data_path, self._data = self._get_data(country)
File ".../lib/python3.7/site-packages/pgeocode.py", line 71, in _get_data
reader, encoding, compression = get_filepath_or_buffer(url)
ValueError: too many values to unpack (expected 3)
File: pgeocode.py
- reader, encoding, compression = get_filepath_or_buffer(url)
+ reader, encoding, compression, *rest = get_filepath_or_buffer(url)
There seems to be some inconsistency with how to access data returned by the query_postal_code() function in addition to some errors thrown.
pgeocode==0.2.1
pandas==1.0.3
Here is the creation of the object with a postal code
Python 3.6.8 (default, Apr 25 2019, 21:02:35) [20/589]
[GCC 4.8.5 20150623 (Red Hat 4.8.5-36)] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> import pgeocode
>>> code = "M4B 1B3"
>>> loc = pgeocode.Nominatim('ca')
>>> s = loc.query_postal_code(code)
>>> print(s)
postal_code M4B
country code CA
place_name East York (Parkview Hill / Woodbine Gardens)
state_name Ontario
state_code ON
county_name East York
county_code NaN
community_name NaN
community_code NaN
latitude 43.7063
longitude -79.3094
accuracy 6
Name: 0, dtype: object
Now I will try to access s.country_code and get an error
>>> print(s.country_code)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib64/python3.6/site-packages/pandas/core/generic.py", line 5274, in __getattr__
return object.__getattribute__(self, name)
AttributeError: 'Series' object has no attribute 'country_code'
And now another field with the same underscore styled name
>>> print(s.postal_code)
M4B
Now using the manner that is often seen with Pandas
>>> print(s['country code'])
CA
But when I try this for postal code it throws a different error
>>> print(s['postal code'])
Traceback (most recent call last):
File "/usr/local/lib64/python3.6/site-packages/pandas/core/indexes/base.py", line 4411, in get_value
return libindex.get_value_at(s, key)
File "pandas/_libs/index.pyx", line 44, in pandas._libs.index.get_value_at
File "pandas/_libs/index.pyx", line 45, in pandas._libs.index.get_value_at
File "pandas/_libs/util.pxd", line 98, in pandas._libs.util.get_value_at
File "pandas/_libs/util.pxd", line 83, in pandas._libs.util.validate_indexer
TypeError: 'str' object cannot be interpreted as an integer
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/lib64/python3.6/site-packages/pandas/core/series.py", line 871, in __getitem__
result = self.index.get_value(self, key)
File "/usr/local/lib64/python3.6/site-packages/pandas/core/indexes/base.py", line 4419, in get_value
raise e1
File "/usr/local/lib64/python3.6/site-packages/pandas/core/indexes/base.py", line 4405, in get_value
return self._engine.get_value(s, k, tz=getattr(series.dtype, "tz", None))
File "pandas/_libs/index.pyx", line 80, in pandas._libs.index.IndexEngine.get_value
File "pandas/_libs/index.pyx", line 90, in pandas._libs.index.IndexEngine.get_value
File "pandas/_libs/index.pyx", line 138, in pandas._libs.index.IndexEngine.get_loc
File "pandas/_libs/hashtable_class_helper.pxi", line 1618, in pandas._libs.hashtable.PyObjectHashTable.get_item
File "pandas/_libs/hashtable_class_helper.pxi", line 1626, in pandas._libs.hashtable.PyObjectHashTable.get_item
KeyError: 'postal code'
The full variant is in the geonames : http://download.geonames.org/export/zip/
Can this be added?
Kudos to the algorithm which makes work easier
But there are few postcodes in Canada which the algorithm is not able to identify though these codes exist in the geonames files
M5E 0C5 is one such code for which i couldnt find the lat and long or distance from the algorithm
Hi,
I am querying latitude and longitude from the postcodes of 26 countries. When I made a test run where there are 10 rows for each country, Portugal (PT) takes much longer to process. For reference, all other countries run in the range of 1-3 seconds, while for Portugal this ranges in 30-40 seconds.
I'm not experienced in Python much, so probably my coding is not the most efficient but it is odd that one country stands out (Suggestions to improve the code below is most welcome).
for country_code in np.unique(df.COUNTRY_VALUE):
print(country_code,len(df[df.COUNTRY_VALUE==country_code]))
start_time = time.time()
nomi = pgeocode.Nominatim(country_code)
df.loc[df.COUNTRY_VALUE==country_code,'LAT']=df.POSTCODE.apply(get_city).latitude
df.loc[df.COUNTRY_VALUE == country_code, 'LON'] = df.POSTCODE.apply(get_city).latitude
print("--- %s seconds ---" % (time.time() - start_time))
def get_city(code):
try:
x=nomi.query_postal_code(code)
return x
except:
return ('')
Currently we load datasets with pd.read_csv
from gzipped CSV format. Loading should be much improved by converting the data to parquet format and using pd.read_parquet
(this might also reduce the size of downloads when using e.g. snappy compression).
Though the limitations of this approach is that datasets would need to be hosted somewhere and a new dependency (pyarrow) would need to be added. I'm not sure that it would be worth it.
Hi @rth! :3 Thanks for your great work and for creating such a useful library. I deeply appreciate your hard work! :3
I love using it, but dependencies used underneath like pandas
and numpy
can be sometimes problematic. They are quite heavy and can be a bit troublesome to install on Apple devices with M* chips. I was thinking about an alternative that would provide an identical API, but use built-in functionalities/libraries such as csv
or math
instead.
I believe it could be beneficial for many workflows and projects. What do you think about such an idea? :D
The GB dataset for Great Britain only includes outwards codes (i.e. only first 3 letters). The full dataset is included in GB_full
, but currently this fails to load,
>>> import pgeocode
>>> dist = pgeocode.GeoDistance('GB_full')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/rth/src/pgeocode/pgeocode.py", line 174, in __init__
super(GeoDistance, self).__init__(country)
File "/home/rth/src/pgeocode/pgeocode.py", line 49, in __init__
.format(country))
ValueError: country=GB_FULL is not a known country code. See the README for a list of supported countries
Mostly making sure input validation works, in this case, should fix this, I think.
AFAIK it's the same issue as #40
Hello,
New guy here using the pgeocode lib for one of my projects. I have just noticed that there may be a small bug while generating the -index.txt file that results the mean location for post code shift a bit.
Can anyone please double check if the latitude is calcualted properly? I can see there's a possible typo in the valid_keys derivation and latitude is spelled with two Ts.
Location: https://github.com/symerio/pgeocode/blob/main/pgeocode.py#L255
Occurs for PL post codes, e.g. 21-310 or 98-330
I think it's a great package. I am trying to use the package for Canadian postal codes.
query_postal_code sums the longitude.
nomi.query_postal_code("41-800")
Thats from GEOName file:
41-800 will return you 2 locations:
PL, 41-800, 50.2817, 18.6745
PL,41-800, 50.3055, 18.778
After running: nomi.query_postal_code("41-800")
postal_code 41-800
place_name Gliwice, Zabrze
latitude 50.2817
longitude 18.7263
and the longitude = SUM of the locations from file / number of results.
First off, this is a great package which has really helped me out! I have encountered a bug with postcodes for CA, UK and IE post codes. The error I am getting is
ValueError Traceback (most recent call last)
<ipython-input-1174-3c5ae744b055> in <module>
----> 1 import codecs, os;__pyfile = codecs.open('''/tmp/pyjoGiOo''', encoding='''utf-8''');__code = __pyfile.read().encode('''utf-8''');__pyfile.close();os.remove('''/tmp/pyjoGiOo''');exec(compile(__code, '''/home/Downloads//Data_Cleaning/cleaning.py''', 'exec'));
~/Downloads//Data_Cleaning/cleaning.py in <module>
500 lookup.columns =
501
--> 502
503
504 df1_missing_city = df1.groupby('Target country code').apply(lambda x: len(x[x['Target city'].isnull() == True]) / len(x) * 100)
~/anaconda3/lib/python3.8/site-packages/pandas/core/series.py in apply(self, func, convert_dtype, args, **kwds)
4198 else:
4199 values = self.astype(object)._values
-> 4200 mapped = lib.map_infer(values, f, convert=convert_dtype)
4201
4202 if len(mapped) and isinstance(mapped[0], Series):
pandas/_libs/lib.pyx in pandas._libs.lib.map_infer()
~/Downloads//Data_Cleaning/cleaning.py in <lambda>(x)
500 lookup.columns =
501
--> 502
503
504 df1_missing_city = df1.groupby('Target country code').apply(lambda x: len(x[x['Target city'].isnull() == True]) / len(x) * 100)
~/Downloads//Data_Cleaning/cleaning.py in postcode_lookup(postcode, form)
489 'VA',
490 'VI',
--> 491 'WF',
492 'YT',
493 'ZA']
~/anaconda3/lib/python3.8/site-packages/pgeocode.py in query_postal_code(self, codes)
305
306 codes = self._normalize_postal_code(codes)
--> 307 response = pd.merge(
308 codes, self._data_frame, on="postal_code", how="left"
309 )
~/anaconda3/lib/python3.8/site-packages/pandas/core/reshape/merge.py in merge(left, right, how, on, left_on, right_on, left_index, right_index, sort, suffixes, copy, indicator, validate)
72 validate=None,
73 ) -> "DataFrame":
---> 74 op = _MergeOperation(
75 left,
76 right,
~/anaconda3/lib/python3.8/site-packages/pandas/core/reshape/merge.py in __init__(self, left, right, how, on, left_on, right_on, axis, left_index, right_index, sort, suffixes, copy, indicator, validate)
654 # validate the merge keys dtypes. We may need to coerce
655 # to avoid incompatible dtypes
--> 656 self._maybe_coerce_merge_keys()
657
658 # If argument passed to validate,
~/anaconda3/lib/python3.8/site-packages/pandas/core/reshape/merge.py in _maybe_coerce_merge_keys(self)
1163 inferred_right in string_types and inferred_left not in string_types
1164 ):
-> 1165 raise ValueError(msg)
1166
1167 # datetimelikes must match exactly
ValueError: You are trying to merge on float64 and object columns. If you wish to proceed you should use pd.concat
I have traced back this error as far the _normalize_postal_code which seems to be throwing up the merge error latter query_postal_code function. Although I haven't been able to figure out what it is about splitting these postcodes which seems to be upsetting pandas so much.
Is there any plan for the query_location
method? I think it would be great to have it. Cheers
The US file no longer exists http://download.geonames.org/export/zip/US.zip and the code throws a 404 when trying to download it.
HTTPError
HTTP Error 404: Not Found
I have pip installed pgeocode with the --user flag as I am on a shared server and lack permissions to write in most places except my home. As a test, I use this demo code:
import pgeocode
nomi = pgeocode.Nominatim('fr')
nomi.query_postal_code("75013")
Is this error below telling me it can't download the geonames database I've asked for?
Thanks.
HTTPError Traceback (most recent call last)
<ipython-input-17-14e76cf7e5d5> in <module>
1 import pgeocode
----> 2 nomi = pgeocode.Nominatim('fr')
3 nomi.query_postal_code("75013")
~/.local/lib/python3.7/site-packages/pgeocode.py in __init__(self, country)
53 "in 1999.")
54 self.country = country
---> 55 self._data_path, self._data = self._get_data(country)
56 self._data_unique = self._index_postal_codes()
57
~/.local/lib/python3.7/site-packages/pgeocode.py in _get_data(country)
69 url = DOWNLOAD_URL.format(country=country)
70 compression = _infer_compression(url, "zip")
---> 71 reader, encoding, compression = get_filepath_or_buffer(url)[:3]
72 with ZipFile(reader) as fh_zip:
73 with fh_zip.open(country.upper() + '.txt') as fh:
/opt/anacondaShell/envs/ScientificPython3Stack/lib/python3.7/site-packages/pandas/io/common.py in get_filepath_or_buffer(filepath_or_buffer, encoding, compression, mode)
200
201 if _is_url(filepath_or_buffer):
--> 202 req = _urlopen(filepath_or_buffer)
203 content_encoding = req.headers.get('Content-Encoding', None)
204 if content_encoding == 'gzip':
/opt/anacondaShell/envs/ScientificPython3Stack/lib/python3.7/urllib/request.py in urlopen(url, data, timeout, cafile, capath, cadefault, context)
220 else:
221 opener = _opener
--> 222 return opener.open(url, data, timeout)
223
224 def install_opener(opener):
/opt/anacondaShell/envs/ScientificPython3Stack/lib/python3.7/urllib/request.py in open(self, fullurl, data, timeout)
529 for processor in self.process_response.get(protocol, []):
530 meth = getattr(processor, meth_name)
--> 531 response = meth(req, response)
532
533 return response
/opt/anacondaShell/envs/ScientificPython3Stack/lib/python3.7/urllib/request.py in http_response(self, request, response)
639 if not (200 <= code < 300):
640 response = self.parent.error(
--> 641 'http', request, response, code, msg, hdrs)
642
643 return response
/opt/anacondaShell/envs/ScientificPython3Stack/lib/python3.7/urllib/request.py in error(self, proto, *args)
561 http_err = 0
562 args = (dict, proto, meth_name) + args
--> 563 result = self._call_chain(*args)
564 if result:
565 return result
/opt/anacondaShell/envs/ScientificPython3Stack/lib/python3.7/urllib/request.py in _call_chain(self, chain, kind, meth_name, *args)
501 for handler in handlers:
502 func = getattr(handler, meth_name)
--> 503 result = func(*args)
504 if result is not None:
505 return result
/opt/anacondaShell/envs/ScientificPython3Stack/lib/python3.7/urllib/request.py in http_error_302(self, req, fp, code, msg, headers)
753 fp.close()
754
--> 755 return self.parent.open(new, timeout=req.timeout)
756
757 http_error_301 = http_error_303 = http_error_307 = http_error_302
/opt/anacondaShell/envs/ScientificPython3Stack/lib/python3.7/urllib/request.py in open(self, fullurl, data, timeout)
529 for processor in self.process_response.get(protocol, []):
530 meth = getattr(processor, meth_name)
--> 531 response = meth(req, response)
532
533 return response
/opt/anacondaShell/envs/ScientificPython3Stack/lib/python3.7/urllib/request.py in http_response(self, request, response)
639 if not (200 <= code < 300):
640 response = self.parent.error(
--> 641 'http', request, response, code, msg, hdrs)
642
643 return response
/opt/anacondaShell/envs/ScientificPython3Stack/lib/python3.7/urllib/request.py in error(self, proto, *args)
567 if http_err:
568 args = (dict, 'default', 'http_error_default') + orig_args
--> 569 return self._call_chain(*args)
570
571 # XXX probably also want an abstract factory that knows when it makes
/opt/anacondaShell/envs/ScientificPython3Stack/lib/python3.7/urllib/request.py in _call_chain(self, chain, kind, meth_name, *args)
501 for handler in handlers:
502 func = getattr(handler, meth_name)
--> 503 result = func(*args)
504 if result is not None:
505 return result
/opt/anacondaShell/envs/ScientificPython3Stack/lib/python3.7/urllib/request.py in http_error_default(self, req, fp, code, msg, hdrs)
647 class HTTPDefaultErrorHandler(BaseHandler):
648 def http_error_default(self, req, fp, code, msg, hdrs):
--> 649 raise HTTPError(req.full_url, code, msg, hdrs, fp)
650
651 class HTTPRedirectHandler(BaseHandler):
HTTPError: HTTP Error 401: authenticationrequired
requirements.txt
numpy==1.18.1
pandas==1.0.1
pgeocode==0.1.2
test.py
import pgeocode
nomi = pgeocode.Nominatim('fr')
error:
>\venv\lib\site-packages\pgeocode.py (line 69, in _get_data)
>from pandas.io.common import get_filepath_or_buffer, _infer_compression
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.