Code Monkey home page Code Monkey logo

easypoi's Introduction

原理

知乎:获取**指定行政区域内所有POIS(兴趣点)的方法

核心功能

1 获取**境内指定行政区域内(最小可精确到街道)的指定关键词的所有兴趣点; 例如可以获取一个城市内所有的便利店、商场、超市、咖啡店、大学等地理位置信息,包括经纬度、所在的省、市、区县、街道等等。

2 获取指定地点(地址或经纬度坐标)方圆N公里以内指定关键词的所有兴趣点。

3 可组合批量获取**境内多个行政区域内多个关键词的所有兴趣点信息。例如你可以同时获取成都市、西安市、上海市三个指定的城市里所有超市、商场和大学的数据。

下载地址

https://www.yuque.com/soaringsoul/geotools

如果发现下载地址失效,请在微信搜索人文互联网 公众号,关注后回复"poi"获取最新的下载链接;

如何运行

非常简单,所见即所得,不做过多说明。

采集结果示例

  • csv

result_csv

  • excel

result_excel

  • mysql

result_excel

联系我

如果在使用过程中遇到无法解决的问题,你可以通过关注我的个人公众号找到我。

另外,也可以通过提交issue的方式提交问题。

License

Apache 2.0 License.

easypoi's People

Contributors

soaringsoul avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

easypoi's Issues

由于连接方在一段时间后没有正确答复或连接的主机没有反应,连接尝试失败

Traceback (most recent call last):
File "C:\Users\stone\AppData\Local\Programs\Python\Python36\lib\site-packages\urllib3\connectionpool.py", line 600, in urlopen
chunked=chunked)
File "C:\Users\stone\AppData\Local\Programs\Python\Python36\lib\site-packages\urllib3\connectionpool.py", line 354, in _make_request
conn.request(method, url, **httplib_request_kw)
File "C:\Users\stone\AppData\Local\Programs\Python\Python36\lib\http\client.py", line 1239, in request
self._send_request(method, url, body, headers, encode_chunked)
File "C:\Users\stone\AppData\Local\Programs\Python\Python36\lib\http\client.py", line 1285, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "C:\Users\stone\AppData\Local\Programs\Python\Python36\lib\http\client.py", line 1234, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "C:\Users\stone\AppData\Local\Programs\Python\Python36\lib\http\client.py", line 1026, in _send_output
self.send(msg)
File "C:\Users\stone\AppData\Local\Programs\Python\Python36\lib\http\client.py", line 964, in send
self.connect()
File "C:\Users\stone\AppData\Local\Programs\Python\Python36\lib\site-packages\urllib3\connection.py", line 181, in connect
conn = self._new_conn()
File "C:\Users\stone\AppData\Local\Programs\Python\Python36\lib\site-packages\urllib3\connection.py", line 168, in _new_conn
self, "Failed to establish a new connection: %s" % e)
urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x00000221015CDDD8>: Failed to establish a new connection: [WinError 10060] 由于连接方在一段时间后没有正确答复或连接的主机没有反应,连接尝试失败。

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "C:\Users\stone\AppData\Local\Programs\Python\Python36\lib\site-packages\requests\adapters.py", line 449, in send
timeout=timeout
File "C:\Users\stone\AppData\Local\Programs\Python\Python36\lib\site-packages\urllib3\connectionpool.py", line 638, in urlopen
_stacktrace=sys.exc_info()[2])
File "C:\Users\stone\AppData\Local\Programs\Python\Python36\lib\site-packages\urllib3\util\retry.py", line 399, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='restapi.amap.com', port=80): Max retries exceeded with url: /v3/config/district?keywords=%E4%B8%AD%E7%89%9F%E5%8E%BF&key=19b6a10f26bfc2ef5ff7014d126350f4&subdistrict=3&extensions=all&output=JSON (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x00000221015CDDD8>: Failed to establish a new connection: [WinError 10060] 由于连接方在一段时间后没有正确答复或连接的主机没有反应,连接尝试失败。',))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "C:\Users\stone\AppData\Local\Programs\Python\Python36\lib\site-packages\scrapy\core\engine.py", line 127, in _next_request
request = next(slot.start_requests)
File "D:\AndroidProject\周边商家\周边商家\python\BaiduMapPoiSpider\BaiduMapWebApiSpier\spiders\web_api_spider.py", line 43, in start_requests
poly = get_region_polyline(region_name)
File "D:\AndroidProject\周边商家\周边商家\python\BaiduMapPoiSpider\BaiduMapWebApiSpier\util\geo\amap_get_geopolylines.py", line 20, in get_region_polyline
response = requests.get(url, params=_parms)
File "C:\Users\stone\AppData\Local\Programs\Python\Python36\lib\site-packages\requests\api.py", line 75, in get
return request('get', url, params=params, **kwargs)
File "C:\Users\stone\AppData\Local\Programs\Python\Python36\lib\site-packages\requests\api.py", line 60, in request
return session.request(method=method, url=url, **kwargs)
File "C:\Users\stone\AppData\Local\Programs\Python\Python36\lib\site-packages\requests\sessions.py", line 524, in request
resp = self.send(prep, **send_kwargs)
File "C:\Users\stone\AppData\Local\Programs\Python\Python36\lib\site-packages\requests\sessions.py", line 637, in send
r = adapter.send(request, **kwargs)
File "C:\Users\stone\AppData\Local\Programs\Python\Python36\lib\site-packages\requests\adapters.py", line 516, in send
raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPConnectionPool(host='restapi.amap.com', port=80): Max retries exceeded with url: /v3/config/district?keywords=%E4%B8%AD%E7%89%9F%E5%8E%BF&key=19b6a10f26bfc2ef5ff7014d126350f4&subdistrict=3&extensions=all&output=JSON (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x00000221015CDDD8>: Failed to establish a new connection: [WinError 10060] 由于连接方在一段时间后没有正确答复或连接的主机没有反应,连接尝试失败。',))
2019-07-09 14:28:12 [scrapy.core.engine] INFO: Closing spider (finished)
2019-07-09 14:28:12 [scrapy.statscollectors] INFO: Dumping Scrapy stats:
{'finish_reason': 'finished',
'finish_time': datetime.datetime(2019, 7, 9, 6, 28, 12, 34469),
'log_count/ERROR': 1,
'log_count/INFO': 7,
'start_time': datetime.datetime(2019, 7, 9, 6, 27, 50, 979586)}
2019-07-09 14:28:12 [scrapy.core.engine] INFO: Spider closed (finished)

触发了百度并发预警之后,换了AK就出现connection fail的错误。 这不会是让我氪金了吧.还是其他错误呀?

KeyError: 'districts'

2019-07-09 16:45:16 [scrapy.core.engine] INFO: Spider closed (finished)
2019-07-09 16:48:18 [scrapy.utils.log] INFO: Scrapy 1.5.1 started (bot: BaiduMapWebApiSpier)
2019-07-09 16:48:18 [scrapy.utils.log] INFO: Versions: lxml 4.2.5.0, libxml2 2.9.5, cssselect 1.0.3, parsel 1.5.1, w3lib 1.20.0, Twisted 19.2.1, Python 3.6.6 (v3.6.6:4cf1f54eb7, Jun 27 2018, 03:37:03) [MSC v.1900 64 bit (AMD64)], pyOpenSSL 19.0.0 (OpenSSL 1.1.1c 28 May 2019), cryptography 2.7, Platform Windows-10-10.0.17134-SP0
2019-07-09 16:48:18 [scrapy.crawler] INFO: Overridden settings: {'BOT_NAME': 'BaiduMapWebApiSpier', 'LOG_FILE': 'log.txt', 'LOG_LEVEL': 'INFO', 'NEWSPIDER_MODULE': 'BaiduMapWebApiSpier.spiders', 'SPIDER_MODULES': ['BaiduMapWebApiSpier.spiders']}
2019-07-09 16:48:19 [scrapy.middleware] INFO: Enabled extensions:
['scrapy.extensions.corestats.CoreStats',
'scrapy.extensions.telnet.TelnetConsole',
'scrapy.extensions.logstats.LogStats']
2019-07-09 16:48:19 [scrapy.middleware] INFO: Enabled downloader middlewares:
['scrapy.downloadermiddlewares.httpauth.HttpAuthMiddleware',
'scrapy.downloadermiddlewares.downloadtimeout.DownloadTimeoutMiddleware',
'scrapy.downloadermiddlewares.defaultheaders.DefaultHeadersMiddleware',
'scrapy.downloadermiddlewares.useragent.UserAgentMiddleware',
'scrapy.downloadermiddlewares.retry.RetryMiddleware',
'scrapy.downloadermiddlewares.redirect.MetaRefreshMiddleware',
'scrapy.downloadermiddlewares.httpcompression.HttpCompressionMiddleware',
'scrapy.downloadermiddlewares.redirect.RedirectMiddleware',
'scrapy.downloadermiddlewares.cookies.CookiesMiddleware',
'scrapy.downloadermiddlewares.httpproxy.HttpProxyMiddleware',
'scrapy.downloadermiddlewares.stats.DownloaderStats']
2019-07-09 16:48:19 [scrapy.middleware] INFO: Enabled spider middlewares:
['scrapy.spidermiddlewares.httperror.HttpErrorMiddleware',
'scrapy.spidermiddlewares.offsite.OffsiteMiddleware',
'scrapy.spidermiddlewares.referer.RefererMiddleware',
'scrapy.spidermiddlewares.urllength.UrlLengthMiddleware',
'scrapy.spidermiddlewares.depth.DepthMiddleware']
2019-07-09 16:48:19 [scrapy.middleware] INFO: Enabled item pipelines:
['BaiduMapWebApiSpier.pipelines.BaidumapwebapispierPipeline']
2019-07-09 16:48:19 [scrapy.core.engine] INFO: Spider opened
2019-07-09 16:48:19 [scrapy.extensions.logstats] INFO: Crawled 0 pages (at 0 pages/min), scraped 0 items (at 0 items/min)
2019-07-09 16:48:19 [scrapy.core.engine] ERROR: Error while obtaining start requests
Traceback (most recent call last):
File "C:\Users\stone\AppData\Local\Programs\Python\Python36\lib\site-packages\scrapy\core\engine.py", line 127, in _next_request
request = next(slot.start_requests)
File "D:\AndroidProject\周边商家\周边商家\python\BaiduMapPoiSpider\BaiduMapWebApiSpier\spiders\web_api_spider.py", line 43, in start_requests
poly = get_region_polyline(region_name)
File "D:\AndroidProject\周边商家\周边商家\python\BaiduMapPoiSpider\BaiduMapWebApiSpier\util\geo\amap_get_geopolylines.py", line 23, in get_region_polyline
polyline = data['districts'][0]['polyline']
KeyError: 'districts'
2019-07-09 16:48:19 [scrapy.core.engine] INFO: Closing spider (finished)
2019-07-09 16:48:19 [scrapy.statscollectors] INFO: Dumping Scrapy stats:
{'finish_reason': 'finished',
'finish_time': datetime.datetime(2019, 7, 9, 8, 48, 19, 883932),
'log_count/ERROR': 1,
'log_count/INFO': 7,
'start_time': datetime.datetime(2019, 7, 9, 8, 48, 19, 785197)}
2019-07-09 16:48:19 [scrapy.core.engine] INFO: Spider closed (finished)

只有市级的成功了,区县的不知道为啥错误。中文,编码都试了不知道是不是region_name_list内的错误

error with "Unhandled error in deferred"

@xugongli 出现这个错误。安装方式选择的是 pip install requirements.txt安装。 重新安装了pywin32 ,错误仍然有~~~
另:百度ak和高等ak已重新设置,mysql已设置~~~
@xugongli 求解,十分感谢!!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.