Code Monkey home page Code Monkey logo

pypistats.org's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pypistats.org's Issues

API : Provide data by package version

What I'd like to know is :
version 1.0.0 of my package foobar was downloaded X times, while version 1.1.0 was downloaded Y times.

This would give an idea of my package popularity over time, without requiring to record download count of every single day since the beginning of time (which is done with the overall endpoint of the API, only for a recent history).

(I just discovered this repo. For a long time, I did want to get pypi download counts without having to manipulate Big Query myself; http://pypistats.org solved this. Thanks!)

-- Louis

Data import seems to have not run yesterday?

Hello! You probably already know this, but just in case: it looks like the live site doesn't have any data for 7-28, even though we're almost to the end of 7-29.

(Very cool site by the way!)

502 Bad Gateway

This is probably more of a hosting issue but I don't know where else to report it, and don't have the skills myself to debug it. I have noticed for the past approximately 3 to 5 days (can't remember exactly when it started but at least three days ago) the site pypistats.org is very slow. For most requests, it takes a very long time to come back; sometimes it times-out completely, returning "502 Bad Gateway" with "nginx/1.12.1" underneath.

Non-normalised package name

I would like to use the data to correlate with openSUSE package names, which use the 'real' name supplied in setup.py, i.e. not-normalised.

I've been doing a bit of research at hugovk/top-pypi-packages#4, and the raw data from bigquery can include this, with a very small perf hit, or it can be added afterwards by doing lookups against PyPI directly, which would be a significant extra work.

I'm putting in the effort - I want to know where I should put the effort, here or somewhere else. If here, would you prefer it coming from bigquery, or added afterwards from PyPI. The latter makes sense if there are other details from PyPI that you believe are needed in this project which cant be obtained from the bigquery dataset.

dependencies listed multiple times

If a dependency is listed more than once in install_requires or extra_requires (which is perfectly possible), e.g.,

    install_requires=["numpy"],
    extras_require={
        "all": ["netCDF4", "h5py", "lxml"],
        "exodus": ["netCDF4"],
        "hdf5": ["h5py"],
        "xml": ["lxml"],
    },

then it is listed multiple times in the web interface too, where it makes less sense:

f

Perhaps this can be uniqued.

Perhaps optional (extra) dependencies could be listed as such in the web interface, too.

pypistats.org has NO new data since July 19.

pypistats.org has NO new data since July 19.

There is also ZERO for the data for all packages on April 8th, 2020, and on July 18, 2020. These may be related and/or due to the same cause (once that cause is discovered).

Add Mirror Software Graphs / Data Analysis

Hi,

Many thanks for this project! It's cool.

I'm a bandersnatch contributor and I'd love to see a page added for Mirroring software and the versions of each respective system used. Namely, bandersnatch, as I am bias.

Do you have an idea of how you'd like this done if I (or you) were to implement it? Today I query Google and a nice graph on your page would be amazing and simplify my life! :D

Getting "429 Client Error: TOO MANY REQUESTS" on <20 requests

I have a small script that queries the pypistats.org API to see how much my packages have been downloaded lately. It makes one request to /api/packages/:project/recent per project, and I only have eighteen projects. I usually only run it once every couple of weeks. However, the past few weeks, running it fails with a "429 TOO MANY REQUESTS" error somewhere in the middle of the project list. My best guess at an explanation is that some form of API rate-limiting was implemented after #20, but the limit seems set too low. What can I do to get my script working again?

500 error

since yesterday I can not reach pypistats.org anymore. All I get is a page saying „500“. I hope it will be up again soon. Thx for the great service.

Announcing a command-line tool

Hello, thank you very much for creating https://pypistats.org!

And thank you very much for creating an API as well, I've made a command-line tool pypistats that uses the API :)

https://pypi.org/project/pypistats
https://github.com/hugovk/pypistats

Here's example use:

$ pypistats python_minor pillow --last-month
| category | percent | downloads |
|----------|--------:|----------:|
|      2.7 |  46.64% | 1,512,429 |
|      3.6 |  30.34% |   983,838 |
|      3.5 |  12.53% |   406,429 |
|      3.7 |   6.12% |   198,558 |
|      3.4 |   3.41% |   110,552 |
| null     |   0.84% |    27,380 |
|      3.3 |   0.05% |     1,599 |
|      2.6 |   0.05% |     1,581 |
|      3.2 |   0.01% |       246 |
|      3.8 |   0.00% |       133 |
|      2.4 |   0.00% |         7 |
| Total    |         | 3,242,752 |

And the output is actually Markdown (though JSON is available via a switch), so can be pasted into GitHub like this:

category percent downloads
2.7 46.64% 1,512,429
3.6 30.34% 983,838
3.5 12.53% 406,429
3.7 6.12% 198,558
3.4 3.41% 110,552
null 0.84% 27,380
3.3 0.05% 1,599
2.6 0.05% 1,581
3.2 0.01% 246
3.8 0.00% 133
2.4 0.00% 7
Total 3,242,752

This issue is just to let you know and to say thanks, feel free to close it once you've read it :)

Text totals don't seem to match up with graph numbers

It seems like the text totals for a package don't add up to what the graphs show. In the below image, the text shows 0, 7, and 15 downloads for day, week, and month, respectively, but the associated graph seems to show at least an order of magnitude more downloads. Am I reading the graph incorrectly? Perhaps the text totals are for downloads without mirrors, only?

image

7-day smoothing

Downloads have a distinct day-of-week pattern, where they're much lower on Saturday/Sunday. A 7-day moving average in the graph would make long term trends much easier to see.

API: 404 is returned for some endpoints but not others

For packages with zero downloads, these return 404:

However, these return data but with empty data:

https://pypistats.org/api/packages/sugar/python_major
https://pypistats.org/api/packages/sugar/python_minor
https://pypistats.org/api/packages/sugar/system

{
"data": [],
"package": "sugar",
"type": "python_major_downloads"
}
{
"data": [],
"package": "sugar",
"type": "python_minor_downloads"
}	
{
"data": [],
"package": "sugar",
"type": "system_downloads"
}

Similarly, for packages which don't exist:

https://pypistats.org/api/packages/notfound123444/recent
https://pypistats.org/api/packages/notfound123444/overall

These return "data": []:

https://pypistats.org/api/packages/notfound123444/python_major
https://pypistats.org/api/packages/notfound123444/python_minor
https://pypistats.org/api/packages/notfound123444/system

Should they all return 404?

Show percentages in the python-version, OS graphs

It'd be handy if the by-python-version and by-OS graphs showed percentages of downloads that fell into each category on each day, instead of absolute numbers. We already see the total downloads in the graph at the top of each page, and then on the graphs below that it's hard to see how the different categories compare to each other because there's so much day-to-day variation in total downloads.

List dependent packages

Current page shows package dependencies, but not yet the dependent packages. I'll be great if we showed that as well.

Intermittent 429 RATE LIMIT EXCEEDED

Hi,

I am intermittently faced with a 429 RATE LIMIT EXCEEDED error when browsing pypistats.org

I'm only browsing the odd page, not making tonnes of requests through the API, so this is a little odd. It doesn't happen all of the time; maybe 25-30% of requests succeed. I have confirmed this on a different device using a different internet connection, so I think this is an issue at your end rather than mine.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.