Code Monkey home page Code Monkey logo

name-your-contributors's Introduction

Welcome

Documentation, notes, drafts, and information

Welcome the Maintainer Mountaineer's organization on GitHub!

You'll find most of the information you need on our website. If you have public questions, open an issue here, and we'll do our best to respond as soon as possible.

If you have private questions, email [email protected].

Join our Slack Group!

Resources

Here are some awesome, related resources you can check out.

Developed here:

Developed elsewhere:

  • Alex Catch insensitive, inconsiderate writing. Website
  • Code Triage Set up code triaging help for volunteers. Assumes you have volunteers.

Common Questions

Why is everything licensed by Burnt Fen Creative LLC?

That is @RichardLitt's shell company he uses for consulting services. Maintainer Mountaineer aims to some day be its own business, but for now, for licensing and invoicing purposes, Burnt Fen Creative is used.

Why are there so many forks?

Maintainer.io policy is to fork repositories when we do public code and repository audits. If we have a fork here, that means we were contacted about auditing a repository. For Maintainer source repositories, you can see this filter on the @mntnr page.

Can I haz stickers?

Yes! Send us an email or open an issue with your name and address, and we'll send a few out to you.

Code of Conduct

Everything @mntnr related follows the Contributor Covenant. Please be nice.

License

CC-BY-SA-NC 4.0 Unlicensed (c) Burnt Fen Creative LLC 2017.

name-your-contributors's People

Contributors

alanshaw avatar berkmann18 avatar dignifiedquire avatar gr2m avatar greenkeeper[bot] avatar jozefizso avatar richardlitt avatar tgetgood avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

name-your-contributors's Issues

Error: no gitconfig to be found at /private/tmp

When I ran

$ name-your-contributors hoodiehq --since=2016-02-01T00:20:000Z

I get

Error: no gitconfig to be found at /private/tmp
    at /Users/gregor/.nvm/versions/node/v4.2.2/lib/node_modules/name-your-contributors/node_modules/gitconfiglocal/index.js:12:27
    at /Users/gregor/.nvm/versions/node/v4.2.2/lib/node_modules/name-your-contributors/node_modules/gitconfiglocal/index.js:45:48
    at FSReqWrap.cb [as oncomplete] (fs.js:212:19)

any idea?

$ node -v && npm -v
v4.2.2
3.7.3

The automated release is failing 🚨

🚨 The automated release from the master branch failed. 🚨

I recommend you give this issue a high priority, so other packages depending on you could benefit from your bug fixes and new features.

You can find below the list of errors reported by semantic-release. Each one of them has to be resolved in order to automatically publish your package. I’m sure you can resolve this 💪.

Errors are usually caused by a misconfiguration or an authentication problem. With each error reported below you will find explanation and guidance to help you to resolve it.

Once all the errors are resolved, semantic-release will release your package the next time you push a commit the master branch. You can also manually restart the failed CI job that runs semantic-release.

If you are not sure how to resolve this, here is some links that can help you:

If those don’t help, or if this issue is reporting something you think isn’t right, you can always ask the humans behind semantic-release.


The push permission to the Git repository is required.

semantic-release cannot push the version tag to the branch master on remote Git repository.

Please refer to the authentication configuration documentation to configure the Git credentials on your CI environment.


Good luck with your project ✨

Your semantic-release bot 📦🚀

Get email for users

I want to be able to automatically add all contributors into the package.json contributors field. Can we get the email from authors, too, to match this spec?

Daemonise

Some queries can't be fulfilled at present because they require more than an hour's quota of info.

One strategy to deal with this would be to have a nyc run as a daemon where you can send it queries and it will grab them over time and store the results in a file. The user should be able to query the status of their running queries to know when the files are safe to copy.

I'm basically thinking of transmission-remote as the interface we want.

Publish

Semantic release has broken. :( I don't know why, and it's annoying because the current version doesn't work.

Grab user URLs

In addition to logins and names, we should grab the GitHub URLs of contributors.

An in-range update of travis-deploy-once is breaking the build 🚨

The devDependency travis-deploy-once was updated from 5.0.8 to 5.0.9.

🚨 View failing branch.

This version is covered by your current version range and after updating it in your project the build failed.

travis-deploy-once is a devDependency of this project. It might not break your production code or affect downstream projects, but probably breaks your build or test tools, which may prevent deploying or publishing.

Status Details
  • continuous-integration/travis-ci/push: The Travis CI build could not complete due to an error (Details).

Release Notes for v5.0.9

5.0.9 (2018-09-27)

Bug Fixes

  • use require.resolve to load babel preset (16292d3)
Commits

The new version differs by 2 commits.

  • 16292d3 fix: use require.resolve to load babel preset
  • 858d475 test: add babel-register.js to test coverage

See the full diff

FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

Error: "Your token has not been granted the required scopes to execute this query"

Unfortunately I cannot get it to work with any token, even if I enable all scopes? I always get this error back:

{
  "data": null,
  "errors": [
    {
      "message": "Your token has not been granted the required scopes to execute this query. The 'email' field requires one of the following scopes: ['user:email', 'read:user'], but your token has only been granted the: ['repo'] scopes. Please modify your token's scopes at: https://github.com/settings/tokens.",
      "locations": [
        {
          "line": 9,
          "column": 1
        }
      ]
    }
  ]
}

I tried to debug it but couldn’t figure it out ... any idea what it could be?

The full command I run is

GITHUB_TOKEN=... node src/cli.js -u octokit -r "rest.js" -a 2018-01-17

Trace Contributions

We already query exactly who does what and where. Which PR of which repo did so and so comment on?

The current aggregation logic throws that away and just returns bucketed counts.

We need the option to keep the entire query response tree (cleaned up a bit, but no info thrown away).

Feature requests

Hi @RichardLitt! I'm looking forward to start using this module more, there are just a couple of things I would love to have in order to do so which. I would like to know if they sounds like they fit in the scope of this module or if I just should go ahead and create one that fits my needs. :) Features I'm looking for are:

  • Exposing internal functions for individual use - Improved testing and usage cases
  • Expose a -o --output-path option to be able to specify where the data gets written
  • Use a streaming interface to avoid running out of memory #15
  • Expose a --flat option, where:
    • if true, the output becomes ndjson with the format
//example
"{ timeframe: <2017-10-12--2017-12-20>, project: <org/repo>, user: <username>, issuesCreated: <>,  commits: <>.. plus other data } \n"
  • if false, the output becomes just a coalesced json struct
// example:
{
  "2017-10-12--2017-12-20": {
    "org/repo": {
      user1: {
        // all the interesting stats
      }
      user2: {
        // ...
      }
   }
  }
}
  • support for a config file that lists several repos and timeframes, so that it generates it all in one swipe
  • able to export to csv data
  • Also, can the hours be dropped from since and until? Or at least, optional

Overall I just also need it to be working, which I'm being unsuccessful at the moment.

Get emails for users

It looks more and more like this is a must have. It means another query to get the user object.

Necessary for #46.

Expose a `--flat` ndjson option

Expose a --flat option, where:

  • if true, the output becomes ndjson with the format
//example
"{ timeframe: <2017-10-12--2017-12-20>, project: <org/repo>, user: <username>, issuesCreated: <>,  commits: <>.. plus other data } \n"
  • if false, the output becomes just a coalesced json struct
// example:
{
  "2017-10-12--2017-12-20": {
    "org/repo": {
      user1: {
        // all the interesting stats
      }
      user2: {
        // ...
      }
   }
  }
}

Asked for in #16.

Grab amount and type of interaction as needed

@tgetgood I need a tool that shows me the types of interactions, and the amount of times a user does each, for a repo.

Basically, I need: RichardLitt committed 4 times, reacted 17 times, commented 5 times, etc...

Can we do this, easily?

Finer grained control of data gathered

Grabbing everything like we currently do can get insanely expensive. Ways to get around this:

  • Query tuning: look more carefully at the branching factors in the queries and reduce the cost of the average query

  • Make reactions optional (and opt in). Reactions, as leaves in the tree, are the most expensive thing to query and not always of interest to end users. So this is a no brainer.

  • Don't get commit authors from the API by default. This isn't an expensive query, but we have to throttle the pagination since github returns these requests fast enough for us to appear spammy when we don't wait between. There are two ways to go here: 1) allow the user to specify the path to a local clone of the repo which we can query directly, and 2) have a flag to get them from the API. To reiterate, grabbing from the API isn't untenable, plotly.js has ~14000 commits, so it takes 140 requests and 140 quota to get it all. That's not necessarily too much quota, but at a reasonable rate limit for requests the user could be waiting a while. So opt in. This relates to #47.

Config file support

It would be good to support for a config file that lists several repos and timeframes, so that it generates it all in one swipe.

Requested in #16.

Supports ranges?

Would be awesome to support time ranges:

nameYourContributors('ipfs', {
  since: '2016-01-15T00:20:24Z',
  until: '2016-10-15T00:20:24Z'  // or "to:"
})

Code overgrabs all contributors to updated threads

If someone comments on a thread three months ago, and someone else comments a day ago, both will be pulled if I am searching within the past week. This is a serious bug that devalues the stats about the size of a community. Each object will need to be date-sorted, not just the top-level issues and PRs.

Version 10 of node.js has been released

Version 10 of Node.js (code name Dubnium) has been released! 🎊

To see what happens to your code in Node.js 10, Greenkeeper has created a branch with the following changes:

  • Added the new Node.js version to your .travis.yml

If you’re interested in upgrading this repo to Node.js 10, you can open a PR with these changes. Please note that this issue is just intended as a friendly reminder and the PR as a possible starting point for getting your code running on Node.js 10.

More information on this issue

Greenkeeper has checked the engines key in any package.json file, the .nvmrc file, and the .travis.yml file, if present.

  • engines was only updated if it defined a single version, not a range.
  • .nvmrc was updated to Node.js 10
  • .travis.yml was only changed if there was a root-level node_js that didn’t already include Node.js 10, such as node or lts/*. In this case, the new version was appended to the list. We didn’t touch job or matrix configurations because these tend to be quite specific and complex, and it’s difficult to infer what the intentions were.

For many simpler .travis.yml configurations, this PR should suffice as-is, but depending on what you’re doing it may require additional work or may not be applicable at all. We’re also aware that you may have good reasons to not update to Node.js 10, which is why this was sent as an issue and not a pull request. Feel free to delete it without comment, I’m a humble robot and won’t feel rejected 🤖


FAQ and help

There is a collection of frequently asked questions. If those don’t help, you can always ask the humans behind Greenkeeper.


Your Greenkeeper Bot 🌴

Org flag raises GraphQL permissions error

Using the --org flag causes a GraphQL permissions error.

🐕  node src/cli.js -o orbitdb -r example-orbitdb-todomvc
Error: Graphql error: {
  "data": null,
  "errors": [
    {
      "message": "Your token has not been granted the required scopes to execute this query. The 'name' field requires one of the following scopes: ['read:org'], but your token has only been granted the: ['repo', 'user'] scopes. Please modify your token's scopes at: https://github.com/settings/tokens.",
      "locations": [
        {
          "line": 122,
          "column": 1
        }
      ]
    }
  ]
}
    at parseResponse (/Users/richard/src/name-your-contributors/src/graphql.js:414:11)
    at <anonymous>

This works if you use -u instead.

"process out of memory" error

Hi, really interested in using the lib, thanks! Got this error, though, after hanging for a few minutes:

(trusty)warren@localhost:~/sites/plots2$ name-your-contributors ipfs --since=2016-10-15T00:20:24Z
Got response
wrote issue_creators
wrote issue_commenters

<--- Last few GCs --->

  403290 ms: Mark-sweep 701.8 (738.5) -> 699.6 (738.5) MB, 8779.9 / 0 ms [allocation failure] [GC in old space requested].
  413027 ms: Mark-sweep 699.6 (738.5) -> 699.6 (738.5) MB, 9736.6 / 0 ms [allocation failure] [GC in old space requested].
  429222 ms: Mark-sweep 699.6 (738.5) -> 698.8 (738.5) MB, 16195.4 / 13 ms [last resort gc].
  441528 ms: Mark-sweep 698.8 (738.5) -> 697.9 (738.5) MB, 12302.8 / 0 ms [last resort gc].


<--- JS stacktrace --->

==== JS stack trace =========================================

Security context: 0x29d65341 <JS Object>
    1: /* anonymous */(aka /* anonymous */) [/home/warren/.nvm/versions/node/v4.4.0/lib/node_modules/name-your-contributors/node_modules/octokat/dist/node/verb-methods.js:79] [pc=0x25676570] (this=0x29d080dd <undefined>,verbFunc=0x47f8f695 <JS Function module.exports.SimpleVerbs.verbs.create (SharedFunctionInfo 0x3a6b53e5)>,verbName=0x29d25431 <String[6]: create>)
    2: injectVerbMethods [/home/wa...

FATAL ERROR: CALL_AND_RETRY_LAST Allocation failed - process out of memory
Aborted (core dumped)

Bug in Octokat: eTag but not cached response

🐕  node cli.js ipfs --since=2016-10-15T00:20:24Z
Got response
wrote issue_creators
wrote issue_commenters
/Users/richard/src/name-your-contributors/node_modules/octokat/dist/node/plugins/cache-handler.js:49
            throw new Error('ERROR: Bug in Octokat cacheHandler. It had an eTag but not the cached response');

Implement in GraphQL

This entire module would work much, much simpler if it didn't have thousands of hits. Let's reduce it to a few. This will also make bug fixing much easier.

Add a Cache

To daemonise effectively (regarding #48) we need a local cache on disk. Nothing fancy just hashes of the queries and timestamps. Global ttl.

This will also have a beneficial impact on development, and in normal use it will mean that if a query fails because it's too expensive (not in daemon mode) then the data grabbed does not get thrown away when you try again later.

Add more examples to the README

Specifically, we need an example that runs on a repo and ends with a list of users that can be used in a CONTRIBUTORS file. This would probably involve also mentioning whodunnit, which is OK.

API rate limit exceeded error

Is there a way I can send the request authenticated as my github user?

$ name-your-contributors hoodiehq --since=2016-02-01T00:20:000Z
err { [Error: {"message":"API rate limit exceeded for 38.104.218.154. (But here's the good news: Authenticated requests get a higher rate limit. Check out the documentation for more details.)","documentation_url":"https://developer.github.com/v3/#rate-limiting"}]
  status: 403,
  json: 
   { message: 'API rate limit exceeded for 38.104.218.154. (But here\'s the good news: Authenticated requests get a higher rate limit. Check out the documentation for more details.)',
     documentation_url: 'https://developer.github.com/v3/#rate-limiting' } }
Unable to get unique users { [Error: {"message":"API rate limit exceeded for 38.104.218.154. (But here's the good news: Authenticated requests get a higher rate limit. Check out the documentation for more details.)","documentation_url":"https://developer.github.com/v3/#rate-limiting"}]
  status: 403,
  json: 
   { message: 'API rate limit exceeded for 38.104.218.154. (But here\'s the good news: Authenticated requests get a higher rate limit. Check out the documentation for more details.)',
     documentation_url: 'https://developer.github.com/v3/#rate-limiting' } }
err { [Error: {"message":"API rate limit exceeded for 38.104.218.154. (But here's the good news: Authenticated requests get a higher rate limit. Check out the documentation for more details.)","documentation_url":"https://developer.github.com/v3/#rate-limiting"}]
  status: 403,
  json: 
   { message: 'API rate limit exceeded for 38.104.218.154. (But here\'s the good news: Authenticated requests get a higher rate limit. Check out the documentation for more details.)',
     documentation_url: 'https://developer.github.com/v3/#rate-limiting' } }

Design

I think the design is not as good as it could be. Look at git-standup -- tons of stars and committers, and not that different of a product.

Take some time to refactor this, Richard.

Extend the definition of "contributor"

We would like commit authors and people who react to anything in a repo to be included in the definition.

So to that end, we should add two more keys to the returned JSON, commitAuthors and reactors, and extend the logic to populate them.

Edit: PullRequestReviewComment is a distinct entity to PullRequestComment. We should be grabbing those too.

Add Pull Request creators

Currently working on branch feature/add-prs. I found an issue with depaginate, which now needs to be updated in each dep. Fun times.

Work without token

Is there any way to get this module working without a token, on the client side?

API Rate limit issue

When I run:

~
🍔  $ npm install --global name-your-contributors
~
🍔  $ name-your-contributors formly-js
Unable to get unique users { [Error: {"message":"API rate limit exceeded for 24.11.5.123. (But here's the good news: Authenticated requests get a higher rate limit. Check out the documentation for more details.)","documentation_url":"https://developer.github.com/v3/#rate-limiting"}]
  status: 403,
  json: 
   { message: 'API rate limit exceeded for 24.11.5.123. (But here\'s the good news: Authenticated requests get a higher rate limit. Check out the documentation for more details.)',
     documentation_url: 'https://developer.github.com/v3/#rate-limiting' } }
Failed to depaginate
Failed to depaginate
Failed to depaginate
Failed to depaginate
Failed to depaginate
Failed to depaginate
 { [Error: {"message":"API rate limit exceeded for 24.11.5.123. (But here's the good news: Authenticated requests get a higher rate limit. Check out the documentation for more details.)","documentation_url":"https://developer.github.com/v3/#rate-limiting"}]
  status: 403,
  json: 
   { message: 'API rate limit exceeded for 24.11.5.123. (But here\'s the good news: Authenticated requests get a higher rate limit. Check out the documentation for more details.)',
     documentation_url: 'https://developer.github.com/v3/#rate-limiting' } }
Failed to depaginate
 { [Error: {"message":"API rate limit exceeded for 24.11.5.123. (But here's the good news: Authenticated requests get a higher rate limit. Check out the documentation for more details.)","documentation_url":"https://developer.github.com/v3/#rate-limiting"}]
  status: 403,
  json: 
... continues

Any ideas?

I tried it with my own username as well. Same error message.

Errors not useful

I got the following error. It really doesn't help me; somewhere, something failed to paginate. What we should specify is in what dependency, where, and how many hits we have had to the API - maybe that is why we get server error issue.

$ name-your-contributors ipfs --since=2016-02-29T12:00:01Z 
Failed to depaginate
 { [Error: Error: connect ENETUNREACH 192.30.252.127:443
    at Object.exports._errnoException (util.js:860:11)
    at exports._exceptionWithHostPort (util.js:883:20)
    at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1063:14)] status: 0 }
Failed to depaginate
 { [Error: Error: connect ENETUNREACH 192.30.252.127:443
    at Object.exports._errnoException (util.js:860:11)
    at exports._exceptionWithHostPort (util.js:883:20)
    at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1063:14)] status: 0 }
Failed to depaginate
 { [Error: {"message":"Server Error","documentation_url":"https://developer.github.com/v3"}]
  status: 500,
  json: 
   { message: 'Server Error',
     documentation_url: 'https://developer.github.com/v3' } }
Failed to depaginate
 { [Error: {"message":"Server Error","documentation_url":"https://developer.github.com/v3"}]
  status: 500,
  json: 
   { message: 'Server Error',
     documentation_url: 'https://developer.github.com/v3' } }
Failed to depaginate
 { [Error: Error: socket hang up
    at createHangUpError (_http_client.js:209:15)
    at TLSSocket.socketOnEnd (_http_client.js:294:23)
    at emitNone (events.js:72:20)
    at TLSSocket.emit (events.js:166:7)
    at endReadableNT (_stream_readable.js:905:12)
    at doNTCallback2 (node.js:450:9)
    at process._tickCallback (node.js:364:17)] status: 0 }
Failed to depaginate
 { [Error: Error: socket hang up
    at createHangUpError (_http_client.js:209:15)
    at TLSSocket.socketOnEnd (_http_client.js:294:23)
    at emitNone (events.js:72:20)
    at TLSSocket.emit (events.js:166:7)
    at endReadableNT (_stream_readable.js:905:12)
    at doNTCallback2 (node.js:450:9)
    at process._tickCallback (node.js:364:17)] status: 0 }
Failed to depaginate
 { [Error: Error: socket hang up
    at createHangUpError (_http_client.js:209:15)
    at TLSSocket.socketOnEnd (_http_client.js:294:23)
    at emitNone (events.js:72:20)
    at TLSSocket.emit (events.js:166:7)
    at endReadableNT (_stream_readable.js:905:12)
    at doNTCallback2 (node.js:450:9)
    at process._tickCallback (node.js:364:17)] status: 0 }
Failed to depaginate
 { [Error: Error: socket hang up
    at createHangUpError (_http_client.js:209:15)
    at TLSSocket.socketOnEnd (_http_client.js:294:23)
    at emitNone (events.js:72:20)
    at TLSSocket.emit (events.js:166:7)
    at endReadableNT (_stream_readable.js:905:12)
    at doNTCallback2 (node.js:450:9)
    at process._tickCallback (node.js:364:17)] status: 0 }
Failed to depaginate
 { [Error: Error: socket hang up
    at createHangUpError (_http_client.js:209:15)
    at TLSSocket.socketOnEnd (_http_client.js:294:23)
    at emitNone (events.js:72:20)
    at TLSSocket.emit (events.js:166:7)
    at endReadableNT (_stream_readable.js:905:12)
    at doNTCallback2 (node.js:450:9)
    at process._tickCallback (node.js:364:17)] status: 0 }
Failed to depaginate
 { [Error: Error: socket hang up
    at createHangUpError (_http_client.js:209:15)
    at TLSSocket.socketOnEnd (_http_client.js:294:23)
    at emitNone (events.js:72:20)
    at TLSSocket.emit (events.js:166:7)
    at endReadableNT (_stream_readable.js:905:12)
    at doNTCallback2 (node.js:450:9)
    at process._tickCallback (node.js:364:17)] status: 0 }
Failed to depaginate
 { [Error: Error: socket hang up
    at createHangUpError (_http_client.js:209:15)
    at TLSSocket.socketOnEnd (_http_client.js:294:23)
    at emitNone (events.js:72:20)
    at TLSSocket.emit (events.js:166:7)
    at endReadableNT (_stream_readable.js:905:12)
    at doNTCallback2 (node.js:450:9)
    at process._tickCallback (node.js:364:17)] status: 0 }

More flexible command line options

Currently we have to either specify the user and repository name, or an org name. It should be possible in addition to specify:

  • Just a user name, and have it treated like an org: get all contributors to repos owned by that user
  • Just a repo, in which case the owner is the owner of the auth token
  • nothing, in which case if we're in a git repo, we get the origin and check that, if no origin, or not in a git repo, abort.
  • Maybe other things too

Be Polite

If a user requests info on a very active repo, the current strategy is to spam the server until we have all the data. This has resulted in us getting banned (trying to pull everything from plotly.js).

We should throttle requests to make sure we're not issuing more than X per minute. I'm not sure what X should be but that can be tweaked.

In addition we should alert the user when a given request triggers throttling so that they know it's going to take a long time. And give them an option to abort. When applicable we should also give them advice for making a better (narrower) query.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.