Code Monkey home page Code Monkey logo

async-sema's Introduction

Vercel

Develop. Preview. Ship.

Documentation · Changelog · Templates · CLI


Vercel

Vercel’s Frontend Cloud provides the developer experience and infrastructure to build, scale, and secure a faster, more personalized Web.

Deploy

Get started by importing a project or using the Vercel CLI. Then, git push to deploy.

Documentation

For details on how to use Vercel, check out our documentation.

Contributing

This project uses pnpm to install dependencies and run scripts.

You can use the dev script to run local changes as if you were invoking Vercel CLI. For example, vercel deploy --cwd=/path/to/project could be run with local changes with pnpm dev deploy --cwd=/path/to/project.

When contributing to this repository, please first discuss the change you wish to make via GitHub Discussions with the owners of this repository before submitting a Pull Request.

Please read our Code of Conduct and follow it in all your interactions with the project.

Local development

This project is configured in a monorepo, where one repository contains multiple npm packages. Dependencies are installed and managed with pnpm, not npm CLI.

To get started, execute the following:

git clone https://github.com/vercel/vercel
cd vercel
corepack enable
pnpm install
pnpm build
pnpm lint
pnpm test-unit

Make sure all the tests pass before making changes.

Running Vercel CLI Changes

You can use pnpm dev from the cli package to invoke Vercel CLI with local changes:

cd ./packages/cli
pnpm dev <cli-commands...>

See CLI Local Development for more details.

Verifying your change

Once you are done with your changes (we even suggest doing it along the way), make sure all the tests still pass by running:

pnpm test-unit

from the root of the project.

If any test fails, make sure to fix it along with your changes. See Interpreting test errors for more information about how the tests are executed, especially the integration tests.

Pull Request Process

Once you are confident that your changes work properly, open a pull request on the main repository.

The pull request will be reviewed by the maintainers and the tests will be checked by our continuous integration platform.

Interpreting test errors

There are 2 kinds of tests in this repository – Unit tests and Integration tests.

Unit tests are run locally with jest and execute quickly because they are testing the smallest units of code.

Integration tests

Integration tests create deployments to your Vercel account using the test project name. After each test is deployed, the probes key is used to check if the response is the expected value. If the value doesn't match, you'll see a message explaining the difference. If the deployment failed to build, you'll see a more generic message like the following:

[Error: Fetched page https://test-8ashcdlew.vercel.app/root.js does not contain hello Root!. Instead it contains An error occurred with this application.

    NO_STATUS_CODE_FRO Response headers:
       cache-control=s-maxage=0
      connection=close
      content-type=text/plain; charset=utf-8
      date=Wed, 19 Jun 2019 18:01:37 GMT
      server=now
      strict-transport-security=max-age=63072000
      transfer-encoding=chunked
      x-now-id=iad1:hgtzj-1560967297876-44ae12559f95
      x-now-trace=iad1]

In such cases, you can visit the URL of the failed deployment and append /_logs to see the build error. In the case above, that would be https://test-8ashcdlew.vercel.app/_logs

The logs of this deployment will contain the actual error which may help you to understand what went wrong.

Running integration tests locally

While running the full integration suite locally is not recommended, it's sometimes useful to isolate a failing test by running it on your machine. To do so, you'll need to ensure you have the appropriate credentials sourced in your shell:

  1. Create an access token. Follow the insructions here https://vercel.com/docs/rest-api#creating-an-access-token. Ensure the token scope is for your personal account.
  2. Grab the team ID from the Vercel dashboard at https://vercel.com/<MY-TEAM>/~/settings.
  3. Source these into your shell rc file: echo 'export VERCEL_TOKEN=<MY-TOKEN> VERCEL_TEAM_ID=<MY-TEAM-ID>' >> ~/.zshrc

From there, you should be able to trigger an integration test. Choose one that's already isolated to check that things work:

cd packages/next

Run the test:

pnpm test test/fixtures/00-server-build/index.test.js

@vercel/nft

Some of the Builders use @vercel/nft to tree-shake files before deployment. If you suspect an error with this tree-shaking mechanism, you can create the following script in your project:

const { nodeFileTrace } = require('@vercel/nft');
nodeFileTrace(['path/to/entrypoint.js'], {
  ts: true,
  mixedModules: true,
})
  .then(o => console.log(o.fileList))
  .then(e => console.error(e));

When you run this script, you'll see all the imported files. If files are missing, the bug is in @vercel/nft and not the Builder.

Deploy a Builder with existing project

Sometimes you want to test changes to a Builder against an existing project, maybe with vercel dev or actual deployment. You can avoid publishing every Builder change to npm by uploading the Builder as a tarball.

  1. Change directory to the desired Builder cd ./packages/node
  2. Run pnpm build to compile typescript and other build steps
  3. Run npm pack to create a tarball file
  4. Run vercel *.tgz to upload the tarball file and get a URL
  5. Edit any existing vercel.json project and replace use with the URL
  6. Run vercel or vercel dev to deploy with the experimental Builder

Reference

async-sema's People

Contributors

cardin avatar dsabanin avatar fauxfaux avatar gastonfig avatar leerob avatar leo avatar mfix22 avatar olliv avatar pranaygp avatar riophae avatar sergiodxa avatar timneutkens avatar weeezes avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

async-sema's Issues

Way to cancel pending tokens

I find I have the need to do a series of concurrent tasks, with a max concurrency, but also with a max total run time. After this time, whatever hasn't finished or started can be ignored.

I propose to add a new API

sema.cancel()

Which rejects all pending .acquire() calls. This way I can do the following

const timeout = setTimeout(() => sema.cancel(), 5000);
await Promise.all(jobs.map(async job => {
  try {
    sema.acquire();
  } catch (err) {
    if (err.code === 'CANCELLED') {
      return null
    }
    throw err
  }

  try {
    await doWork(job);
  } finally {
    sema.release();
  }
}));
clearTimeout(timeout);

If accepted I can make a PR

IE 11 Support

When adding this to a project, my app no longer loads in IE 11. I can't find any polyfills which alleviate the problem.

Has anyone had success running in IE?

What is the `nr` variable?

sorry to interrupt. what does nr stands for, I thought it was num of resource, but then what is the capacity variable?

Is it possible to make 1 request every 2 seconds?

I am using the simple implementation RateLimiter and I am using the lowest value of 1 (1 request / sec) but I still get 500 errors from the server which I am calling. so I wonder if its possible to further reduce the number of calls say 1 every 2 seconds?

Rate limit and throttle: Last item throttled

Thanks for this code!

const lim = RateLimit(2, {
  timeUnit: (5000),
});

await Promise.all(
  images.map(async (image, index) => {

    await lim();

    const parsed = await _prepImage.call(this, image);

    await util.downloadImage(parsed.url, parsed.target);

  })
);

From what I can tell, the above code downloads two images at a time and pauses 5 seconds between concurrent downloads.

I am noticing though, the last image download waits when I would like it to exit as soon as it's resolved.

In other words, since there are no more images to download (as it it the last promise to resolve) I don't want it to wait.

Questions:

  • Considering the above code snippet, am I using RateLimit properly (or, is this a good use case for RateLimit)?
  • How can I exit out of the last promise to resolve without pausing?

Many thanks in advance for the help and for sharing this code!

Action required: Greenkeeper could not be activated 🚨

🚨 You need to enable Continuous Integration on all branches of this repository. 🚨

To enable Greenkeeper, you need to make sure that a commit status is reported on all branches. This is required by Greenkeeper because it uses your CI build statuses to figure out when to notify you about breaking changes.

Since we didn’t receive a CI status on the greenkeeper/initial branch, it’s possible that you don’t have CI set up yet. We recommend using Travis CI, but Greenkeeper will work with every other CI service as well.

If you have already set up a CI for this repository, you might need to check how it’s configured. Make sure it is set to run on all new branches. If you don’t want it to run on absolutely every branch, you can whitelist branches starting with greenkeeper/.

Once you have installed and configured CI on this repository correctly, you’ll need to re-trigger Greenkeeper’s initial pull request. To do this, please delete the greenkeeper/initial branch in this repository, and then remove and re-add this repository to the Greenkeeper App’s white list on Github. You'll find this list on your repo or organization’s settings page, under Installed GitHub Apps.

Some nice way of async iterating?

I'm trying to limit uploads to 10 simultaneously, and I'm doing it like this:

	const uploadSema = new Sema(10)
	for (const upload of uploads) {
		await uploadSema.acquire()
		uploadOne(upload).finally(() => uploadSema.release())
	}
	await uploadSema.drain()

It's reasonably nice, but I was wondering if there wasn't a way to make this nicer.

I moved the logic to this helper function

export const queuedWork = async (items, fn, workers = 10) => {
	const sema = new Sema(workers)
	let threw = null
	for (const item of items) {
		if (threw) break
		await sema.acquire()
		// eslint-disable-next-line promise/catch-or-return
		Promise.resolve(fn(item))
			.catch(err => {
				threw = err
			})
			.finally(() => sema.release())
	}
	await sema.drain()
	if (threw) throw threw
}

Is this a good way of going about it? Is there maybe a more elegant way?

(wrote these tests too)

test('queuedWork async', async () => {
	const out = []
	await queuedWork([1, 5, 7, 89, 2], async n => out.push(n), 2)
	expect(out).toEqual([1, 5, 7, 89, 2])
})

test('queuedWork sync', async () => {
	const out = []
	await queuedWork([1, 5, 7, 89, 2], n => out.push(n), 2)
	expect(out).toEqual([1, 5, 7, 89, 2])
})

test('queuedWork throws', async () => {
	const out = []
	await expect(
		queuedWork(
			[1, 2, 3, 4, 5],
			async n => {
				if (n === 2) throw new Error('meep')
				else out.push(n)
			},
			2
		)
	).rejects.toThrow('meep')
	expect(out).toEqual([1, 3])
})


test('queuedWork throws sync', async () => {
	const out = []
	await expect(
		queuedWork(
			[1, 2, 3, 4, 5],
			n => {
				if (n === 2) throw new Error('meep')
				else out.push(n)
			},
			2
		)
	).rejects.toThrow('meep')
	expect(out).toEqual([1])
})

Possible p() & v() confusion

Looking at the source code and the examples, it seems to me that the library is using p() to signal/free up a resource, whereas v() is used to wait/"block" on the semaphore.

However, according to the definition of a semaphore, the uses of p() and v() are the other way around. Some resources to verify are:

Please let me know if I'm misunderstanding the use case here. :)

Intuition surrounding `nr` values

How should one best figure out how to meaningfully determine this value? The examples seem to not follow any strict pattern.

Can the readme be made to include a rough guideline or set of heuristics to determine this for a given type of application?

Add examples to README

All examples are currently in an examples directory, but we should perhaps include the basic on on the README itself

broken typescript definition

Hi all, currently typescript definition seems broken, typescript compiler complains about malformed type definition file (.d.ts) syntax.

maybe it's off topic, I love this package. so i've migrated async-sema package to typescript. i'd like to discuss about publishing typescript version of async-sema. should i publish typescript version of async-sema package as individual package? or just send PR to this repo? Please kindly give me your advice. Thanks :)

.release() does not check actual token count

The nr value given to Sema() does not stop .release() from being called as many times as a user wants.

Example below shows an nr value of 1, but the user is able to obtain 2 concurrent uses.

const Sema = require("async-sema").Sema;

let i = 0;
const a = new Sema(1);
a.release();
a.acquire().then(() => {
	i++;
	console.log(`i = ${i}`);
	return a.acquire();
}).then(() => {
	i++;
	console.log(`i = ${i}`);
	return a.acquire();
});

RateLimit does not work with number less than 1.

async function f() {
  const lim = RateLimit(0.5, { timeUnit: 1000 }); // rps
  console.time('order');

  for (let i = 0; i < 100; i++) {
    await lim();
    console.timeLog('order');
  }
  console.timeEnd('order');
}
f();

order: 1.083s
order: 2.085s
order: 3.087s
order: 4.091s
async function f() {
  const lim = RateLimit(1, { timeUnit: 2000 }); // rps
  console.time('order');

  for (let i = 0; i < 100; i++) {
    await lim();
    console.timeLog('order');
  }
  console.timeEnd('order');
}
f();

order: 2.071s
order: 4.075s
order: 6.078s
order: 8.081s

it should work with number less than 1 or maybe we should write it in the doc

Add Sema.run utility for easy task processing

I think adding the following utility function might simplify working with semaphores a lot

export class Sema {
    public runTask<T>(task: (token?: any) => Promise<T>): Promise<T> {
        return this.acquire().then(token =>
            Promise.resolve(token)
                .then(task)
                .finally(() => sem.release(token))
        );
    }
}

It would allow to simply and safely queue tasks for processing while making sure that acquired tokens are never lost:

const sem = new Sema(3);

// Run many task in parallel limited by sem
for (let i = 0; i < 100; i++) {
    sem.runTask(async () => {
        // do some stuff
    });
}

// Alternatively use Promise.all and map
await Promise.all(items.map(item => sem.runTask(() => processItem(item))))

Args object should be optional

Now the constructor requires an on empty object for default args:

new Sema(10, {})

while

new Sema(10)

would be optimal.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.