Code Monkey home page Code Monkey logo

perforce-buildkite-plugin's Introduction

Perforce Buildkite Plugin Build Status

A Buildkite plugin that lets you check out code from Perforce Version Control on Windows, Linux and macOS platforms.

  1. Configure at least P4PORT and P4USER (see examples below)
  2. Provision with credentials - a P4TICKETS file is recommended
  3. Optionally customise workspace mapping with stream, sync or view settings.

The P4CLIENT, P4USER and P4PORT used by the plugin are written to a P4CONFIG file at the workspace root and the P4CONFIG env var is set, so build scripts are able to automatically pick up configuration for any further interactions with Perforce.

Examples

Configuration via env vars

env:
  P4PORT: perforce:1666
  P4USER: username

steps:
  plugins:
    - improbable-eng/perforce: ~

Configuration via plugin

steps:
  plugins:
    - improbable-eng/perforce:
      p4port: perforce:1666
      p4user: username

P4PORT may also be configured by setting BUILDKITE_REPO for your pipeline.

Configuration

Basic

p4user/p4port/p4tickets/p4trust (optional, string)

Override configuration at the User Environment level. May be overridden by P4CONFIG or P4ENVIRO files.

See p4 set for more on system variables and precedence.

fingerprint (optional, string)

Supply a trusted p4 server fingerprint to ensure the server the client connects to has not been MITM'd.

stream (optional, string)

Which p4 stream to sync, e.g. //dev/minimal. Can be overridden by view.

sync (optional, []string)

List of paths to sync, useful when only a subset of files in the clients view are required.

sync:
  - //dev/minimal/.buildkite/...
  - //dev/minimal/scripts/...

view (optional, string)

Custom workspace view. Must consist of concrete depot paths. Overrides stream.

view: >-
  //dev/project/... project/...
  //dev/vendor/... vendor/...

Advanced

client_options (optional, string)

Default: clobber.

Additional options for the client workspace, see Options field.

client_options: noclobber nowriteall

client_type (optional, string)

Default: writeable.

readonly and partitioned client workspaces can be used to reduce impact of automated build systems on Perforce server performance. See related article Readonly and Partitioned Client Workspaces.

Note that writeable client workspaces must be deleted and re-created to change to readonly or partitioned and vice versa.

Note that readonly or partitioned workspaces do not appear in the db.have table, which prevents them from being used as a revision specifier.

This adds a caveat if you wish to re-use workspace data across different machines: the original client which populated that workspace must have been writeable.

(e.g. If a disk with existing workspace data is attached to a new machine, the plugin will create a new client, read the old workspace name from P4CONFIG and p4 flush //...@<old-workspace>. The flush command fails if the old workspace was not of type writeable)

parallel (optional, string)

Default: 0 (no parallelism)

Number of threads to use for parallel sync operations. High values may affect Perforce server performance.

share_workspace (optional, bool)

Default: no

Allow multiple Buildkite pipelines to share each stream-specific client workspace.

Useful to avoid syncing duplicate data for large workspaces.

Can only be used with stream workspaces and when no more than one buildkite-agent process is running on that machine.

stream_switching (optional, bool)

Default: no

Allows multiple Buildkite pipelines to share a single client workspace, switching streams as required.

Must have share_workspace: yes to take effect.

Triggering Builds

There are a few options for triggering builds that use this plugin, in this order from least valuable but most convenient to most valuable but least convenient.

Manual

Relies on people within your team manually clicking New Build within the BuildKite UI.

  • To build current head revision on the server - accept the defaults.
  • To build a specific revision - paste the revision number into the Commit textbox.
    • Note you can also use more abstract p4 revision specifiers such as @labelname or @datespec
  • To build a shelved changelist - paste your changelist number into the Branch textbox.

Schedule

Scheduled builds with a cron in buildkite - this requires no additional setup, but provides the slowest response time between a change being made and a build triggered.

Polling

A service polls your perforce for the current head revision and POSTs to the Buildkite API to trigger builds for any new changes. Note that you will need to store state to avoid duplicate and skipped builds.

P4 Trigger

Set up a p4 trigger which POSTs to the buildkite API to trigger a build. See p4 triggers for more information. Note that this will require admin access to the Perforce server.

See examples for sample p4 trigger scripts.

Contributing

OSX

Run dev/setup_env_osx.sh

Python virtualenv .dev-venv for running tests will be created at repo root.

Run the test_server_fixture unit test to check everything is setup correctly:

source .dev-venv/bin/activate
pytest python/test_perforce.py -k test_server_fixture

Linux/Windows

TBC, feedback welcome.

Suggested workflow

Making changes to python/

  • Read implementation of test_server_fixture in test_perforce.py
  • Write unit test in test_perforce.py, optionally making changes to the test fixture if required
  • Implement new functionality
  • Iterate via unit test

Making changes to hooks/ and scripts called by hooks

  • Add entries to local-pipeline.yml to test new behaviour, if relevant
  • make to start p4d on localhost:1666, vendor the plugin, run the pipeline and kill p4d.

perforce-buildkite-plugin's People

Contributors

ca-johnson avatar dependabot[bot] avatar gavinelder avatar helcaraxan avatar improbaben89 avatar improbable-mattchurch avatar jordanvogel avatar super-filip avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

perforce-buildkite-plugin's Issues

Update p4python dependency

Since 2018.2, these changes have happened (https://www.perforce.com/perforce/doc.current/user/p4pythonnotes.txt) (below)

I think one of our internal studios is starting to use macOS, otherwise the only things that look notable are the memory leak fixes.

I wouldn't be sure precisely how to test that a change like this is safe. Is there a test-plan/matrix of items to assert against for our platform/python combinations? I'd be guessing:

  • windows 2019 / linux
  • python 2.7.latest / 3.5.latest / 3.6.latest / 3.7.latest / 3.8.latest (new in 2020.1)

is the matrix. On each, is it sufficient to run the test automation, or are there other assertions to make?

changelog

New functionality in 2020.1

#1979782 (Job #100909) * ***
    Added support for Python 3.8.

#1971901 (Job #102669) * ** ***
	Updated the spec templates to match the 2020.1 Helix Server specs.

New functionality in 2019.1

#1804040 (Job #98777) * ***
P4Python now requires p4api 2019.1
Added OSX support for 10.14,  removed support for 10.8 and earlier
Added additional libraries from p4api 19.1 to the link lines on all platforms.
Fixed name conflicts between p4python modules and p4api extensions.
On Linux, detect glib version so the correct p4api libraries are downloaded and linked.

#1804040 (Job #97426) * ***
 Will now link with either openssl 1.0.2 or 1.1.1

#1804040  (Job #98778) *
On Linux, if -ssl is not specified, will look for compatible ssl libraries,
   and if not found, will download the openssl source, and build and install it.

#1894940 * (Job #98779) *
Fixed Windows issue where OpenSSL library names changed with Openssl 1.1.0+

Bug fixes in 2019.1

#1804040 (Job #98261) *
Changed setup.py to use subclasses, and process arguments the way setuputils wants.

#1894940 (Job #98423) *
Regex change to allow for space/extra characters in OpenSSL version

#1841909 (Job #99571) *
Memory leak in P4Result::Reset

#1841909 (Job #99549) *
Memory leak in Exception Handler

#184040 (Job #98782) *
P4Python does not install correctly with Maya 2019 "mayapy" interpreter

#1812102 (Job #98261) *
P4Python attempts to download P4API every time, ignoring --apidir

Support re-use of synced client workspaces when hostnames change

Use case:

  • Sync perforce on a gcloud instance
  • Save data as an image, so that syncs are incremental
  • Scale out image, avoiding N initial syncs

Current behaviour:

  • Workspace is blatted, sync from scratch cause client name changed

Ideal behaviour:

  • Workspace is flushed to a matching revision as data

Impl. details:

  • Write a marker file with hostname and revision. If hostname changes, flush to revision.

Allow multiple paths in partial stream sync

Right now the sync parameter takes a single path
There is no reason why we couldn't take multiple paths and it opens up scope to optimise build performance by only syncing relevant directories

Considerations:

  • Backwards compatability (maybe not necessary with a clear message on the changelog)
  • Sync path is used to filter unshelve of pending cl

Install requirements into a venv

This avoids polluting global python installs on agents with our requirements

Checkout hook would look like:

if venv dir not exists:
  init venv
venv activate
install requirements.txt
python checkout.py

Support unshelving exclusive lock files (+l)

Currently, to perform presubmit tests we unshelve a changelist into the buildkite agents client workspace

For exclusive lock files, this can block people from working (it also prevents tests running in parallel)

We should look at a solution which involves p4 print to get shelved file contents instead of actually unshelving them.

Will need to track which files have been modified and clean them up

Remove 'root' setting altogether

Currently used to ensure when running integration test with make that everything ends up in a p4_workspace folder to be gitignored and cleaned up easily

Could probably be replaced be overriding BUILDKITE_BUILD_CHECKOUT_PATH or similar

Write test fixtures as code instead of crafting by-hand

The workflow for modifying test server fixture is to run unit tests, pause them halfway, make changes and save the resulting fixture. (See test_perforce.py:test_fixture)

Instead, it would be better to write as code:

server = setup_server()
server.add_stream("my-stream")

client = setup_client(server, stream="my-stream")
changelist = client.new_changelist()
changelist.add_file('file.txt', content="Hello World")
client.submit(changelist)

changelist = client.new_changelist()
changelist.add_file('file.txt', content="Goodbye World")
client.shelve(changelist)

This would setup a server with one stream, where a client submitted a changelist and holds a shelved changelist.

This way, we can more easily create and test different scenarios.

Support auto-resolve of shelved changes

When someone kicks a build with a shelved changelist, there might be merge conflicts when unshelving it. For now, we can live with it (resolve locally, re-shelve and try again) but it would be nice to automatically resolve them with -am

Resolve: None: The current workspace files are used for the build.

Resolve: Safe (-as): Accepts the file in the depot if it has the only changes. Accepts the file in the workspace if it has the only changes. Doesn’t resolve if both the depot and workspace files have changed.

Resolve: Merge (-am): Accepts the file in the depot if it has the only changes. Accepts the workspace file if it has the only changes. Merges changes if both the depot and workspace files have changed and there are no conflicts.

Resolve: Force Merge (-af): Accepts the file in the depot if it has the only changes. Accepts the workspace file if it has the only changes. Creates a merged file if both the depot and workspace files have changed, even if there are conflicts. Where there are conflicts, both versions are included with text notations indicating the conflicts.

Resolve: Yours (-ay): -- keep your edits: Uses the file that is in the workspace and ignores the version of the file that is in the depot.

Resolve: Theirs (-at) -- keep shelf content: Replaces the copy of the file in the workspace with the revision that is in the depot, discards any changes in the workspace file.

Annotate the build revision #

When a build is started with 'HEAD', we coerce this to whatever the latest commit is during pipeline upload and set it as build metadata, then sync to the same revision in following jobs.

I'm not sure how to override the link to github in the top bar of a build yet, but maybe this is possible - that would be a nice way to surface this info.

For now, its adequate to send an annotation at the same time you set the build meta-data just so its at least easily discoverable

Set BUILDKITE_COMMIT

Its quite common to expect BUILDKITE_COMMIT to be set to something sensible inside of a job. We should set this to a perforce revision number, even though thats slightly weird.

Perhaps BUILDKITE_REVISION would be a better name for this env var to be a bit more source-control agonostic.

Missing p4config? Delete + recreate client

  • a crutch for when people run multiple workspaces on one machine (causes a race, but fine for now)
  • allows changes like noclobber, allwrite etc to take effect by just deleting the p4config by hand. no need to manage the workspace myself

Use readonly or partitioned clients by default

From brief discussion on #185

Using writeable clients is usually not necessary in CI and can cause performance regression for 'real' users as the db.have table becomes fragmented over time.

Partitioned is a more balanced choice, but readonly may be the best default to ensure that files being opened in the workspace during CI is always done with an understanding of the potential consequences. (i.e. we try to support this, but its not the default mode)

Make sure to bump major semver

Fixture server fails to start vs different versions of p4d

Confirmed compatible version:
Rev. P4D/MACOSX1010X86_64/2018.2/1779952 (2019/04/02).

Currently the error you receive is 'failed to connect to server'
Really, the error is that the fixture is imcompatible with certain versions of PD4

A) Write instructions to install the correct version
B) Surface the error better

Running a build against a shelved CL that doesn't exist should give a nicer error than a stacktrace.

Versions of relevant software used

4.4.0 bk plugin

What happened

Python stacktrace

What you expected to happen

Something like

ERROR: your CL (75176) does not exist in perforce.

How to reproduce it (as minimally and precisely as possible):

Trigger a build of a shelved CL where that does not exist.

Full logs to relevant components

Logs

Running plugin perforce checkout hook | 5s
-- | --
  | # A hook runner was written to "/var/lib/buildkite-agent/tmp/20693f98-b9ca-44d8-a1de-a880198dba21/buildkite-agent-bootstrap-hook-runner-571790165" with the following:
  | $ /var/lib/buildkite-agent/tmp/20693f98-b9ca-44d8-a1de-a880198dba21/buildkite-agent-bootstrap-hook-runner-571790165
  | You are using pip version 8.1.1, however version 20.2.4 is available.
  | You should consider upgrading via the 'pip install --upgrade pip' command.
  | 10:12:41 p4python INFO: p4 trust -y
  | 10:12:42 p4python INFO: p4 client -o bk-p4-something-something
  | 10:12:42 p4python INFO: p4 client -i
  | 10:12:43 p4python INFO: Client bk-p4-something-something not changed.
  | 10:12:43 p4python INFO: p4 revert -w //...
  | 10:12:43 p4python WARNING: //... - file(s) not opened on this client.
  | 10:12:43 p4python INFO: p4 sync --parallel=threads=0 //my-depot/my-folder/my-second-folder/.buildkite/...@75176
  | 10:12:43 p4python WARNING: //my-depot/my-folder/my-second-folder/.buildkite/...@75176 - file(s) up-to-date.
  | 10:12:44 p4python INFO: p4 describe -S 75177
  | Traceback (most recent call last):
  | File "/var/lib/buildkite-agent/plugins/github-com-improbable-eng-perforce-buildkite-plugin-v4-4-0/hooks/../python/checkout.py", line 47, in <module>
  | main()
  | File "/var/lib/buildkite-agent/plugins/github-com-improbable-eng-perforce-buildkite-plugin-v4-4-0/hooks/../python/checkout.py", line 37, in main
  | repo.p4print_unshelve(changelist)
  | File "/var/lib/buildkite-agent/plugins/github-com-improbable-eng-perforce-buildkite-plugin-v4-4-0/python/perforce.py", line 291, in p4print_unshelve
  | depotfiles = changeinfo['depotFile']
  | KeyError: 'depotFile'

Anything else we need to know

Send revision metadata in 'git show' format to buildkite:git:commit

This allows the revision link and build message to be set at runtime

Buildkite parse this special bit of metadata to set BUILDKITE_MESSAGE and BUILDKITE_COMMIT

"buildkite:git:commit"
Format:
"git", "--no-pager", "show", "HEAD", "-s", "--format=fuller", "--no-color"

Plugin occasionally fails to sync backwards

Agent bless used: https://buildkite.com/improbable/midwinter-bless/builds/92, note CL 68063

Specified CL 68005 for build, with a patch of 68241

Note errors:
Asset 'Cascade.umap' has been saved with engine version newer than current and therefore can't be loaded. CurrEngineVersion: 4.24.3-67998+++midwinter+main AssetEngineVersion: 4.24.3-68054+++midwinter+main

(CL 67998 was editor version compatible with CL 68005)

This would only be present if Cascade.umap wasn't synched to CL 68005. It turns out that the agent running the step was previously synced to CL 68054 from the last build it ran.

https://buildkite.com/improbable/midwinter-build/builds/2697#3a9cc5c2-d324-42b3-8d17-21d32316d055/6-7

Plugin version: perforce-buildkite-plugin#v4.3.2

FR: Allow configuration of the exit code on failure

If the checkout fails for some reason, that's quite likely to be a systemic failure (whether persistent like p4 being down, or transient like a network interruption on a large asset) rather than a user-level failure.

We would like to be able to set up auto-retry on BK jobs so that such failures are automatically retried, but while the exit code for such is 1 we cannot distinguish that from other user-level job failures.

We'd like to be able to specify the exit code that p4 checkouts fail with so that we can distinguish that and match it to the exit codes we use for auto-retry.

Support pre-commit testing of shelved changes

We need to support a workflow where a user has a changelist on their machine and wants to test it prior to submission.

  • Can supply a changelist to a build
  • Plugin unshelves this change into the client workspace for each job
  • Unshelved changes are reliably cleaned up for following jobs (i.e. restored to match the #have list)

Things to consider:

  • Downloading large shelves repeatedly - could we cache the shelf in the pipeline upload job?
  • Write out a file which lists all the modified files and the mode they were modified in. We can then 'p4 clean' this list of files to ensure consistency with the workspaces #have list (or something faster, if it comes to me)

Windows Support

  • Add hooks/checkout.bat to support checking out Perforce on Windows
  • Support running unit tests on Windows

Investigate using the pre-built binaries

p4python offers pre-built binaries for different platforms (https://www.perforce.com/downloads/helix-core-api-python) which would allow the pip-install step to be skipped and instead rely on binaries found on the host.

We bake our dependencies into our hosts, so this would be a small optimisation for runtime in terms of speed (~10s on first-install).

I'm mostly interested in it as a way to reduce the run-time dependencies (since pip may try to compile dependencies that are not found, at which point that toolchain must be present for that to succeed). I appreciate that on CI machines, that toolchain is likely to be present regardless, but perhaps not at a workable version or setup.

(I'm not expecting this to necessarily be something we do, but thought I'd write it up in case)

Pre-exit hook fails if venv was never setup

For e.g. if some other plugin fails, our pre-exit hook will still run but complains that the virtualenv doesn't exist. If the venv doesn't exist, don't bother trying to run.

Only unshelve files which match the configured sync paths

For e.g. if you only need to sync the .buildkite dir in a bootstrap step, this avoids unecessary downloading of files you will not use

Could probably use p4 where to filter out unmapped files (I think the local location comes out as "UNMAPPED" OR something)

edit: unmapped files are already omitted, instead we must match against the P4Repo.sync_path as sometimes files are mapped, but we just dont want to sync them

For now, support in p4print_unshelve (see perforce.py) is most useful, as the traditional unshelve is not really in active use

Fix `make local_run` workflow

Currently throws an exception because we depend on metadata being set in these lines:

checkout.py

    description = repo.description(get_users_changelist() or revision.strip('@'))
    set_build_info(revision, description)

Save pending changelists as buildkite artefacts

Not 100% sure about this, but it might be better than the current method of:

  1. Make a copy of shelved cl
  2. Save cl number in BK metadata
  3. Prefer this over the users original, for stricter versioning
    Possible benfits:
    A) Less load on p4 server, no changelists that hang about forever
    B) Faster vs p4 print -o for each file (maybe?)

Major Caveat: Loses ability to iterate quickly on a given step by retrying it without the need to run the entire pipeline again

Support 'git mirrors' style single-checkout

  • Then symlink to that directory from the pipeline-specific directories
  • Only support if you have a single agent running, which we can auto-detect
  • should be split by stream (unless you want to support stream switching.. probably not)

Revert unshelved files pre-exit

Limits the amount of pollution to p4v from build agents that have files checked out

ALTERNATIVE:
see if p4 has some special mode for workspaces that exempts them from this view (including exclusive-checkout files, so peoeple can presubmit test these)

Move coercion of env vars into python side

In current checkout/checkout.bat, we coerce buildkite plugin vars into P4 env vars.

Better to just do this coercion inside the hook written in python, this way the platform specific hook files can just be 'install reqs, invoke python'

Allow clobbering non-writeable files

If a machine is unexpectedly shutdown mid-sync, there can be files on disk that aren't in the #have list yet. This is fine, we should be okay to overwrite non-readonly files - afaik this is to guarantee that you don't blat over users actual work

Write a .p4config inside the client workspace

This will make it really easy to:

  • Detect that the entire workspace has been deleted
  • Enable debugging when remoting in without having to gather various bits of context to know what port, user, client etc to connect with
  • Put special config like increasing recv buffer size in there to help everybody automatically

You will need to:

  • Write this file out in init
  • Set P4CONFIG

Provide a better error vs empty shelved changes

Currently, this error surfaces when trying to make a copy of the shelved files.

Instead, we should raise a more informative error saying "change X did not contain any shelved files that are mapped into this workspace/stream"

18:33:09 p4python INFO: p4 sync -q --parallel=threads=0 //depot/dev/.buildkite/...@11702
--
  | 18:33:10 p4python INFO: p4 revert //...
  | 18:33:10 p4python WARNING: //... - file(s) not opened on this client.
  | 18:33:10 p4python INFO: p4 unshelve -s 11678
  | 18:33:10 p4python WARNING: Change 11678 - no file(s) to unshelve.
  | 18:33:10 p4python INFO: p4 change -o
  | 18:33:10 p4python INFO: p4 change -i
  | 18:33:11 p4python INFO: Change 11705 created.
  | 18:33:11 p4python INFO: p4 changes -c client-name -s pending -m 1
  | 18:33:11 p4python INFO: p4 shelve -c 11705
  | 18:33:11 p4python ERROR: No files to shelve.
  | Traceback (most recent call last):
  | File "/var/lib/buildkite-agent/plugins/github-com-ca-johnson-perforce-buildkite-plugin-override-git-metadata/hooks/../python/checkout.py", line 44, in <module>
  | main()
  | File "/var/lib/buildkite-agent/plugins/github-com-ca-johnson-perforce-buildkite-plugin-override-git-metadata/hooks/../python/checkout.py", line 35, in main
  | changelist = repo.backup(user_changelist)
  | File "/var/lib/buildkite-agent/plugins/github-com-ca-johnson-perforce-buildkite-plugin-override-git-metadata/python/perforce.py", line 165, in backup
  | self.perforce.run_shelve('-c', backup_cl)
  | File "/var/lib/buildkite-agent/.local/lib/python2.7/site-packages/P4.py", line 646, in run_shelve
  | return self.run("shelve", *nargs, **kargs)
  | File "/var/lib/buildkite-agent/.local/lib/python2.7/site-packages/P4.py", line 611, in run
  | raise e
  | P4.P4Exception: [P4#run] Errors during command execution( "p4 shelve -c 11705" )
  |  
  | [Error]: 'No files to shelve.'

Travis CI failing to install p4python (so cannot pytest)

As part of pip install p4python, it retrieves the p4 api from ftp.perforce.com to build against (since p4pthon doesnt have any wheels for linux, it needs to build from src)

curl ftp.perforce.com fails - if we get this to pass it should have no problem installing p4python.

Uniquely identify workspaces more accurately

BUILDKITE_AGENT_ID is too unique - restarting the agent causes this to change and workspaces to thrash

Ideally we would have consistency with:
A) A host
B) That agent e.g. 1/2/3/4

Ideal workspace name:
bk-p4--N

Use of `capsys` causes log access errors on unit test failures

noticed in #203

When unit tests fail, there is an issue where log file is closed prior to test exiting.

Example error:

Message: 'p4 sync --parallel=threads=0 //...'
Arguments: ()
--- Logging error ---
Traceback (most recent call last):
  File "/Users/carl/.pyenv/versions/3.6.0/lib/python3.6/logging/__init__.py", line 989, in emit
    stream.write(msg)
ValueError: I/O operation on closed file.

Repro caused by removing one of the with exception blocks in a unit test (or adding one where an exception is not raised)

I was able to track this down to the use of capsys in test_server_fixture

If we remove such usage, the issue is resolved. However, it is quite convenient to print to stdout the server address.

Suggested solutions:

  • Find a workaround to allow continued use of capsys
  • Remove use of capsys, add to instructions in test_server_fixture to run pytest with a special flag to disable output capturing so you can still see the test server address printed to stdout

Annotate a warning/error if unshelved changelist contents differed in any steps

hash = md5()
for digest in changeinfo['digests']:
  md5.add(digest)
if not set_metadata('buildkite:perforce:changedigest', md5.hex()'):
  # Already set, compare against
  if get_metadata(key) != md5.hex():
	buildkite-agent annotate --type error "Shelved files were modified during the build, some steps were ran against different versions of your shelved files. Please run another build to validate your change prior to submission"

"Negative" sync paths

We can specify paths via sync to say we want only these paths, but actually for a particular use case we want to sync everything except one path.

It would be convenient if we could use perforce-like convention for workspace view in the sync option like:

//sync/me/..
-//sync/me/not/...

or

-//sync/me/not/...  

(where //... is implicit)

Document how a user might set up a hook to trigger builds

  • A) Put a script on the server that is ran on commits (p4 triggers)
  • B) Poll head revision, trigger builds for each revision in between

Ideally we get a sample p4 trigger and/or polling service so people don't have to write this themselves and repeat our mistakes

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.