Code Monkey home page Code Monkey logo

autovmaf's Introduction

autovmaf

autovmaf - A toolkit to automatically encode multiple bitrates and perform automated VMAF measurements on all of them.

๐Ÿ“– Read the documentation ๐Ÿ‘€

Report a Bug ยท Request a Feature

license license license

PRs welcome made with hearth by Eyevinn

By optimizing ABR-ladders for specific content, you will make sure to not have wasteful rungs and this has been shown to cut bandwidth usage in half.

Usage

Transcoding and VMAF analysis can either be run in AWS or locally. When running in aws, you will need a running ECS cluster with a task definition configured to run easyvmaf-s3.

Installation

npm install --save @eyevinn/autovmaf

Environment Variables

A few environment variables can be set. These are:

LOAD_CREDENTIALS_FROM_ENV=true   //Load AWS credentials from environment variables
AWS_REGION=eu-north-1
AWS_ACCESS_KEY_ID=ABCD...
AWS_SECRET_ACCESS_KEY=EFGH...

Generate VMAF measurements

To generate VMAF measurements, you will need to define a job which can be created with the createJob()-function.

const { createJob } = require('@eyevinn/autovmaf');

const vmafScores = await createJob({
  name: 'MyVMAFmeasurements',
  pipeline: 'pipeline.yml',
  encodingProfile: 'profile.json',
  reference: 'reference.mp4',
  models: ['HD', 'PhoneHD'], // optional
  resolutions: [
    {
      // optional
      width: 1280,
      height: 720,
      range: {
        // optional
        min: 500000,
        max: 600000
      }
    }
  ],
  bitrates: [
    // optional
    500000, 600000, 800000
  ],
  method: 'bruteForce' // optional
});

When creating a job, you can specify:

  • Name
    • This will name the folder in which to put the files.
  • Pipeline
    • Path to a YAML-file that defines the pipeline. See examples/pipeline.yml for an example AWS-pipeline.
    • When running locally, pipeline data can be inlined in the job definition.
  • Encoding Profile
    • Path to a JSON-file that defines how the reference should be encoded. When using AWS, this is a MediaConvert configuration. See an example for AWS at examples/aws/encoding-profile.json. For local pipelines, this is key-value pairs that will be passed as command line arguments to FFmpeg. If pipeline data is inlined in the job definition, encodingProfile can be omitted and key-value pairs can instead be set in the ffmpegOptions property of the pipeline object.
  • Reference
    • Path to the reference video to analyze. Normally a local path, but when using AWS, this can also be an S3-URI.
  • Models (optional)
    • A list of VMAF-models to use in evaluation. This can be HD, MobileHD and UHD. HD by default.
  • Resolutions (optional)
    • A list of resolutions to test. By default it will test all resolutions in the example ABR-ladder provided by Apple in the HLS Authoring Spec.
    • Range (optional)
      • A min and max bitrate for testing a specific resolution. Adding a range will filter out bitrates that are outside of the given range. It is disabled by default.
  • Bitrates (optional)
    • A list of bitrates to test. By default a list of bitrates between 150 kbit/s to 9000 kbit/s.
  • Method (optional)
    • The method to use when analyzing the videos. Either bruteForce or walkTheHull. By default bruteForce. NOTE: walkTheHull is not implemented at the moment.

Create job using yaml

const { createJob } = require('@eyevinn/autovmaf');
const YAML = require('yaml');
const fs = require('fs');
const parseResolutions = (resolutions) => {
  resolutions.map((resolutionStr) => ({
    width: parseInt(resolutionStr.split('x')[0]),
    height: parseInt(resolutionStr.split('x')[1])
  }));
};
const jobFile = fs.readFileSync('job.yml', 'utf-8');
const jobData = YAML.parse(jobFile);
const job = {
  ...jobData,
  resolutions:
    jobData['resolutions'] !== undefined
      ? parseResolutions(jobData['resolutions'])
      : undefined
};
createJob(job);

An example of creating a job from a YAML-file can be seen in the examples-folder.

Read VMAF-scores

Using getVmaf(), you can read VMAF-scores from a JSON-file or a directory of JSON-files. This works on both local paths as well as S3-URIs with a "s3://"-prefix.

Example:

const vmafFiles = await getVmaf('s3://path/to/vmaf/');

vmafFiles.forEach((file) => {
  console.log(file.filename + ': ' + file.vmaf);
});

CLI Usage

When running with the cli, all transcoding and vmaf analysis will be run locally.

Requirements

Global installation

Installing globally with npm -g will make the autovmaf command available in your path

npm install -g @eyevinn/autovmaf

Environments variables

  • EASYVMAF_PATH - needs to point to the file easyVmaf.py from your easyVmaf installation.
  • FFMPEG_PATH - only needs to be set if ffmpeg is not in your path.
  • PYTHON_PATH - only needs to be set if python is not in your path.

Command line options

Available command line options for the cli can be listed with the --help argument

autovmaf [source]

run transcode and vmaf analysis for videofile source

Commands:
  autovmaf [source]                 run transcode and vmaf analysis for
                                    videofile source                   [default]
  autovmaf suggest-ladder <folder>  Suggest bitrate ladder given vmaf results
  autovmaf export-csv <folder>      Export Vmaf results as csv

Positionals:
  source  SOURCEFILE                                                    [string]

Options:
  --version         Show version number                                [boolean]
  --help            Show help                                          [boolean]
  --resolutions     List of resolutions, ie 1920x1080,1280x720...       [string]
  --bitrates        List of bitrates, ie 800k,1000k,...                 [string]
  --name            Name for this autovmaf run                          [string]
  --models          List of VMAF Models to use                          [string]
  --job             File with job definition                            [string]
  --saveAsCsv       Save VMAF measurements as a .csv file in addition to a JSON
                    file                              [boolean] [default: false]
  --skipTranscode   Skip transcode and run vmaf on already transcoded files
                                                      [boolean] [default: false]
  --skipExisting    Skip transcode for already transcoded files
                                                       [boolean] [default: true]
  --probeBitrate    Read bitrate of transcoded file with ffprobe
                                                      [boolean] [default: false]
  --ffmpeg-options  List of options to pass to ffmpeg, on the form
                    key1=value1:key2=value2                             [string]

Output files will be stored in a folder corresponding to the argument given to the --name option. If resolutions and/or bitrates are not specified default values will be used, See above.

Providing job definition in a json or yaml file

With the --job option, a path to a yaml or json file with a job definition can be passed to to the cli. The values defined in the file can be overridden with other commandline options. For instance the reference video defined in the job file can be overridden by passing a source file on the command line.

Using variables in the job definition

It is possible to iterate over other variables than bitrate and resolutions when running a local encode. For instance, to run transcode and vmaf analysis with x265 in CRF mode for a number of CRF values, a job definition like below can be used (also available in examples/local/local-job-crf.yaml)

models:
  - HD
resolutions:
  - width: 1920
    height: 1080
bitrates:
  - 0
pipeline:
  ffmpegEncoder: libx265
  singlePass: true
  skipDefaultOptions: true
  ffmpegOptions:
    '-pix_fmt': 'yuv420p'
    '-preset': 'veryslow'
    '-x265-params': 'crf=%CRF%:scenecut=0:keyint=50:min-keyint=50:open-gop=0'
  easyVmafExtraArgs:
    '-threads': 20
pipelineVariables:
  CRF:
    - 22
    - 26
    - 30
    - 34

This will run transcode and vmaf analysis for CRF values 22,26,30, and 34. Variables are used in the ffmpeg options by insterting %VARIABLENAME%. This string will then be substituted with a value from the list of values from pipelineVariables.VARIABLENAME. Note that when running CRF encode or other non-ABR mode, skipDefaultOptions must be set to avoid injecting bitrate options to ffmpeg. Also note that the cli needs to be run with the --probe-bitrate option to get the correct bitrate from the transcoded files.

Generate VMAF measurements example

autovmaf --resolutions 1920x1080,1280x720,960x540 --bitrates 500k,800k,1200k,1600k,2000k,3000k,4000k --name my-autovmaf-test1 my-source-video.mp4

With the above command, when the run is finished transcoded files will be available in the folder my-autovmaf-test1, and vmaf-data in the folder my-autovmaf-test1/HD.

Development

Run tests

npm test

About Eyevinn Technology

Eyevinn Technology is an independent consultant firm specialized in video and streaming. Independent in a way that we are not commercially tied to any platform or technology vendor.

At Eyevinn, every software developer consultant has a dedicated budget reserved for open source development and contribution to the open source community. This give us room for innovation, team building and personal competence development. And also gives us as a company a way to contribute back to the open source community.

Want to know more about Eyevinn and how it is to work here. Contact us at [email protected]!

autovmaf's People

Contributors

birme avatar dependabot[bot] avatar friday avatar grusell avatar jonathanwalter avatar oscnord avatar quartercastle avatar slowmove avatar zapfire88 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

autovmaf's Issues

Improve suggest ladder functionality

The current algorithm for generating a bitrate ladder only considers bitrates that were actually measured on for inclusion in the bitrate ladder. It would make sense to tweak the algorithm to use linear interpolation between measurement points to allow it to select more optimal bitrates.

Update README.md

The readme needs to contain much more information. One should be able to use autoabr without going through the code base.

Handle hardcoded paths correctly

There might be an issue related to hardcoded paths in Autovmaf see: https://github.com/Eyevinn/autovmaf/blob/e17167dff563b5d80e60eb8dcc26973d1ab52140/src/pipelines/local/local-pipeline.ts#L76C67-L76C76 and running the CLI/local jobs on Windows.

(node:448) NOTE: We are formalizing our plans to enter AWS SDK for JavaScript (v2) into maintenance mode in 2023.

Please migrate your code to use AWS SDK for JavaScript (v3).
For more information, check the migration guide at https://a.co/7PzMCcy
(Use `node --trace-warnings ...` to show where the warning was created)
info: Creating job MyVMAFMeasurements.
info: ffmpegOptions: ["-vf","scale=640:360","-c:v","libx264","-b:v","150000","-maxrate","150000","-bufsize","300000"]
autovmaf <source>

run transcode and vmaf for videofile source

Commands:
  autovmaf <source>                 run transcode and vmaf for videofile source
                                                                       [default]
  autovmaf suggest-ladder <folder>  Suggest bitrate ladder given vmaf results

Positionals:
  source  SOURCEFILE                                                    [string]

Options:
  --help            Show help                                          [boolean]
  --version         Show version number                                [boolean]
  --resolutions     List of resolutions, ie 1920x1080,1280x720...       [string]
  --bitrates        List of bitrates, ie 800k,1000k,...                 [string]
  --name            Name for this autovmaf run
                                        [string] [default: "MyVMAFMeasurements"]
  --models          List of VMAF Models to use          [string] [default: "HD"]
  --ffmpeg-options  List of options to pass to ffmpeg, on the form
                    key1=value1:key2=value2                             [string]

Error: ffmpeg exited with code 4294967294: Error opening output file /dev/null.
Error opening output files: No such file or directory

    at ChildProcess.<anonymous> (C:\Users\kalle\AppData\Roaming\nvm\v18.18.0\node_modules\@eyevinn\autovmaf\node_modules\fluent-ffmpeg\lib\processor.js:182:22)
    at ChildProcess.emit (node:events:517:28)
    at ChildProcess.emit (node:domain:489:12)
    at ChildProcess._handle.onexit (node:internal/child_process:292:12)
PS C:\Users\kalle\Documents\Works\Gigset\AutoVMAF>

It should be possible to set bitrate range for a resolution

Currently all bitrates will be encoded and measured for all resolutions. It would be nice to be able to set a bitrate range for a resolution (optional).

Proposal:

{
    name: "MyVMAFmeasurements",
    pipeline: "pipeline.yml",
    encodingProfile: "profile.json",
    reference: "reference.mp4",
    models: ["HD", "PhoneHD"],
    resolutions: [
    { 
       width: 1280, 
       height: 720,
       range: {
         min: 500000,
         max: 600000
       }    
    }],
    bitrates: [500000, 600000, 800000],
    method: "bruteForce"
}

autovmaf wants to change profile parameters through its interface. OSAAS api gets the profile from http. Make a decision and implement.

OSAAS encore api profile parameter description
User can provide a url to a profiles file on instance creation. If no such url is provided a url to default profiles is used.

If this is the case, then the user is required to have the profile hosted on http? Right now the default profiles are on github. I assume similar hosting would be required if one was to use their own profile.yml. As autovmaf is designed today, the user expects to have the ability to affect the resolution and bitrate of the transcoding output through the autovmaf interface. How do we want to solve this?

Local profile.yml pushed somewhere where the OSAAS encore api can access it?

Deliverable
Should be able to start a job in exact same process flow as local and aws pipelines. This likely implies that we will need some way for the autovmaf job parameters to be pushed to the hosted encore profile. Current control flow is that a .yml profile is stored on github or other http host, thereby accessible by the encore OSAAS api?

https://svt.github.io/encore-doc/#profiles
As described by the SVT encore documentation
A Profile can be seen as a general abstraction of an FFmpeg configuration - example, bitrate to use, thumbnail generation, the codec to use.
A Profile specifies a big part of the configuration used by an Encore Job - metadata, FFmpeg configuration and specific codec configuration.
A Profile is specified in the yaml-format.

Check if VMAF output file already exists before starting a task in ECS

Currently we do not check if an output json file for a specific resolution/bitrate already exists before starting an ECS task. There should be a check so that we do not run the same VMAF measurement task multiple times.

async analyzeQuality(reference: string, distorted: string, output: string, model: QualityAnalysisModel): Promise<string> {
let outputFilename: string;
if (isS3URI(output)) {
const outputUrl = new URL(output);
// Remove initial '/' in pathname
outputFilename = outputUrl.pathname.substring(1);
} else {
outputFilename = output;
}
const outputBucket = this.configuration.outputBucket;
const outputObject = outputFilename;
const outputURI = `s3://${outputBucket}/results/${outputObject}`;
const referenceFilename = await this.uploadIfNeeded(reference, outputBucket, path.dirname(outputObject));
const distortedFilename = await this.uploadIfNeeded(distorted, outputBucket, path.dirname(outputObject));

Help for running autovmaf in Docker-Container

Hi Devs,

Many Kudos for this excellent piece of software!

I'd like to create a Dockerfile for running autovmaf in a container, but currently it fails, hopefully you can help me?

Dockerfile:

FROM python:latest

COPY --from=mwader/static-ffmpeg:5.1.2 /ffmpeg /usr/local/bin/
COPY --from=mwader/static-ffmpeg:5.1.2 /ffprobe /usr/local/bin/

RUN apt update && sh -c 'curl -sL https://deb.nodesource.com/setup_19.x | bash -'
RUN apt install nodejs -y
RUN npm install -g @eyevinn/autovmaf

RUN git clone https://github.com/gdavila/easyVmaf.git
RUN pip3 install ffmpeg-progress-yield

ENV EASYVMAF_PATH=/easyVmaf/
ENV FFMPEG_PATH=/usr/local/bin/
ENV PYTHON_PATH=/usr/bin/

Building image:

docker buildx build --platform linux/amd64,linux/arm64 . -f Dockerfile -t r.gl-systemhaus.de/tools/autovmaf:latest --push

Running container fails:

docker run  -v /home/martini/:/tmp -it r.gl-systemhaus.de/tools/autovmaf:latest /bin/bash
autovmaf --resolutions 1920x1080,1280x720,960x540 --bitrates 500k,800k,1200k,1600k,2000k,3000k,4000k --name my-autovmaf-test1 /tmp/1080P50.mp4

Error: spawn /usr/local/bin/ EACCES
    at ChildProcess._handle.onexit (node:internal/child_process:285:19)
    at onErrorNT (node:internal/child_process:483:16)
    at process.processTicksAndRejections (node:internal/process/task_queues:82:21) {
  errno: -13,
  code: 'EACCES',
  syscall: 'spawn /usr/local/bin/',
  path: '/usr/local/bin/',
  spawnargs: [ '-formats' ]
}

=> I'm not in expert in node, so I don't know, how to fix this error Error: spawn /usr/local/bin/ EACCES

Make it possible to run autoabr as a lambda

Currently when running a vmaf analysis the the program is running locally. This however can cause issues if for example the internet connection is lost or the computer goes to sleep etc.

It would therefore be nice to have an option to run it as a lambda or something similar so it isn't dependant on a local setup.

image

Revise S3 bucket file structure

Proposed new file structure:

bucket-name (or maybe S3 prefix)
โ”œโ”€โ”€ encoded-files
โ”‚   โ””โ”€โ”€ job-name
โ”‚       โ””โ”€โ”€ HxW
โ”‚           โ”œโ”€โ”€ HxW_b.mp4
โ”‚           โ””โ”€โ”€ HxW_b2.mp4
โ”œโ”€โ”€ reference-files
โ”‚   โ””โ”€โ”€ high-res-reference.mxf
โ””โ”€โ”€ results
    โ””โ”€โ”€ job-name
        โ””โ”€โ”€ model
            โ”œโ”€โ”€ HxW_b2_vmaf.json
            โ””โ”€โ”€ HxW_b_vmaf.json

replace fluent-ffmpeg with custom cli wrapper + pidusage package?

I looked into how we are measuring CPU usage with our fluent-ffmpeg fork, and I can see some issues with it. For starters, time is a shell builtin in bash, zsh and fish, and then the binary is not needed, and if not installed the local pipeline will fail. It will also fail for Windows. Maybe it should degrade gracefully instead so we don't get any CPU measurements, but I think we still can get them with pidusage (which should work cross platform).

This only works if we have access to the pid. From what I have seen fluent-ffmpeg doesn't expose the pid or the process, but it's really not that hard to write a cli wrapper for ffmpeg. I have already done that previously (this doesn't help "build" the command for you unlike fluent-ffmpeg, making it much lighter and easier to maintain, it just helps you run the command and get the progress updates).

The pidusage package also recommends process.cpuUsage, which also works cross platform, but it's is not available on the child proccess, and I'm not sure if it's possible to use it with child processes). Perhaps you can if you fork the process, then spawn a command from there, but I'm not sure if it measures the child process cpu usage then?

Retry on AWS timeout

The current maxWaitTime for ECS tasks and MediaConvert jobs are set to 3600 seconds. This should be increased and we should not allow the service to crash if a timeout happens. Instead a retry should occur so that the job doesn't need to be restarted.

async waitForObjectInS3(S3Bucket: string, S3Key: string): Promise<boolean> {
try {
await waitUntilObjectExists({ client: this.s3, maxWaitTime: AWSPipeline.MAX_WAIT_TIME }, { Bucket: S3Bucket, Key: S3Key });
return true;
} catch (error) {
logger.error(`Error waiting for object ${S3Key} in bucket ${S3Bucket}: \n Error: ${error}`);
return false;
}
}

Write great docs

A good README.md and some awesome ๐ŸŒˆ documentation. How hard could it be?

MkDocs is nice.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.