Code Monkey home page Code Monkey logo

fashion-compatibility's Introduction

Learning Type-Aware Embeddings for Fashion Compatibility

fashion-compatibility contains a PyTorch implementation for our paper. If you find this code or our dataset useful in your research, please consider citing:

@inproceedings{VasilevaECCV18FasionCompatibility,
Author = {Mariya I. Vasileva and Bryan A. Plummer and Krishna Dusad and Shreya Rajpal and Ranjitha Kumar and David Forsyth},
Title = {Learning Type-Aware Embeddings for Fashion Compatibility},
booktitle = {ECCV},
Year = {2018}
}

This code was tested on an Ubuntu 16.04 system using Pytorch version 0.1.12. It is based on the official implementation of the Conditional Similarity Networks paper.

Usage

You can download the Polyvore Outfits dataset including the splits and questions for the compatibility and fill-in-the-blank tasks from here (6G). The code assumes you unpacked it in a dictory called data, but if you choose a different directory simply set the --datadir argument. You can see a listing and description of the model options with:

    python main.py --help

Using a pre-trained model

We have provided a pre-trained model for the nondisjoint data split which you can download here (11M). This model learns diagonal projections from the general embedding to a type-specific compatibility space which is L2 normalized after appying the projection. You can test this model using:

    python main.py --test --l2_embed --resume runs/nondisjoint_l2norm/model_best.pth.tar

This code includes some minor modifications resulting in better perfromance than the version used for our camera ready. For example, our pre-trained model should provide a compatibility AUC of 0.88 and fill-in-the-blank accuracy of 57.6, which is a little better than the 0.86 AUC/55.3 accuracy for our best model reported in our paper.

Training a new model

To train the pre-trained model above we used the following command.

    python main.py --name {your experiment name} --learned --l2_embed

By default the code outputs the results on the test set after training. However, if you wanted to re-run the test for many settings you have to use the same flags during testing as you had during training. For example, if you trained with the --use_fc to train fully connected type-specific embeddings rather than a (diagonal) mask, at test time you would use:

   python main.py --test --use_fc --resume runs/{your experiment name}/model_best.pth.tar

fashion-compatibility's People

Contributors

bryanplummer avatar mvasil avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.