Code Monkey home page Code Monkey logo

odl's Introduction

================================================
LIBOL - A Library for Online Learning Algorithms
================================================                    

                    V0.3.0
                    
Available at http://libol.stevenhoi.org

Authors: Steven C.H. Hoi, Jialei Wang, Peilin Zhao

	[email protected]

	Nanyang Technological University
	School of Computer Engineering	
	Singapore 639798	

Scientific results produced using the software provided shall acknowledge
the use of LIBOL. Please cite as

    Steven C.H. Hoi, Jialei Wang, and Peilin Zhao.
    LIBOL: A Library for Online Learning Algorithms.
    Nanyang Technological University, 2012.
    Software available at http://libol.stevenhoi.org/

Moreover shall the author of LIBOL be informed about the publications.

About LIBOL
===========================================================================
LIBOL is a simple package for solving large-scale online learning tasks. 
This version has a total of 16 different linear online algorithms for binary classification, 
and a total of 13 different algorithms and variants for multiclass classification. 

Specifically, LIBOL consists of a family of first order online learning algorithms as follows
- Perceptron: the classical online learning algorithm (Rosenblatt, 1958);
- ALMA: A New ApproximateMaximal Margin Classification Algorithm Gentile (2001);
- ROMMA: the relaxed online maxiumu margin algorithms (Li and Long, 2002);
- OGD: the Online Gradient Descent (OGD) algorithms (Zinkevich, 2003);
- PA: Passive Aggressive (PA) algorithms (Crammer et al., 2006);
and a family of second order online learning algorithms as follows
- SOP: the Second Order Perceptron (SOP) algorithm (Cesa-Bianchi et al., 2005);
- CW: the Confidence-Weighted (CW) learning (Dredze et al., 2008);
- IELLIP: online learning algorithms by improved ellipsoid method Yang et al. (2009);
- AROW: the Adaptive Regularization of Weight Vectors (Crammer et al., 2009);
- NAROW: New variant of Adaptive Regularization (Orabona and Crammer, 2010);
- NHERD: the Normal Herding method via Gaussian Herding (Crammer and Lee, 2010)
- SCW: the recently proposed Soft ConfidenceWeighted algorithms (Wang et al., 2012).

This document briefly explains the usage of LIBOL. A more detailed manual can be found from the LIBOL web site.

To get started, please read the ``Quick Start'' section first.
For developers, please check the "run_experiment.m" to learn how to explore LIBOL for developing your new algorithms.

Table of Contents
=================

- Quick Start
- Installation
- Examples
- Additional Information

Quick Start
===========

See the section ``Installation'' for installing LIBOL.

Type "make", and then "demo" to get started 

Installation
============

Simply type `make' to compile a list of core C++ programs (including "libsvmread"), and to add the paths of core algorithms ("algorithms", "algorithms_mex", "libsvm", "weka2matlab") automatically.

If you do not wish to re-compile the library, you can simply type 'init' to initialize the library by adding the necessary paths. 

Examples
========

>> demo

Apply the online "Perceptron" algorithm on the default data set "svmguide3"

Note: By default, one random trial is applied for automated parameter selection and a total of 20 runs will be applied to obtain average results; The rest is the same.

>> demo('bc')

Demo of binary classification. Apply the online "Perceptron" algorithm on the default data set "svmguide3"

>> demo('mc')

Demo of multiclass classification. Apply the online multi-class Perceptron ("PerceptronM") algorithm on the default data set "glass"

>> demo('bc','PA','svmguide3')

Apply the online "PA" algorithm on the matlab-format binary-class data set "svmguide3"

>> demo('bc','PA','w1a','libsvm','c')

Apply the online "PA" algorithm on the libsvm-format binary-class data set "w1a" (where the core function is based on the 'C' implementation).

>> demo('bc','Perceptron','sonar','arff')

Apply the "Perceptron" algorithm on the weka-ARFF-format data set "sonar.arff"  % This requires to install WEKA and add WEKA.jar path to the matlab Java class path

>> demo('mc','M_PA','glass')

Apply the online multiclass "PA" (M_PA) algorithm on the matlab-format multiclass data set "glass" (where the core function is by default using 'matlab' implementation).

>> run_experiment('bc')

Run the entire set of binary classification experiments for side-by-side comparisons of all algorithms on the default data set "svmguide3" (by default using matlab implementation)

>> run_experiment('mc')

Run the entire set of multiclass classification experiments for side-by-side comparisons of all algorithms on the default data set "glass" (by default using matlab implementation)

>> run_experiment('bc','svmguide3','mat','c')

Run the entire set of binary classification experiments for side-by-side comparisons of all algorithms on the matlab-format data set "svmguide3" using C implementation

>> run_experiment('mc','glass','mat','c')

Run the entire set of multi-class experiments for side-by-side comparisons of all algorithms on the matlab-format data set "glass" using C implementation

Additional Information
======================

If you have any questions and comments, please send your email to [email protected]

For more info about LIBOL, please visit http://libol.stevenhoi.org/

Last updated date: 12 December, 2013.

odl's People

Contributors

doyensahoo avatar phquang avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

odl's Issues

Updating stages of alpha are not clear in the code

Thank you for the implementation.

I was trying to convert your implementation to be compatible with TensorFlow backend. Comparing the paper with your implementation, I had several issues tracking the updates of the adaptation weights "alphas".

  • the "adaptive_weight" configuration is not used. I guess it is also related to updating alphas
  • I couldn't track where you are doing the smoothing and normalizing stages of alphas as stated in "Algorithm 1" in your paper.

Thanks in advance for your reply and help.

Error while executing train.py

Hello Sir,

I am getting an error run() got an unexpected keyword argument 'hedge' while executing the train.py

Can you please let me know what might be cause of it?

Thanks & Regards
Mrugank Akarte

tensorflow version issue

sorry, could you show me the tensorflow version env?
because i catch a error, when use repo dataset and code, i can;t reproduce HPB training demo.

model.compile(optimizer = optim, loss = loss_dict, hedge = config['hedge'],loss_weights = loss_weights, metrics = ['accuracy'])

File "/usr/local/python27/lib/python2.7/site-packages/keras/engine/training.py", line 684, in compile
total_loss = K.sum(K.dot(self.holder, self.metrics_tensors))
File "/usr/local/python27/lib/python2.7/site-packages/keras/backend/tensorflow_backend.py", line 790, in dot
if ndim(x) is not None and (ndim(x) > 2 or ndim(y) > 2):
File "/usr/local/python27/lib/python2.7/site-packages/keras/backend/tensorflow_backend.py", line 396, in ndim
dims = x.get_shape()._dims
AttributeError: 'list' object has no attribute 'get_shape'

it seems an tensorflow error
i use keras version as repo show and backend tensorflow i use 1.2.0

tensorflow issue

Hi, I have installed the dependency of Theano 0.8.2 and Keras 1.2.1 and replaced Keras's keras/engine/training.py file with the modified training.py. Then when I try to run the main.py, there is error regarding tensorflow though you mentioned in the previous issues that you did not use tensorflow backend.

It looks like the tensorflow was automatically imported when keras was imported. So I guess I should install one more dependency of tensorflow. But which version of tensorflow did you use?

(onlineDL) E:\ODL\src\hbp>python main.py -c hb19.yaml
Using TensorFlow backend.
Traceback (most recent call last):
File "main.py", line 7, in
from keras.datasets import mnist
File "E:\miniconda3_64\envs\onlineDL\lib\site-packages\keras_init_.py", line 4, in
from . import engine
File "E:\miniconda3_64\envs\onlineDL\lib\site-packages\keras\engine_init_.py", line 10, in
from .training import Model
File "E:\miniconda3_64\envs\onlineDL\lib\site-packages\keras\engine\training.py", line 20, in
from .. import optimizers
File "E:\miniconda3_64\envs\onlineDL\lib\site-packages\keras\optimizers.py", line 9, in
import tensorflow as tf
ImportError: No module named tensorflow

Having problem running the demo

I have installed the required dependencies, but when I run the demo as instructed, the following error appears:
Traceback (most recent call last):
File "main.py", line 110, in
my_callback = main(sys.argv[1:], 0)
File "main.py", line 99, in main
model.compile(optimizer = optim, loss = loss_dict, hedge = config['hedge'],loss_weights = loss_weights, metrics = ['accuracy'])
File "/usr/local/lib/python2.7/site-packages/keras/engine/training.py", line 685, in compile
total_loss = K.sum(K.dot(self.holder, self.metrics_tensors))
File "/usr/local/lib/python2.7/site-packages/keras/backend/tensorflow_backend.py", line 790, in dot
if ndim(x) is not None and (ndim(x) > 2 or ndim(y) > 2):
File "/usr/local/lib/python2.7/site-packages/keras/backend/tensorflow_backend.py", line 396, in ndim
dims = x.get_shape()._dims
AttributeError: 'list' object has no attribute 'get_shape'

How do I solve this?

The following problems occurred when running train.py. Would you please help me? Thank you very much.

Traceback (most recent call last):
File "C:/Users/58413/Desktop/ODL-master/ODL-master/src/train.py", line 61, in
model.compile(optimizer=optim, loss = ['mse','mse','mse'], hedge = True, loss_weights = [1.0/3]*3, metrics =['accuracy'])
File "C:\Anaconda3\lib\site-packages\keras\engine\training.py", line 684, in compile
total_loss = K.sum(K.dot(self.holder, self.metrics_tensors))
File "C:\Anaconda3\lib\site-packages\keras\backend\tensorflow_backend.py", line 790, in dot
if ndim(x) is not None and (ndim(x) > 2 or ndim(y) > 2):
File "C:\Anaconda3\lib\site-packages\keras\backend\tensorflow_backend.py", line 396, in ndim
dims = x.get_shape()._dims
AttributeError: 'list' object has no attribute 'get_shape'

The following problems occurred when running "cd src/hbp -> python main.py -c hb19.yaml" Would you please help me? Thank you very much.

Traceback (most recent call last):
  File "main.py", line 110, in <module>
    my_callback = main(sys.argv[1:], 0)
  File "main.py", line 99, in main
    model.compile(optimizer = optim, loss = loss_dict, hedge = config['hedge'],loss_weights = loss_weights, metrics = ['accuracy'])
  File "/home/tyan/.local/lib/python2.7/site-packages/keras/engine/training.py", line 684, in compile
    total_loss = K.sum(K.dot(self.holder, self.metrics_tensors))
  File "/home/tyan/.local/lib/python2.7/site-packages/keras/backend/theano_backend.py", line 238, in dot
    if is_sparse(x):
  File "/home/tyan/.local/lib/python2.7/site-packages/keras/backend/theano_backend.py", line 49, in is_sparse
    return th_sparse_module and isinstance(tensor.type, th_sparse_module.SparseType)
AttributeError: 'list' object has no attribute 'type'

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.