Code Monkey home page Code Monkey logo

comparative-experiments's Introduction

Comparative Experiments

Introduction

This project is designed to compare the performance of different algorithms on a given dataset.

Dependencies

  • Python 3.11
  • Numpy

Usage

import numpy as np
from comparative_experiments.experiment import SingleExperiment, ExperimentComparator
from comparative_experiments.metrics import mse, rmse, mae, r2


def main():
    # Define the metrics to be used for comparison
    metrics = {
        'MSE': mse,
        'RMSE': rmse,
        'MAE': mae,
        'R2': r2
    }

    # Initialize the comparator
    comparator = ExperimentComparator(metrics=metrics)

    # Create a simple experiment
    def mock_run_callable1(X, y):
        return X  # Simple mock behavior for testing

    def mock_run_callable2(X, y):
        return X * 2

    def mock_run_callable3(X, y):
        return X * 3

    experiment1 = SingleExperiment(name="Mock Experiment 1", run_callable=mock_run_callable1)
    experiment2 = SingleExperiment(name="Mock Experiment 2", run_callable=mock_run_callable2)
    experiment3 = SingleExperiment(name="Mock Experiment 3", run_callable=mock_run_callable3)

    # Add the experiment to the comparator
    comparator.add_experiment(experiment1)
    comparator.add_experiment(experiment2)
    comparator.add_experiment(experiment3)

    # Set the data for the experiments
    X_test = np.array([1, 2, 3])
    y_test = np.array([2, 4, 6])
    comparator.set_data(X=X_test, y=y_test)

    # Run the experiments and evaluate them
    results = comparator.run()

    # Print the results
    for experiment_name, metrics in results.items():
        print(f"Experiment: {experiment_name}")
        for metric_name, value in metrics.items():
            print(f"{metric_name}: {value}")

    # Export the results to a CSV file
    save_path = "./experiment_results.csv"
    comparator.export_results(save_path)


if __name__ == "__main__":
    main()

Or run the example script:

python ./examples/basic_usage.py

Output

MSE RMSE MAE R2
Mock Experiment 1 4.666666666666667 2.160246899469287 2.0 -0.75
Mock Experiment 2 0.0 0.0 0.0 1.0
Mock Experiment 3 4.666666666666667 2.160246899469287 2.0 -0.75

Test

pytest

Future Work

  • Impove the stability of the comparator
  • SingleExperiment logging
  • ExperimentComparator logging

comparative-experiments's People

Contributors

kv-chiu avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.