Code Monkey home page Code Monkey logo

matrix-profile's Introduction

Matrix Profile Library

A Java library for Matrix Profile

Maven Central

Overview

matrix profile exmple

The Matrix Profile, has the potential to revolutionize time series data mining because of its generality, versatility, simplicity and scalability. In particular it has implications for time series motif discovery, time series joins, shapelet discovery (classification), density estimation, semantic segmentation, visualization, rule discovery, clustering etc

The advantages of using the Matrix Profile (over hashing, indexing, brute forcing a dimensionality reduced representation etc.) for most time series data mining tasks include:

  • It is exact: For motif discovery, discord discovery, time series joins etc., the Matrix Profile based methods provide no false positives or false dismissals.
  • It is simple and parameter-free: In contrast, the more general spatial access method algorithms typically require building and tuning spatial access methods and/or hash function.
  • It is space efficient: Matrix Profile construction algorithms requires an inconsequential space overhead, just linear in the time series length with a small constant factor, allowing massive datasets to be processed in main memory.
  • It allows anytime algorithms: While our exact algorithms are extremely scalable, for extremely large datasets we can compute the Matrix Profile in an anytime fashion, allowing ultra-fast approximate solutions.
  • It is incrementally maintainable: Having computed the Matrix Profile for a dataset, we can incrementally update it very efficiently. In many domains this means we can effectively maintain exact joins/motifs/discords on streaming data forever.
  • It does not require the user to set similarity/distance thresholds: For time series joins, the Matrix Profile provides full joins, eliminating the need to specify a similarity threshold, which is an unintuitive task for time series.
  • It can leverage hardware: Matrix Profile construction is embarrassingly parallelizable, both on multicore processors and in distributed systems.
  • It has time complexity that is constant in subsequence length: This is a very unusual and desirable property; all known time series join/motif/discord algorithms scale poorly as the subsequence length grows. In contrast, we have shown time series joins/motifs with subsequences lengths up to 100,000, at least two orders of magnitude longer than any other work we are aware of.
  • It can be constructed in deterministic time: All join/motif/discord algorithms we are aware of can radically different times to finish on two (even slightly) different datasets. In contrast, given only the length of the time series, we can precisely predict in advance how long it will take to compute the Matrix Profile
  • It can handle missing data: Even in the presence of missing data, we can provide answers which are guaranteed to have no false negatives.

For more information about matrix profile check out The UCR Matrix Profile Page

In version 0.0.2 we implemented MPdist measure. The useful properties of the MPdist include:

  • Ability to compare time series of different lengths.
  • Robust to spikes, dropouts, wandering baseline, missing values, and other issues that are common outside of benchmark dataset.
  • Invariance to amplitude and offset offered by DTW and Euclidean distance as well as additional invariances, including phase invariance, order invariance, liner trend invariance, and stutter invariance.
  • Allows scalability

We followed the fast MPdist algorithm that can be found in section "Speeding up MPdist Search" in the official paper.

Usage

  • MPdist (query by content)
  • STMP (two time series)
  • STMP (self join)
  • STAMP (with two time series)
  • STAMP (self join)
  • Misc:
    • MASS v2.0
    • Fast moving SD
    • Fast moving average

Note: We are using ND4j as time series representation. You can find more information about ND4j here

Installation

You can pull Matrix Profile library from the central maven repository, just add to pom.xml file:

<dependency>
   <groupId>io.github.ensozos</groupId>
   <artifactId>matrix-profile</artifactId>
   <version>0.0.3</version>
</dependency>

For gradle users add this to build.gradle:

compile 'io.github.ensozos:matrix-profile:0.0.3'

Example

The user needs to create a MatrixProfile profile object and pass the time series (INDArrays) as parameters:

  MatrixProfile matrixProfile = new MatrixProfile();
  
  int window = 4;
  INDArray target = Nd4j.create(new double[]{0.0, 6.0, -1.0, 2.0, 3.0, 1.0, 4.0}, new int[]{1, 7});
  INDArray query = Nd4j.create(new double[]{1.0, 2.0, 0.0, 0.0, -1}, new int[]{1, 5});

  matrixProfile.stamp(target, query, window);

For Matrix Profile distance you need to create MPdistance object:

  MPdistance mpDist = new MPdistance();
  
  int window = ...;
  INDArray target = Nd4j.create(...);
  INDArray query = Nd4j.create(...);

  mpDist.getMPdistance(target, query, window);

Other projects with Matrix Profile

License

Distributed under the MIT license. See LICENSE for more information.

Paper Citation

  • Yeh, Chin-Chia Michael & Zhu, Yan & Ulanova, Liudmila & Begum, Nurjahan & Ding, Yifei & Dau, Anh & Silva, Diego & Mueen, Abdullah & Keogh, Eamonn. (2016). Matrix Profile I: All Pairs Similarity Joins for Time Series: A Unifying View That Includes Motifs, Discords and Shapelets. 1317-1322. 10.1109/ICDM.2016.0179.

  • Abdullah Mueen, Yan Zhu, Michael Yeh, Kaveh Kamgar, Krishnamurthy Viswanathan, Chetan Kumar Gupta and Eamonn Keogh (2015), The Fastest Similarity Search Algorithm for Time Series Subsequences under Euclidean Distance

  • Matrix Profile XII: MPdist: A Novel Time Series Distance Measure to Allow Data Mining in More Challenging Scenarios. Shaghayegh Gharghabi, Shima Imani, Anthony Bagnall, Amirali Darvishzadeh, Eamonn Keogh. ICDM 2018.

matrix-profile's People

Contributors

ensozos avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

matrix-profile's Issues

MatrixProfile.stamp does not give deterministic results for self-join case

For the versions of stamp and stmp that take a query parameter, I see that the results are the same between them. However for the self-join case (no query provided), then the results for stamp are non-deterministic. I'm sure that this has to do with Random not using a seed, but shouldn't the result be the same even if the indices are processed in a different order each time?

When I run this test, I get different results for the second part of the pair nearly every time.

   INDArray shortTargetSeries = Nd4j.create(new double[]{0.0, 6.0, -1.0, 2.0, 3.0, 1.0, 4.0}, new int[]{1, 7});

   @Test
    public void testMatrixProfileSelfJoinStampWindow4() {
        int window = 4;
        Pair<INDArray, INDArray>expectedResultWhenSelfJoin = new Pair<>(
                Nd4j.create(new double[]{1.7308, POSITIVE_INFINITY, POSITIVE_INFINITY, 1.7308}, new int[]{1, 4}),
                Nd4j.create(new double[]{3.0000,    2.0000,    2.0000,     0}, new int[]{1, 4})
        );
        Pair<INDArray, INDArray> pair = matrixProfile.stamp(shortTargetSeries, window);
        assertEquals(expectedResultWhenSelfJoin.toString(), pair.toString());
    }

The first part of the pair in the result is always the same, but the second part of the pair is random. It is of the form
[ 3.0000, x, x, 0]]
where x is 0, 1, 2, or 3. Why? Is this a bug or my misunderstanding?

Offset of buffer can not be >= Integer.MAX_VALUE

Hello.

Working with a TimeSeries of size 50.000, I got the next error message (Exception):

Exception in thread "main" java.lang.IllegalArgumentException: Offset of buffer can not be >= Integer.MAX_VALUE
at org.nd4j.linalg.api.ndarray.BaseNDArray.offset(BaseNDArray.java:5216)
at org.nd4j.linalg.api.ndarray.BaseNDArray.subArray(BaseNDArray.java:2558)
at org.nd4j.linalg.api.ndarray.BaseNDArray.get(BaseNDArray.java:5013)
at io.github.ensozos.utils.CustomOperations.centeredMovingMinimum(CustomOperations.java:67)
at io.github.ensozos.core.MPdistance.getMPdistance(MPdistance.java:59)
at Main.main(Main.java:138)

Is it a fault (restriction) of nd4j?
Is it something else?

Thanks for your time!

How avoid NaN and Infinity values in the distance profile?

While adding more tests, I noticed that there are a number of cases where the distance part of the matrix profile can have Infinity and NaN values. One simple example (but not the only one) is a straight line.

If I run matrix profile on this straight line series of 10 y values

1.2, 1.2, 1.2, 1.2, 1.2, 1.2, 1.2, 1.2, 1.2, 1.2

with a window of 5, then I get MP distance values of

NaN, NaN, NaN, POSITIVE_INFINITY, 3.1623, POSITIVE_INFINITY

And MP index values of

2.0000, 3.0000, 4.0000, 5.0000, 1.0000, 5.0000

There are 6 values in the profile (as expected). The length should be 10 – windowSize + 1 = 6.
I don’t understand yet why there are NaN values. I was expecting all 0’s for the distances. Maybe it has to do with z-normalization.

I will look at the code more closely and try to make a proposal and perhaps PR to avoid these values.

Exception in thread `main`

When trying to use getMPdistance(), I get the following error:

Exception in thread "main" java.lang.NoSuchMethodError: 'org.nd4j.linalg.api.ndarray.INDArray org.nd4j.linalg.factory.Nd4j.zeros(long, long)'
	at io.github.ensozos.core.MPdistance.getMassDistMatrix(MPdistance.java:116)
	at io.github.ensozos.core.MPdistance.getMPdistance(MPdistance.java:52)
	at Main.Main.main(Main.java:29)

I suppose it is an error related to org.nd4j.linalg.api.ndarray.INDArray not having an zeros(long, long) method but I am not proficient in Java in order to search it more.
Any ideas on what to do with it?

Thanks!

Add accuracy parameter to stamp

The advantage of stamp over stmp is that it is an anytime algorithm.
Currently the anytime nature of it is not exploitable. There are 2 ways that we might fix that.

  1. Make the method asynchronous, and then provide a method to stop it in the middle of its execution and return an intermediate result.
  2. Add an "accuracy" parameter so that only x% of the random steps are executed before terminating.

I believe that both can be supported and would be useful, but 2) is easier. I would like to start with 2).

Add unit tests that show usage for some simple datasets

Currently there are no unit tests at all, and its not that clear from the README how the code can be applied to a real dataset. Some unit tests would help to ensure changes do not break anything, give instruction on usage, and give confidence that the code works as expected.

I will work on a PR to add some tests.

Improve unit tests

There are a few minor enhancements that I would like to make.
I will submit a PR shortly.

Running Matrix-Profile inside spark on Windows

I have a spark project on linux where I was able to add code that depended on Matrix-Profile by adding

"io.github.ensozos" % "matrix-profile" % "0.0.3" % "provided"

to the build.sbt file.
I used "provided" so that the huge number of c++ library dependencies required by Matrix-Profile would not be included in the jar that I deploy to spark-jobserver.
Spark still needs those dependencies, so I need to add them to the spark-submit command that is used to launch spark using the --packge option. This is what I added:

--packages "io.github.ensozos:matrix-profile:0.0.3,org.nd4j:nd4j-native-platform:1.0.0-beta2"

Then when I run the spark-submit command to start spark-jobserver (using server_start.bat) on windows, it downloads a lot off dependencies that look correct.
Here is a sampling of what I see in the console:

:: loading settings :: url = jar:file:/C:/apps/spark-2.3.0-bin-windows/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
io.github.ensozos#matrix-profile added as a dependency
org.nd4j#nd4j-native-platform added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
        confs: [default]
        found io.github.ensozos#matrix-profile;0.0.3 in central
        found org.nd4j#nd4j-native-platform;1.0.0-beta2 in central
        found org.bytedeco.javacpp-presets#openblas-platform;0.3.0-1.4.2 in central
        found org.bytedeco.javacpp-presets#openblas;0.3.0-1.4.2 in central
        found org.bytedeco#javacpp;1.4.2 in central
        found org.bytedeco.javacpp-presets#mkl-platform;2018.3-1.4.2 in central
        found org.bytedeco.javacpp-presets#mkl;2018.3-1.4.2 in central
        found org.bytedeco.javacpp-presets#mkl-dnn-platform;0.15-1.4.2 in central
        found org.bytedeco.javacpp-presets#mkl-dnn;0.15-1.4.2 in central
        found org.nd4j#nd4j-native;1.0.0-beta2 in central
        found org.nd4j#nd4j-native-api;1.0.0-beta2 in central
        found org.nd4j#nd4j-buffer;1.0.0-beta2 in central
        found org.nd4j#nd4j-context;1.0.0-beta2 in central
        found org.nd4j#nd4j-common;1.0.0-beta2 in central
        found org.nd4j#jackson;1.0.0-beta2 in central
        found org.yaml#snakeyaml;1.12 in central
        found org.codehaus.woodstox#stax2-api;3.1.4 in central
        found joda-time#joda-time;2.2 in central
        found org.slf4j#slf4j-api;1.7.21 in central
        found commons-io#commons-io;2.5 in central
        found org.apache.commons#commons-math3;3.5 in central
        found org.apache.commons#commons-lang3;3.6 in central
        found org.apache.commons#commons-compress;1.16.1 in central
        found org.objenesis#objenesis;2.6 in central
        found com.google.guava#guava;20.0 in central
        found commons-codec#commons-codec;1.10 in local-m2-cache
        found org.nd4j#nd4j-api;1.0.0-beta2 in central
        found com.vlkan#flatbuffers;1.2.0-3f79e055 in central
        found com.github.os72#protobuf-java-shaded-351;0.9 in central
        found com.github.os72#protobuf-java-util-shaded-351;0.9 in central
        found com.google.code.gson#gson;2.7 in central
        found uk.com.robust-it#cloning;1.9.3 in central
        found net.ericaro#neoitertools;1.0.0 in central
        found com.github.wendykierp#JTransforms;3.1 in central
        found pl.edu.icm#JLargeArrays;1.5 in central
downloading https://repo1.maven.org/maven2/io/github/ensozos/matrix-profile/0.0.3/matrix-profile-0.0.3.jar ...
        [SUCCESSFUL ] io.github.ensozos#matrix-profile;0.0.3!matrix-profile.jar (103ms)
:
:
downloading https://repo1.maven.org/maven2/pl/edu/icm/JLargeArrays/1.5/JLargeArrays-1.5.jar ...
        [SUCCESSFUL ] pl.edu.icm#JLargeArrays;1.5!JLargeArrays.jar (70ms)
:: resolution report :: resolve 20313ms :: artifacts dl 15756ms
        :: modules in use:
        com.github.os72#protobuf-java-shaded-351;0.9 from central in [default]
        com.github.os72#protobuf-java-util-shaded-351;0.9 from central in [default]
        com.github.wendykierp#JTransforms;3.1 from central in [default]
        com.google.code.gson#gson;2.7 from central in [default]
        com.google.guava#guava;20.0 from central in [default]
        com.vlkan#flatbuffers;1.2.0-3f79e055 from central in [default]
        commons-codec#commons-codec;1.10 from local-m2-cache in [default]
        commons-io#commons-io;2.5 from central in [default]
        io.github.ensozos#matrix-profile;0.0.3 from central in [default]
        joda-time#joda-time;2.2 from central in [default]
        net.ericaro#neoitertools;1.0.0 from central in [default]
        org.apache.commons#commons-compress;1.16.1 from central in [default]
        org.apache.commons#commons-lang3;3.6 from central in [default]
        org.apache.commons#commons-math3;3.5 from central in [default]
        org.bytedeco#javacpp;1.4.2 from central in [default]
        org.bytedeco.javacpp-presets#mkl;2018.3-1.4.2 from central in [default]
        org.bytedeco.javacpp-presets#mkl-dnn;0.15-1.4.2 from central in [default]
        org.bytedeco.javacpp-presets#mkl-dnn-platform;0.15-1.4.2 from central in [default]
        org.bytedeco.javacpp-presets#mkl-platform;2018.3-1.4.2 from central in [default]
        org.bytedeco.javacpp-presets#openblas;0.3.0-1.4.2 from central in [default]
        org.bytedeco.javacpp-presets#openblas-platform;0.3.0-1.4.2 from central in [default]
        org.codehaus.woodstox#stax2-api;3.1.4 from central in [default]
        org.nd4j#jackson;1.0.0-beta2 from central in [default]
        org.nd4j#nd4j-api;1.0.0-beta2 from central in [default]
        org.nd4j#nd4j-buffer;1.0.0-beta2 from central in [default]
        org.nd4j#nd4j-common;1.0.0-beta2 from central in [default]
        org.nd4j#nd4j-context;1.0.0-beta2 from central in [default]
        org.nd4j#nd4j-native;1.0.0-beta2 from central in [default]
        org.nd4j#nd4j-native-api;1.0.0-beta2 from central in [default]
        org.nd4j#nd4j-native-platform;1.0.0-beta2 from central in [default]
        org.objenesis#objenesis;2.6 from central in [default]
        org.slf4j#slf4j-api;1.7.21 from central in [default]
        org.yaml#snakeyaml;1.12 from central in [default]
        pl.edu.icm#JLargeArrays;1.5 from central in [default]
        uk.com.robust-it#cloning;1.9.3 from central in [default]
        :: evicted modules:
        org.objenesis#objenesis;2.1 by [org.objenesis#objenesis;2.6] in [default]
        ---------------------------------------------------------------------
        |                  |            modules            ||   artifacts   |
        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
        ---------------------------------------------------------------------
        |      default     |   36  |   35  |   35  |   1   ||   36  |   36  |
        ---------------------------------------------------------------------
:: retrieving :: org.apache.spark#spark-submit-parent
        confs: [default]
        36 artifacts copied, 0 already retrieved (81691kB/453ms)
[2018-11-26 07:37:35,849] WARN  doop.util.NativeCodeLoader [] [] - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Warning: Local jar C:\Users\BBE\.ivy2\jars\org.nd4j_nd4j-native-1.0.0-beta2.jar does not exist, skipping.
Warning: Local jar C:\Users\BBE\.ivy2\jars\org.bytedeco.javacpp-presets_mkl-2018.3-1.4.2.jar does not exist, skipping.
Warning: Local jar C:\Users\BBE\.ivy2\jars\org.bytedeco.javacpp-presets_mkl-dnn-0.15-1.4.2.jar does not exist, skipping.

However, when I try to run anything through sob-server, I get errors like this:

[2018-11-26 07:41:36,286] ERROR .apache.spark.SparkContext [] [akka://JobServer/user/context-supervisor/sql-context] - Failed to add file:/C:/Users/BBE/.ivy2/jars/org.bytedeco.javacpp-presets_mkl-2018.3-1.4.2.jar to Spark environment
java.io.FileNotFoundException: Jar C:\Users\BBE\.ivy2\jars\org.bytedeco.javacpp-presets_mkl-2018.3-1.4.2.jar not found
        at org.apache.spark.SparkContext.addJarFile$1(SparkContext.scala:1807)
        at org.apache.spark.SparkContext.addJar(SparkContext.scala:1837)
        at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:457)
        :
file:/C:/Users/BBE/.ivy2/jars/org.bytedeco.javacpp-presets_mkl-dnn-0.15-1.4.2.jar to Spark environment
java.io.FileNotFoundException: Jar C:\Users\BBE\.ivy2\jars\org.bytedeco.javacpp-presets_mkl-dnn-0.15-1.4.2.jar not found
        at org.apache.spark.SparkContext.addJarFile$1(SparkContext.scala:1807)
        at org.apache.spark.SparkContext.addJar(SparkContext.scala:1837)
        at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:457)
        at org.apache.spark.SparkContext$$anonfun$12.apply(SparkContext.scala:457)
        at scala.collection.immutable.List.foreach(List.scala:381)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:457)
        at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2486)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:930)
        at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:921)
        at scala.Option.getOrElse(Option.scala:121)
        at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:921)
        at spark.jobserver.context.SessionContextFactory.makeContext(SessionContextFactory.scala:36)
        at spark.jobserver.context.SessionContextFactory.makeContext(SessionContextFactory.scala:23)
        at spark.jobserver.context.SparkContextFactory$class.makeContext(SparkContextFactory.scala:64)
        :  

When I look in the .ivy2\jars directory, I do see what look to be the correct windows versions of these things:

org.bytedeco.javacpp-presets_mkl-2018.3-1.4.2-windows-x86_64.jar
org.bytedeco.javacpp-presets_mkl-dnn-0.15-1.4.2-windows-x86_64.jar
org.bytedeco.javacpp-presets_openblas-0.3.0-1.4.2-windows-x86_64.jar
:

But that is not what it appears to be looking for in the above errors. Is there a way to get it to look for the correct platform specific version of these jars? Maybe I need to compile the jar on windows instead of linux before deploying it. In the past it always worked fine to build on linux and deploy to windows, but maybe these c++ depedencies change that.

Issues when trying to update ND4J depenency to 1.0.0-M2.1

Hi Enzosos,

I am trying to use your library in a Project for tuning an particle ion source (similar to what is show here: Ion Source Optimization Using Bi-Objective Genetic and Matrix-Profile Algorithm). In the paper I used a Python implementation of matrix-profile, but not I want to move the logic to JAVA. One goal in this is to update the dependencies to np4j to version 1.0.0-M2.1, which unfortunately has breaking changes (introduced with nd4j 1.0.0-beta4).

One issue I am facing when running the unit tests with the updated dependency is that all calls of the type

INDArray.get(INDArrayIndex... indexes)

Need to be two dimensional now. This is not a general issue, but when changing the code in MatrixProfileCalculator::MPRunnable::run(), from

    @Override
    public void run() {
        INDArray distanceProfile      = distProfile.getDistanceProfile(timeSeriesA, timeSeriesB, index, window);
        INDArray distanceProfileIndex = distProfile.getDistanceProfileIndex(tsBLength, index, window);

        if (trivialMatch) {
            INDArrayIndex[] indices = new INDArrayIndex[] { NDArrayIndex.interval(
                            Math.max(0, index - window / 2),
                            Math.min(index + window / 2 + 1, tsBLength)) };
            distanceProfile.put(indices, Double.POSITIVE_INFINITY);
        }

        updateProfile(distanceProfile, distanceProfileIndex);
    }

to (NDArrayIndex.all() to be compatible to nd4j 1.0.0-M2.1)

    @Override
    public void run() {
        INDArray distanceProfile      = distProfile.getDistanceProfile(timeSeriesA, timeSeriesB, index, window);
        INDArray distanceProfileIndex = distProfile.getDistanceProfileIndex(tsBLength, index, window);

        if (trivialMatch) {
            INDArrayIndex[] indices = new INDArrayIndex[] { **NDArrayIndex.all(),** NDArrayIndex.interval(
                            Math.max(0, index - window / 2),
                            Math.min(index + window / 2 + 1, tsBLength)) };
            distanceProfile.put(indices, Double.POSITIVE_INFINITY);
        }

        updateProfile(distanceProfile, distanceProfileIndex);
    }

I get errors like

  java.lang.IllegalStateException: Indices are out of range: Cannot get interval index Interval(b=0,e=5,s=1) on array with size(1)=4. Array shape: [1, 4], indices: [all(), Interval(b=0,e=5,s=1)]

for all "testMatrixProfileSelfJoin*" test cases of Matrix profile test. The cause of this error is the fact that distanceProfile.put(...) (calling INDArray.get()) fails, because for the get(...) the IntervalIndex in ìndicesis larger than the distanceProfilearray. This again is caused, because the IntervalIndexis created using the size of tsB which is larger than distanceProfile.

One way to cure this is to change the code to in MatrixProfileCalculator to

    @Override
    public void run() {
        INDArray distanceProfile      = distProfile.getDistanceProfile(timeSeriesA, timeSeriesB, index, window);
        INDArray distanceProfileIndex = distProfile.getDistanceProfileIndex(tsBLength, index, window);

        if (trivialMatch) {
            INDArrayIndex[] indices = new INDArrayIndex[] { **NDArrayIndex.all(),** NDArrayIndex.interval(
                            Math.max(0, index - window / 2),
                            Math.min(index + window / 2 + 1, **distanceProfile.length()**)) };
            distanceProfile.put(indices, Double.POSITIVE_INFINITY);
        }

        updateProfile(distanceProfile, distanceProfileIndex);
    }

I am not sure if this is the right approach as it changes the logic of calculating Matrix-Profile. With this change, the tests do not throw errors any more, but I get assertion failures in tests Windows8, 2SawTeeth, 2Humps, ... ; Windows4, Windows5, StraightLine, Plateau become green.

Could you have a look into this?

Add MIT license.txt file

The README says

Distributed under the MIT license. See license.txt for more information.

However, I do not see a license.txt file anywhere.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.