janjakubnanista / downsample Goto Github PK
View Code? Open in Web Editor NEWCollection of several downsampling methods for time series visualisation purposes.
License: MIT License
Collection of several downsampling methods for time series visualisation purposes.
License: MIT License
Is it possible to support ArrayBuffer
and similar structures like Uint8Array
, Float32Array
, etc?
Ideally it would be great if I can pass a Uint8Array
as argument and it will give me the same type array back.
Right now I'd have to convert the ArrayBuffer to the DataPoint structure and that's very inefficient in my case.
I'm using Plotly and have formatted my tooling to generate data formatted like this:
var trace1 = {
x: [1, 2, 3, 4],
y: [10, 15, 13, 17],
mode: 'markers',
type: 'scatter'
};
var trace2 = {
x: [2, 3, 4, 5],
y: [16, 5, 11, 9],
mode: 'lines',
type: 'scatter'
};
var trace3 = {
x: [1, 2, 3, 4],
y: [12, 9, 15, 12],
mode: 'lines+markers',
type: 'scatter'
};
var data = [trace1, trace2, trace3];
Plotly.newPlot('myDiv', data);
Since there's a lot of data (the reason I'm downsampling in the first place!) changing data shapes twice would be a big overhead unless there's an efficient way I'm missing.
import { LTTB } from 'downsample/methods/LTTB'
const t = LTTB([[1,2]], 1)
// Is an array
console.log(Array.isArray(t))
// Is not typed as an array
t.map(console.log)
Thanks for the module! Is there a way to import a specific method in Node v8.5+ with the --experimental-modules
flag? import { LTTB } from 'downsample'
fails.
SyntaxError: The requested module 'downsample' does not provide an export named 'LTTB'
Hello, we were playing around with the periodic data demo and had a question regarding the ASAP downsampling method.
When I 'downsample' 1000 original points to 1000 downsampled points, I get a graph that looks like this:
I would say that the ASAP trendline matches the original data fairly accurately, applied a little bit of smoothing, but the original data is pretty much intact.
However, when I downsample 900 original points to 900 downsampled points, I get a graph that looks like this:
Although theoretically it seems to me that mapping 1000 -> 1000 or 900 -> 900 should've made no difference in the amount of detail that is preserved, in practice, it seems like there's quite a bit of difference, where the 900 -> 900 lost significantly more detail than 1000 -> 1000.
Was wondering if this is a characteristic somehow of the ASAP downsampling method, or if this was a potential bug. We would like the results of the 1000 -> 1000 sampling consistently, to preserve around that level of detail as a result of the downsampling.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.