Code Monkey home page Code Monkey logo

rem's Introduction

image

rem

🧠 Remember everything. (very alpha - download anyway)


🚨 Looking for contributions / help! 🚨

I would love to keep this project alive and growing, but can't do it alone.

If you're at all interested in contributing, please feel free to reach out, start a discussion, open a PR, look at issues, look at roadmap below, etc.

Something not working properly? There's no telemtry or tracking, so I won't know! Please log an issue or take a crack at fixing it yourself and submitting a PR! Have feature ideas? Log an issue!


Original Demo

An open source approach to locally record everything you view on your Mac (prefer other platforms? come help build xrem, cross-platform version of this project).

Note: Only tested on Apple Silicon, but there is now an intel build


This is an early version (rem could use your help!)

Please log any bugs / issues you find!

Looking at this code and grimacing? Want to help turn this project into something awesome? Please contribute. I haven't written Swift since 2017. I'm sure you'll write better code than me.


I think the idea of recording everything you see has the potential to change how we interact with our computers, and believe it should be open source.

Also, from a privacy / security perspective, this is like... pretty scary stuff, and I want the code open so we know for certain that nothing is leaving your laptop. Even telemetry has the potential to leak private info.

This is 100% local. Please, read the code yourself.

Also, that means there is no tracking / analytics of any kind, which means I don't know you're running into bugs when you do. So please report any / all you find!

Features:

  • Automatically take a screenshot every 2 seconds, recognizing all text, using an efficient approach in terms of space and energy
  • Go back in time (full-screen scrubber of everything you've viewed)
  • Copy text from back in time
  • Search everything you've viewed with keyword search (and filter by application)
  • Easily grab recent context for use with LLMs
  • First Intel build (please help test!)
  • It "works" with external / multiple monitors connected
  • Natural language search / agent interaction via updating local vector embedding
  • Novel search experiences like spatial / similar images
  • More search filters (by time, etc.)
  • Fine-grained purging / trimming / selecting recording
  • Better / First-class multi-monitor support

Getting Started

  • Download the latest release, or build it yourself!
  • Launch the app
  • Click the brain
  • Click "Start Remembering"
  • Grant it access to "Screen Recording" i.e. take screenshots every 2 seconds
  • Click "Open timeline" or "Cmd + Scroll Up" to open the timeline view
    • Scroll left or right to move in time
  • Click "Search" to open the search view
    • Search your history and click on a thumbnail to go there in the timeline
  • In timeline, give Live Text a second and then you can select text
  • Click "Copy Recent Context" to grab a prompt for interacting with an LLM with what you've seen recently as context
  • Click "Show Me My Data" to open a finder window where rem stores SQLite db + video recordings
  • Click "Purge All Data" to delete everything (useful if something breaks)

(that should be all that's needed)

Build it yourself

  • Clone the repo git clone --recursive -j8 https://github.com/jasonjmcghee/rem.git or run git submodule update --init --recursive after cloning
  • Open project in Xcode
  • Product > Archive
  • Distribute App
  • Custom
  • Copy App

FAQ

  • Where is my data?
    • Click "Show Me My Data" in the tray / status icon menu
    • Currently it is stored in: ~/Library/Containers/today.jason.rem/Data/Library/Application Support/today.jason.rem
    • It was originally: ~/Library/Application\ Support/today.jason.rem/

(Never)AQ

  • Wow that logo is so great, you're an artist. Can I see your figma?

XCode + copy / paste from history:

Screen.Recording.2023-12-27.at.8.38.07.PM.mov

rem's People

Contributors

caetano-dev avatar jasonjmcghee avatar ruslanjabari avatar seletz avatar sudocurse avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

rem's Issues

Skip processing identical screens

I noticed that videos and search returns quite a lot of similar images which are not worth recording and OCRing.

Probably the trick here is to detect how much has changed between frames. But this could potentially save lots of processing resources and make disk footprint smaller.

Recording Permissions never seems to get detected

I'm having a bit of a strange issue.

I have double verified that screen recording permissions for Rem.app are allowed. However, every time I click "start recording" from the menubar, rem just again asks for permissions and brings up the System menu.

I have attempted a reinstall and the compat version.

This is on an M1 macbook pro.

»Open Timeline« shows blank grey screen

Hi! I wanted to try out the app and clicked »Start Remembering«, use my Computer for like 10 Minutes, clicked »Stop Remembering« and then »Open Timeline«. It open a fullscreen, blank grey window with no controls and no content. Is there any way i can send you debug information?

Also clicking on »Search« results in the screen i attached, it seems the screenshots are not correctly loaded?

I'd love to be able to help more on this!

Bildschirmfoto 2023-12-28 um 11 35 04

Check SQLite.framework reference -- iOS?

Summary

I had some trouble getting an initial build to work in Xcode after cloning SQLite.swift.

I had to update the (relative path) reference to SQLite.swift from within Xcode (took an embarrassingly long time because I'm not a daily Xcode user...) but then I ran into another issue: a build failure due to a missing build directory:

Showing All Messages
No such file or directory: '/Users/<username>/Library/Developer/Xcode/DerivedData/rem-gzgveujhyqjmemcyrgrqjanzsmqi/Build/Products/Debug-iphoneos/SQLite.framework/SQLite'

Note: Debug-iphoneos - I don't have anything except the macOS SDKs installed.

I poked around on the internet and tried to read between the lines, but ultimately fixed it by deleting SQLite.framework in the rem target Build Phases section under both Link Binary with Libraries and Embed Frameworks and re-adding it, taking care to add the macOS SDK, not the iPhone version. After that my builds succeeded.

Screenshot:

image

Additional Info

I'm not sure if it's portable, but the relevant git diff in my .xcodeproj file is:

a/rem.xcodeproj/project.pbxproj b/rem.xcodeproj/project.pbxproj
index 6f2ec5e..9a66d6f 100644
--- a/rem.xcodeproj/project.pbxproj
+++ b/rem.xcodeproj/project.pbxproj
@@ -7,6 +7,8 @@
 	objects = {

 /* Begin PBXBuildFile section */
+		78349C542B3DE8630009C28F /* SQLite.framework in Embed Frameworks */ = {isa = PBXBuildFile; fileRef = 96E66BC32B2F5745006E1E97 /* SQLite.framework */; settings = {ATTRIBUTES = (CodeSignOnCopy, RemoveHeadersOnCopy, ); }; };
+		78349C582B3DE9050009C28F /* SQLite.framework in Frameworks */ = {isa = PBXBuildFile; fileRef = 96E66BC32B2F5745006E1E97 /* SQLite.framework */; };
 		961C95DA2B2E19B30093F228 /* remApp.swift in Sources */ = {isa = PBXBuildFile; fileRef = 961C95D92B2E19B30093F228 /* remApp.swift */; };
 		961C95DC2B2E19B30093F228 /* ContentView.swift in Sources */ = {isa = PBXBuildFile; fileRef = 961C95DB2B2E19B30093F228 /* ContentView.swift */; };
 		961C95E12B2E19B40093F228 /* Preview Assets.xcassets in Resources */ = {isa = PBXBuildFile; fileRef = 961C95E02B2E19B40093F228 /* Preview Assets.xcassets */; };
@@ -19,8 +21,6 @@
 		961C96152B2EBEE50093F228 /* DB.swift in Sources */ = {isa = PBXBuildFile; fileRef = 961C96142B2EBEE50093F228 /* DB.swift */; };
 		969BA2EC2B3D1D46009EE9C6 /* SettingsManager.swift in Sources */ = {isa = PBXBuildFile; fileRef = 969BA2EB2B3D1D46009EE9C6 /* SettingsManager.swift */; };
 		969F3EFF2B3A8C4D0085787B /* HotKey in Frameworks */ = {isa = PBXBuildFile; productRef = 969F3EFE2B3A8C4D0085787B /* HotKey */; };
-		969F3F042B3B70560085787B /* SQLite.framework in Frameworks */ = {isa = PBXBuildFile; fileRef = 96E66BBF2B2F5745006E1E97 /* SQLite.framework */; };
-		969F3F052B3B70560085787B /* SQLite.framework in Embed Frameworks */ = {isa = PBXBuildFile; fileRef = 96E66BBF2B2F5745006E1E97 /* SQLite.framework */; settings = {ATTRIBUTES = (CodeSignOnCopy, RemoveHeadersOnCopy, ); }; };
 		969F3F082B3B7C7C0085787B /* RemFileManager.swift in Sources */ = {isa = PBXBuildFile; fileRef = 969F3F072B3B7C7C0085787B /* RemFileManager.swift */; };
 		969F3F0B2B3CB2110085787B /* Field.swift in Sources */ = {isa = PBXBuildFile; fileRef = 969F3F0A2B3CB2110085787B /* Field.swift */; };
 		969F3F0D2B3CCEC30085787B /* Ask.swift in Sources */ = {isa = PBXBuildFile; fileRef = 969F3F0C2B3CCEC30085787B /* Ask.swift */; };
@@ -162,7 +162,7 @@
 			dstPath = "";
 			dstSubfolderSpec = 10;
 			files = (
-				969F3F052B3B70560085787B /* SQLite.framework in Embed Frameworks */,
+				78349C542B3DE8630009C28F /* SQLite.framework in Embed Frameworks */,
 			);
 			name = "Embed Frameworks";
 			runOnlyForDeploymentPostprocessing = 0;
@@ -201,7 +201,7 @@
 			isa = PBXFrameworksBuildPhase;
 			buildActionMask = 2147483647;
 			files = (
-				969F3F042B3B70560085787B /* SQLite.framework in Frameworks */,
+				78349C582B3DE9050009C28F /* SQLite.framework in Frameworks */,
 				969F3EFF2B3A8C4D0085787B /* HotKey in Frameworks */,
 			);
 			runOnlyForDeploymentPostprocessing = 0;

Other

OMG thank you for posting this. I've been meaning to take a poke at it, but it's much easier to have a skeleton to work inside since I'm not a Swift/Objective-C user normally.

"Recording" itself isn't happening (for some users, still unclear who/why)

Rem worked great on my M1 MBP, but it isn't functioning as intended on my M1 iMac.

Toggling 'timeline' shows:
image

And using the 'search' feature shows:
image

Searching a specific query does show results matching that word(s), but without the actual thumbnail/recording of it. Clicking on any search result shows image 1 all over again.

'Show me my data' opens finder with several mp4 files, but I'm unable to open any as it says these files are incompatible with quicktime (and all of these also weigh 0KB). Unsure why the recording isn't occurring as I've granted screen-recording permissions

Feature request: Exclude Applications

I'd like to request a feature for it to stop taking screen shots if a specific application is in focus. That way if there is an application that works with sensitive data, it will never record it.

Thank you for considering this feature.

Build error: error: Build input file cannot be found: '/Users/kurt/GitHub/rem/rem/SettingsManager.swift'. Did you forget to declare this file as an output of a script phase or custom build rule which produces it? (in target 'rem' from project 'rem')

Steps to reproduce:

  1. Install Xcode on a Mac M1
  2. Add GitHub account
  3. Add Apple account and enable signing for personal team; let Xcode manage it
  4. Fork rem repo (kurtseifried/rem)
  5. Open project, clone git repo kurtseifried/rem
  6. Import SQLite from https://github.com/stephencelis/SQLite.swift
  7. Clean build folder to be on the safe side
  8. Hit build, get error:

error: Build input file cannot be found: '/Users/kurt/GitHub/rem/rem/SettingsManager.swift'. Did you forget to declare this file as an output of a script phase or custom build rule which produces it? (in target 'rem' from project 'rem')

Am I missing something obvious here?

Hooks / Triggers

What if there was a concept of "hooks".

Users could setup a trigger given certain criteria for something to happen given certain data (text, images, etc)


In general, I would love to lay a foundation of access to the data in a way that opens up doors.

Investigate: compute embeddings via CoreML model

(Splitting out this discussion from #17; putting it here to document what I tried in case someone else wants to follow up)

I attempted to convert the gte-small model from HuggingFace from pytorch --> CoreML and integrated it into rem.

Attempt #1 just use the CoreML model that somebody uploaded to the HF repo a few weeks ago

Result: I was able to easily get a tokenizer imported via swift-transformers, and import the CoreML model, but the actual model prediction resulted in NaNs.

Attempt #2 convert the model myself using huggingface exporters project

Result: conversion fails in the validation phase, because it outputs NaNs... (see a pattern here? :) )

Attempt #3 manual conversion by following coremltools documentation

Result: kind of a few different things, but mostly: NaNs.

I'm unclear whether conversion of a pytorch model for embeddings specifically is something that's supported/intended by coremltools. They have a lot of models included that seem much more complicated than a BERT embedding model should be but 🤷.

After a lot of poking and tweaking of inputs, I was able to get the pytorch model loaded into CoreML in fp16 format (it was defaulting to fp32 for some reason -- I think that's why the model uploaded to HF was so big to begin with). When I got to this point I get fp32 <--> fp16 compatibility issues from CoreML tools, which is a definite improvement, but... still not functional.

Error:

... snip ...
  File "/Users/robertgay/.pyenv/versions/exporters/lib/python3.10/site-packages/coremltools/converters/mil/mil/operation.py", line 190, in __init__
    self._validate_and_set_inputs(input_kv)
  File "/Users/robertgay/.pyenv/versions/exporters/lib/python3.10/site-packages/coremltools/converters/mil/mil/operation.py", line 503, in _validate_and_set_inputs
    self.input_spec.validate_inputs(self.name, self.op_type, input_kvs)
  File "/Users/robertgay/.pyenv/versions/exporters/lib/python3.10/site-packages/coremltools/converters/mil/mil/input_type.py", line 137, in validate_inputs
    raise ValueError(msg)
ValueError: In op, of type layer_norm, named input.5, the named input `epsilon` must have the same data type as the named input `gamma`. However, epsilon has dtype fp32 whereas gamma has dtype fp16.

Summary

So... I'm going to table this for now, given that there's already a more flexible/probably less finicky alternative (the rust lib + bindings). It was fun while it lasted, but there are only so many hours in the day. 😅

(feel free to close this, I just didn't want to carp up #17 given that there are ~3 discussions happening there right now.)

hevc_videotoolbox support

We can get better performance in terms of efficiency of rendering along with smaller file sizes.

Just swapping out h264_videotoolbox appears to "work" but we need a way to extract the frames and current approach breaks.

No investigation has been conducted yet.

To be clear: our custom ffmpeg build has hevc_videotoolbox support too (literally the only other thing it supports) so no changes needed there

Intel version gobbling mem

I've been running the intel version on and off on an intermittently-used 2019 16" MBP. I hadn't noted this until recently (so perhaps related to a recent Sonoma developer beta?) but it's been gobbling memory and causing my MBP to crash.

I'd recently recovered from a crash and hadn't yet restarted rem but had activity monitor up and was doing some "normal computing". I started rem and started recording, piddled around a bit and wandered away for a few minutes (maybe 10?). the WindowsServer memoruy was at ~1.5Gb and when i came back was almost at 10Gb. i just grabbed a rem video from a little while before while the laptop was sitting mostly idle and watched WindowsServer grow in memory from 4-5 Gb in the span of a single mp4.

I'm going to stop running it here at least until the next beta update and see how things look then; perhaps others not running the sonoma beta are havign better luck.

Better multi-monitor support

This is going to dramatically impact performance if we record both at all times.

We need to think this through and consider a "record active display" kind of experience.

But- the drawback here regardless is that the capped screen no longer has the guarantee of matching the laptop screen resolution. Honestly - that's a problem in general if you plug in a monitor.

Anyway, would be great to think this through / plan it out.

Add option to Disable/Customize Shortcut

I use Figma and regularly use CMD+Scroll to navigate the canvas, it seems like it activates the timeline which means I can't use this at the same time as figma.

Settings Window Does Not Use MacOS Standard

It seems to me that the current settings implementation does not use the standard MacOS UI for settings.

Pressing CMD-, should open application settings, this opens a empty window instead.

This should be an easy fix -- at least in my ObjC days it was pretty simple to add a standard settings window. I might try to get this going, no promises, though -- I've not be doing MaxOS UI development for a very long time. I'll see how this goes.

Optionally only OCR the content of the frontmost application

To reduce noise and increase data quality, it might be good to only OCR the frontmost application window portions of the captured screenshots.

I propose to add a new setting for this -- default off.

The implementation should be straight forward. I'll try to get this going and add a PR.

“rem.app” is damaged and can’t be opened. You should eject the disk image.

image

Just thought I'd let you know that I cannot open the app on M1 Macbook Air.

Model Name: MacBook Air
Model Identifier: MacBookAir10,1
Model Number: MGN63RU/A
Chip: Apple M1
Total Number of Cores: 8 (4 performance and 4 efficiency)
Memory: 8 GB
System Firmware Version: 10151.61.4
OS Loader Version: 10151.61.4
Serial Number (system): C02DP9EMQ6L4
Hardware UUID: 2F02303C-6478-55AD-80EA-0EFE3AAFB5DC
Provisioning UDID: 00008103-001D29322E62001E
Activation Lock Status: Enabled

When rem is Remembering, mediaanalysisd spins up taking 30%-50% CPU

Issue:

When rem is running, and actively Remembering a session, Apple's mediaanalysisd also starts up. It will run constantly with CPU between 30% - 50% of total usage for the entire session. (rem itself consumes around 5% CPU.) After selecting Stop Remembering in the rem menu, the process peaks briefly and then drops away.

What is expected:

Not sure...

Why is this an issue?

For an app which derives its functionality from running all day, triggering another process that consumes considerable resources might lead to unnecessary battery consumption.

Discussion

In comparison, using Quicktime to record a (full screen) capture session does not trigger mediaanalysisd. Once a Quicktime screen capture movie is recorded and available in a Quicktime window, its text is also selectable, with copy to clipboard available.

Quicktime's screencapture process hovers between 15% -20% CPU.

rem, when running, shows an Energy Impact of between 100 and 300 on my M1 MBP 14", on battery. In comparison, with Quicktime running and full-screen recording, Energy Impact for the screencapture process is around 10 - 15, an order of magnitute less.

Is mediaanalysisd doing something specific and of value for rem? Is it actively spawned, or spawned by MacOS due to the way the video is being stored by rem?

Is there anything to be learned from the way Quicktime does its screen capture that would make rem more efficient for long-term active sessions?

Animated wallpaper causes CPU/battery drain even when not recording

I haven't had a chance to dig into this much, but I was running a debug build of rem that was not set to remember anything. And I happened to check the battery drain:

image

I checked activity and the rem process was pulling 100% doing...something? Not sure what.

Tossing here in case someone else sees same behavior w/ more details.

edit: fyi my was based on 4ea0929 with no local changes.

window retrieval for application identification is using `isActive` which can be multiple windows, the incorrect field to use

I've been poking at the screenshot capture with the goal of trying to be able to filter to only the foreground application, and I noticed that what's being captured even on main seems to not match my expectations of what should be being captured based on the assumed intent of the code.

In particular,

let window = shareableContent.windows.first { $0.isActive }

looks like it's attempting to find the active window, where "active" is probably supposed to be "the one on top"(?). But there are actually a lot of windows that have isActive = true when I'm running.

Adding the following debugging code to that method:

            shareableContent.windows.forEach() {
                if $0.isActive {
                    print("Active window: \($0.title)")
                }
            }

results in basically every open application on my machine, plus some Desktop-y and menubar-y looking things:

active window: Optional("Offscreen Wallpaper Window")
active window: Optional("Wallpaper-")
active window: Optional("Item-0")
active window: Optional("Recents")
active window: Optional("rem — remApp.swift")
active window: Optional("Item-0")
active window: Optional("Screenshot 2023-12-28 at 7.16.08 PM")
... snip ...

I spent some time looking at the various ScreenCaptureKit APIs, including SCContentFilter, and SCScreenshotManager, but haven't figured out how one would go about getting the actual highlighted/selected window. Just wanted to flag this in case someone driving by knew the answer. :)

If quit before video processed, frames still exist in sqlite db, but produce grey screen

For example, frames without associated video:

sqlite> select * from video_chunks order by id desc limit 2;
10|/Users/jason/Library/Application Support/today.jason.rem/output-1703765761.890883.mp4
9|/Users/jason/Library/Application Support/today.jason.rem/output-1703765749.5049071.mp4

sqlite> select * from frames order by id desc limit 2;
80|11|1|2023-12-28T12:16:06.476|System Settings
79|11|0|2023-12-28T12:16:04.026|System Settings

Screen Recording Permissions don't re-prompt

What I Did:
Tried to allow screensharing permissions

What Happened:
When I was prompted to allow screen-sharing I already had a System Settings dialog open, so macOS couldn't open the Screenshare pane for me to allow rem.

Once I'd gotten out of there I wasn't in the screenshare pane, and when I went back to rem it was still greyed out in the menu bar and the "Start Recording" menu item was still available. I re-clicked it a few times but rem didn't enable and didn't ask me to enable screensharing.

Once I went into the screensharing pane and allowed rem (which was in the list - I didn't have to go find it) I was able to "Start Recording" and rem enabled itself.

What I Expected (or at least a would-be-nice):
If rem doesn't have the appropriate permissions so can't enable recording when you try to enable it it would be amaze if it re-started the allow permissions workflow (if that's even possible - I think I've seen it elsewhere)

Context:
2021 16" MBP
Sonoma 14.2.1
built-from-source on XCode 15.1 from 6200775

Bugs I’ve found

Hello,

First I must say thank you for this project, as it could become a real replacement for RewindAI; which I couldn’t morally setup as it wasn’t open source (my Mac has my entire life on it so I’m really careful).

I’m here to report several bugs I’ve encountered:

  • rem quits itself at random times. I tried to understand or find when the app “crashes”, but I couldn’t, as it would work continuously for hours, and then, when I check the app icon at the top, it’s gone. I’ll try to investigate this issue further.
  • when using the timeline feature (really love it), if you go all the way to the right, so at present time, it shows a grayish image, and then you can’t use the timeline anymore. You’ll have to quit the app and “start remembering” again.
  • also when using the timeline, often, some images can’t be loaded (there’s a message I don’t remember), and maybe more often, images are just grayish even though the app was remembering at that time.
  • when using the search feature, if you click on a result, the image doesn’t show up, and instead it does weird things to the screen, so you have to escape the menu.

I think that’s all for the bugs.

There’s just one single feature that would be enough to make the app really work (even if so many things can be added to it, of course), this is the presence of the timeline in the timeline. Not knowing when a picture is dated is really unpractical for fast search.

All in all, I really hope this project has a bright future, as I think that having open source AI is a must have.

Cheers.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.