render() method takes all the streams and a view, and generates an output. Views can be anything that follow a common set of functions to input and output.
Deliverables
design the basic logic of view functions
implement "html5_video_wall" as a proof-of-concept view
Datagram is pretty useless if you can't discover what others have created with their datagrams. searchIndex() allows the user to search Open Discovery Index for all published datagrams. Default ODI is Datagram's but can be changed.
Deliverables
returns a set of sharelinks based on search phrase
Core Adapters are a way to express the type of Core and what to expect from its data content. You can use Core Adapters to create your own Cores from stratch or easily import existing data constructs as Cores. Core Adapters should be flexible enough to express any type of digital content.
Deliverables
specification of how to design a core adapter
specification of how core adapters are handled by the hypervisor
Every project needs a logo, preferably pixel perfect one that took unnecessary amount of time to tweak just right given that extra oompf that nobody else except me will ever notice.
Deliverables
black & white logo
conveys that datagram is your connection to data
works in favicon size and in billboard size
is easily understandable for everybody with basic knowledge of tech
uses simple shapes so it renders nicely on all DPIs
is just a first version that can and will be improved in the future
At the moment Datagram has no interface outside the library API. This limitation makes it unattractive to use for anybody outside those who are integrating Datagram as part of their own system. So we need user interface and what better place to start than a good old CLI.
First version of CLI is just pure commands meaning that there's no daemons or background processing. Later on we need to expand CLI to have daemon/background capabilities so you don't need to manually run it every time.
Sometimes when you try to download a bigger file, datagram unexpectedly closes the connection. I suspect this might be due to a sudden memory spike with serialization and encryption.
At the moment Datagram caches everything it downloads into a local storage. Most of the time this is what the user wants, especially when using dg as a library, but this behavior leads to duplicate data when the use case is to export data from remote datagrams as files.
Each stream has already capability of cleaning up the cache, it just needs to be hooked up as an option.
User should be able to remove any Core added. When Core is removed, it should be also remembered and blocked in the future.
Replication already has the ability to select what Cores the user wants to share and receive. Combined with persisted block list in the meta core we can easily remove blocked cores from both want and share lists on the fly.
Deliverables
user can use remove_core(core_key) to remove a core
core is removed from meta core
core is automatically removed from want list in replication
core is automatically removed from sharing list in replication
At the moment if you want to get the list of data you have in your datagram, you need to download everything in the datagram, and then get the data names. This works when you are running local datagrams but fails when you just want to see inside a remote datagram.
This can be solved by creating an index array to each datagram at key _index. Every time new data is stored, its key is appended into _index. Then list method can be refactored to fetch only _index key.
Currently version of Datagram uses standard file system calls. To get more speed and avoid consuming memory like Chrome, we need streams. All internal parts support streams already, including encryption so this doesn't require major changes.
Only bigger question is the serialization which is done with msgpack right now.
Admin stream is a stream in Datagram that stores all admin actions. Admin actions can be anything from adding new admins to marking certain blocks in certain streams as spam. Users of a specific datagram can decide freely whether they want to follow admin actions or not.
Feature list should communicate it clearly what features needs to be build, initial ideas how to do it and what the end result should look like so that it works with everything else.
Now anybody can add streams which makes it easy to abuse the original sharer. Siggrid gives us the ability to control who can add new streams into the replicated Datagram. @telamon has been working on this challenge at https://github.com/telamon/multifeed-sigrid and we talked about integrating siggrid into Datagram.
Currently siggrid supports one-way authorization meaning that once you authorize someone, you can't take it back. We can remedy this by allowing the owner to create authorized users which are replicated to all replicants as "admin stream". When owner wants to remove an admin, admin is marked as unauthorized in the "admin stream".
Deliverables
by default Container must be locked to creator's key
user can add additional admins with add_admin(owner_key) and receive admin_key
added admin can authorize themselves with authorize(admin_key)
user can remove admins with remove_admin(admin_key)
Core Service is used to request a core. If Core Definition is provided, a new core is created according to the definition. If keys are provided, Core Service will attempt to open an existing core.
open({ sharelink }) API method opens other datagrams based on the sharelink information. Open Discovery Index where sharelink is searched at defaults to Datagram's but can be customized.
At the moment Datagram requires you to have the right user credentials. If these are not provided, it won't initialize. And if it would initialize, it wouldn't open any data because the data would fail verification due to the missing user id. This is fine for the use cases where the user sends data between their own devices, but it prohibits sharing datagram with others.
To make multi-user work, verification needs to made optional, and datagram needs to have a built-in user registry where the user can save read keys from other users for data verification.
Deliverables
Implement user registry
Refactor data fetch code to optionalize verification & decryption
"stream" is much more analogous in action to what "cores" do technically. It makes no sense to create a new abstraction term "core" on top of "stream" when it brings no benefits over the abstraction it lies upon.