Code Monkey home page Code Monkey logo

Comments (6)

avescodes avatar avescodes commented on June 11, 2024 1

from conformity.

avescodes avatar avescodes commented on June 11, 2024

from conformity.

danielcompton avatar danielcompton commented on June 11, 2024

I've been thinking about this a little bit, here's my current thoughts.

Background

In development, it is often desirable to work with a "development" database, and rework its schema multiple times before committing to a particular approach and deploying to production. Often the development database has useful working state in it which is inconvenient or slow to recreate.

Datomic doesn't allow schema to be retracted or excised, so it is not possible to completely roll-back a migration in the same way that you could in a traditional SQL database.

Idea

When conforming schemas, Conformity can (optionally) check whether the norms that were transacted still match what they say now. If there is a difference, then the user can use multiple strategies to bring the db into alignment. This is intended only for use in development, schema migration in production is a separate issue.

Schema strategy:

  • No check
  • Warn - Print a warning to the console (probably best for production)
  • Rename - rename all conformed attributes that have changed to a synthetic name, and then reapply the new attributes

Data strategy:

  • Warn - warn that the data is now using renamed attributes
  • Retract - retract any data that was transacted since the modified norm was applied
  • Excise - excise any data that was transacted since the modified norm was applied

I'm not sure what should happen for data that was applied in a norm. Maybe the same as any data that was transacted since the norm?

This is a little bit hazy for me, so some of the details above may not make sense or be possible, but this is the general direction I'm thinking of.

from conformity.

kennethkalmer avatar kennethkalmer commented on June 11, 2024

My approach is to use datomock in my REPL and in my tests for the initial work of fleshing out the schema, with a help function that looks more or less like this:

(defn mock-conn! [conn]
  (def mconn (app.db.core/migrate-schema! (datomock/fork-conn conn)))
  (def mdb (d/db mconn)))

I then iterate on a mock connection in the REPL, and the tests give me some really good feedback. This also has the benefit of validating my transactions as I go.

It is not perfect (yet). I want to improve on the flow and be able to swap out the real datomic connection with some development middleware so I can actually test my full app with the mocked/conformed database. I'm relying on my tests for this feedback, and I still find myself having to add additional transactions after I've conformed and have had the tests pass.

from conformity.

daemianmack avatar daemianmack commented on June 11, 2024

In case it's useful as a dissenting data point, I don't personally feel the anticipated benefit of this feature sufficiently offsets the increase in complexity or API surface area.

To me, it feels more like a workflow issue than a tooling one.

In the past, I've always addressed the problem of arriving at a correct norm by iterating on small, focused schema alterations via some mix of REPL testing and Datomic's mem DB before committing final PR-worthy changes to the schema description -- similar in spirit to the approach described by @kennethkalmer.

This hasn't been nearly painful enough that I've wanted extra tooling support for it.

Just wanted to throw that out there in case it helps highlight an alternative path. Happy to discuss further -- in particular, I'd be curious to see a concrete code situation that constitutes a strong pro argument for this change; perhaps I've just been lucky?

from conformity.

danielcompton avatar danielcompton commented on June 11, 2024

One strong use case we have is that our front-end developers run our back-end Datomic application, but don't usually work with the code much. They need a persistent database so that they can keep the state they've built up over time when developing a feature for a few days. When I push a new update which has changed the existing migration (that hasn't made it to prod yet), I'd like it to automatically migrate the data, or at least warn them that their schema isn't up-to-date.

Currently, I just tell them when to drop the entire DB, which works, but isn't the best, and sometimes I forget to tell them, leading to strange errors.

from conformity.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.