Code Monkey home page Code Monkey logo

beam-monorepo's Introduction

beam-monorepo's People

Contributors

dependabot[bot] avatar jvantuyl avatar tleef avatar vheathen avatar yordis avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

beam-monorepo's Issues

allow disabling generation of @spec on the overridable functions when using OnePiece.Commanded.ValueObject

Problem Statement

Summary

When overriding a function that's defoverridable, there's no way to override the original type spec, only add to it. This generates some really surprising behavior.

Details

All of the problems I've encountered come from something as innocuous as this:

defmodule MyErrorValue do
  use OnePiece.Commanded.ValueObject

  @type error_type() :: :busted | :broken | :angry
  @type error_reason() :: String.t()

  @type t() :: %__MODULE__{
    type: error_type(),
    reason: error_reason()
  }

  embedded_schema do
    field :type, Ecto.Enum, values: [:busted, :broken, :angry]
    field :reason, :string
  end

  @spec new(error_type()) :: t()
  def new(error_type) when is_atom(error_type) do
    %__MODULE__{type: error_type}
  end

  @spec new(error_type(), error_reason()) :: t()
  def new(error_type, error_reason) when is_atom(error_type) and is_binary(error_reason) do
    %__MODULE__{type: error_type, reason: reason}
  end
end

What is the spec of new/1? Most people would think it's just what's on the screen. But, in reality, it's:

@spec new(map() | :busted | :broken | :angry) :: {:ok, t()} | {:error, any()} | t()

For extra fun, what's the spec of new/2? Most people would think it's what's on the screen... and they'd be right. And things get extra fun if you have a default argument, so you essentially define both arities of the function and they have wildly differing specs.

I also intentionally wrote in a subtle bug. I "accidentally" messed up the return type from new/1 (as this should either be a result tuple or we should be overriding new!/1). Ideally Dialyzer would notice the spec not matching and complain. What actually happens is you end up with a wildly permission, totally incorrect spec. It will certainly allow more things that intended. But also the real success typing can't handle the parameters it claims to. And the result will only ever be one of the declared options.

For real fun, let's say you decide to accept a struct instead of an atom. Now things get really weird. This is because all structs are also maps. Since there's already a spec that takes a map, another spec that takes a struct would "overlap" the possible inputs. Dialyzer doesn't like this and complains about "overlapping domains". In this case it just discards the spec completely while emitting a rather cryptic warning.

Analysis

Many developers probably aren't equipped to understand what's going on here. The population of our community that fluently understands macros, speaks Dialyzer, and is likely to look at what __using__ is even doing here is rather small.

On one hand, you don't even know that you're doing it. If you override the function and add a new spec, the original spec is invisible unless you have some way to inspect it. Most people won't even know that something is off unless something hints at it (i.e. an IDE popup or Dialyzer error). In this way, it very much violates the Rule of Least Surprise.

Since Dialyzer will believe you if you lie to it, it will further complicate things by hallucinating that the spec is meaningful. Other times, it will notice that the success typing isn't correct either, thereby snapping back to reality in a way that isn't entirely helpful.

While the false warnings can be frustrating, the scarier thing is the lack of certain warnings where there should be. They get suppressed due to the odd but looser compound spec and the developer is none the wiser.

At the point that you get to the whole "overlapping domains" and "unsupported contract" stuff, most people are just lost. The code they wrote looks simple. They think they know what the spec is, because it's "right there in the file". So far, most of the developers I've talked to had just ended up deciding that Dialyzer had been trying experimental drugs, shrugged, and moved on.

Our Issues

While it's perhaps questionable whether it's wise to change a function to an incompatible type, I've found some places where it's been done in our codebase and it's pretty widespread. I would not be surprised if other people run into this as well. Unfortunately, the default response of many developers has been "just ignore Dialyzer". ๐Ÿ˜ข

Solution Brainstorm

The easiest fix for me would be for OnePiece to just not generate the specs. I think you can just take an option like disable_type_generation: true, put the @spec declarations inside of an if and it would just work. I'd be glad to code this up if you'd accept it.

One other thought I had was to move the specs to the implementations in ValueObject instead of putting them in the generated code. I'm pretty sure that Dialyzer would infer the typing through that function call and do essentially the same thing. But, since the specs weren't on the generated function, specifying one would entirely subsume the inferred one.

This also makes it possible to override those functions with incompatible versions. While that might not be wise, at least the spec wouldn't be subtly broken.

OnePiece.Commanded.Event with JsonbSerializer and events evolution

Problem Statement

Sorry for bringing it up here, I tried to find OnePiece.Commanded-related discussions on ElixirForum but didn't find any, and this is quite specific one, so I decided to give a shot here.

I'm trying to use OnePiece.Commanded to my new project and evaluating an option to use its JsonbSerializer. It expects to have all events built agains the Event module or at least have a new!/1 function.

As I understand, the idea of the ObjectValue sitting underneath the Event is to use Ecto to deserialize supported types into a runtime event struct, and this is totally legit and sound.

The question is what scenario you think would be the best in the feature when it is necessary to evolve events?

Solution Brainstorm

I'm not a big fan of keeping old structures just to support legacy, it brings a lot of confusion and mistakes during development, especially when a team is big enough. With the Upcaster protocol implementation it is possible to update some event's fields to the new shape or even build a completely new current runtime event representation. But if we keep only the current shape of the event structure it would break casting: either silently when some fields just don't get their way into a pre-upcasted structure or loudly, when fields just can't be casted anymore. The original Commanded JsonDecoder protocol gives the power to cast specific values to the event-specific implementation. But if we enforce casting right after reading an event from the event store - it won't work. And yes, it is possible to redefine the new!/1 method to do things there, but then it replaces the Upcaster protocol in many ways and requires using internal ValueObject implementation functions to continue casting.

I understand that it is possible to just continue to use the original scenario with Commanded JsonDecoder, but I would kindly ask to share your thoughts on the topic if you don't mind.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.