Code Monkey home page Code Monkey logo

aws-sdk-ruby-record's People

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

aws-sdk-ruby-record's Issues

Update Request Parameters

Record.update! only accepts record attributes as parameters, disallowing the configuration of 'return_value' or 'return_consumed_capacity'.

To reiterate my solution describe in #59:

To handle this I created a SimpleDelegator which delegates an Aws::DynamoDB::Client and provides additional functionality:

  class InterceptClient < SimpleDelegator

    attr_accessor :update_item_params

    # inject additional parameters for update requests
    def update_item(request)
      super(request.merge(update_item_params))
    end
  end

I then created a new_client function for generating clients (mainly exposed to be overridden in test, if necessary):

      # Spawns a new client with necessary request parameter overrides.
      def self.new_client
        client = InterceptClient.new(Aws::DynamoDB::Client.new(stub_responses: Rails.env.test?))
        client.update_item_params = { return_values: 'ALL_NEW' }
        client
      end

This is severely limiting due to the fact that the class and all instances share the same client - this means ALL updates must return 'ALL_NEW' attributes; there is no way to (safely) change that value in a multi-threaded environment (using Shoryuken for processing SQS messages and reading/writing Dynamo).

before / after callbacks

Thanks for putting this out, it's great! One thing I could see helping are some before/after save/update/etc. callbacks. I'm going to start working on something, and wanted to be sure it wasn't in progress. Let me know!

datetime_attr with millisecond resolution?

I was wondering if something could point me to or provide an example on how to use a custom formatter for datetime_attr? I would like to store datetime in DynamoDB to millisecond resolution. ie. "2017-09-10T06:32:53.484Z"

I understand that this should still work as a range key and BETWEEN should work when querying for rows between two datetimes of millisecond resolution?

Namespacing

Under the current dsl, namespacing table names for a specific environment such as staging_2_events would require loading an environment config in each model and explicitly setting the table name or monkey patching the underlying implementation.

It would be awesome if aws-record supported namespacing in an initializer or some other mechanism, similar to the way Dynamoid does so, for cases like ours where we're using a different set of table names for different environments. Thoughts?

Support for binary types? Normal Marshal classes?

We have some information we need to encrypt in our dynamodb store.... It appears there are no attr methods for declaring objects that will be of type B in Dynamo.

I'm trying to build an appropriate marshaler (sic), but I'm also a bit stymied by the unfamiliar abstract interface (compared to say, Ruby's Marshal class, AR's serialization support, or Data Mapper's dm-types) for these in this library. In what way does type_cast imply "read from database representation"? And why is it always called, even when writing to the database representation?

If it has to work this way, what class can my marshaler assume the raw data is in when read?

The table does not have the specified index: my_index

Hello,

I'm having trouble creating a secondary index.

In my model, I define the index with:

global_secondary_index :my_index,
      hash_key: :identity_id, range_key: :range_uid, projection: { projection_type: "ALL" }

And before running my tests I have:

migration = Aws::Record::TableMigration.new(type)
migration.create!({
    provisioned_throughput: {
      read_capacity_units: 5,
      write_capacity_units: 2
    }, global_secondary_index_throughput: {
     my_index: {
       read_capacity_units: 5,
       write_capacity_units: 2,
   }
}
migration.wait_until_available

When I run my tests, I have the following error:

     Aws::DynamoDB::Errors::ValidationException:
       The table does not have the specified index: my_index

So, it seems that the migration is not actually creating the index? Do I need to do something to actually create the index?

DybamoDB local

Is there a way to access the dynamodb local with this gem?

Projecting Specific Columns in find

As of now when doing Cars.find(id: 1, manufacturer_id: 1) it gives whole row, but what if I have
structure like this,
{ id: 1, manufacturer_id: 1, style: { color: red, rim_present: true } }, so now I want only the style portion of the car not the whole record. The only possible way now is query but I know the primary key(id, manufacturer_id) values
Is it possible like Cars.find(id:1, manufacturer_id: 1, projection_expression: 'style') or something on the lines of this?

Nested List Attributes & Type Casting

I've noticed that when I use a nested list, that integers come back as BigDecimal even tho they are PUT into an item using N type. This means I have to take on the need of type casting my items collection as needed. This seems fine an natural to me but wanted to make sure this was not an issue or if there was some better direction/method to use?

For example, as is now. My collect is using an _list as the name and then in my model I have a list method that iterates over _list data and casts it accordingly. Sound good?

Known Issue: Some types do not dirty track correctly.

I've discovered this around set types, but it likely exists in other types too - I need to verify. Here's an example:

require 'aws-record'

class Model
  include Aws::Record
  string_attr     :uuid, hash_key: true
  string_set_attr :problem
end

item = Model.new
item.uuid = "uuid"
item.problem = ["one"]
item.save # Initial save

from_db = Model.find(uuid: "uuid")
from_db.dirty? # => false, which is correct
from_db.problem.add("two")
from_db.dirty? # => false, which is arguably incorrect
from_db.save # Will not save the set changes.

The problem is that dirty tracking does not track changes to the object itself, it tracks that the object itself is different. For example, this snippet is a workaround:

from_db = Model.find(uuid: "uuid")
from_db.problem = from_db.problem.dup.add("two")
from_db.dirty? # => true
from_db.save # properly saves the set

I'm working on the correct fix for this, but it should be known that this problem exists.

List of classes

The current documentation sates lists that a list may include only:

  • Hash
  • Array
  • String
  • Numeric
  • Boolean
  • IO
  • Set
  • nil

I've been looking over the code and was hoping theres was a way to define a formal class within a list. For example:

class Order
    include Aws::Record
    list_attribute :line_items, of: LineItem
end

The exact interface for how this would be accomplished is up for debate, but if this is a feature within the scope of this gem I would be willing to contribute it.

"limit" on scan

My code :

Model

class Project
  include Aws::Record
  string_attr     :id, hash_key: true
  string_attr     :title
  string_attr     :description
  string_attr     :status
end

Controller

class ProjectsController < ApplicationController
    def index
     options = {:limit => params[:limit]}
     projects = Project.scan(options)
    end
end

Output
All records


Also tried

class ProjectsController < ApplicationController
    def index
     projects = Project.build_scan.limit(params[:limit]).complete!
    end
end

Output
All records

But

class ProjectsController < ApplicationController
    def index
     projects = Project.build_scan.limit(params[:limit]).complete!
     projects.page # if i removed this line the Output gonna be all records
     projects
    end
end

Output
All records as per limit. Can you help to explain why it works only when .page is used?

#empty? on ItemCollection always returns false

On a table that has no matches, I ran #empty? and #count expecting to get true and 0, respectively. The following is what I got from rails console.

irb(main):018:0> Resource.scan({filter_expression: "resource_id = ABC"}).empty?
[Aws::DynamoDB::Client 200 0.103658 0 retries] scan(filter_expression:"resource_id = ABC",table_name:"resources_test")

=> false
irb(main):019:0> Resource.scan({filter_expression: "resource_id = ABC"}).count
[Aws::DynamoDB::Client 200 0.067079 0 retries] scan(filter_expression:"resource_id = ABC",table_name:"resources_test")

=> 0

If I'm reading this correctly, the test cases for #empty also expect false (always).

Am I missing something here?

Table Migrations?

Hi Alex! I'm back again with another question, this time regarding table migrations. Let's say that I have a model defined called Student with a grade_level (integer, partition key) and a user_uuid (string, range_key):

class Student
  include Aws::Record
  integer_attr :grade_level, hash_key: true
  string_attr :user_uuid, range_key: true
end

I want to add a list of classes to each student, but I can't do so! Lets say I want to update a student:

student = Student.find(grade_level: 10, user_uuid: abfag-1234-fhasdf)
student.classes = ["Spanish", "English"]
student.update!

returns an error: undefined method 'classes='

This makes sense, of course, as the point of the ORM abstraction is a way to bring some structure to models so they aren't random sets of key value pairs all over the tables. My guess is that I'm supposed to migrate the Students table to add an attribute? I went to check the documentation and the #update method said to defer to the AWS-SDK V2 documentation, which brought me this:

#update_table(options = {}) ⇒ Types::UpdateTableOutput

Modifies the provisioned throughput settings, global secondary indexes, or DynamoDB Streams settings for a given table. 

source: (https://docs.aws.amazon.com/sdkforruby/api/Aws/DynamoDB/Client.html#update_table-instance_method)

I can't find anywhere that tells me I can add/remove attributes from a table! It looks to me that the attribution CRUD operations rest in #update_item, as detailed:

#update_item(options = {}) ⇒ Types::UpdateItemOutput

Edits an existing item's attributes, or adds a new item to the table if it does not already exist. You can put, delete, or add attribute values. You can also perform a conditional update on an existing item (insert a new attribute name-value pair if it doesn't exist, or replace an existing name-value pair if it has certain expected attribute values). 

source: (https://docs.aws.amazon.com/sdkforruby/api/Aws/DynamoDB/Client.html#update_item-instance_method).

I must be confused, because from my perspective, CRUD operations on models are handled on a per-item basis from the aws-sdk, but there's no setter method for models on the aws-record abstraction? Your help is greatly appreciated.

ValidationError could have a more helpful message

"Validation hook returned false!" leaves you wondering what caused the hook to return false. When using ActiveModel::Validations the details of what validations failed are available by errors.full_messages.

I wonder if we could optionally use the ActiveModel::Validations errors.full_messages, or create a hook that would allow a consumer to inject whatever text they wanted for the ValidationError message.

Datetime format stored and queried

When storing a UTC datetime object with date_time_marshaler.rb, the stored value is in "2017-01-01T09:05:00+00:00" format, not "2017-01-01T09:05:00Z".
This makes it necessary to manually format datetime with the following code before using it in query, like date_time_marshaler.rb does internally.

date_string = DateTime.parse(Time.at(date.to_i).utc.to_s).iso8601 #=> "2017-01-01T09:05:00+00:00"
date_string = date.iso8601 #=> "2017-01-01T09:05:00Z"

I feel it will be helpful if we can save datetime in "2017-01-01T09:05:00Z" format, or if we can pass a datetime object in expression_attribute_values.

Conditional put

Hi! Is there any way to do a conditional put request? The case when I get some data, change it by some operations and I want to save it back only if data in DynamoDB still the same when I get it.

Maybe you could give me an advice how to implement it with a few changes.

New TableMigration with local DynamoDB instance returns NoMethodError

I'm having trouble creating migrations with a local version of DynamoDB. I'm supplying TableMigration with a client that (tested successfully) connects to the local DynamoDB instance, and am using valid AWS credentials, like so:

client = Aws::DynamoDB::Client.new(
  region: 'localhost', 
  endpoint: 'http://localhost:8000', 
  access_key_id: 'goodkey', 
  secret_access_key: 'validsecret')
migration = Aws::Record::TableMigration.new(MyModel, opts: {:client => client})

I've also tried it with a valid AWS region (ie us-west-1). The error I receive in each instance is:

NoMethodError: undefined method `match' for nil:NilClass
  block in partition_matching_region at aws-sdk-core-2.3.8/lib/aws-sdk-core/endpoint_provider.rb:67
                                find at org/jruby/RubyEnumerable.java:633
           partition_matching_region at aws-sdk-core-2.3.8/lib/aws-sdk-core/endpoint_provider.rb:66
                       get_partition at aws-sdk-core-2.3.8/lib/aws-sdk-core/endpoint_provider.rb:54
                        endpoint_for at aws-sdk-core-2.3.8/lib/aws-sdk-core/endpoint_provider.rb:26
                             resolve at aws-sdk-core-2.3.8/lib/aws-sdk-core/endpoint_provider.rb:10
                             resolve at aws-sdk-core-2.3.8/lib/aws-sdk-core/endpoint_provider.rb:80
           block in RegionalEndpoint at aws-sdk-core-2.3.8/lib/aws-sdk-core/plugins/regional_endpoint.rb:24
                                call at aws-sdk-core-2.3.8/lib/seahorse/client/configuration.rb:64
           block in resolve_defaults at aws-sdk-core-2.3.8/lib/seahorse/client/configuration.rb:199
                                each at org/jruby/RubyArray.java:1593
                                each at aws-sdk-core-2.3.8/lib/seahorse/client/configuration.rb:57
                    resolve_defaults at aws-sdk-core-2.3.8/lib/seahorse/client/configuration.rb:198
                            value_at at aws-sdk-core-2.3.8/lib/seahorse/client/configuration.rb:194
                    block in resolve at aws-sdk-core-2.3.8/lib/seahorse/client/configuration.rb:183
                            each_key at org/jruby/RubyHash.java:1511
                                each at lib/ruby/stdlib/set.rb:306
                             resolve at aws-sdk-core-2.3.8/lib/seahorse/client/configuration.rb:183
                      apply_defaults at aws-sdk-core-2.3.8/lib/seahorse/client/configuration.rb:171
                              build! at aws-sdk-core-2.3.8/lib/seahorse/client/configuration.rb:144
                        build_config at aws-sdk-core-2.3.8/lib/seahorse/client/base.rb:68
                          initialize at aws-sdk-core-2.3.8/lib/seahorse/client/base.rb:19
                                 new at aws-sdk-core-2.3.8/lib/seahorse/client/base.rb:105
                          initialize at aws-record-1.0.0.pre.8/lib/aws-record/record/table_migration.rb:34

As you can see, I'm using JRuby, but as the library is the same it seems unlikely that this is the problem. Any thoughts? Am I setting up the local table migration correctly?

Seahorse::Client::NetworkingError (SSL_connect returned=1 errno=0 state=error: wrong version number)

Hello, I am trying to run a table migration on a local instance of DynamoDB (using the Docker image). Here's what I am trying:

irb(main):035:0> client = Aws::DynamoDB::Client.new(region: 'local', access_key_id: 'foobar', secret_access_key: 'foobar
')
irb(main):036:0> migration = Aws::Record::TableMigration.new(Model, client: client)
irb(main):037:1* migration.create!(
irb(main):038:2*   provisioned_throughput: {
irb(main):039:2*     read_capacity_units: 5,
irb(main):040:2*     write_capacity_units: 2
irb(main):041:1*   },
irb(main):042:2* global_secondary_index_throughput: {
irb(main):043:3*   gsi: {
irb(main):044:3*     read_capacity_units: 3,
irb(main):045:3*     write_capacity_units: 1
irb(main):046:2*   }
irb(main):047:1* }
irb(main):048:0> )
Seahorse::Client::NetworkingError (SSL_connect returned=1 errno=0 state=error: wrong version number)

The local instance of DynamoDB is running, since I am able to query it from my terminal.

I can't tell what is wrong here. Would appreciate any guidance!

Derived Column Values

Question: is there a good way to have columns whose values are derived from other columns and to have those values be correctly populate for all interfaces? For example:

class MyRecord
  include Aws::Record

  integer_attr :pid, hash_key: true
  integer_attr :shd, derived_from: :pid { |pid| pid % 100 }

Derived from would take an array of fields and would execute the block when any referenced column is changed, passing their values to the block either separately or as a hash.

Then one would expect:

record = MyRecord.new(pid: 1234)
record.shd #=> 34

My work-around for this was along the lines of:

def pid=(value)
  set_attribute(:pid, value) # copied from Aws::Record generated method since 'super' doesn't exist.
  self.shd = pid % 100
  value
end

Which works for the simple create/save case:

MyRecord.new(pid: 1234).save!
record = MyRecord.find(pid: 1234)
record.shd #=> 34

But, as I just discovered, does not work for updates (because no instance is ever involved):

MyRecord.update(pid: 3456) # upserts a new record
record = MyRecord.find(pid: 3456)
record.shd #=> nil

Would love to see a generalized way of doing this; seems pertinent for dynamically generating GSIs and having everything play nicely...

Current work around will be (though now I need a base record class rather than module so super will actually work):

def update(opts)
  inject_default_opts!(opts)
  super(opts)
end

No link to example app

The documentation is ok, but an example Ruby on Rails 5 app with migrations would be truly awesome.

I imagine such an example exists somewhere. How else did you test this? :)

Can you please add a link to it in the README.

Batch write support

It appears for our application, we're going to need batch write support; we're only seeing about 70 WOPS out of our application, and our design goal for this version about 250 (the table is provisioned for 1500 at the moment).

What's your best suggestion for faking it until aws:record can do it?

instantiate method

While working with aws-record I found it extremely helpful to define an instantiate method:

# @param [Hash] attrs
#
# @return [Aws::Record]
def instantiate(attrs)
  record = new
  data = record.instance_variable_get('@data')
  attrs_for_extraction = attrs.with_indifferent_access

  attributes.each do |name, attr|
    data[name] = attr.extract(attrs_for_extraction)
  end

  record
end

Very similar to _build_items_from_response in item_collection.rb.

Whenever I work with the low-level dynamodb_client, I end up using MyRecord.instantiate to build records from the response.

Would be nice to have an officially maintained method like instantiate as part of the public API.

Consistent read in `find`?

As far as I can tell, the request_options gets nothing from the hash we pass in, so there's no way to specify that a find should be consistent. This definitely causes problems in our test environment, where we get "heisenspecs": tests that sometimes pass and sometimes fail, depending on when "eventually" happens. I'd be happy to work on a patch... but given that the arguments to find are a hash, it would mean introducing reserved words (consistent_read) in a place that doesn't obviously have to worry about them now.

Survey/Question: Are you using TableMigration? How?

I've gotten some feedback about the Aws::Record::TableMigration class. Along those lines, I'm working on enhancements to that library. I'd like to hear what is and is not going well with that class from any current users.

Alternatively, if you're only integrating with existing DynamoDB tables, are the model classes giving you all the functionality you need to do this? If not, where is it falling short?

The '#find' command doesn't properly populate database_attribute_name if symbolized

To reproduce, consider the following class:

class FindTestTable
  include Aws::Record

  string_attr :hash, hash_key: true, database_attribute_name: :hk
  string_attr :range, range_key: true, database_attribute_name: :rk
end

This class will work for creating new items and saving them, but if you run the following, you'll have a problem:

item = FindTestTable.find(hash: 'validh', range: 'validr')
item.to_h # will have nil values for both keys

This works correctly if you stringify the values, as is documented:

class FindTestTable
  include Aws::Record

  string_attr :hash, hash_key: true, database_attribute_name: 'hk'
  string_attr :range, range_key: true, database_attribute_name: 'rk'
end

I think this should be enhanced to properly read from databases when you use symbols for the database_attribute_name, it was likely an oversight on my part that it didn't to begin with. I also think this is likely a decent first issue for someone looking to get familiar with the deserialization logic.

Survey/Question: Add a lightweight, optional validation module?

Version 1.0.0.pre.7 removed the built-in #valid? and #errors methods, directing you to validation libraries like ActiveModel::Validations instead. A question to active followers/users: Does anybody find this to be a difficult approach to work with?

ActiveModel provides a rich set of validations, and re-implementing them seems to be wasteful. Some will also enjoy keeping the library lightweight, by not always requiring ActiveModel if not needed for your use case. However, I do see a potential middle ground - does anybody need some validation functionality, but can't or won't use an outside library?

I want to hear about those use cases, if they exist. And if they do, I'm open to ideas for what you would need in an optional module that provides validation hooks.

Configuration with a Single Table

I've been this library into a new Ruby application running on AWS Lambda. Overall, I'm new to Lambda and Dynamo DB, so I apologize if some of this is obvious.

Per the Dynamo DB docs, it seems that a well structured application should include only one table. I've reviewed some AWS documentation and videos on how to structure keys and indexes to include a variety of information in single table while still maintaining the ability to query for distinct pieces of data, rather than say a separate User, Product, Order table.

However, my reading of this library and some examples seems to imply the use of several tables- in particular, defining specific properties/indexes on the model, and taking this model as an argument to TableMigration.

Is the expectation that this library is used with a single-table application?

Sub-par validations

Hey guys, first off, thank you for making this gem! I love the fact that your architecture is true to DynamoDB, instead of forcing a fit with the ActiveRecord API.

I've started using aws-record in a new project and so far so good. The one thing that I'm unhappy with are validators.

Validators must define the validate(value) method. This is ok for very simple use cases, like validating the presence of a value, ensuring the value falls within a certain range, etc.

Unfortunately, this structure makes it impossible to handle more complex validation scenarios. For example, let's say I have a record with two fields: ingredient and quantity. If I want to validate that quantity falls within a specified range for a given ingredient, I have no way of doing this.

IMO you guys should either:

  1. Not support validations and leave those up to the user.
  2. Pass the record itself as an argument to the validate method. This seems like more of a hack than a proper solution, since now it no longer makes sense to define "validators" at the field level.
  3. Take inspiration from ActiveRecord validators, and define validations at the record level.

Why become nil?

Thank you for making a wonderful Gem!

During development, I use local DynamoDB.
Data could be saved to DynamoDB, but scan will be nil for some reason.

require 'aws-record'

class Transaction
  include Aws::Record
  set_table_name "hoges"

  string_attr :id, hash_key: true
  boolean_attr :processed

  local_client = Aws::DynamoDB::Client.new(region: "localhost", endpoint: 'http://localhost:8000',)
  self.configure_client(client: local_client)
end

scan = Transaction.scan(
  expression_attribute_names: {"#p": "processed"},
  expression_attribute_values: {":val": {"BOOL": false}},
  filter_expression: "#p = :val",
)

puts scan # nil

Attribute's default_value can be modified

It's possible to modify an attribute's default_value and have those modifications carry over to new instances of the model.

Example:

class MyModel
  include Aws::Record

  map_attr :things, default_value: {}
end
> MyModel.new.things[:foo] = :bar
=> :bar
> MyModel.new.things
=> {:foo=>:bar}

Is query limit supported?

I'm having some trouble limiting the results of a query on an index. I keep trying to impose a limit of 50 but I still get all the results that match the index query. Is Limit supported? or am I missing something?

Post.query(
        index_name: "blog_id-posted_time-index",
        key_condition_expression: "#H = :h AND #R < :r",
        limit: 50,
        expression_attribute_names: {
            "#H" => "blog_id",
            "#R" => "posted_time"
        },
        expression_attribute_values: {
            ":h" => blog_uuid,
            ":r" => timestamp
        }
    )

Where Post includes Aws::Record.

Record Instance Update

Right now save (or save!) encapsulates the selection between a put and an update, with the ability to force put. I recently wanted a "find or create" method, and naively did:

record = MyRecord.find(key: 123)
record ||= MyRecord.new(key: 123)
...
record.save! #=> Conditional write exception

I end up receiving the conditional write exception because some other process is trying to create the same record.

To fix this, I moved to using an update:

MyRecord.update(key: 123, value: 'some new data')

However, we have a GSI which is dynamically generated from the values which update never takes into account (see issue #58). Thus, the preferred solution would be:

MyRecord.new(key: 123, value: 'some new data').upsert!

To achieve this I copied the else statement from ItemOperations#_perform_save which builds an update request for dirty columns and creates the record if it does not already exist; hence upsert.

Batch get support

I was diving into the code and I see methods for #scan and #query but not one for #batch_get_item. Is that something you'd like to see added at the ItemRecord level or should clients subclass and add methods as appropriate? In that case will methods like #_build_item_from_resp or #_build_items_from_response become public?

I was thinking of a method like #find_items that would take in an array of primary key hashes and it would handle unprocessed keys appropriately (I've worked low level with the Ruby SDK V1 so I'm not sure if either the V1 or V2 client handled unprocessed items appropriately).

Happy to send a PR if this isn't on a roadmap.

Rspec testing/mocks

Hi, I just started using this gem in a Rack (Sinatra) project and I love it! I'm starting to write some rspec tests and was wondering what's the best way to stub the calls to Aws::Record.

For example, my model looks like this:

    class Resource
      include Aws::Record
      configure_client client: ddb_client

      string_attr   :id, hash_key: true
      string_attr   :type
      string_attr   :status
      map_attr      :data
    end

In my code I have something like this:

Resource.find(id: uuid)

What's the best way to stub that call (and possibly others, e.g. scan or query)?
I guess it could do this but I'm not sure how to set up the Record object:

expect(Resource).to receive(:find).and_return(record)

Any examples would be appreciated!

ability to conditionally update capacity

We have a rails application that uses dynamo. It's useful to have table definitions as migrations to ensure developers have up-to-date schemas. However, changing capacity is something that belongs in configuration management (e.g. terraform) rather than in my application code.

One solution is to have a flag in TableConfig that makes says, "only set capacity on table creation". Slowly working through that in this branch: https://github.com/NoRedInk/aws-sdk-ruby-record/tree/optionally-update-throughput

Why was this gem created?

Hi,

I'm going to start a rather big project using ruby and dynamoDB. I first stumbled upon this gem: https://github.com/Dynamoid/Dynamoid and then came across this one too. Now I'm wondering: Why was this one created a year ago, then Dynamoid was already around? Is there any particular reason that could guide me towards which one I should use... :-)

Query by time?

It appears the attribute marshallers know about date-time, but the query stack does not. What's the best workaround for now? I'd like to make sure I marshal exactly like the attribute mappers do, so it appears I should reach deeply into the Marshaler::DateTime class to get my value translated?

With a hash key of "data_source_id" and a sort key of "input_file_mtime", I try a query like:

Model:

   class Thang
     include Aws::Record
     set_table_name 'thangs'
     integer_attr :data_source_id, hash_key: true
     datetime_attr :input_file_mtime, sort_key: true
   end

Query:

         key_condition_expression:
           '#data_source_id = :data_source_id AND #input_file_mtime = :timestamp',
        expression_attribute_names: {
           '#data_source_id' => 'data_source_id',
           '#input_file_mtime' => 'input_file_mtime',
         },
         expression_attribute_values: {
           ':input_file_mtime' => opts[:input_file_mtime],
           ':data_source_id' => opts[:data_source_id],
         },

Result:

     ArgumentError:
       unsupported type, expected Hash, Array, Set, String, Numeric, IO, true, false, or nil, got Time
     # /home/rcobb/.rvm/gems/ruby-2.1.5@stone/gems/aws-sdk-core-2.6.5/lib/aws-sdk-core/dynamodb/attribute_value.rb:47:in `format'
     # /home/rcobb/.rvm/gems/ruby-2.1.5@stone/gems/aws-sdk-core-2.6.5/lib/aws-sdk-core/dynamodb/attribute_value.rb:16:in `marshal'
     # /home/rcobb/.rvm/gems/ruby-2.1.5@stone/gems/aws-sdk-core-2.6.5/lib/aws-sdk-core/plugins/dynamodb_simple_attributes.rb:191:in `translate'
     # /home/rcobb/.rvm/gems/ruby-2.1.5@stone/gems/aws-sdk-core-2.6.5/lib/aws-sdk-core/plugins/dynamodb_simple_attributes.rb:185:in `block in map'
     # /home/rcobb/.rvm/gems/ruby-2.1.5@stone/gems/aws-sdk-core-2.6.5/lib/aws-sdk-core/plugins/dynamodb_simple_attributes.rb:184:in `each'
     # /home/rcobb/.rvm/gems/ruby-2.1.5@stone/gems/aws-sdk-core-2.6.5/lib/aws-sdk-core/plugins/dynamodb_simple_attributes.rb:184:in `with_object'
     # /home/rcobb/.rvm/gems/ruby-2.1.5@stone/gems/aws-sdk-core-2.6.5/lib/aws-sdk-core/plugins/dynamodb_simple_attributes.rb:184:in `map'
     # /home/rcobb/.rvm/gems/ruby-2.1.5@stone/gems/aws-sdk-core-2.6.5/lib/aws-sdk-core/plugins/dynamodb_simple_attributes.rb:201:in `translate_complex'
     # /home/rcobb/.rvm/gems/ruby-2.1.5@stone/gems/aws-sdk-core-2.6.5/lib/aws-sdk-core/plugins/dynamodb_simple_attributes.rb:193:in `translate'
     # /home/rcobb/.rvm/gems/ruby-2.1.5@stone/gems/aws-sdk-core-2.6.5/lib/aws-sdk-core/plugins/dynamodb_simple_attributes.rb:166:in `block in structure'
     # /home/rcobb/.rvm/gems/ruby-2.1.5@stone/gems/aws-sdk-core-2.6.5/lib/aws-sdk-core/plugins/dynamodb_simple_attributes.rb:165:in `each'
     # /home/rcobb/.rvm/gems/ruby-2.1.5@stone/gems/aws-sdk-core-2.6.5/lib/aws-sdk-core/plugins/dynamodb_simple_attributes.rb:165:in `with_object'
     # /home/rcobb/.rvm/gems/ruby-2.1.5@stone/gems/aws-sdk-core-2.6.5/lib/aws-sdk-core/plugins/dynamodb_simple_attributes.rb:165:in `structure'
     # /home/rcobb/.rvm/gems/ruby-2.1.5@stone/gems/aws-sdk-core-2.6.5/lib/aws-sdk-core/plugins/dynamodb_simple_attributes.rb:152:in `apply'
     # /home/rcobb/.rvm/gems/ruby-2.1.5@stone/gems/aws-sdk-core-2.6.5/lib/aws-sdk-core/plugins/dynamodb_simple_attributes.rb:121:in `translate_input'
     # /home/rcobb/.rvm/gems/ruby-2.1.5@stone/gems/aws-sdk-core-2.6.5/lib/aws-sdk-core/plugins/dynamodb_simple_attributes.rb:111:in `call'
     # /home/rcobb/.rvm/gems/ruby-2.1.5@stone/gems/aws-sdk-core-2.6.5/lib/aws-sdk-core/plugins/param_converter.rb:20:in `call'
     # /home/rcobb/.rvm/gems/ruby-2.1.5@stone/gems/aws-sdk-core-2.6.5/lib/aws-sdk-core/plugins/response_paging.rb:26:in `call'
     # /home/rcobb/.rvm/gems/ruby-2.1.5@stone/gems/aws-sdk-core-2.6.5/lib/seahorse/client/plugins/response_target.rb:21:in `call'
     # /home/rcobb/.rvm/gems/ruby-2.1.5@stone/gems/aws-sdk-core-2.6.5/lib/seahorse/client/request.rb:70:in `send_request'
     # /home/rcobb/.rvm/gems/ruby-2.1.5@stone/gems/aws-sdk-core-2.6.5/lib/seahorse/client/base.rb:207:in `block (2 levels) in define_operation_methods'
     # /home/rcobb/.rvm/gems/ruby-2.1.5@stone/gems/aws-record-1.0.1/lib/aws-record/record/item_collection.rb:79:in `items'
     # /home/rcobb/.rvm/gems/ruby-2.1.5@stone/gems/aws-record-1.0.1/lib/aws-record/record/item_collection.rb:40:in `each'

Convert Date/Time to utc and iso8601 string when using datetime_attr

I've used the datetime_attr of aws-record, but since this type is not available in DynamoDB, it is converted to a string.

Yet, it is not really convenient to store dates without any convention in DynamoDB because I want to sort by date. Is it possible to convert date/time to utc and iso8601 before storing them in DynamoDB, or to create a new type of attributes to do so (something like utc_time_attr) ?

It is really easy to convert to this format using Ruby :

Time.parse(my_date).utc.iso8601

Something like this could work :

require 'time'

module UtcTimeMarshaler
  def self.type_cast(raw_value, options = {})
    case raw_value
    when nil
      nil
    when ''
      nil
    when Integer
      Time.at(raw_value).utc.iso8601
    when Time
      raw_value.utc.iso8601
    when DateTime
      raw_value.to_time.utc.iso8601
    else
      Time.parse(raw_value.to_s).utc.iso8601
    end
  end

  def self.serialize(raw_value, options = {})
    time = type_cast(raw_value)
    case time
    when nil
      nil
    when Time
      time.utc.iso8601
    else
      msg = "expected a DateTime value or nil, got #{datetime.class}"
      raise ArgumentError, msg
    end
  end
end

Nested attributes / associations?

Hi, I'm using DynamoDB to store some user metadata. The idea being that I can query for a single user ID to find sets of data about that user. Think of a soundcloud-like app where users have different playlists of songs: "Electronic", "Shower Music", "Graduation Playlist".

The document would look something like this:

{
  "uuid": 87,
  "playlists": [
      {
          "name": "Electronic",
          "songs": [12, 17, 33, 24, 49]
      },
      (
           "name": "Shower Music",
           "songs": [4, 15, 7]
      } ,
      {    "name": "Graduation Playlist"
           "songs":  [42]
      }
   ]
}

However, after digging through your documentation/source code (maybe I didn't dig deep enough?), I can't seem to find anywhere that gives the ability to define nested hashes by default on the models. Is this intended—models should only have single-level attributes? The solution that comes to mind for me is associations, but I don't think aws-record supports associations (correct me if I'm wrong).

What's the best way to move forward on this?

If you forget to create a schema, weird error messages ensue

I am a stupid idiot, and I forgot to add hash_key: true to the actual, uh, hash_key. This is a dumb thing that dumb people do and I lost hours from it due to my own stupidity.

However, the library never told me "Hey, idiot, I don't know how to handle your aws-record object because you never told me what your actual keys were." That would've been nice :(

I suspect, also, that you can probably inadvertently specify two hash_key's, and the library won't care.

Ideally, when the Class is first created might be the best time to check for this, but I think you might not ever get a signal that 'oh, yeah, the user is finished defining their class'.

So instead, maybe we could put something in the save() method that, when it's walking through the keys to see if any of them have changed (to see if this is a 'new record' or an 'update'), we can use the following algorithm:

  1. Check to see if a temporary variable (that was initialized to false) is true, if so skip this check. Something like schema_checked?
  2. Otherwise, we look for exactly-only-one hash key, and optionally, perhaps only-one range key. More than one range key or not exactly one hash key should throw an exception, something like Aws::Record::SchemaNotSpecified or Aws::Record::SchemaInvalid, ideally with some helpful further information like "No Hash Key defined" or "More than one Hash Key defined" or "More than one Range Key Defined".
  3. Set schema_checked? to true, so this code doesn't get run again.

Perhaps that optimization isn't necessary, but lots of people who use DynamoDB use it for performance reasons, and so if they're really beating up on the DB, then repeatedly walking through the defined schema seems wasteful.

How to inherit common attributes from a base class?

I have 2 models that share a number of attributes, but have some of their own. Is there any way I can use this library to do this?

If I put the include on the super class:

class Animal
  include Aws::Record
  string_attr :name
  integer_attr :age
end

class Dog < Animal
  boolean_attr :family_friendly
end

class Cat < Animal
  integer_attr :num_of_wiskers
end

Then either Dog.new or Cat.new raises:

NoMethodError: undefined method `register_attribute' for nil:NilClass

If I move include Aws::Record into the subclasses, loading Animal stops working.

If I include Aws::Record in all 3 classes, the subclasses don't inherit attributes because of the way Aws::Record::Attributes.included works.

I can make this work using:

class Animal
  def self.attributes
    { name: 'string', age: 'integer'}
  end
end

class Cat < Animal
  include Aws::Record
  Animal.attributes.each_pair do |name, type|
    send("#{type}_attr", name)
  end
  integer_attr :num_of_wiskers
end

class Dog < Animal
  include Aws::Record
  Animal.attributes.each_pair do |name, type|
    send("#{type}_attr", name)
  end
  boolean_attr :family_friendly
end

but doing this feels like having to work around the library's limitations. Is there a better way?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.