Code Monkey home page Code Monkey logo

crystal-memcached's People

Contributors

comandeo avatar gdiggs avatar jhass avatar mamantoha avatar mauricioabreu avatar samfrench avatar teodor-pripoae avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

crystal-memcached's Issues

implement: cas (check and set) command

I don't know much about this feature aside from a library in another language uses it and I'm trying to offload some work using crystal, but there's no current memcache libraries that support the check and set/compare and swap functionality for memcached.

This was something I found (ignore the site's name, it has a lot of important information at the beginning) along the way, but I've never worked with sockets or byte code much so I'm not even sure where to start.

Hopefully that helps someone with implementation of said feature.

The client is extremelly slow

Hi! Thanks for the library!

We were implementing memcached backend for Kiwi (unified interface for key-value stores) and discovered that the client by some reason is too slow.

The following program (100 set operations), take about 4 seconds. I've tried another client(ruby) and it's executed immediately, so I conclude the problem on the client side.

require "memcached"

memcached = Memcached::Client.new

start_time = Time.now
100.times do |i|
  memcached.set(i.to_s, i.to_s)
end
duration = Time.now - start_time
puts duration.to_f

Thanks!

Implement key hashing

The client should be able to hash keys across multiple servers. Hashing should be consistent.

Tag New Release

First off, thank you @comandeo for this library. I'm new to Crystal and it's been very helpful to have all these nice, free shards to install.

I see someone fixed the issue I've been having the last few days back in January with this commit, 7049e6d, but it's not been tagged in an official release.

Should I just change my branch to master or is there a plan to incorporate this patch in a future tagged release?

Thank you.

Fails with large values in memcached

The client seems to fail when we get and set large values in memcached. This is shown by a failing test I have added:

https://github.com/samfrench/crystal-memcached/blob/large-value-test/spec/memcached/client_spec.cr#L11-L17

We tried with Dalli ruby client to get and set the same value in the same memcached instance and that worked fine. So this has been limited to just this client only.

We have looked into the code and tried to provide a fix for this, but do not understand the logic around getting and setting values in memcached. So raising this issue as this will probably be much easier to fix for you than us.

Using memcached client with large values truncates response

Hi @comandeo, thank you for fixing #9 previously. I have been looking again at getting and setting large values in memcached and it does look like there is still an issue. All the specs are fine and pass for me, but when I use this shard in my project, I always get a truncated response.

E.g.

File called example.cr

require "memcached"

client = Memcached::Client.new("127.0.0.1", 11211)
value = "LargeValue" * 1000
client.set("LargeKey", value)
response = client.get("LargeKey")
puts response.to_s.bytesize
puts response

Then I run:

crystal run example.cr

The output I get:

10000
LargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLargeValueLarg

I would expect 10000 characters to be output as "LargeValue" is 10 characters and multiplied by 1000 should give me 10000. For some reason I only get 8164 characters. Before setting the item in the cache I have verified that is the full 10000 characters. When looking into it, it doesn't always seem to be 8164 characters but I don't get the full response. When checking the bytesize of the string, it suggests that the string is 10000 characters but I don't see them.

I tried to add a spec for this, but it seems to only be an issue when running it as part of an application. I did wonder if it was the memcached image I was using, but as the specs pass with it, I think it is only when retrieving the item from the cache where it gets truncated. Any help in this area would be appreciated.

How to configure or set options?

Is there a way to specifiy options, or configuration values?

Many memcached wrappers/libraries have this concept. See:

My ultimate goal is to use MemCachier on Heroku for my Crystal/Kemal app. Most all of their examples recommend changing numerous settings.

From their Ruby examples:

{
  :failover => true,            # default is true
  :socket_timeout => 1.5,       # default is 0.5
  :socket_failure_delay => 0.2, # default is 0.01
  :down_retry_delay => 60       # default is 60
}

From their Python examples:

{
  # Faster IO
  'tcp_nodelay': True,

  # Keep connection alive
  'tcp_keepalive': True,

  # Timeout for set/get requests
  'connect_timeout': 2000, # ms
  'send_timeout': 750 * 1000, # us
  'receive_timeout': 750 * 1000, # us
  '_poll_timeout': 2000, # ms

  # Better failover
  'ketama': True,
  'remove_failed': 1,
  'retry_timeout': 2,
  'dead_timeout': 30,
}

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.