Code Monkey home page Code Monkey logo

redis-bloomfilter's Introduction

Build Status

redis-bloomfilter

Requires the redis gem.

Adds Redis::Bloomfilter class which can be used as distributed bloom filter implementation on Redis.

A Bloom filter is a space-efficient probabilistic data structure that is used to test whether an element is a member of a set.

Installation

$ gem install redis-bloomfilter

Testing

$ bundle install
$ rake

Drivers

The library contains a set of different drivers.

  • A pure Ruby implementation
  • A server-side version based on lua available for Redis v. >= 2.6

How to use

require "redis-bloomfilter"

# It creates a Bloom Filter using the default ruby driver
# Number of elements expected : 10000
# Max error rate: 1%
# Key name on Redis: my-bloom-filter
# Redis: 127.0.0.1:6379 or an already existing connection
@bf = Redis::Bloomfilter.new(
  :size => 10_000, 
  :error_rate => 0.01, 
  :key_name => 'my-bloom-filter'
)

# Insert an element
@bf.insert "foo"
# Check if an element exists
puts @bf.include?("foo") # => true
puts @bf.include?("bar") # => false

# Empty the BF and delete the key stored on redis
@bf.clear

# Using Lua's driver: only available on Redis >= 2.6.0
# This driver should be prefered because is faster
@bf = Redis::Bloomfilter.new(
  :size => 10_000, 
  :error_rate => 0.01, 
  :key_name   => 'my-bloom-filter-lua',
  :driver     => 'lua'
)

# Specify a redis connection:
# @bf = Redis::Bloomfilter.new(
#   :size => 10_000, 
#   :error_rate => 0.01, 
#   :key_name   => 'my-bloom-filter-lua',
#   :driver     => 'lua',
#   :redis      => Redis.new(:host => "10.0.1.1", :port => 6380)
# )

Performance & Memory Usage

---------------------------------------------
Benchmarking lua driver with 1000000 items
              user     system      total        real
insert:   38.620000  17.690000  56.310000 (160.377977)
include?: 43.420000  20.600000  64.020000 (175.055146)

---------------------------------------------
Benchmarking ruby driver with 1000000 items
              user     system      total        real
insert:  125.910000  20.250000 146.160000 (195.973994)
include?:121.230000  36.260000 157.490000 (231.360137)

The lua version is about ~3 times faster than the pure-Ruby version

Lua code is taken from https://github.com/ErikDubbelboer/redis-lua-scaling-bloom-filter

1.000.000 ~= 1.5Mb occupied on Redis

Contributing to redis-bloomfilter

  • Check out the latest master to make sure the feature hasn't been implemented or the bug hasn't been fixed yet.
  • Check out the issue tracker to make sure someone already hasn't requested it and/or contributed it.
  • Fork the project.
  • Start a feature/bugfix branch.
  • Commit and push until you are happy with your contribution.
  • Make sure to add tests for it. This is important so I don't break it in a future version unintentionally.
  • Please try not to mess with the Rakefile, version, or history. If you want to have your own version, or is otherwise necessary, that is fine, but please isolate to its own commit so I can cherry-pick around it.

Copyright

Copyright (c) 2013 Francesco Laurita. See LICENSE.txt for further details.

redis-bloomfilter's People

Contributors

alanmccann avatar idoa01 avatar taganaka avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

redis-bloomfilter's Issues

License missing from gemspec

Some companies will only use gems with a certain license.
The canonical and easy way to check is via the gemspec
via e.g.

spec.license = 'MIT'
# or
spec.licenses = ['MIT', 'GPL-2']

There is even a License Finder to help companies ensure all gems they use
meet their licensing needs. This tool depends on license information being available in the gemspec.
Including a license in your gemspec is a good practice, in any case.

How did I find you?

I'm using a script to collect stats on gems, originally looking for download data, but decided to collect licenses too,
and make issues for missing ones as a public service :)
https://gist.github.com/bf4/5952053#file-license_issue-rb-L13 So far it's going pretty well

Hiredis Dependency

Since it's a transport mechanism any reason why the hiredis dependency is required?
We are unfortunately in a situation were we can't use it for various reasons.

Consider removing `remove` method from the driver

I recently came across a nasty bug that I believe is a combination of having a remove option and the underlying lua implementation that scales the bloomfilter and increments the counter on each write function call.

Even if you roll the dice with an error rate, remove in a bloomfilter that is flipping bits seems unsafe. Any time the same bit gets flipped to 1 by two different pieces of data, you risk removing both pieces of data if you subsequently remove either one by setting that bit back to 0.

But with scaling in the mix, the issue is even thornier. Since we are incrementing the count each time the insert function is called, any operation -- insert, remove, or even an add call with duplicate data -- will increment the counter. This accelerates the count that determines whether or not a new index, i.e. a new redis key, should be added and in turn creates unintended scaling.

What's more, you get unintended corruption and don't actually remove data if you have started added new keys in your scaled bloom filter, but try to remove something that was added to a previous key.

You don't know deterministically which key any one piece of data was written to. And you only ever add to the latest key.

Here's an example:

# initialize with a fixed size
size = 10
name = 'test'

# set up more than size-amount of data
ids = []
12.times { ids << SecureRandom.uuid }

bf = Redis::Bloomfilter.new(
              :size       => size,
              :error_rate => 0.001, # 0.1%
              :key_name   => name,
              :driver     => 'lua',
              :redis      => $redis
            )

# insert over size so we generate 2 buckets
ids.each { |d| bf.insert(d) }

# verify bucket count
[15] pry(main)> $redis.keys('test*')
=> ["test:1", "test:2", "test:count"]

# verify insert count (lua code increments count before each insert)
[16] pry(main)> $redis.get("test:count")
=> "12"

# all ids should be present -- confirmed
ids.each { |d| puts "INCLUDE #{d}?: #{bf.include?(d)}" }

INCLUDE 7a90b81e-9f21-47d2-a9f7-8533c8537e54?: true
INCLUDE 5b1ded55-f662-4fe7-822e-ba955e04503a?: true
INCLUDE 78ebf0f1-d8fc-4e35-9e07-9a25ef25e850?: true
INCLUDE 6070dc01-1a6d-4f35-b114-48e0ff92b61c?: true
INCLUDE 1d4b6147-c972-4433-a728-5b14f3c8b61b?: true
INCLUDE 7d1d4fc2-4137-4fba-9329-9f743a9f8a5b?: true
INCLUDE 9f76d6ff-6ca2-4a3c-a250-21dc13ffa712?: true
INCLUDE 9fdfa9d7-0c2a-4dfe-964e-7459f67eec0b?: true
INCLUDE 0674efdd-1111-4273-9927-4375656ee045?: true
INCLUDE ccb48632-1430-490b-a196-5a5da27f545d?: true
INCLUDE 43817598-7a16-40b1-a78b-881efd736576?: true
INCLUDE dacc26f9-40a4-4efd-be6c-09b3ed910b78?: true

# remove the first id, which we assume is written in bucket 1
[18] pry(main)> bf.remove(ids.first)
=> nil

# our write-to-latest theory is that it won't actually flip the bits in the first bucket -- confirmed
[19] pry(main)> bf.include?(ids.first)
=> true

# but we will increment count -- bloating the index calculation
[20] pry(main)> $redis.get("test:count")
=> "13"

# run an include check to see if one removal was enough to corrupt data in bucket 2 -- confirmed
ids.each { |d| puts "INCLUDE #{d}?: #{bf.include?(d)}" }

INCLUDE 7a90b81e-9f21-47d2-a9f7-8533c8537e54?: true
INCLUDE 5b1ded55-f662-4fe7-822e-ba955e04503a?: true
INCLUDE 78ebf0f1-d8fc-4e35-9e07-9a25ef25e850?: true
INCLUDE 6070dc01-1a6d-4f35-b114-48e0ff92b61c?: true
INCLUDE 1d4b6147-c972-4433-a728-5b14f3c8b61b?: true
INCLUDE 7d1d4fc2-4137-4fba-9329-9f743a9f8a5b?: true
INCLUDE 9f76d6ff-6ca2-4a3c-a250-21dc13ffa712?: true
INCLUDE 9fdfa9d7-0c2a-4dfe-964e-7459f67eec0b?: true
INCLUDE 0674efdd-1111-4273-9927-4375656ee045?: true
INCLUDE ccb48632-1430-490b-a196-5a5da27f545d?: true
INCLUDE 43817598-7a16-40b1-a78b-881efd736576?: true
INCLUDE dacc26f9-40a4-4efd-be6c-09b3ed910b78?: false

# clean-up the key
bf.clear

So I think for safety, you should remove the remove calls.

A nice update might be to implement something like this from the original repo where we only increment the counter if the data isn't already present. Of course, checking all the related keys would be the most thorough way to do this, but probably at the cost of performance of a full check before each insert.

Let me know if I can help with the actual implementation if this is an update you're interested in making.

Our workaround for now is removing the remove calls in our calling code and doing a check-before-insert each time.

Support connection pools

It would be nice if I could pass you our Redis connection pool and your gem would check out connections as necessary to perform operations. See the connection_pool gem.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.