Code Monkey home page Code Monkey logo

elasticsearch_autocomplete's People

Contributors

sktocha avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

elasticsearch_autocomplete's Issues

How to index more fields of the model?

I want to index a primary_tag field with the name field, so that I don't have to hit the database after getting results from Elasticsearch. Do I need to provide custom mappings for this?

Strange Behaviour

I can't let your gem working at all.
Every time that i call your method ac_search i receive:

NoMethodError (undefined method `match' for #<Tire::Search::Query:0x5bcff50 @value={}>):

and when I indexing everything at the beginning (rake environment tire:import CLASS='User' FORCE=true)

[IMPORT] Deleting index 'nilclass_users'
[IMPORT] Creating index 'nilclass_users' with mapping:
{"user":{"properties":{"first_name":{"type":"multi_field","fields":{"first_name":{"type":"string"},"ac_first_name":{"type":"string","search_analyzer":"ac_search
","include_in_all":false,"index_analyzer":"ac_edge_ngram"},"ac_word_first_name":{"type":"string","search_analyzer":"ac_search","include_in_all":false,"index_ana
lyzer":"ac_edge_ngram_word"}}}}}}

[IMPORT] Starting import for the 'User' class

#15/15 | 100% #######################################################

Import finished in 0.04500 seconds

What is the issue here?

Strange behavior for the index prefix

Hi, Great Gem btw :) Saved me a bit of work. One snag: I noticed the following:

1.9.3p385 :003 > ElasticsearchAutocomplete.defaults
=> {:attr=>:name, :localized=>false, :mode=>:word, :index_prefix=>"nilclass"}
1.9.3p385 :004 > ElasticsearchAutocomplete.default_index_prefix
=> "treehouse"

For some reason, it's naming the index nilclass_model_name_pluralize

Just FYI. I only noticed it because I have a resque worker that manually rebuilds the index and the names didn't match :)

Support for Mongoid

I noticed that Mongoid is not supported out of the box. The reason is that the railtie doesn't work since ActiveSupport.on_load :active_record never fires when using Mongoid.

lib/elasticsearch_autocomplete/railtie.rb

module ElasticsearchAutocomplete
  class Railtie < Rails::Railtie
    initializer 'elasticsearch_autocomplete.model_additions' do
      ActiveSupport.on_load :active_record do
        include ElasticsearchAutocomplete::ModelAddition
      end
    end
  end
end

The solution is quite simple, just include the ModelAddition in the model directly.

class Post
  include Mongoid::Document
  include ElasticsearchAutocomplete::ModelAddition

end

Perhaps it should just be documented that you can add the ModelAddition include directly to a model when using Mongoid.

Improvement Suggestion

@leschenko

How do you think we can modify the ac_search method in order to provide an optional filter, that avoid to return the current user in the list?

I have this working in my model, but I can't figure out how let it working with your gem.
I have it directly in a tire search, but I would like to incorporate directly in the ac_search method, maybe using an optional parameter to pass directly to the method (need to be passed in the method and not through the option, due to the impossibility to access a current user from the user model):

(1) This will filter the results returned by elasticsearch, and exclude the current user (this tire method is present in the user model)

filter :ids, :values => self.without_current_user(user).map(&:id)

(2) Method to exclude a specific user from all the users (the current user) - this will allow to pass an instance of the current user (e.g @user)

def self.without_current_user(user)
self.where("users.id != ?", user.id)
end

Looking forward to hear from you.

Best
Dinuz

How to specify an alternative host/port?

I am deploying to Heroku and was wondering how to specify an alternative host (bonsai.io)?

Also, is there a smooth way of not using Tire to import the model data from Postgres to ES?

Thanks

Error Running Example | analyzer on field [ac_term] must be set when search_analyzer is set

I keep getting this error when I try to create the index by running the following command:

$ rake autocomplete
[ERROR] There has been an error when creating the index -- elasticsearch returned:
400 : {"error":{"root_cause":[{"type":"mapper_parsing_exception","reason":"analyzer on field [ac_term] must be set when search_analyzer is set"}],"type":"mapper_parsing_exception","reason":"mapping [autocomplete]","caused_by":{"type":"mapper_parsing_exception","reason":"analyzer on field [ac_term] must be set when search_analyzer is set"}},"status":400}
[ERROR] There has been an error when creating the index -- elasticsearch returned:
400 : {"error":{"root_cause":[{"type":"mapper_parsing_exception","reason":"analyzer on field [ac_term] must be set when search_analyzer is set"}],"type":"mapper_parsing_exception","reason":"mapping [autocomplete]","caused_by":{"type":"mapper_parsing_exception","reason":"analyzer on field [ac_term] must be set when search_analyzer is set"}},"status":400}

Thanks for your help!

merge old tire search class method with elasticsearch_autocomplete

This is my old tire class method:

def self.search(params)
   tire.search(load: true, page: params[:page], per_page: 9) do
     query do
       boolean do
         must { string params[:query], default_operator: "AND" } if params[:query].present?
         must { range :published, lte: Time.zone.now }
         must { term :post_type, params[:post_type] } if params[:post_type].present? 
       end
     end
     highlight :title, :description, :options => { :tag => '<strong>', :fragment_size => 170, :number_of_fragments => 5  }
       sort { by :created_at, "desc" }
     facet "posttypes" do
       terms :post_type
     end
   end
 end

I would like to use this options and merge it with the analyzers of elasticsearch_autocomplete gem.

How can I do it?

Thanks!

How can I provide the _source block to mapping?

Hi,

I am trying to limit the number of fields the search returns. I usually do this by providing the the _source block like so:

mapping :_source => {
:enabled => true,
:includes => [] # array of fields I want to include
} do
...
end

Is it possible using skip_settings: true and only provide the _source block and keeping the rest of the setting same? Please explain me how!

Cheers

special characters ñ or áéíóú and multi-word facets

I have a problem with special characters ñ or áéíóú.

my model:

class Car
  ac_field :name, :description, :city, :skip_settings => true

  def self.ac_search(params, options={})
    tire.search load: true, page: params[:page], per_page: 9 do
      query do
        boolean do
          must { string params[:query], default_operator: "AND" } if params[:query].present?
          must { term :city, params[:city] } if params[:city].present?
        end
      end
      filter :term, city: params[:city] if params[:city].present?
      facet "city" do
        terms :city
      end
    end
  end

end

This version works fine with special characters e.g.:

Query with Martin I get all results with Martín, martín, martin, Martin

With this approach this is the problem:

Now what results is individual words. e.g. A city tagged ["San Francisco", "Madrid"] will end up having three separate tags. Similarly, if I do a query to search on "san francisco" (must { term 'city', params[:city] }), that will fail, while a query on "San" or "Francisco" will succeed. The desired behaviour here is that the tag should be atomic, so only a "San Francisco" (or "Madrid") tag search should succeed.

To fix this problem I create my custom mapping:

model = self
  settings ElasticsearchAutocomplete::Analyzers::AC_BASE do
    mapping _source: {enabled: true, includes: %w(name description city)} do
      indexes :name, model.ac_index_config(:name)
      indexes :description, model.ac_index_config(:description)
      indexes :city, :type => 'string', :index => :not_analyzed 
    end
  end

With this mapping the problem with multi-words is fixed, and now facets with city field works fine:

Instead of getting the type facets San and Francisco Now I get San Francisco

Now, the problem is that with this mapping inside of the model the search doesn't find results with special characters e.g.:

Query with Martin I get only results with Martin martin

I'm using mongoid instead active record.

How can I fix this problem?

Thanks!

How to add char_filter to the existing "ac_edge_ngram_full" analyser ?

This is the setting for the custom analyser that I want to use with analyzer ac_edge_ngram_full

{
    "settings" : {
        "analysis" : {
            "char_filter" : {
                "my_mapping" : {
                    "type" : "mapping",
                    "mappings" : ["(=>", ")=>", "\\u0020-\\u0020=>\\u0020", "\\u0020-=>\\u0020", "-\u0020=>"]
                }
            },
            "analyzer" : {
                "my_ngram_analyzer" : {
                    "tokenizer" : "my_ngram_tokenizer",
                    "filter" : ["lowercase", "asciifolding"],
                    "char_filter" : ["my_mapping"]
                }
            },
            "tokenizer" : {
                "my_ngram_tokenizer" : {
                    "type" : "nGram",
                    "min_gram" : "1",
                    "max_gram" : "50",
                    "token_chars": [ "letter", "digit", "whitespace", "punctuation"]
                }
            }
        }
    }
}

I have read the Wiki relating to the Custom Mapping, but could not figure out, how do I had the char_filter to the ac_edge_ngram_full analyzer , as well as add an extra property token_chars
to the ac_edge_ngram_full tokenizer ? Also, would the code be able to handle unicode for whitespace ?

I believe forking the repository and making the changes would not be a good option/

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.