elastic / elasticsearch-js Goto Github PK
View Code? Open in Web Editor NEWOfficial Elasticsearch client library for Node.js
Home Page: https://ela.st/js-client
License: Apache License 2.0
Official Elasticsearch client library for Node.js
Home Page: https://ela.st/js-client
License: Apache License 2.0
I am using v0.10.25 still getting this .It was working fine 2 days ago .
Elasticsearch ERROR: 2014-03-19T08:05:18Z
Error: Request error, retrying -- connect ECONNREFUSED
at Log.error (/home/ajay/node_modules/elasticsearch/src/lib/log.js:213:60)
at checkRespForFailure (/home/ajay/node_modules/elasticsearch/src/lib/transport.js:185:18)
at HttpConnector.<anonymous> (/home/ajay/node_modules/elasticsearch/src/lib/connectors/http.js:150:7)
at ClientRequest.bound (/home/ajay/node_modules/elasticsearch/node_modules/lodash-node/modern/internals/baseBind.js:56:17)
at ClientRequest.EventEmitter.emit (events.js:95:17)
at Socket.socketErrorListener (http.js:1547:9)
at Socket.EventEmitter.emit (events.js:95:17)
at net.js:441:14
at process._tickCallback (node.js:415:13)
Elasticsearch WARNING: 2014-03-19T08:05:18Z
Unable to revive connection: http://127.0.0.1:9200/
Elasticsearch WARNING: 2014-03-19T08:05:18Z
No living connections
{ message: 'No Living connections' }
elasticsearch cluster is down!
I'm having issues getting ES to reconnect to my host once it has lost connectivity - it isn't reconnecting.
Here are the steps I'm using to simulate the error:
The warning log will emit on each failed request to use the client:
Unable to revive connection: https://user:pass@ip
I can see via the warning log that it is trying to make HEAD
re-connect requests, but the responses from each request come back undefined, even though internet connectivity has been restored and the ES instance itself is up and working fine and is receiving the HEAD requests (as evident by its own logs).
Here's the log trace output from each retry attempt:
Method:
HEAD
Request Params:
{ method: 'HEAD',
protocol: 'https:',
auth: 'user:pass',
hostname: '123.123.123.123',
port: 443,
path: '/',
headers: undefined,
agent:
{ options: { maxSockets: 11, minSockets: 10, rejectUnauthorized: false },
requests: {},
sockets: { '123.123.123.123:443': [] },
freeSockets: { '123.123.123.123:443': [] },
maxSockets: 11,
minSockets: 10,
_events: { free: [Function] } } }
Body: undefined
responseBody: undefined
responseStatus: undefined
This is the client config:
sniffAfterConnectionFault: true,
maxRetries: 100,
deadTimeout: 1000
I am using a custom HTTP connector, but it isn't doing much:
var ESHttpConnector = require('elasticsearch/src/lib/connectors/http');
var util = require('util');
var CustomHttpConnector = function (host, config) {
ESHttpConnector.call(this, host, config);
};
util.inherits(CustomHttpConnector, ESHttpConnector);
CustomHttpConnector.prototype.makeAgentConfig = function(config) {
return {
maxSockets: config.maxSockets,
minSockets: config.minSockets,
rejectUnauthorized: config.rejectUnauthorized
};
};
module.exports = CustomHttpConnector;
Can you confirm that the ES client will attempt to reestablish lost connections? Is there something else I need to do to get it working?
I noticed there were commits for version 1.2.0 and 1.3.0 but there have been no official tags and zip/tarballs released?
Will 1.3.0 have bower support as per issue #6 ?
We currently use grunt with grunt-connect-proxy for local development so that we can code our front end to use relative path AJAX requests to avoid cross site request issues.
When we deploy the application to other environments we simply configure the load balancer to perform the same type of proxying of relative path requests to other servers/ports server side.
I'm attempting to configure the esFactory as such:
angular.module('components.search')
.service('es', function (esFactory) {
return esFactory({
host: '/services/elastic'
});
});
and using elasticsearch.angular.js but the AJAX requests are being made to http://localhost/services/elastic instead of /services/elastic
This is causing collisions with the way grunt runs the application in development on a url like http://localhost:9000/#/
Can we update the code to not assume I want to prefix the AJAX requests with http://localhost if I specify a host with a a leading slash?
This might just be achievable through a new configuration param:
angular.module('components.search')
.service('es', function (esFactory) {
return esFactory({
host: '/services/elastic',
relative: true
});
});
and specifically handled in elasticsearch.angular.js when making the call with $http
this.$http({
...
url: this.host.makeUrl(params),
...
})
When executing a query I get the mentioned message. Here is some example code:
var es = require('elasticsearch');
var c = new es.Client({log: 'trace'});
// log r with text s
var ok = function(r) { console.log('succes', r); }
var fail = function(r) { console.log('failed', r); }
// ok this shit works
c.search({
index: 'places',
q: 'pants'
}).then(ok, fail);
// fails: message: 'Unable to parse/serialize body'
c.search({
index: 'places',
type: 'snackbar',
body: {
query: {
match: {
heading: 'het'
}
}
}
}).then(ok, fail);
// fails, same.
c.search({
index: 'places',
body: { query: { match_all: {} } }
}).then(ok, fail);
This is the result of running it:
node bug.js
Elasticsearch INFO: 2014-01-08T18:48:32Z
Adding connection to http://localhost:9200/
Elasticsearch DEBUG: 2014-01-08T18:48:32Z
starting request { method: 'POST',
path: '/places/_search',
query: { q: 'pants' } }
Elasticsearch DEBUG: 2014-01-08T18:48:32Z
starting request { method: 'POST',
path: '/places/snackbar/_search',
body: { query: { match: [Object] } },
query: {} }
Elasticsearch DEBUG: 2014-01-08T18:48:32Z
starting request { method: 'POST',
path: '/places/_search',
body: { query: { match_all: {} } },
query: {} }
Elasticsearch TRACE: 2014-01-08T18:48:32Z
curl 'http://localhost:9200/places/_search?q=pants&pretty=true' -XPOST
<- 200
{
"took": 1,
"timed_out": false,
"_shards": {
"total": 3,
"successful": 3,
"failed": 0
},
"hits": {
"total": 0,
"max_score": null,
"hits": []
}
}
Elasticsearch INFO: 2014-01-08T18:48:32Z
Request complete
succes { body:
{ took: 1,
timed_out: false,
_shards: { total: 3, successful: 3, failed: 0 },
hits: { total: 0, max_score: null, hits: [] } },
status: 200 }
Elasticsearch TRACE: 2014-01-08T18:48:32Z
curl 'http://localhost:9200/places/_search?pretty=true' -XPOST -d '{
"query": {
"match_all": {}
}
}'
<- 200
{"took":1,"timed_out":false,"_shards":{"total":3,"successful":3,"failed":0},"hits":{"total":1,"max_score":1.0,"hits":[{"_index":"places","_type":"snackbar","_id":"1","_score":1.0, "_source" : {heading:"Het Geveltje", text: "Foo"}}]}}
Elasticsearch INFO: 2014-01-08T18:48:32Z
Request complete
failed { message: 'Unable to parse/serialize body',
body: '{"took":1,"timed_out":false,"_shards":{"total":3,"successful":3,"failed":0},"hits":{"total":1,"max_score":1.0,"hits":[{"_index":"places","_type":"snackbar","_id":"1","_score":1.0, "_source" : {heading:"Het Geveltje", text: "Foo"}}]}}',
status: 200 }
Elasticsearch TRACE: 2014-01-08T18:48:32Z
curl 'http://localhost:9200/places/snackbar/_search?pretty=true' -XPOST -d '{
"query": {
"match": {
"heading": "het"
}
}
}'
<- 200
{"took":2,"timed_out":false,"_shards":{"total":3,"successful":3,"failed":0},"hits":{"total":1,"max_score":0.19178301,"hits":[{"_index":"places","_type":"snackbar","_id":"1","_score":0.19178301, "_source" : {heading:"Het Geveltje", text: "Foo"}}]}}
Elasticsearch INFO: 2014-01-08T18:48:32Z
Request complete
failed { message: 'Unable to parse/serialize body',
body: '{"took":2,"timed_out":false,"_shards":{"total":3,"successful":3,"failed":0},"hits":{"total":1,"max_score":0.19178301,"hits":[{"_index":"places","_type":"snackbar","_id":"1","_score":0.19178301, "_source" : {heading:"Het Geveltje", text: "Foo"}}]}}',
status: 200 }
I don't understand what I'm doing wrong here?
I have an EC2 Amazon cluster created using ES tutorial to create EC2 ES cluster.
Update: It doesn't have to do anything with cluster - simple ES server also is not working.
I am writing the front end. I took help of #19 to get started.
I put code for search in the controller - that works when the controller is being instantiated.
elasticClient.search({index: 'myindex', q: 'Sail'}, function (error, response) {
console.log(error);
console.log(response);
return response;
});
Response:
INFO: 2014-01-12T20:10:21Z
Adding connection to http://ec2-myinstance.amazonaws.com:9200/
elasticsearch-angular.js:17745
DEBUG: 2014-01-12T20:10:21Z
starting request { path: '/_cluster/nodes', method: 'GET' }
elasticsearch-angular.js:17745
DEBUG: 2014-01-12T20:10:21Z
starting request { method: 'GET', path: '/_cluster/health', query: {} }
elasticsearch-angular.js:17745
DEBUG: 2014-01-12T20:10:21Z
starting request { method: 'POST', path: '/myindex/_search', query: { q: 'Sail' } }
elasticsearch-angular.js:17745
XHR finished loading: "http://ec2-myinstance.amazonaws.com:9200/_cluster/health". angular.js:7991
INFO: 2014-01-12T20:10:21Z
Request complete
elasticsearch-angular.js:17745
XHR finished loading: "http://ec2-myinstance.amazonaws.com:9200/myindex/_search?q=Sail". angular.js:7991
INFO: 2014-01-12T20:10:21Z
Request complete
elasticsearch-angular.js:17745
undefined app.js:13
Object {took: 16, timed_out: false, _shards: Object, hits: Object}
app.js:14
XHR finished loading: "http://ec2-myinstance.amazonaws.com:9200/_cluster/nodes". angular.js:7991
INFO: 2014-01-12T20:10:21Z
Request complete
Now, if the same code is put inside a method in the controller and that method is invoked later, the request times out.
$scope.search = function() {
elasticClient.search({index: 'myindex', q: 'Sail'}, function (error, response) {
console.log(error);
console.log(response);
return response;
});
};
elasticsearch-angular.js:17745
INFO: 2014-01-12T20:10:21Z
Adding connection to http://10.191.xxx.xxx:9200/
elasticsearch-angular.js:17745
DEBUG: 2014-01-12T20:10:36Z
starting request { method: 'POST', path: '/myindex/_search', query: { q: 'Sail' } }
elasticsearch-angular.js:17745
DEBUG: 2014-01-12T20:10:36Z
starting request { method: 'POST', path: '/myindex/_search', query: { q: 'Sail' } }
elasticsearch-angular.js:17745
RequestTimeout {message: "Request Timeout after 30000ms", stack: ""} app.js:46
undefined app.js:47
RequestTimeout {message: "Request Timeout after 30000ms", stack: ""} app.js:46
undefined
POST http://10.191.xxx.xxx:9200/scrapy/_search?q=Sail angular.js:7991
DEBUG: 2014-01-12T20:15:21Z
starting request { path: '/_cluster/nodes', method: 'GET' }
elasticsearch-angular.js:17745
GET http://10.191.xxx.xxx:9200/_cluster/nodes angular.js:7991
DEBUG: 2014-01-12T20:20:21Z
starting request { path: '/_cluster/nodes', method: 'GET' }
elasticsearch-angular.js:17745
GET http://10.191.xxx.xxx:9200/_cluster/nodes angular.js:7991
DEBUG: 2014-01-12T20:25:22Z
starting request { path: '/_cluster/nodes', method: 'GET' }
elasticsearch-angular.js:17745
GET http://10.191.xxx.xxx:9200/_cluster/nodes angular.js:7991
DEBUG: 2014-01-12T20:30:22Z
starting request { path: '/_cluster/nodes', method: 'GET' }
elasticsearch-angular.js:17745
GET http://10.191.xxx.xxx:9200/_cluster/nodes angular.js:7991
DEBUG: 2014-01-12T20:35:23Z
starting request { path: '/_cluster/nodes', method: 'GET' }
Any ideas? I have tried other tools and they all work fine. The GETs on private ip in the above response don't look too encouraging to me. Help appreciated!
It's nice that you publish new versions, but to remove old versions from npm is very bad practice. This destroys the installation of packages that depends on an older version.
Did you plan to publish elasticsearch-js into the Bower package repository: http://sindresorhus.com/bower-components/#!/search/elasticsearh ?
I'm impatient to use lasticsearch-js instead of elastic.js by replacing in my bower.json the dependency "elastic.js" : "1.1.1" by "elasticsearch": "1.0.0"
var elastic = new elasticsearch.Client({
host: this.config.get('elastic:host') + ':' + this.config.get('elastic:port'),
});
var id = {};
var update = {};
elastic
.update({ id : id, index : app.config.get('elastic:index'), type : 'type', body : update })
.then(function() {})
.otherwise(function(error) {
console.log('error', error);
} );
This example will fail, no request will be made to the elastic server and there will be no error message to explain why it failed.
As TypeScript has hit 1.0RC1, the language should be stable.
It'd be really useful (and powerful) if you could provide the type defs for ElasticSearch as elasticsearch helps solve enterprise problem, and typescript is the answer to scaling javascript apps (front and back).
TypeScript definition are similar to header files that help a statically typed language such as TypeScript making sense of an api.
See this repo https://github.com/borisyankov/DefinitelyTyped containing many defs.
I was trying to figure out why my application would increase its memory usage while idle. I believe that there is a memory leak during sniff(). I had a quick look inside the source code but I don't know what is causing it yet. Since I cannot look over it until sometime next week, I thought I would report it now.
Here is code that reproduces it: (sniffInterval is aggressive to demonstrate this)
var es = require('elasticsearch');
var client = es.Client({sniffInterval: 1});
setInterval(function() {
console.log(process.memoryUsage());
}, 1000);
Even with looking with top, I see memory usage increase. When running the same code using memwatch, it believes there is a memory leak:
MEMLEAK { start: Wed Mar 26 2014 14:25:04 GMT-0700 (PDT),
end: Wed Mar 26 2014 14:25:17 GMT-0700 (PDT),
growth: 3133672,
reason: 'heap growth over 5 consecutive GCs (13s) - 827.59 mb/hr' }
The download link of the README.MD is broken: https://download.elasticsearch.org/elasticsearch/elasticsearch-js/1.0.3/elasticsearch-js.zip
Using current 1.5.2 version, browser client, grunt browserify task reports:
Running "browserify:basic" (browserify) task
Error: module "./buffer_ieee754" not found from ".../js/elasticsearch.js"
I scanned through the code and there are a lot of require-s for buffer_ieee754, but the module itself is not present or appears to be included.
Hi
With Angular 1.2.4 and elasticsearch.angular.js 1.0.0 I've got the following error when calling the fullTextSearch function:
Uncaught TypeError: Object function AngularConnector(host, config) {
ConnectionAbstract.call(this, host, config);
this.defer = config.defer;
this.$http = config.$http;
} has no method '$http' elasticsearch.angular-1.0.0.js:16912
AngularConnector.request elasticsearch.angular-1.0.0.js:16912
sendReqWithConnection elasticsearch.angular-1.0.0.js:18086
utils.applyArgs elasticsearch.angular-1.0.0.js:18504
bound elasticsearch.angular-1.0.0.js:7167
My Angular configuration is:
angular.module('myApp.services', [])
.service('es', ['esFactory', function (esFactory, $http) {
return esFactory({
hosts: [
'localhost:9200'
],
log: 'trace',
sniffOnStart: true
});
}])
.factory('searchService', ['es', function(es) {
var searchServiceInstance = {
"fullTextSearch" : function(from, size, text) {
es.search( {
"from": from,
"size": size,
"query": { ... }
}).then(function (resp) {
var hits = resp.body.hits;
});
}};
return searchServiceInstance;
}]);
Do you know if it could be a configuration mistake?
And do you have an Angular sample ?
Regards,
Antoine
Hi,
this is my config :
My elasticsearch is on 127.0.0.1:9200
My node file:
"use strict";
var mongoose = require('mongoose'),
elasticsearch = require('elasticsearch'),
fs = require('fs'),
config = require('../../config/config');
var elasticSearchClient = new elasticsearch.Client({log:'trace'});
I've this crash :
Elasticsearch INFO: 2014-01-23T23:33:33Z
Adding connection to http://127.0.0.1:9200/
/var/www/speedealing/nodejs/node_modules/elasticsearch/src/lib/connection_pool.js:284
if (!connection.id) {
^
TypeError: Cannot read property 'id' of undefined
at ConnectionPool.removeConnection (/var/www/speedealing/nodejs/node_modules/elasticsearch/src/lib/connection_pool.js:284:18)
at ConnectionPool.setHosts (/var/www/speedealing/nodejs/node_modules/elasticsearch/src/lib/connection_pool.js:322:10)
at new Transport (/var/www/speedealing/nodejs/node_modules/elasticsearch/src/lib/transport.js:66:25)
at Object.EsApiClient (/var/www/speedealing/nodejs/node_modules/elasticsearch/src/lib/client.js:49:22)
at new Client (/var/www/speedealing/nodejs/node_modules/elasticsearch/src/lib/client.js:60:10)
at Object.<anonymous> (/var/www/speedealing/nodejs/app/routes/elasticsearch.js:8:27)
at Module._compile (module.js:456:26)
at Object.Module._extensions..js (module.js:474:10)
at Module.load (module.js:356:32)
at Function.Module._load (module.js:312:12)
24 Jan 00:33:33 - [nodemon] app crashed - waiting for file changes before starting...
Can you help me ?
Thanks
Herve
npm ERR! Error: No compatible version found: bluebird@'^1.1.1'
Whats going on?
http://elasticsearch.github.io/elasticsearch-js/index.html#logging suggests that I simply pass in a function that has error/warning functions defined.
I'm reality I had to write the following:
function ProxyLogger(log, config) {
LoggerAbstract.call(this, log, config);
}
util.inherits(ProxyLogger, LoggerAbstract);
ProxyLogger.prototype.write = function (label, message) {
label = label.toLowerCase();
message = '[ELASTICSEARCH] ' + message;
switch (label) {
case 'error':
case 'info':
case 'debug':
app.log[label](message);
break;
case 'trace':
app.log.info(message);
break;
case 'warning':
app.log.warn(message);
break;
default:
app.log.error('Unknown elasticsearch label: ' + label);
app.log.error(message);
}
};
following the example a few paragraphs down under the title ("To use your own logger class, just pass it in as the logger's type")
I would like to know how to update mapping of an index.
Any help on this regard is highly appreciated.
http://elasticsearch.github.io/elasticsearch-js/api.html#bulk
It looks like the JSON array shown above is incorrect.
[
{ index: { _index: 'myindex', _type: 'mytype', _id: 1 } }, // action description
{ title: 'foo' }, // the document to index
{ update: { _index: 'myindex', _type: 'mytype', _id: 2 } }, // action description
{ doc: { title: 'foo' } // the document to update
{ delete: { _index: 'myindex', _type: 'mytype', _id: 3 }, // action description
// no document needed for this delete
]
Should be this or similar:
[
{ index: { _index: 'myindex', _type: 'mytype', _id: 1 } }, // action description
{ title: 'foo' }, // the document to index
{ update: { _index: 'myindex', _type: 'mytype', _id: 2 } }, // action description
{ doc: { title: 'foo' } }, // the document to update
{ delete: { _index: 'myindex', _type: 'mytype', _id: 3 } }// action description
// no document needed for this delete
]
It would be easy to adopt of you have example for popular use cases.
I am looking for how to run filter queries.
Queries like:`
{
"filtered" : {
"query" : {
"match" : {
"_all" : "example words"
}
},
"filter" : {
or: [
{
"term": {"privacy": "public"}
},
{
"term": {"added_by": "someone"}
}
]
}
}
}
Hi,
I've just tried the jquery client from master build and whenever I call var e = $.es.Client({})
I got an recursive loop until a call stack error pops (RangeError: Maximum call stack size exceeded).
In https://github.com/elasticsearch/elasticsearch-js/blob/1.5/src/lib/client.js#L32:
function Client(config) {
config = config || {};
function EsApiClient() {
// our client will log minimally by default
if (!config.hasOwnProperty('log')) {
config.log = 'warning';
}
// ...snip...
config.host = 'http://localhost:9200';
// ...snip...
config.sniffEndpoint = '/_cluster/nodes';
And again in transport.js:
function Transport(config) {
var self = this;
config = config || {};
var LogClass = (typeof config.log === 'function') ? config.log : require('./log');
config.log = this.log = new LogClass(config);
and maybe a few more places. This breaks code that uses a shared config object to instantiate multiple es clients.
This is similar to an issue fixed in #51.
I'd submit a PR to fix this, but I can't sign the CLA.
(is there an email list for this project?)
I installed this into an Angular application with:
"bower install elasticsearch-js --save"
When attempting to include the elasticsearch.angular.js module, it complains about 'require' not being define. Requires is used on lines 7 and 8:
var AngularConnector = require('./lib/connectors/angular');
var Client = require('./lib/client');
I'm noob to the JS package management. Should RequireJS be listed as a dependency in the package.json?
It seems the include and exclude params are ignored when doing a getSource and specifying id, index and type.
http://elasticsearch.github.io/elasticsearch-js/api.html#getsource
The referenced get api doesnt seem to have a method to exclude fields:
http://www.elasticsearch.org/guide/en/elasticsearch/reference/current/docs-get.html
Currently when I search on an index that doesn't exist the errors object is being passed to my promise as a ConnectionFault error instead of a 404 error.
I get the error below when trying to make a simple request to my es index. The elasticsearch-js library works perfect on my localhost but once I push it to a ubuntu server I get the error below.
Elasticsearch ERROR: 2014-02-14T12:59:47Z
Error: Request error, retrying -- connect ENOENT
at Log.error (/var/www/omgtransitapp/releases/20140214125838/api/node_modules/elasticsearch/src/lib/log.js:213:60)
at checkRespForFailure (/var/www/omgtransitapp/releases/20140214125838/api/node_modules/elasticsearch/src/lib/transport.js:185:18)
at HttpConnector.<anonymous> (/var/www/omgtransitapp/releases/20140214125838/api/node_modules/elasticsearch/src/lib/connectors/http.js:137:7)
at ClientRequest.bound (/var/www/omgtransitapp/releases/20140214125838/api/node_modules/elasticsearch/node_modules/lodash-node/modern/internals/baseBind.js:52:17)
at ClientRequest.EventEmitter.emit (events.js:101:17)
at Socket.socketErrorListener (_http_client.js:239:9)
at Socket.EventEmitter.emit (events.js:101:17)
at net.js:434:14
at process._tickCallback (node.js:599:11)
Though a lot of trial and error I was able to the client to connect if I commented the agent parameter in the reqParams hash in http.js. Does anyone know if this is a legitimate bug or if I have something configured incorrectly? It looks like agent is derived from another module "ForeverAgent".
var reqParams = {
method: params.method || 'GET',
protocol: host.protocol + ':',
auth: host.auth,
hostname: host.host,
port: host.port,
path: (host.path || '') + (params.path || ''),
headers: headers,
//agent: this.agent
};
It helps to front-end elasticsearch with services like APIGEE that can act as a proxy.
Usually this means connecting to something like my.apigee.com/v1/proxy/_search
instead of my.es.com/_search
... notice the difference being an additional prefix in the path /v1/proxy/
Can elasticsearch client initialization be tweaked to accomodate this by accepting a url prefix that it would add to all standard requests?
When I set logging to trace mode, log lines show me wrong hostname "localhost:9200" even though client didn't connect to localhost.
Elasticsearch TRACE: 2014-01-28T01:09:17Z
curl 'http://localhost:9200/...
This is my sample setting.
--->
{
host: "www.foo.com:80",
log: "trace"
}
--->
I'm using 1.3.0 elasticsearch package in nodejs.
Can you document routing parameters/options during index & search please?
According to the 1.0 HTTP API documentation, the correct format for getting a mapping for a specific index and type is:
/<%=index%>/<%=type%>/_mapping'
http://www.elasticsearch.org/guide/en/elasticsearch/reference/1.x/indices-get-mapping.html
However, in the elasticsearch.js 1.0 API, the format is:
/<%=index%>/_mapping/<%=type%>
https://github.com/elasticsearch/elasticsearch-js/blob/1.5/src/lib/apis/1_0.js#L2526
This results in an incorrect response, which looks like this:
{
_index: '<index>',
_type: '_mapping',
_id: '<type>',
exists: false
}
Additionally, this makes me sad.
I'm using the angular build and noticed that error field in the callback for pings returns as undefinied for an offline ElasticSearch cluster in 1.3.0 but as a proper error in 1.1.0.
I am new to elasticsearch and this js client as well. This is the scenario, I was testing and did not work. Am I doing something wrong with the config or is it a bug?
2 Nodes - ES1, ES2 (both Master eligible, data)
config = {'hosts': ['es-1:9200', 'es-2:9200'],
'sniffOnStart': true,
'sniffInterval': 3000,
'log': [{'type': 'stdio',
'levels': ['error', 'warning']}]}
ES1 is down.
elasticsearch-js requests to add to the index, ES2 adds to the index ==> Correct
ES2 is also down
Displays warning message and re-tries only ES2
ES1 is up
Displays warning message and re-tries only ES2 (does not connect to ES1)
Hello,
I have a problem. I would like to add a river (in my case mongodb river) directly in nodejs if it not exists.
Currently I do that in bash :
curl -XPUT 'http://localhost:9200/_river/mongodb/_meta' -d '{
"type": "mongodb_feeds",
"mongodb": {
"db": "myapp",
"collection": "feeds"
},
"index": {
"name": "feeds",
"type": "documents"
}
}'
I'm looking for understand how to do that with an elasticsearch request in nodejs. I search in elastic search API but I don't find my answer :(.
Thank you !!
PR Coming Shortly.
I've no clue whats wrong here. I played around with some options but didn't succeed.
client = new elasticsearch.Client({
host: 'localhost:9200',
agent: true,
maxSockets: 5,
connectionClass: "http",
keepAlive: false
// ,log: 'trace'
});
Working in OSX, node 0.10.25, es 1.0.1
Elasticsearch ERROR: 2014-02-26T16:03:58Z
Error: Request error, retrying -- connect EADDRNOTAVAIL
at Log.error (/Users/oliver/Documents/node/splimporter/node_modules/elasticsearch/src/lib/log.js:213:60)
at checkRespForFailure (/Users/oliver/Documents/node/splimporter/node_modules/elasticsearch/src/lib/transport.js:185:18)
at HttpConnector.<anonymous> (/Users/oliver/Documents/node/splimporter/node_modules/elasticsearch/src/lib/connectors/http.js:138:7)
at ClientRequest.bound (/Users/oliver/Documents/node/splimporter/node_modules/elasticsearch/node_modules/lodash-node/modern/internals/baseBind.js:52:17)
at ClientRequest.EventEmitter.emit (events.js:95:17)
at Socket.socketErrorListener (http.js:1547:9)
at Socket.EventEmitter.emit (events.js:95:17)
at net.js:441:14
at process._tickCallback (node.js:415:13)
Even when there are no pending requests, client.close() does not release an elasticsearch connection until maxKeepAliveTime elapses (which is defaulted to 5 min).
Expected: the connection closes immediately so that, for example, your node process can terminate immediately.
I'm using version 1.4.0
The browser builds for 1.0.3 are broken. They are empty files.
Hi there,
unfortunately it is quite difficult for a newby in both angular and elasticsearch to get a grap on how to instantiate a client with elasticsearch.angular.js according to the documentation.
The example stated on http://www.elasticsearch.org/guide/en/elasticsearch/client/javascript-api/current/index.html#_setting_up_the_client_in_the_browser
// elasticsearch.angular.js creates an elasticsearch
// module, which provides an esFactory
var app = angular.module('app', ['elasticsearch']);
app.service('es', function (esFactory) {
return esFactory({ ... });
});
is different to the example provided under
http://elasticsearch.github.io/elasticsearch-js/
module.service('es', function (esFactory) {
return esFactory({
host: 'localhost:9200',
// ...
});
});
and in both cases I'm actually missing the step on how to actually retrieve the client. Could you please enhance the documentation by an explizit example?
Sorry to bother you,
many thanks and regards,
Kalumet
For the following code:
var elasticsearch = require('elasticsearch');
var client = new elasticsearch.Client({
host: '127.0.0.1:9200'
});
client.ping({
requestTimeout: 1000,
hello: "elasticsearch!"
}, function (error) {
if (error) {
console.log(error);
console.error('elasticsearch cluster is down!');
} else {
console.log('All is well');
}
});
Result in 0.11.10:
Elasticsearch ERROR: 2014-02-16T06:23:10Z
Error: Request error, retrying -- connect ENOENT
at Log.error (/Users/selva/my_repos/node_something/node_modules/elasticsearch/src/lib/log.js:213:60)
at checkRespForFailure (/Users/selva/my_repos/node_something/node_modules/elasticsearch/src/lib/transport.js:185:18)
at HttpConnector.<anonymous> (/Users/selva/my_repos/node_something/node_modules/elasticsearch/src/lib/connectors/http.js:137:7)
at ClientRequest.bound (/Users/selva/my_repos/node_something/node_modules/elasticsearch/node_modules/lodash-node/modern/internals/baseBind.js:52:17)
at ClientRequest.EventEmitter.emit (events.js:101:17)
at Socket.socketErrorListener (_http_client.js:239:9)
at Socket.EventEmitter.emit (events.js:101:17)
at net.js:434:14
at process._tickCallback (node.js:599:11)
Elasticsearch WARNING: 2014-02-16T06:23:10Z
Unable to revive connection: http://127.0.0.1:9200/
Elasticsearch WARNING: 2014-02-16T06:23:10Z
No living connections
{ [Error: No Living connections] message: 'No Living connections' }
elasticsearch cluster is down!
Whereas in v0.10.25
All is well
When passing a config object to new elasticsearch.Client(), the Client modifies the passed in config object in such a way that it cannot be used to create a second client.
Example:
var elasticsearch = require('elasticsearch');
var config = {
log : "warning"
};
console.dir(config);
var es1 = new elasticsearch.Client(config);
console.dir(config);
var es2 = new elasticsearch.Client(config);
> node es-config-bug.js
{ log: 'warning' }
{ log:
{ _events:
{ closing: [Object],
error: [Function: bound],
warning: [Function: bound] } },
host: 'http://localhost:9200',
hosts: 'http://localhost:9200',
maxSockets: 10,
maxKeepAliveRequests: 0,
maxKeepAliveTime: 300000 }
/Users/jvonnieda/Desktop/test/node_modules/elasticsearch/src/lib/log.js:44
throw new TypeError('Invalid logging output config. Expected either a lo
^
TypeError: Invalid logging output config. Expected either a log level, array of log levels, a logger config object, or an array of logger config objects.
at new Log (/Users/jvonnieda/Desktop/test/node_modules/elasticsearch/src/lib/log.js:44:13)
at new Transport (/Users/jvonnieda/Desktop/test/node_modules/elasticsearch/src/lib/transport.js:17:27)
at Object.EsApiClient (/Users/jvonnieda/Desktop/test/node_modules/elasticsearch/src/lib/client.js:49:22)
at new Client (/Users/jvonnieda/Desktop/test/node_modules/elasticsearch/src/lib/client.js:60:10)
at Object.<anonymous> (/Users/jvonnieda/Desktop/test/es-config-bug.js:10:11)
at Module._compile (module.js:456:26)
at Object.Module._extensions..js (module.js:474:10)
at Module.load (module.js:356:32)
at Function.Module._load (module.js:312:12)
at Function.Module.runMain (module.js:497:10)
In response to #64, I'm trying to implement a custom connection class to add rejectUnauthorized: false
to the http/https socket options returned in makeAgentParams
.
Using your previous gist, I'm trying to implement my own makeAgentParams; however, it never gets called.
If I were to add createAgent
to my custom connector prototype, that get's called but not makeAgentParams.
Why? Must I implement all prototype methods for this to work?
Here's a little test script.
var elasticsearch = require('elasticsearch');
var util = require('util');
var ESHttpConnector = require('elasticsearch/src/lib/connectors/http');
function CAAHttpConnector(host, config) {
ESHttpConnector.call(this, host, config);
}
util.inherits(CAAHttpConnector, ESHttpConnector);
CAAHttpConnector.prototype.makeAgentParams = function(config) {
// never gets called
console.log('makeAgentParams');
};
var client = new elasticsearch.Client({
connectionClass: CAAHttpConnector
});
client.ping({ requestTimeout: 10000, ping: true }, function(err) {
if(err) {
// returns Error: Request error, retrying -- connect ECONNREFUSED
// which makes sense as I haven't specified my hosts
// but I do add hosts, I get the untrusted cert error and makeAgentParams is never called
return console.log(err);
}
});
Are there any plans to implement the query dsl in the javascript library such as the one that exists in the java library?
I'm using version 1.5.5 and 0.90.5. For the code at https://gist.github.com/johnywith1n/9013097, I have a class that resets and index. It creates it if it doesn't exist and deletes and recreates it if it does.
At the bottom, I chain two promises together to create two indices. The program terminates if the indices do not already exists, but doesn't terminate if they do. Also, it seems that if I only reset one index instead of both, the program terminates regardless of whether or not they already exist.
I'm not sure what's keeping the program alive in the instance where it doesn't terminate, but I'm guessing the client is keeping something alive, so I thought maybe it might be interesting to you.
For some reason the same authentication that I provide to a client
for client.search
doesn't work for client.update
... is there some connection in the code that's missing? Anyone else see this issue?
The same request with body&url copied out of the network console of chrome ... work fine on Sense when tested so the request isn't malformed or impossible ... the auth credentials just aren't working with update call.
Can you document connection pool with examples.
How to handle cases where elasticserach goes down and the connection is lost.
Do we have something to reconnect/auto-reconnect or use connection pool for these things.
btw, Great effort. Thank you for this :)
Add documentation that the promise callback will be called with an object with two properties: status and body.
status is the http status and body is the parsed elasticsearch response.
As of right now, the build process uses browserify to generate browser compatible versions of the elasticsearch-js client.
I'm still looking for a way to publish the results of this build process to bower without having to programmatically modify several child repositories. Any suggestions are welcome.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.