Code Monkey home page Code Monkey logo

rdfstore-js's Introduction

#rdfstore-js Build Status Join the chat at https://gitter.im/antoniogarrote/rdfstore-js

Important Note

Many features present in versions 0.8.X have been removed in the 0.9.X. Some of them, will be added in the next versions, other like the MongoDB backend will be discarded. Please read this README file carefully to find the current set of features.

Table of Contents

Introduction

rdfstore-js is a pure Javascript implementation of a RDF graph store with support for the SPARQL query and data manipulation language.

var rdfstore = require('rdfstore');

rdfstore.create(function(err, store) {
  store.execute('LOAD <http://dbpedia.org/resource/Tim_Berners-Lee> INTO GRAPH <http://example.org/people>', function() {

	store.setPrefix('dbp', 'http://dbpedia.org/resource/');

	store.node(store.rdf.resolve('dbp:Tim_Berners-Lee'),  "http://example.org/people", function(err, graph) {

	  var peopleGraph = graph.filter(store.rdf.filters.type(store.rdf.resolve("foaf:Person")));

	  store.execute('PREFIX rdf:  <http://www.w3.org/1999/02/22-rdf-syntax-ns#>\
					 PREFIX foaf: <http://xmlns.com/foaf/0.1/>\
					 PREFIX : <http://example.org/>\
					 SELECT ?s FROM NAMED :people { GRAPH ?g { ?s rdf:type foaf:Person } }',
					 function(err, results) {

					   console.log(peopleGraph.toArray()[0].subject.valueOf() === results[0].s.value);

					 });
	});

  });
});

rdfstore-js can be executed in a web browser or can be included as a library in a node.js application. It can also be executed as a stand-alone SPARQL end-point accepting SPARQL RDF Protocol HTTP requests. Go to the bottom of this page to find some application examples using the library.

The current implementation is far from complete but it already passes all the test cases for the SPARQL 1.0 query language and supports data manipulation operations from the SPARQL 1.1/Update version of the language.

Some other features included in the library are the following:

  • SPARQL 1.0 support
  • SPARQL 1.1/Update support
  • Partial SPARQL 1.1 query support
  • JSON-LD parser
  • Turtle/N3 parser
  • W3C RDF Interfaces API
  • RDF graph events API
  • Custom filter functions
  • Browser persistence using IndexedDB

Documentation

Documentation for the store can be found here.

SPARQL support

rdfstore-js supports at the moment SPARQL 1.0 and most of SPARQL 1.1/Update. Only some parts of SPARQL 1.1 query have been implemented yet.

This is a list of the different kind of queries currently implemented:

  • SELECT queries
  • UNION, OPTIONAL clauses
  • NAMED GRAPH identifiers
  • LIMIT, OFFSET
  • ORDER BY clauses
  • SPARQL 1.0 filters and builtin functions
  • variable aliases
  • variable aggregation: MAX, MIN, COUNT, AVG, SUM functions
  • GROUP BY clauses
  • DISTINCT query modifier
  • CONSTRUCT queries
  • ASK queries
  • INSERT DATA queries
  • DELETE DATA queries
  • DELETE WHERE queries
  • WITH/DELETE/INSERT/WHERE queries
  • LOAD queries
  • CREATE GRAPH clauses
  • DROP DEFAULT/NAMED/ALL/GRAPH clauses
  • CLEAR DEFAULT/NAMED/ALL/Graph clauses
  • FILTER EXISTS / NOT EXISTS operators
  • BIND
  • FILTER IN / NOT IN operators

##Installation

The library can be installed using NPM:

$ npm install rdfstore

The library can also be installed via bower using a global module:

$ bower install rdfstore

##Building

Before running the build script, you must install JavaScript dependencies with npm (npm is shipped with node):

$ npm install

The library can be built using gulp:

$ gulp

The browser version can be built using the 'browser' gulp target:

$ gulp browser

Tests

To execute the whole test suite of the library, including the DAWG test cases for SPARQL 1.0 and the test cases for SPARQL 1.1 implemented at the moment, a gulp target can be executed:

$ gulp specs

Additionally, there are some smoke tests for both browser versions that can be found ithe 'spec/browser'' directory.

API

This is a small overview of the rdfstore-js API.

###Store creation

//nodejs only
var rdfstore = require('rdfstore');

// in the browser the rdfstore object
// is already defined

// alt 1
rdfstore.create(function(err, store) {
  // the new store is ready
});


// alt 2
new rdfstore.Store(function(err, store) {
  // the new store is ready
});

###Query execution

// simple query execution
store.execute("SELECT * { ?s ?p ?o }", function(err, results){
  if(!err) {
	// process results
	if(results[0].s.token === 'uri') {
	  console.log(results[0].s.value);
	}
  }
});

// execution with an explicit default and named graph

var defaultGraph = [{'token':'uri', 'value': graph1}, {'token':'uri', 'value': graph2}, ...];
var namedGraphs  = [{'token':'uri', 'value': graph3}, {'token':'uri', 'value': graph4}, ...];

store.executeWithEnvironment("SELECT * { ?s ?p ?o }",defaultGraph,
  namedGraphs, function(err, results) {
  if(err) {
	// process results
  }
});

###Construct queries RDF Interfaces API

var query = "CONSTRUCT { <http://example.org/people/Alice> ?p ?o } \
			 WHERE { <http://example.org/people/Alice> ?p ?o  }";

store.execute(query, function(err, graph){
  if(graph.some(store.rdf.filters.p(store.rdf.resolve('foaf:name')))) {
	nameTriples = graph.match(null,
							  store.rdf.createNamedNode(rdf.resolve('foaf:name')),
							  null);

	nameTriples.forEach(function(triple) {
	  console.log(triple.object.valueOf());
	});
  }
});

###Loading remote graphs

rdfstore-js will try to retrieve remote RDF resources across the network when a 'LOAD' SPARQL query is executed. The node.js build of the library will use regular TCP sockets and perform proper content negotiation. It will also follow a limited number of redirections. The browser build, will try to perform an AJAX request to retrieve the resource using the correct HTTP headers. Nevertheless, this implementation is subjected to the limitations of the Same Domain Policy implemented in current browsers that prevents cross domain requests. Redirections, even for the same domain, may also fail due to the browser removing the 'Accept' HTTP header of the original request. rdfstore-js relies in on the jQuery Javascript library to peform cross-browser AJAX requests. This library must be linked in order to exeucte 'LOAD' requests in the browser.

store.execute('LOAD <http://dbpedialite.org/titles/Lisp_%28programming_language%29>\
			   INTO GRAPH <lisp>', function(err){
  if(err) {
	var query = 'PREFIX foaf:<http://xmlns.com/foaf/0.1/> SELECT ?o \
				 FROM NAMED <lisp> { GRAPH <lisp> { ?s foaf:page ?o} }';
	store.execute(query, function(err, results) {
	  // process results
	});
  }
})

###High level interface

The following interface is a convenience API to work with Javascript code instead of using SPARQL query strings. It is built on top of the RDF Interfaces W3C API.

/* retrieving a whole graph as JS Interafce API graph object */

store.graph(graphUri, function(err, graph){
  // process graph
});


/* Exporting a graph to N3 (this function is not part of W3C's API)*/
store.graph(graphUri, function(err, graph){
  var serialized = graph.toNT();
});


/* retrieving a single node in the graph as a JS Interface API graph object */

store.node(subjectUri, function(err, node) {
  //process node
});

store.node(subjectUri, graphUri, function(err, node) {
  //process node
});



/* inserting a JS Interface API graph object into the store */

// inserted in the default graph
store.insert(graph, function(err) {}) ;

// inserted in graphUri
store.insert(graph, graphUri, function(err) {}) ;



/* deleting a JS Interface API graph object into the store */

// deleted from the default graph
store.delete(graph, function(err){});

// deleted from graphUri
store.delete(graph, graphUri, function(err){});



/* clearing a graph */

// clears the default graph
store.clear(function(err){});

// clears a named graph
store.clear(graphUri, function(err){});



/* Parsing and loading a graph */

// loading local data
store.load("text/turtle", turtleString, function(err, results) {});

// loading remote data
store.load('remote', remoteGraphUri, function(err, results) {});



/* Registering a parser for a new media type */

// The parser object must implement a 'parse' function
// accepting the data to parse and a callback function.

store.registerParser("application/rdf+xml", rdXmlParser);

###RDF Interface API

The store object includes a 'rdf' object implementing a RDF environment as described in the RDF Interfaces 1.0 W3C's working draft. This object can be used to access to the full RDF Interfaces 1.0 API.

var graph = store.rdf.createGraph();
graph.addAction(rdf.createAction(store.rdf.filters.p(store.rdf.resolve("foaf:name")),
								 function(triple){ var name = triple.object.valueOf();
												   var name = name.slice(0,1).toUpperCase()
												   + name.slice(1, name.length);
												   triple.object = store.rdf.createNamedNode(name);
												   return triple;}));

store.rdf.setPrefix("ex", "http://example.org/people/");
graph.add(store.rdf.createTriple( store.rdf.createNamedNode(store.rdf.resolve("ex:Alice")),
								  store.rdf.createNamedNode(store.rdf.resolve("foaf:name")),
								  store.rdf.createLiteral("alice") ));

var triples = graph.match(null, store.rdf.createNamedNode(store.rdf.resolve("foaf:name")), null).toArray();

console.log("worked? "+(triples[0].object.valueOf() === 'Alice'));

###Default Prefixes

Default RDF name-spaces can be specified using the registerDefaultNamespace. These names will be included automatically in all queries. If the same name-space is specified by the client in the query string the new prefix will shadow the default one. A collection of common name-spaces like rdf, rdfs, foaf, etc. can be automatically registered using the registerDefaultProfileNamespace function.

new Store({name:'test', overwrite:true}, function(err,store){
	store.execute('INSERT DATA {  <http://example/person1> <http://xmlns.com/foaf/0.1/name> "Celia" }', function(err){

	   store.registerDefaultProfileNamespaces();

	   store.execute('SELECT * { ?s foaf:name ?name }', function(err,results) {
		   test.ok(results.length === 1);
		   test.ok(results[0].name.value === "Celia");
	   });
	});
});

###JSON-LD Support

rdfstore-js implements parsers for Turtle and JSON-LD. The specification of JSON-LD is still an ongoing effort. You may expect to find some inconsistencies between this implementation and the actual specification.

		jsonld = {
		  "@context":
		  {
			 "rdf": "http://www.w3.org/1999/02/22-rdf-syntax-ns#",
			 "xsd": "http://www.w3.org/2001/XMLSchema#",
			 "name": "http://xmlns.com/foaf/0.1/name",
			 "age": {"@id": "http://xmlns.com/foaf/0.1/age", "@type": "xsd:integer" },
			 "homepage": {"@id": "http://xmlns.com/foaf/0.1/homepage", "@type": "xsd:anyURI" },
			 "ex": "http://example.org/people/"
		  },
		  "@id": "ex:john_smith",
		  "name": "John Smith",
		  "age": "41",
		  "homepage": "http://example.org/home/"
		};

store.setPrefix("ex", "http://example.org/people/");

store.load("application/ld+json", jsonld, "ex:test", function(err,results) {
  store.node("ex:john_smith", "ex:test", function(err, graph) {
	// process graph here
  });
});

###Events API

rdfstore-js implements an experimental events API that allows clients to observe changes in the RDF graph and receive notifications when parts of this graph changes. The two main event functions are subscribe that makes possible to set up a callback function that will be invoked each time triples matching a certain pattern passed as an argument are added or removed, and the function startObservingNode that will be invoked with the modified version of the node each time triples are added or removed from the node.

var cb = function(event, triples){
  // it will receive a notifications where a triple matching
  // the pattern s:http://example/boogk, p:*, o:*, g:*
  // is inserted or removed.
  if(event === 'added') {
	console.log(triples.length+" triples have been added");
  } else if(event === 'deleted') {
	console.log(triples.length+" triples have been deleted");
  }
}

store.subscribe("http://example/book",null,null,null,cb);


// .. do something;

// stop receiving notifications
store.unsubscribe(cb);

The main difference between both methods is that subscribe receives the triples that have changed meanwhile startObservingNode receives alway the whole node with its updated triples. startObservingNode receives the node as a RDF Interface graph object.

var cb = function(node){
  // it will receive the updated version of the node each
  // time it is modified.
  // If the node does not exist, the graph received will
  // not contain triples.
  console.log("The node has now "+node.toArray().length+" nodes");
}

// if only tow arguments are passed, the default graph will be used.
// A graph uri can be passed as an optional second argument.
store.startObservingNode("http://example/book",cb);


// .. do something;

// stop receiving notifications
store.stopObservingNode(cb);

In the same way, there are startObservingQuery and stopObservingQuery functions that makes possible to set up callbacks for whole SPARQL queries. The store will try to be smart and not perform unnecessary evaluations of these query after quad insertion/deletions. Nevertheless too broad queries must be used carefully with the events API.

###Custom Filter Functions

Custom filter function can be registered into the store using the registerCustomFunction function. This function receives two argument, the name of the custom function and the associated implementation. This functions will be available in a SPARQL query using the prefix custom. You can also use a full URI to identify the function that is going to be registered. The function implementation will receive two arguments, an object linking to the store query filters engine and a list with the actual arguments. Arguments will consist of literal or URIs objects. Results from the function must also be literal or URI objects.

The query filters engine can be used to access auxiliary function to transform literals into JavaScript types using the effectiveTypeValue function, boolean values using the effectiveBooleanValue, to build boolean literal objects (ebvTrue, ebvFalse) or return an error with the ebvError. Documentation and source code for the QueryFilters object n the 'js-query-engine' module can be consulted to find information about additional helper functions.

The following test shows a simple examples of how custom functions can be invoked:

new Store({name:'test', overwrite:true}, function(err,store) {
	store.load(
		'text/n3',
		'@prefix test: <http://test.com/> .\
		 test:A test:prop 5.\
		 test:B test:prop 4.\
		 test:C test:prop 1.\
		 test:D test:prop 3.',
		function(err) {

			var invoked = false;
            // instead of 'my_addition_check' a full URI can be used 'http://test.com/my_fns/my_addition_check'
			store.registerCustomFunction('my_addition_check', function(engine,args) {
		// equivalent to var v1 = parseInt(args[0].value), v2 = parseInt(args[1].value);

		var v1 = engine.effectiveTypeValue(args[0]);
		var v2 = engine.effectiveTypeValue(args[1]);

		// equivalent to return {token: 'literal', type:"http://www.w3.org/2001/XMLSchema#boolean", value:(v1+v2<5)};

		return engine.ebvBoolean(v1+v2<5);
	});

	   store.execute(
				'PREFIX test: <http://test.com/> \
				 SELECT * { ?x test:prop ?v1 .\
							?y test:prop ?v2 .\
							filter(custom:my_addition_check(?v1,?v2)) }',
				function(err) {
				   test.ok(results.length === 3);
		   for(var i=0; i<results.length; i++) {
			test.ok(parseInt(results[i].v1.value) + parseInt(results[i].v2.value) < 5 );
		}
		test.done()
		}
	);
  });
});

###Persistence

The store can be persisted in the browser using IndexedDB as the backend. In order to make the store persistent, the 'persistent' flag must be set to true in the store creation options. Additionally, a 'name' option can also be passed for the store. Different persistent instances of the store can be opened using different names.

###Controlling the frequency of function yielding

Performance of the store can be improved by reducing the frequency the 'nexTick' mechanism is used to cancel the the calls stack. You can reduce this frequency by invoking the yieldFrequency function on the Store object and passing a bigger number:

var rdfstore = require('rdfstore')
rdfstore.Store.yieldFrequency(200); // will only yield after 200 invocations of nextTick

If the number is too big a number can produce stack overflow errors during execution. If you find this problem, reduce the value provided for yieldFrequency.

##Dependencies

The library include dependencies to two semantic-web libraries for parsing:

  • N3.js library, developed by Ruben Verborgh and released under the MIT license.

  • jsonld, developed by Digital Bazaar and released under the New BSD license.

##Frontend

A stand-along frontend for the store built using electron has been added in version 0.9.7. You can build the frontend running the command:

$ gulp frontend

The file will be added under the releases directory.

##Contributing

rdfstore-js is still at the beginning of its development. If you take a look at the library and find a way to improve it, please ping us. We'll be very greatful for any bug report or pull-request.

Author

Antonio Garrote, email:[email protected], twitter:@antoniogarrote.

License

Licensed under the MIT License, copyright Antonio Garrote 2011-2015

rdfstore-js's People

Contributors

antoniogarrote avatar cie avatar craigw avatar datokrat avatar flosse avatar gitter-badger avatar jmandel avatar ktym avatar l00mi avatar laczoka avatar lanthaler avatar melvincarvalho avatar muety avatar nicola avatar nopnop avatar rubenverborgh avatar scippio avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

rdfstore-js's Issues

build error N3Lexer.js

Just trying to build the code. It runs but seems to fail as shown belo

 $ ./make.rb browser
  BROWSER CONFIGURATION
*** loading configuration
*** building distribution directory
(!) dist directory already exits
*** processing ./src/js-trees/src/utils.js
 * modifying: exports.Utils = {};
 -> var Utils = {};
 * ignoring: var Utils = exports.Utils;
*** processing ./src/js-trees/src/in_memory_b_tree.js
 * modifying: exports.InMemoryBTree = {};
 -> var InMemoryBTree = {};
 * ignoring: var InMemoryBTree = exports.InMemoryBTree;
*** processing ./src/js-rdf-persistence/src/quad_index_common.js
 * modifying: exports.QuadIndexCommon = {};
 -> var QuadIndexCommon = {};
 * ignoring: var QuadIndexCommon = exports.QuadIndexCommon;
*** processing ./src/js-rdf-persistence/src/quad_index.js
 * modifying: exports.QuadIndex = {};
 -> var QuadIndex = {};
 * ignoring: var QuadIndex = exports.QuadIndex;
 * writing right MemoryTree
 * ignoring: var Utils = require("./../../js-trees/src/utils").Utils;
 * ignoring: var QuadIndexCommon = require("./quad_index_common").QuadIndexCommon;
*** processing ./src/js-rdf-persistence/src/quad_backend.js
 * modifying: exports.QuadBackend = {};
 -> var QuadBackend = {};
 * ignoring: var QuadBackend = exports.QuadBackend;
 * ignoring: var Utils = require("./../../js-trees/src/utils").Utils;
 * ignoring: var QuadIndexCommon = require("./quad_index_common").QuadIndexCommon;
 * ignoring: var QuadIndex = require("./quad_index").QuadIndex;
*** processing ./src/js-rdf-persistence/src/lexicon.js
 * modifying: exports.Lexicon = {};
 -> var Lexicon = {};
 * ignoring: var Lexicon = exports.Lexicon;
 * ignoring: var QuadIndexCommon = require("./quad_index_common").QuadIndexCommon;
*** processing ./src/js-communication/src/ajax_transport.js
 * modifying: exports.NetworkTransport = {};
 -> var NetworkTransport = {};
 * ignoring: var NetworkTransport = exports.NetworkTransport;
*** processing ./src/js-communication/src/jsonld_parser.js
 * ignoring: var Utils = require("./../../js-trees/src/utils").Utils;
 * modifying: exports.JSONLDParser = {};
 -> var JSONLDParser = {};
 * ignoring: var JSONLDParser = exports.JSONLDParser;
*** processing ./node_modules/n3/lib/N3Lexer.js
./make.rb:516:in `initialize': No such file or directory - ./node_modules/n3/lib/N3Lexer.js (Errno::ENOENT)
    from ./make.rb:516:in `open'
    from ./make.rb:516:in `block (2 levels) in process_files_for_browser'
    from ./make.rb:514:in `each'
    from ./make.rb:514:in `block in process_files_for_browser'
    from ./make.rb:501:in `open'
    from ./make.rb:501:in `process_files_for_browser'
    from ./make.rb:556:in `make_browser'
    from ./make.rb:598:in `<main>'

As I am new to JS and ruby I am not sure how problematic this is.
Btw. you may want to add a link to the ruby gem ( for people like me )

Stand-alone SPARQL end-point- access remotely in javascript

I could run the rdfstore-js as Stand-alone SPARQL end-point. The example in README file sends a SPARQL query using command line program curl:

curl -v -d "default-graph-uri=http://test.com/graph1" --data-urlencode "query=select * { ?s ?p ?o } limit 3" -H "Accept: application/rdf+xml" http://localhost:8080/sparql

I need to access it remotely using javascript in a browser and not in command line program. The examples in README do not show how to access a remote SPARQL endpoint in javascript; they just show how to create local rdfstores and query them. Can anyone help me on this issue?

Maybe implement and extend the evented API in node.js over redis pub/sub

I think the evented API is a awesome feature in rdfstore-js and it is even more awesome in an environment like node. :D Notifications if some data has changed or even if it gets accessed from the store in "realtime" is getting more and more important for enterprises.
If we could implement and further advance the Events API in rdfstore-js over redis pub/sub it will be easier to built software and tools on top of a triple store. It will also lead to independently and self-contained small software parts, which in turn can be scaled better and developed independently from each other.

Add some array methods to RDF Interfaces

IMHO this should be added to the RDF Interfaces spec but I don't know where to add that so I go for here for the moment :)

I think it would make sense to add them to the Graph object which only provides a forEach and some function so far.

Querying a set of named RDF graphs without naming the graphs

I'm prototyping with the persistent node/mongo store and have a data model that makes use of rdfg:subGraphOf predicates to relate graphs. I want to query across a set of graphs without explicitly naming the graphs in a manner described in http://www.snee.com/bobdc.blog/2009/03/querying-a-set-of-named-rdf-gr.html, but my queries return [].

I'm still ramping up on the whole rdf/sparql space but I guess the issue pertains to the definition of the 'default graph'?

In the node/mongo combination the default seems to be the empty set; SELECT * { ?s ?p ?o } and SELECT DISTINCT ?g WHERE { GRAPH ?g { ?s ?p ?o . } } both return [].

I'm wondering if there is another way to query multiple graphs using query derived graph names using the node/mongo configuration?

Thx,

(W3C) RDF JavaScript Libraries Community Group

Hi @antoniogarrote and @ALL

While ago few folks have started this community group: http://www.w3.org/community/rdfjs/

I see it as a great place to coordinate all our efforts arounf RDF related libraries in javascript! Currently 26 people participate in it including @RubenVerborgh and many others who maintain relevant libraries.

I would like to encourage you to check out mailing list archive:
http://lists.w3.org/Archives/Public/public-rdfjs/

Still very young wiki:
http://www.w3.org/community/rdfjs/wiki/Main_Page

Hope you'll find it an interesting community to collaborate ๐Ÿ˜„

Does not work with latest node.js unstable

I'm trying the folowing code : http://pastebin.com/5nPJRqpe
Error logs :

B:\eclipse\workspace\tests>node distant-rdf.js

node.js:207
        throw e; // process.nextTick error, or 'error' event on first tick
              ^
Error: No such module
    at Object.<anonymous> (B:\eclipse\workspace\tests\node_modules\rdfstore\node
_modules\webworker\lib\webworker.js:35:26)
    at Module._compile (module.js:432:26)
    at Object..js (module.js:450:10)
    at Module.load (module.js:351:31)
    at Function._load (module.js:310:12)
    at Module.require (module.js:357:17)
    at require (module.js:368:17)
    at B:\eclipse\workspace\tests\node_modules\rdfstore\index.js:35472:14
    at Object.<anonymous> (B:\eclipse\workspace\tests\node_modules\rdfstore\inde
x.js:36637:2)
    at Module._compile (module.js:432:26)

Versions :
node : 0.5.8
npm : 1.0.97
rdfstore: 0.4.2

Not a javascript (nor node) pro !

Thanks !

SPARQL parser cannot handle escaped backslashes

After figuring out my workaround for #54, I encountered new problems with string escaping.

Consider this triple, stored in a text file:

<http://foo> <http://bar> "b\\a\\z" .

loaded like this:

var triple;
$.get("sample.n3", function(txt){ triple = txt });

then added to a store like this:

store.load('text/turtle', triple, function(){})

and can be retrieved like this:.

store.execute(
  'SELECT * WHERE {<http://foo> <http://bar> ?o}',
  function(e, rows){console.log(rows[0].o.value)}
)
">> b\a\z"

There is no way whatsoever to delete this data with SPAR*L:

store.execute('DELETE DATA {' + triple + '}', function(){})
// SyntaxError: Expected [2] Query or [30] Update but "D" found."

I currently have no workaround for this, other than replacing all of the users' \ with /, which is not really acceptable, but good enough to to go forward with.

?x ?p ?x issue

I'm trying to write some queries that match triples having the same resource for the subject and object, i.e. "WHERE {?x ?p ?x}". in the examples below (one CONSTRUCT the other a SELECT) there are no triples that match this pattern and yet the queries are returning results.

am I missing something here or is this a bug?

thanks!

require('rdfstore').create(function(store) {
    store.load(
        'text/n3', 
        '<http://A> <http://B> <http://C>.',
        function(success) {
            store.execute(
                'CONSTRUCT { ?x ?p ?x } WHERE { ?x ?p ?x }',
                function(success, results) {
                    results.triples.forEach(function(result){
                       console.log(result.subject.nominalValue,
                                    result.predicate.nominalValue,  
                                    result.object.nominalValue);
                       });
                 }
            );
        }); 
 });

// => http://C http://B http://C

require('rdfstore').create(function(store) {
    store.load(
        'text/n3', 
        '<http://A> <http://B> <http://C>.',
        function(success) {
            store.execute(
                'SELECT * WHERE { ?x ?p ?x }',
                function(success, results) {
                    results.forEach(function(result){
                       console.log(result.x.value,
                                    result.p.value);
                       });
                 }
            );
        }); 
});

// => http://C http://B

No support for casting in FILTER -- e.g. xsd:dateTime(?s)

I'm having trouble with queries that attempt to cast a plain literal as an xsd:dateTime. For example:

PREFIX rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#>
PREFIX sp: <http://smartplatforms.org/terms#>
PREFIX dcterms: <http://purl.org/dc/terms/>
SELECT  ?fill_date
WHERE {
  ?m rdf:type sp:Medication .
  ?m sp:fulfillment ?fill.
  ?f dcterms:date ?fill_date.  
  FILTER( xsd:dateTime(?fill_date) > "2009-01-01T00:00:00Z"^^xsd:dateTime )
}

Make the graph rdf-interface api available to the store

For now, to get a handle to the rdf-interface you need to either query the store or use the graph() or node() methods. These will return a graph object (snapshot) with the rdf-interface api. I would like to have the rdf-interface api available on the whole store, without creating snapshots and using callbacks.

The filter function could use the indexes that are available in the store backend

Cannot connect to MongoDB Backend with Auth

I'm unable to connect to a MongoDB backend using a username/password, provided in the standard Mongo URL format. The following error shows up:

Attempting to create store...

node.js:201
        throw e; // process.nextTick error, or 'error' event on first tick
              ^
TypeError: Cannot read property 'arbiterOnly' of undefined
    at /Users/robbinsd/Dropbox/Repos/rdfstorenode/node_modules/rdfstore/node_modules/mongodb/lib/mongodb/connection/server.js:552:22
    at [object Object].checkoutReader (/Users/robbinsd/Dropbox/Repos/rdfstorenode/node_modules/rdfstore/node_modules/mongodb/lib/mongodb/connection/server.js:569:16)
    at /Users/robbinsd/Dropbox/Repos/rdfstorenode/node_modules/rdfstore/node_modules/mongodb/lib/mongodb/db.js:1249:79
    at Db._executeQueryCommand (/Users/robbinsd/Dropbox/Repos/rdfstorenode/node_modules/rdfstore/node_modules/mongodb/lib/mongodb/db.js:1456:5)
    at Cursor.nextObject (/Users/robbinsd/Dropbox/Repos/rdfstorenode/node_modules/rdfstore/node_modules/mongodb/lib/mongodb/cursor.js:446:13)
    at Array.0 (/Users/robbinsd/Dropbox/Repos/rdfstorenode/node_modules/rdfstore/node_modules/mongodb/lib/mongodb/cursor.js:164:12)
    at EventEmitter._tickCallback (node.js:192:40)

See this gist for my source code: https://gist.github.com/2001409

GeoSpatial queries

We've done a SPARQL based POC using the node/mongodb version of rdfstore, and it would be really nice if we could augment our POC with range & distance based geospatial queries. In looking through the recent commit history I see support for custom filter functions and I think this might be a reasonable way to achieve this result for a POC, but I don't have the resources to add this support in rdfstore, or the time to move to some other solution. I'm wondering if there is anyone with the time and expertise willing to add limited geo query support to the nodejs version rdfstore for a fee?

Enhancement request: more metadata about query results

When executing a SPARQL select query via Store.execute, the results come back as an array of object literals, where each key is one of the variables in the select. This makes two use cases harder, where the UI allows users to write or edit their own queries:

  • when presenting results to the user, it is harder to know which variables were part of the original query, as variables that occur in OPTIONAL clauses do not necessarily appear as keys in the return values;
  • when presenting results to the user, it is harder to list the variables in the same order as the user articulated in the original query, so the user has to work harder to relate the results to the query they wrote.

Both of these use cases could be addressed by providing a more structured return value which includes some metadata (e.g. an ordered list of the variables bound by the query) as well as the result bindings.

An alternative would be to allow the Store object to expose the SparqlParser object, so that we could at least re-parse the user's query to extract bound variables, or other structural features.

How-to: Serialize results to rdf/xml, turtle, json-ld or N-triples format

Hi!

I need to serialize graph (result from CONSTRUCT or SELECT query) in different RDF serialization formats: rdf/xml, turtle, json-ld and N-triples.
I use the graph.toNT() method to get the results in N-triples, what about the rdf/xml and the other formats?

Is there some way to serialize the results in other formats?

Thanks, all the best!

Network error

I'm trying to run latest build in browser with "Loading remote graphs" example and it gives me following error:
rdf_store_min.js:605 Error loading graph
rdf_store_min.js:605 Network error: [object Object]
Tried to run it in latest FF, Safari and Chrome (dev branch). Also tried to change graph URI to local one - this also doesn't helps

Mongo persistence setup?

I'm trying out the MongoDb RDF persistence but can't seem to get any statements actually written into Mongo; my test script below. Is it necessary to run the SPARQL endpoint for persistence? The launch script for that isn't included with the npm install distribution, and after building the node.js version (from zip) the script throws error cannot find module './index'.

Any help appreciated :)

var rdfstore = require('rdfstore');

new rdfstore.Store({persistent:true,
engine:'mongodb',
name:'mwa', // quads in MongoDB will be stored in a DB named mwa
overwrite:false, // delete all the data already present in the MongoDB server
mongoDomain:'localhost', // location of the MongoDB instance, localhost by default
mongoPort:27017 // port where the MongoDB server is running, 27017 by default
}, function( store ){

    store.execute('SELECT * { ?s ?p ?o }', function(success,results) {
console.log('\n\nFIRST SELECT returned ' + success + ' and results ' + JSON.stringify(results) );
if ( results.length==0) {
    // try an insert and then a query
    store.execute('INSERT DATA { <http://example/book3> <http://example.com/vocab#title> <http://test.com/example> }', function(result, msg){
        console.log('\n\nINSERT returned ' + result + ' and msg ' + JSON.stringify(msg) );              
        store.execute('SELECT * { ?s ?p ?o }', function(success,results) {
            console.log('select returned ' + success + ' and results ' + JSON.stringify(results) );
        });
    });
}

});

});

Malfunction of Store.Store.prototype.insert() if no graph to insert to is defined.

The insert() of Triples does not work if there is no graph defined to insert into.

if(graph != null) {
    query = "INSERT DATA { GRAPH " + this._nodeToQuery(graph) +" { "+ query + " } }";
} else {
    query = "INSERT DATA { " + this._nodeToQuery(graph) +" { "+ query + " }";
}

The second query is wrong. I did try to change to:

query = "INSERT DATA { " + query + " }";

But without luck, this query is not working either.

SPARQL DELETE does not delete everything

Given the dataset in the following turtle snippet:

https://gist.github.com/castanea/8009815

When running the following DELETE-queries:

DELETE WHERE { GRAPH <http://webidp.local/idp> {  <http://webidp.local/certs#2111edf479941934cb7b0fd437eea5e207901ae7> ?p ?o . } }
DELETE WHERE { GRAPH <http://webidp.local/idp> { <http://webidp.local/webids/test/a7937b64b8caa58f03721bb6bacf5c78cb235febe0e70b1b84cd99541461a08e> ?p ?o . } }
DELETE WHERE { GRAPH <http://webidp.local/idp> {  ?user <http://webidp.local/vocab#webID> <http://webidp.local/webids/test/a7937b64b8caa58f03721bb6bacf5c78cb235febe0e70b1b84cd99541461a08e> . } }

Not all the data expected gets deleted, the following triple remains in store:

<http://webidp.local/certs#2111edf479941934cb7b0fd437eea5e207901ae7> <http://webidp.local/vocab#base64DER> "MIIEYzCCA0ugAwIBAgIUIRHt9HmUGTTLew/UN+6l4geQGucwDQYJKoZIhvcNAQELBQAwgYQxLTArBgNVBAoTJEJlcm5lIFVuaXZlcnNpdHkgb2YgQXBwbGllZCBTY2llbmNlczEvMC0GA1UECxMmRW5naW5lZXJpbmcgYW5kIEluZm9ybWF0aW9uIFRlY2hub2xvZ3kxCzAJBgNVBAYTAkNIMRUwEwYDVQQDEwxCRkggV2ViSUQgQ0EwHhcNMTMxMjAyMTM0ODMzWhcNMTQxMjAyMTM0ODMzWjBUMRYwFAYDVQQDEw1KdXN0dXMgVGVzdHVzMQswCQYDVQQGEwJDSDEtMCsGA1UEChMkQmVybmUgVW5pdmVyc2l0eSBvZiBBcHBsaWVkIFNjaWVuY2VzMIIBIjANBgkqhkiG9w0BAQEFAAOCAQ8AMIIBCgKCAQEA3S/I/LvgOlMHEGi4NL1xpcEdLx0YH7zxMCtN+Fl6re6eYdXd/b5iOmCnjr0WC1Mg6KfEOWwMtila4zx3akn9DcFd53aULDVcfG4vpSogxeJJtWbzTuzqtU3R8DWeRaZf3+TpgWY/tZrYdiuNVoTAbV0EytOtiKZuMRzkaOXPhUjtcc+4nlsuQjEvC3OxEEzhFyNVDznP9TkcAglPwnuhYeaQZrBwaobDm7GyajB3K6mZzyK5brHK6e20yFBSNFWK9mjrp7nIGHHgcPP9+TzfOuY+3XrNY2rBFYHkXZxN0aK0Cppq80f0ZRmzPfasbeh65xiFy5IMDlXCNINo8LN8wQIDAQABo4H7MIH4MAkGA1UdEwQCMAAwCwYDVR0PBAQDAgeAMB0GA1UdJQQWMBQGCCsGAQUFBwMCBggrBgEFBQcDBDCBjAYDVR0RAQH/BIGBMH+GZ2h0dHBzOi8vbG9jYWxob3N0Ojg0NDMvaWQvdGVzdC9hNzkzN2I2NGI4Y2FhNThmMDM3MjFiYjZiYWNmNWM3OGNiMjM1ZmViZTBlNzBiMWI4NGNkOTk1NDE0NjFhMDhlI3Byb2ZpbGWBFGp1c3R1cy50ZXN0dXNAYmZoLmNoMBEGCWCGSAGG+EIBAQQEAwIHgDAdBgNVHQ4EFgQUlfYRktGPiLrfuMHbLWYXq6RVx7QwDQYJKoZIhvcNAQELBQADggEBAFEkViztWGOAQHdMKoMfSIQYN0Mpyhpzh2zD0dkwHIRH0J1KC3mWHgp4T5e5VW9HSbxI/ROfp4FFIjcLszL8vSQjMSwjyt7P9Ca0fNRLlaCDIswZjjAxGxAyd8PqYspRpQYdSpneYJF0PC/CkyNbWII0QxjsjtDBUCvZqZK965OFpq2sCvo3aa0vOmhdjYjHSFVWdSXf4RSb61bL14sYzb0TT2kMdVNibsRszcRCiCRfH1Jk5pj2MqShV2+7OXzC2YKAuJmbrhOCmOE06FaRBJbZU9TpoBWvEEOJGl7SI4n7VBBlRr8CjS2uANFlj1J8Gs5UYxXDaf6fV4pgQxASspw="^^<http://www.w3.org/2001/XMLSchema#base64Binary> . 

Besides this one, three other triples remain, which is correct and on purpose.

I verified the behaviour several times with different datasets (looking the same in terms of lengths etc.) and think that it is related to the size of the subject of the triple in question. Same behaviour seems to occur when not specifiying the type base64Binary for it.

Additionally, I ran a test using Fuseki with the exact same data and get the expected result (the triple given above beeing deleted as well).

close method

A close method, finishing the execution of the store and releasing all the resources being used, must be implemented.

env is undefined in line 5275 (better error message?)

i am trying to load an JSON-LD file in the store with

store.load("application/ld+json", data, function(success, results) {

and i have included this two js files:

  http://code.jquery.com/jquery-1.9.1.min.js
  https://raw.github.com/antoniogarrote/rdfstore-js/master/dist/browser/rdf_store.js

but i got the following error: is it possible to get a better error message?

rdf_store.js (line 5275)
env is undefined

rdf_store.js (line 5276)
Utils.lexicalFormBaseUri@https://raw.github.com/antoniogarrote/rdfstore-js/master/dist/browser/rdf_store.js:391
Utils.lexicalFormTerm@https://raw.github.com/antoniogarrote/rdfstore-js/master/dist/browser/rdf_store.js:404
jsonld.toTriples/callback@https://raw.github.com/antoniogarrote/rdfstore-js/master/dist/browser/rdf_store.js:2701
jsonld.toTriples@https://raw.github.com/antoniogarrote/rdfstore-js/master/dist/browser/rdf_store.js:2748
JSONLDParser.parser.parse@https://raw.github.com/antoniogarrote/rdfstore-js/master/dist/browser/rdf_store.js:5150
RDFLoader.RDFLoader.prototype.tryToParse@https://raw.github.com/antoniogarrote/rdfstore-js/master/dist/browser/rdf_store.js:5267
Store.Store.prototype.load@https://raw.github.com/antoniogarrote/rdfstore-js/master/dist/browser/rdf_store.js:31581
loadData/<@file:///User
b.Callbacks/c@http://code.jquery.com/jquery-1.9.1.min.js:3
b.Callbacks/p.fireWith@http://code.jquery.com/jquery-1.9.1.min.js:3
k@http://code.jquery.com/jquery-1.9.1.min.js:5
.send/r@http://code.jquery.com/jquery-1.9.1.min.js:5

new fast turtle parser (npm n3)

http://ruben.verborgh.org/blog/2013/04/30/lightning-fast-rdf-in-javascript/

"Node.js has spawned a new, asynchronous generation of tools. Asynchronous thinking is different from traditional stream processing: instead of actively waiting for data in program routines, you write logic that acts when data arrives. JavaScript is an ideal language for that, because callback functions are lightweight. I have written a parser for Turtle, an RDF serialisation format, that uses asynchrony for maximal performance."

Turtle parser fails for large files

After reading that you loaded 100,545 triples into memory for the LUBM benchmark I wanted to try loading my own data into rdfstore. However, store.load fails for a large Turtle file with more than 3,000 triples.
The general structure of my data looks like this:

@prefix cal: <http://127.0.0.1:5984/cal/> .
@prefix verb: <http://127.0.0.1:5984/cal/verb/> .
cal:d31615229aa3c0fd05b1c5c25c46a987 verb:adjclose "10829.68" ;
    verb:close "10829.68" ;
    verb:date "2010-10-01" ;
    verb:high "10907.41" ;
    verb:low "10759.14" ;
    verb:open "10789.72" ;
    verb:volume "4298910000" .

Smaller subsets of the data with 100, 300, and 1,000 triples are imported flawlessly in approximately 1 sec, 5 secs, and 17 secs, respectively. For 3,000 triples and more, both Node and Chrome stay unresponsive. I tried to track down the problem and it seems like that the process gets stuck in the parse method of TurtleParser.

Maybe an event-based parser or chunking would help. In the meantime, could loading the data as JSON be an intermediate solution?

ACID support

Are there any chances to get a full ACID version of rdfstore-js?
MongoDB itself doesn't support ACID transactions so maybe using a different DB could work?

null or xsd:string ?

In QueryFilter there is a question:

       if(builtincall === 'str') {
            if(ops[0].token === 'literal') {
                // lexical form literals
                return {token: 'literal', type:null, value:""+ops[0].value}; // type null? or "http://www.w3.org/2001/XMLSchema#string"
            } else if(ops[0].token === 'uri'){
                // codepoint URIs
                return {token: 'literal', type:null, value:ops[0].value}; // idem
            } else {
                return QueryFilters.ebvFalse();
            }
        }

I'd say from it should be the xsd:string. There is no literal with no type uri anymore.

BUG: inserting triples in JSON-LD format is broken

In Utils.lexicalFormBaseUri = function(term, env)

env is undefined

when in jsonld.toTriples a callback is constructed where no env is passed to lexicalFormBaseUri

as s, p, o are already regular URIs ( -> jsonld.normalize).

It takes "years" to load ~10MB of triples

Hi!

I tried to load file(local) with about 10MB of N-triples but it takes so much time. Actually it never finish...
Loading of 4K triples takes about 1 minute.

Where can be the problem? Thanks!

Literals with escaped characters loaded from N3 cannot be referenced with SPAR(Q|U)L

When literals get OIDs from being loaded from N3, they will be added to the literalsToOID hash with any escaped characters, \n``\t``"``\b``\r``\, de-referenced.

Consider this triple, stored in a text file:

<http://foo> <http://bar> "baz\nbaz" .

loaded like this:

var triple;
$.get("sample.n3", function(txt){ triple = txt });

then added to a store like this:

store.load('text/turtle', triple, function(){})

There is no way whatsoever to delete this data with SPAR*L:

store.execute('DELETE DATA {' + triple + '}', function(){})
">> Error unregistering quad"
">> Cannot read property '1' of undefined"
">> ERROR unregistering quad"

Whereas if the same triple is loaded via SPAR*L, it's fine:

store.execute('INSERT DATA {' + triple + '}', function(){})
store.execute('DELETE DATA {' + triple + '}', function(){})

My work around is to manually transform incoming n3 to SPAR*L. I am not supporting the python-like """ string format, but it is good enough, I guess, as I control how they are constructed, but this seems like a pretty tough thing to handle if you don't control both sides.

Naming convention about blank node identifier

Hello,

The rdfstore.js set the identifier for blank node by adding the prefix ":" with a blankNodeCounter, just like ":1"
After the query result from rdfstore.js, I intend to used the jsonld module to generate a json-ld object. Unfortunately, this kind of blank node identifier is not supported by json-ld. I read both specification and can't found any thing about the naming convention for blank node. I do not understand why JSON-LD can only read blank node identifier like this "_:b121", this means after the prefix there must have a charater and then number series.

I create the issue here so as to given an advisement, whether could rdfstore support this kind of naming convention for blank node in the DB as well.

Best regards,
Kevin

Make it possible to load data from remote sparql endpoints via construct queries

So when you do:

rdfstore.create(function(store) {
  store.load('sparql', {
    endpoint: 'http://dbpedia.org/sparql',
    accept: 'text/rdf+n3',
    query : 'CONSTRUCT {<http://dbpedia.org/resource/Resource_Description_Framework> ?p ?o . } WHERE {<http://dbpedia.org/resource/Resource_Description_Framework> ?p ?o } LIMIT 100'
  }, function(success, results) {
  }
});

The contents gets loaded into the given graph

Turtle Parser - Fails on comment

I have noticed that the Turtle parser fails when a comment is present in the .n3 document.

The Turtle language spec allows for comments denoted by a "#" like so:

@prefix : <http://test.test#> .
@prefix foaf: <http://xmlns.com/foaf/0.1/> .

# This line is a comment

:me a foaf:person;
    foaf:name "David" ;
    .

Loading this document logs error:

parsing error with mime type : SyntaxError: Expected "@base", "@prefix", [145] WS or [54] TriplesBlock but "D" found.

If the comment sits after the first triple, this similar error is logged instead:

parsing error with mime type : SyntaxError: Expected "@base", "@prefix" or [54] TriplesBlock but "A" found.

Finally, if the comment sits at the top of the document, the error is:

parsing error with mime type : SyntaxError: Expected "@base", "@prefix", [145] WS or [54] TriplesBlock but "x" found.

Comments at the end of the document cause no error.

I will be addressing this issue in the next few days, but I just wanted to log it here since you're much more familiar with the code than I :D

Thank you for the license change - really enjoying the library!

Support for custom functions

Is there currently support for custom functions in sparql queries, if yes how are these defined, if no are they on the roadmap?

IndexedDB Backend

rdfstore-js is great

An IndexedDB backend would be nice, given the storage limitations associated with LocalStorage (e.g. 5 Mb hard limit in Chrome).

<literal>> ("bad-formed" edpoint output XML)

In SPARQL endpoint output there is a "bad-formed" XML representation of > (>>). The problem is caused by code:

nextResult = nextResult+"<literal>";
if(result[p].lang != null ) {
    nextResult = nextResult + ' xml:lang="'+result[p].lang+'" ';
}
if(result[p].type != null ) {
    nextResult = nextResult + ' datatype="'+result[p].type+'" ';
}
nextResult = nextResult+">"+Server.xmlEncode(result[p].value)+"</literal>";

in files /dist/nodejs/server.js and /src/js-store/src/server.js. The problem could be probably solved by removing > from the first line:

nextResult = nextResult+"<literal";

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.