Code Monkey home page Code Monkey logo

Comments (6)

mbdavid avatar mbdavid commented on May 19, 2024

Hello @sherry-ummen! It's easy to change this limit from 1Mb to 16Mb, just need change here:

https://github.com/mbdavid/LiteDB/blob/master/LiteDB/Document/BsonDocument.cs#L14

But...

The question is: why your document are so big? I think 1Mb a realllllly big document, and I always try keep under 100Kb.
Big document consume too memory (all document must be loaded into memory) and are too slow in read and write operations. Remember: LiteDB is an embedded database so this memory consume are from "client".

  • Can you split your document in more documents? You can use DbRef<T> to "join"
  • Your document contains a file or an ByteArray? You can use FileStorage
  • You document contains a big text? You can use FileStorage too.

Take a look on MongoDB documents about data modeling. All mongodb data modeling concepts are valid to LiteDB:
http://docs.mongodb.org/manual/core/data-modeling-introduction/

from litedb.

sherry-ummen avatar sherry-ummen commented on May 19, 2024

Thanks Mauricio.

Yes the document is text and its big. Basically some graphical objects related data. And its legacy code which generates big objects. So its very difficult to change the behavior.

Why is it slow? Is it the serializer which is slow? We are currently using mongodb and if the doc size is more than 16MB then we store it as Blob.

But we want embedded database. And Litedb suits best.

Sent from my Windows Phone


From: Mauricio Davidmailto:[email protected]
Sent: ‎21/‎04/‎2015 16:07
To: mbdavid/LiteDBmailto:[email protected]
Cc: Sherry Ummenmailto:[email protected]
Subject: Re: [LiteDB] Document size limit to be increased to 16Mb? (#25)

Hello @sherry-ummen! It's easy to change this limit from 1Mb to 16Mb, just need change here:

https://github.com/mbdavid/LiteDB/blob/master/LiteDB/Document/BsonDocument.cs#L14

But...

The question is: why your document are so big? I think 1Mb a realllllly big document, and I always try keep under 100Kb.
Big document consume too memory (all document must be loaded into memory) and are too slow in read and write operations. Remember: LiteDB is an embedded database so this memory consume are from "client".

  • Can you split your document in more documents? You can use DbRef<T> to "join"
  • Your document contains a file or an ByteArray? You can use FileStorage
  • You document contains a big text? You can use FileStorage too.

Take a look on MongoDB documents about data modeling. All mongodb data modeling concepts are valid to LiteDB:
http://docs.mongodb.org/manual/core/data-modeling-introduction/


Reply to this email directly or view it on GitHub:
#25 (comment)

from litedb.

mbdavid avatar mbdavid commented on May 19, 2024

There is no problem to serialize/deserialize big documents, LiteDB uses TextRead/TextWriter to avoid performance problems. But documents are treated as a single unit, so when you need read a big document, you need read all data pages, store all in memory (CacheService) and deserialize all bytes. To save is the same problem: a minimal change must serialize all document and write in all pages.

FileStorage (as MongoDB GridFS) works as a splitter content in separate documents. To store big files, LiteDB split content in 1MB chunks and store one at time. After each chunk, LiteDB clear cache to avoid use too many memory.
https://github.com/mbdavid/LiteDB/blob/master/LiteDB/Database/FileStorage/LiteFileStorage.cs#L49

from litedb.

sherry-ummen avatar sherry-ummen commented on May 19, 2024

Ok so reading from all data pages is the problem? Then that should not be an issue in case of SSD ?. And will memory mapped i/o will help?

Sent from my Windows Phone


From: Mauricio Davidmailto:[email protected]
Sent: ‎21/‎04/‎2015 22:32
To: mbdavid/LiteDBmailto:[email protected]
Cc: Sherry Ummenmailto:[email protected]
Subject: Re: [LiteDB] Document size limit to be increased to 16Mb? (#25)

There is no problem to serialize/deserialize big documents, LiteDB uses TextRead/TextWriter to avoid performance problems. But documents are treated as a single unit, so when you need read a big document, you need read all data pages, store all in memory (CacheService) and deserialize all bytes. To save is the same problem: a minimal change must serialize all document and write in all pages.

FileStorage (as MongoDB GridFS) works as a splitter content in separate documents. To store big files, LiteDB split content in 1MB chunks and store one at time. After each chunk, LiteDB clear cache to avoid use too many memory.
https://github.com/mbdavid/LiteDB/blob/master/LiteDB/Database/FileStorage/LiteFileStorage.cs#L49


Reply to this email directly or view it on GitHub:
#25 (comment)

from litedb.

mbdavid avatar mbdavid commented on May 19, 2024

You will not avoid read all pages if your document is big and you need read all. To better performance, SSD disk are great and RAM memory too.

Read documents with 16Mb is not a big issue if you read one or two documents each time. If you have many, I recommend to "close" and "re-open" database (using(var db = new LiteDatabase(...) { ... }).

I have plans (it´s on my todo-list) to implement a better cache service that auto clear non-used cache pages, so it´s avoid to close/open database.

from litedb.

AhmedDeffous avatar AhmedDeffous commented on May 19, 2024

Thanks Mauricio.

Yes the document is text and its big. Basically some graphical objects related data. And its legacy code which generates big objects. So its very difficult to change the behavior.

Why is it slow? Is it the serializer which is slow? We are currently using mongodb and if the doc size is more than 16MB then we store it as Blob.

But we want embedded database. And Litedb suits best.

Sent from my Windows Phone

From: Mauricio Davidmailto:[email protected] Sent: ‎21/‎04/‎2015 16:07 To: mbdavid/LiteDBmailto:[email protected] Cc: Sherry Ummenmailto:[email protected] Subject: Re: [LiteDB] Document size limit to be increased to 16Mb? (#25)

Hello @sherry-ummen! It's easy to change this limit from 1Mb to 16Mb, just need change here:

https://github.com/mbdavid/LiteDB/blob/master/LiteDB/Document/BsonDocument.cs#L14

But...

The question is: why your document are so big? I think 1Mb a realllllly big document, and I always try keep under 100Kb. Big document consume too memory (all document must be loaded into memory) and are too slow in read and write operations. Remember: LiteDB is an embedded database so this memory consume are from "client".

  • Can you split your document in more documents? You can use DbRef<T> to "join"
  • Your document contains a file or an ByteArray? You can use FileStorage
  • You document contains a big text? You can use FileStorage too.

Take a look on MongoDB documents about data modeling. All mongodb data modeling concepts are valid to LiteDB: http://docs.mongodb.org/manual/core/data-modeling-introduction/

Reply to this email directly or view it on GitHub: #25 (comment)

from litedb.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.