hello there! you're awesome, this plugin saves my life.
I have 400k books with 70k duplicates.
the issue I have is very specific to implementation. as I see, when merging books it first copies them to tmpfs (RAM) and then does its magic. however, in the process I've been watching total file and directory count in Calibre library. first, it was decreasing (from 1370 thousand files/dirs went down to bout 1.3 mil files/dirs). after a long time, about 8 hours processing, it started to go up again, and reached about 1305 thousand files. and then.. crashed with no space left on tmpfs. I got Calibre error, clicked ok. restarted Calibre to clean up RAM usage, ran duplicates scan again with the same parameters, and the found dupes count went down from 70k to 50k. so, I guess, it processed 20k files before crashing.
the question is, do I lose any books forever if it crashes in the process? are they removed from the filesystem when they're copied to RAM? I suspect this cause file count is dropping while it loads them to RAM. I can't really see if they are lost forever in such case, cause I can't track which books are being processed.
thank you!