Comments (8)
Thanks for flagging @byi8220 ! We put this patch in to give default samplers the faster behaviour by default. I'll look at pulling this out to a non-monkey patch solution
from data.
cc @gokulavasan see the original issue in HF Accelerate on assumptions on number of calls to iter: huggingface/accelerate#2894
from data.
I see 3 approaches currently to fix this,
- we fork the dataloader init code, this is maybe the best way but introduces more forked code
- we repro the sampler/batch_sampler set up in init before we call super().init(...), this seems hacky and not significantly better than 1)
- we check isinstance after super().init() and replace, but that heading down old-school Lightning territory and surely will lead to head-scratching and further problems down the road.
I'm going to go with 1)
from data.
Thanks for resolving this! This appears to have fixed the breaking tests in accelerate
, and the repro above shows there is no more monkey patching going on.
However, it might be worth mentioning that if one passes in an existing non-stateful BatchSampler to the StatefulDataLoader constructor, then the constructed StatefulDataLoader
will use the provided sampler. Only pointing this out since it wasn't clear to me if this is the intended behavior.
Repro output with nightly 0.7.1.dev20240703+cpu
(important stuff in green diff):
--------------------------------------------------------------------------------
BatchSampler before importing `stateful_dataloader`: <class 'torch.utils.data.sampler.BatchSampler'>
RandomSampler before importing `stateful_dataloader`: <class 'torch.utils.data.sampler.RandomSampler'>
--------------------------------------------------------------------------------
type(non_stateful_dataloader.batch_sampler): <class 'torch.utils.data.sampler.BatchSampler'>
type(non_stateful_dataloader.batch_sampler.sampler): <class '__main__.MyRandomSamplerWrapper'>
type(non_stateful_dataloader.batch_sampler.sampler.original_sampler): <class 'torch.utils.data.sampler.RandomSampler'>
--------------------------------------------------------------------------------
+ BatchSampler after importing `stateful_dataloader`: <class 'torch.utils.data.sampler.BatchSampler'>
+ RandomSampler after importing `stateful_dataloader`: <class 'torch.utils.data.sampler.RandomSampler'>
--------------------------------------------------------------------------------
+ # Even after the fix, stateful_dataloader has a non-stateful sampler
type(stateful_dataloader.batch_sampler): <class 'torch.utils.data.sampler.BatchSampler'>
type(stateful_dataloader.batch_sampler.sampler): <class '__main__.MyRandomSamplerWrapper'>
type(stateful_dataloader.batch_sampler.sampler.original_sampler): <class 'torch.utils.data.sampler.RandomSampler'>
--------------------------------------------------------------------------------
type(stateful_dataloader_2.batch_sampler): <class 'torchdata.stateful_dataloader.sampler.BatchSampler'>
type(stateful_dataloader_2.batch_sampler.sampler): <class 'torch.utils.data.sampler.SequentialSampler'>
--------------------------------------------------------------------------------
from data.
@byi8220 thanks for the details! If I understand correctly: user has explicitly passed in a non-stateful Batch Sampler to DataLoader constructor? In this case, I think this is correct, we should respect what the user has passed in and not try to override it under the hood. I've seen code that does this before in other libraries and it can cause some nasty surprises and un-intuitive behaviour, and can be very hard to debug.
from data.
user has explicitly passed in a non-stateful Batch Sampler to DataLoader constructor?
Yes
In this case, I think this is correct, we should respect what the user has passed in and not try to override it under the hood.
This makes sense, I was just curious if this would cause an issue with saving or loading a state dict into this dataloader
from data.
That's a great call, yes it might cause an issue if checkpoint was saved before and then loaded with the new code-change, but it might also just try to fast-forward, I haven't attempted. We'll be cutting a release this month, so once that's out it should be easier to manage these types of changes.
For the case where users are explicitly passing in a BatchSampler, they can import it from torchdata.stateful_dataloader.samplers instead of from torch.utils.data
from data.
I'm not super familiar with the stateful_dataloader code, but I agree with your hunch that it should fall back to fast forwarding.
Still, it might be worth adding a warning? Worst case, it might help catch a hard to find bug. But best case, isn't fast forwarding an iterator costly?
from data.
Related Issues (20)
- Iterating a data pipe, created with random split, ends in error as the code tries to iterate past the data pipe lenght
- `v2.1.2+cu118` and `v2.1.1+cu118` run into torchdata `ImportError: libssl.so.3: cannot open shared object file: No such file or directory`, that `v2.1.0+cu118` doesn't have an issue with HOT 1
- PyTorch 2.2: import torchdata fails on ubuntu-20.04 github runners HOT 3
- Dataloader is slow with iterdatapipes and shuffle that has large in-memory fields (because traverse_dps is slow) HOT 3
- DataLoader2 with multiprocess raise exception: Can not request next item while we are still waiting response for previous request HOT 1
- Move to removesuffix string method after python 3.8 support is dropped
- torchdata not compatible with torch 2.3.0 HOT 3
- [StatefulDataLoader] macOS tests are too slow
- MacOS state_dict tests in CI are failing during shutdown HOT 2
- StatefulDataLoader stores worker state twice if the IterableDataset is also an Iterator
- GDriveReaderDataPipe complains "using a sharing/viewing link instead of a download link"
- iter(dataset) is called twice for certain cases of state restore of IterableDataset HOT 3
- State_dict on dataset seems to be called more often than expected HOT 2
- Make DistributedSampler stateful HOT 4
- Enable Append Mode in SaverIterDataPipe HOT 1
- Returning tensor instead of dict for state_dict causes failure HOT 2
- best practice for `snapshot_every_n_steps` HOT 1
- what's the exact plan for torchdata now? HOT 1
- early stop worker got Exception Error HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from data.