mivg / sled Goto Github PK
View Code? Open in Web Editor NEWThe official repository for Efficient Long-Text Understanding Using Short-Text Models (Ivgi et al., 2022) paper
License: MIT License
The official repository for Efficient Long-Text Understanding Using Short-Text Models (Ivgi et al., 2022) paper
License: MIT License
In usage_example.py, generated_output is always equal to the prefix. Or if there's no prefix, it's equal to the document content. I tried it with a few different texts.
from sled import SledTokenizer, SledForConditionalGeneration
model = SledForConditionalGeneration.from_pretrained('tau/bart-large-sled')
tokenizer = SledTokenizer.from_pretrained('tau/bart-large-sled')
inputs = tokenizer("Hello, my dog is cute", return_tensors="pt")
model = model.to(0)
inputs = {k: v.to(0) for k, v in inputs.items()}
model(**inputs)
The above code produces the following error.
Relevant versions and using python 3.10
torch==1.13.1+cu116
torchaudio==0.13.1+cu116
torchmetrics==0.11.3
torchvision==0.14.1+cu11
transformers==4.26.1
py-sled==0.1.7
Traceback (most recent call last):
File "", line 1, in
File "/home/ga2530/miniconda3/envs/bhc/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1194, in _call_impl
return forward_call(*input, **kwargs)
File "", line 2, in _forward
File "/home/ga2530/miniconda3/envs/bhc/lib/python3.10/site-packages/sled/modeling_sled.py", line 756, in _forward
forward_kwargs["encoder_outputs"] = self._run_sliding_window_forward(
File "/home/ga2530/miniconda3/envs/bhc/lib/python3.10/site-packages/sled/modeling_sled.py", line 476, in _run_sliding_window_forward
return self._run_sliding_window_forward_stacked(args_tensor_inds, kwargs_tensor_keys, s, *args,
File "/home/ga2530/miniconda3/envs/bhc/lib/python3.10/site-packages/sled/modeling_sled.py", line 562, in _run_sliding_window_forward_stacked
encoder_outputs2 = self._underlying_model.forward(
File "/home/ga2530/miniconda3/envs/bhc/lib/python3.10/site-packages/transformers/models/bart/modeling_bart.py", line 1393, in forward
lm_logits = lm_logits + self.final_logits_bias.to(lm_logits.device)
AttributeError: 'MockTensorForConditionalGeneration' object has no attribute 'device'
The instructions for using SLED are a bit confusing, it's not clear on how one should use SLED when working with non BART models.
When loading a model checkpoint of type AutoModelForSeq2SeqLM with AutoModel, the state dict will not be loaded correctly, as some of the name of parameters are different.
Hi,
Thanks for releasing the code for SLED.
The README suggests editing the config appropriately to use SLED with other base models from hugging face. However, this only works with hugging face models. Is there a way to interface SLED with other models that are not on hugging face?
A description of how to go about that and what code changes (in SLED and in the base model) might be needed would be really helpful.
Thanks!
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.