Comments (2)
Thanks @ho4040 , We are taking a look and will get back to you shortly.
from transformers-neuronx.
I was able to get adequate output on inf2.8xlarge:
['Jihye\'s Persona: A 22-year-old woman working part-time at a convenience store in Seoul.
<START>
You:...
Jihye: Welcome, man.
You: hello?
Jihye: You can use the bathroom now. I\'ll be right here, waiting.
Jihye: Please do yourself a favor and be fast about it. I\'m not here for your business. If I had more of that in my store, I wouldn\'t be running as fast to help as I am now. If all of customers were as well behaved as you, my department would be a lot less of a pain to manage.
Jihye: Let\'s not get into any more of an argument. You seem impatient to get back to your business. I\'ll wait for you again when you\'re finished. Good luck.
Jihye: If you\'re finished, I mean. (I\'ve been waiting a while...)
<STOP>
Jihye: *I sigh.*
Shit... I wonder how bad of a week it would have to be for a customer like him...
*It wasn\'t exactly surprising that customers like this were']
I had to make a few changes to get it running on a smaller machine:
smaller params here:
GPTJForSampling.from_pretrained('./pygmalion-6b-split', n_positions=256, batch_size=1, tp_degree=1, amp='f16')
and
neuron_model.sample(input_ids, sequence_length=256)
start = time.time()
neuron_model.sample(input_ids, sequence_length=256)
then run with FI_EFA_FORK_SAFE=1.
Environment:
RockyLinux 9.2, Podman container running python 3.8 and transformers_neuronx-0.5.58
I'm not sure what revision of pygmalian I have, could be an old one. Here is the sha256sum of model-00001:
# sha256sum pytorch_model-00001-of-00002.bin
88ba2b44537f444e3fad92dff6962ac8c0b983427523484f98e7acf2d71fd65e pytorch_model-00001-of-00002.bin
from transformers-neuronx.
Related Issues (20)
- How to use generate() with inputs_embeds HOT 2
- Mixtral config issue -- not handling null well HOT 8
- Generate Llama 2 from Embeddings HOT 5
- Infering logits from `model.forward` for the entire batch instead of the last forward's output. HOT 6
- Support for MPT model HOT 1
- `stopping_criteria_list(input_ids, probs)` does not check for the correct sequence. HOT 4
- User feedback when compiling and reloading a large model HOT 1
- Issue while compiling Mistral 7B 0.2 Instruct HOT 5
- Backward compatibility with saved llama 2 compiled artifacts HOT 1
- NaN outputs when masking llama model inputs HOT 8
- Improve Neuron model loading time HOT 4
- Add support for `gemma` models HOT 1
- Add support for Baichuan-13B model
- Latest changes introduced for continuous batching break Mixtral model HOT 5
- llava support HOT 3
- Any plan to support Qwen-2 Model
- Neuron model NEFFs are dependent on the python path HOT 2
- Not able to load llama 3 70b on inf2.24xlarge instance HOT 5
- Gibberish output for princeton-nlp/Sheared-LLaMA-1.3B with continuous batching HOT 2
- [Question] BasicTransformerBlock
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from transformers-neuronx.