Comments (17)
Received the model weights today.
from llama.
two days, no reply. I feel frustrated.
from llama.
I asked ChatGPT for the response time of facebookResearch/llama , but it can't tell me.
from llama.
Received the model weights today.
How big is the download out of interest?
Downloading still.
The 7B parameter model is 13.5 GB.
The 13B parameter model is 26 GB.
from llama.
The application has not received a rejection or a successful reply.
from llama.
One day, no reply
from llama.
Minute by minute, where you wait for who he succeeded
from llama.
I have been doing the test with the help of chatgpt for whole day, unsuccessful. Also waiting for someone who has applied it successfully.
from llama.
4 days no reply... it's ok guys
from llama.
Did you use .edu mail to apply? I used, but no reply 3days.
from llama.
Not me. I don't have an edu. I'm just a regular jo ☹
from llama.
您是否使用 .edu 邮件申请?我用了,但是3天没有回复。
There was no reply to the school email application.
from llama.
I request at Feb 27 PM 12(KST)(UTC time : Feb 27 3 AM)
I applied at Mar 2 AM 11(KST)(UTC time : Mar 2 2 AM)
I fill my publication of research. But i haven't research about NLP.
I used .edu mail.
from llama.
Received the model weights today.
How big is the download out of interest?
from llama.
@shaswatsaxena, how long did you wait?
from llama.
@Annusha 5 days
from llama.
Applied on 31st may. Its 6th June today. Used an .edu email id. No response. :|
from llama.
Related Issues (20)
- ### System Info HOT 1
- Architecture
- Agnostic Atheist AI not Normal HOT 14
- Discussing a potential bias in Llama2-Chat that can lead to content safety issues
- download.sh didn't work well HOT 3
- parameter count of Llama2-70B and Llama2-13B
- Change the name of openai to closeai and change the project name to openai.
- Error: llama runner process no longer running: 3221225785
- [Generation, Question] Why does the `seed` have to be the same in different processors (`Llama.build`)?
- how can i evaluate mathematic datasets like GSM8K?
- Test Tokenizer gives Incorrect padding error
- No response from request to access models
- how to download this model HOT 1
- Providing SHA-256 hashes
- This PR will implement code for reproducing results in the following paper:
- Unable to access the Hugging Face Llama-3 model repo
- [Parallel MD5] Accelerating `download.sh`
- LLaMA3 supports an 8K token context length. When continuously pretraining with proprietary data, the majority of the text data is significantly shorter than 8K tokens, resulting in a substantial amount of padding. To enhance training efficiency and effectiveness, it is necessary to merge multiple short texts into a longer text, with the length remaining below 8K tokens. However, the question arises: how should these short texts be combined into a single training sequence? Should they be separated by delimiters, or should an approach involving masking be used during the pretraining process?
- Oddities downloading the 8b-instruct model
- How to infer answer using llama2-7b-hf?
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from llama.