Comments (10)
This is a good job. Thank you.
And
I remember the f1-score > 88 some days ago, but the f1-score < 88 I ran last night.
Because the entities fully-connected layer use the same weight after the update?Could you share how to adjust parameters such as the learning rate to get more results?
Because I want to use R-BERT in my dataset, but the result is not very well.
Thanks.
你好,请问你有复现到89.25吗 我现在最高调试到88.39,你的最高能到多少呢
from r-bert.
This is a good job. Thank you.
And
I remember the f1-score > 88 some days ago, but the f1-score < 88 I ran last night.
Because the entities fully-connected layer use the same weight after the update?
Could you share how to adjust parameters such as the learning rate to get more results?
Because I want to use R-BERT in my dataset, but the result is not very well.
Thanks.你好,请问你有复现到89.25吗 我现在最高调试到88.39,你的最高能到多少呢
你好,我没有对这个进行调参,我们可以交流+Q 276431860
from r-bert.
This is a good job. Thank you.
And
I remember the f1-score > 88 some days ago, but the f1-score < 88 I ran last night.
Because the entities fully-connected layer use the same weight after the update?Could you share how to adjust parameters such as the learning rate to get more results?
Because I want to use R-BERT in my dataset, but the result is not very well.
Thanks.
请问你是用中文训练的嘛
from r-bert.
This is a good job. Thank you.
And
I remember the f1-score > 88 some days ago, but the f1-score < 88 I ran last night.
Because the entities fully-connected layer use the same weight after the update?
Could you share how to adjust parameters such as the learning rate to get more results?
Because I want to use R-BERT in my dataset, but the result is not very well.
Thanks.请问你是用中文训练的嘛
不是,只是跑了一下这个代码,当然用的是Sem2010数据集的。
from r-bert.
This is a good job. Thank you.
And
I remember the f1-score > 88 some days ago, but the f1-score < 88 I ran last night.
Because the entities fully-connected layer use the same weight after the update?
Could you share how to adjust parameters such as the learning rate to get more results?
Because I want to use R-BERT in my dataset, but the result is not very well.
Thanks.请问你是用中文训练的嘛
不是,只是跑了一下这个代码,当然用的是Sem2010数据集的。
嘿嘿。请问下有没有类似的中文数据集啊?谢谢
from r-bert.
This is a good job. Thank you.
And
I remember the f1-score > 88 some days ago, but the f1-score < 88 I ran last night.
Because the entities fully-connected layer use the same weight after the update?
Could you share how to adjust parameters such as the learning rate to get more results?
Because I want to use R-BERT in my dataset, but the result is not very well.
Thanks.请问你是用中文训练的嘛
不是,只是跑了一下这个代码,当然用的是Sem2010数据集的。
嘿嘿。请问下有没有类似的中文数据集啊?谢谢
我之前整理过中医数据集方面的资料,不过没有细看,你可以看一下。
http://cips-chip.org.cn/2020/eval2
https://github.com/GanjinZero/awesome_Chinese_medical_NLP
https://kns.cnki.net/kcms/detail/detail.aspx?dbcode=CMFD&dbname=CMFD202001&filename=1019668145.nh&v=PoGN%25mmd2FoPjZYSA%25mmd2Bk7VryERaP9ChrEabWA0giTnbdc%25mmd2FfyZZ5hDJj1Jw9gwEErKppD5Y
https://github.com/xiaopangxia/TCM-Ancient-Books
https://github.com/yao8839836/PTM
https://github.com/yao8839836/CEMRClass
from r-bert.
This is a good job. Thank you.
And
I remember the f1-score > 88 some days ago, but the f1-score < 88 I ran last night.
Because the entities fully-connected layer use the same weight after the update?
Could you share how to adjust parameters such as the learning rate to get more results?
Because I want to use R-BERT in my dataset, but the result is not very well.
Thanks.你好,请问你有复现到89.25吗 我现在最高调试到88.39,你的最高能到多少呢
你好,我跑了一下这个代码,结果只得到87.95,请问你是通过什么调试到88.39的呢,方便分享下方法吗,谢谢~
from r-bert.
This is a good job. Thank you.
And
I remember the f1-score > 88 some days ago, but the f1-score < 88 I ran last night.
Because the entities fully-connected layer use the same weight after the update?
Could you share how to adjust parameters such as the learning rate to get more results?
Because I want to use R-BERT in my dataset, but the result is not very well.
Thanks.你好,请问你有复现到89.25吗 我现在最高调试到88.39,你的最高能到多少呢
你好,我跑了一下这个代码,结果只得到87.95,请问你是通过什么调试到88.39的呢,方便分享下方法吗,谢谢~
就是修改不同的种子,我当时把seed设置为2333就可以得到,你可以尝试一下
from r-bert.
from r-bert.
Mul-BERT 在 the SemEval 2010 Task 8 dataset 到达 90.72 (Macro-F1)方法非常简单,推荐一波:https://github.com/DongPoLI/Mul-BERT,
from r-bert.
Related Issues (16)
- Question about F1 results HOT 2
- 为什么执行了python3 main.py --do_train --do_eval后result.txt文件是空的? HOT 6
- Mul-BERT 在 the SemEval 2010 Task 8 dataset 到达 90.72 (Macro-F1)方法非常简单,
- The question is that the information of e1 and e2 is obtained in predict.py, but predict.py does not seem to be used in train .
- Training on a custom dataset
- Custom dataset with different relations
- 运行predict.py出现错误:FileNotFoundError: [Errno 2] No such file or directory: './model\\training_args.bin'
- How can i train my own dataset? HOT 4
- CUDA out of memory HOT 2
- How can I retrain the model from the past trained model ? HOT 2
- Some model files might be missing... HOT 5
- Segmentation fault (core dumped) HOT 1
- how can i train the R-bert in chinese dataset? HOT 1
- 看到效果达到89.69%的关系抽取源码实现,推荐一波 https://github.com/DongPoLI/EC-BERT HOT 1
- FClayer for two entities
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from r-bert.