Comments (5)
Do you observe high variation on the training speed? Or is it consistently low? I think you are taking the average value. Can you please show a section of the log output (containing multiple lines of the training speed) of the workers?
from byteps.
There are a few things you can try. If any of the following works for you, please let us know. Though the following env may start with MXNET
, they apply to any workers, TF/MXNet/PyTorch, because the parameter server is based on MXNet.
-
For the parameter servers, set
export MXNET_OMP_MAX_THREADS=10
if you have 16 CPU cores per server. Setexport MXNET_OMP_MAX_THREADS=4
if you only have 8 CPU cores -
Set
export MXNET_CPU_WORKER_NTHREADS=32
. This may speed up the parameter server -
Start more parameter server instances. For example, when you have two physical machines to run the servers, you can start 4 (
DMLC_NUM_SERVER=4
), two server instances per physical machine. This will increase the network bandwidth utilization especially when your single TCP flow cannot saturate your bandwidth.
from byteps.
@bobzhuyb thank you for your reply.
All my rdma machines are in use for training a bert. I will try it next week.
Actually I want to use this kit for training a bert, but when I use this to test a bert_large model (vocab size is 110k) training speed. The speed up ratio is much lower than horovod’s . I will upload my experiment results next week.
from byteps.
@MarvinLong Do you have NVLinks? If not, BytePS may not give you speed-up because it is optimized for the scenario where the network bandwidth is the bottleneck. However, with 80Gbps RDMA without NVLinks, the bottlenck is inside each machine.
If you have NVLinks, properly configured BytePS should give you significant speedup compared with Horovod, especially with 3+ workers.
from byteps.
@bobzhuyb yes, I have 4 NVLinks machines with IB. As your advise I can use them as 2 workers and 2 servers.
from byteps.
Related Issues (20)
- How to use gradient accumulate in BytePS torch DDP? HOT 5
- The byteps in K8S Pod doesn't have DMLC_WORKER_ID configured.
- Stuck in the bps.init(). HOT 7
- Is it right to do allreduce immediately for non-zero ranks in bytescheduler? HOT 2
- 啥时候支持sparse模型?
- 有计划支持纯cpu吗?我们worker也用cpu机器的 HOT 2
- benchmark with cross barrier error
- Successfully installed BytePS but cannot import byteps.torch or byteps.tensorflow HOT 2
- Running multiple workers on a single GPU machine
- Release BytePS docker image support for TF2
- 安装报错 HOT 1
- Communication failure in MXNet with BytePS HOT 3
- support for fault tolerance and straggler mitigation
- broadcast and is_initialized api are not supported with pytorch.
- Supported environment
- 安装问题
- Mistakes of Workload calculation HOT 5
- How does the tensorflow scheduler plugin used in the tf_benchmark_cnn.py HOT 1
- segmentation fault while launching the worker HOT 1
- Is there any benchmark comparison with Megatron-LM ?
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from byteps.