You can not select more than 25 topics Topics must start with a chinese character,a letter or number, can include dashes ('-') and can be up to 35 characters long.

README.md 1.0 kB

4 years ago
123456789101112131415161718192021222324252627
  1. ## Usage
  2. This directory contains examples using `heturun` command.
  3. * Data Parallel (MLP model and WDL model):
  4. ```bash
  5. # Local Data Parallel Using AllReduce
  6. heturun -c local_allreduce.yml python run_mlp.py --config lar
  7. # Local Data Parallel Using AllReduce for Dense Parameters and PS for Sparse(Embedding) Parameters
  8. heturun -c local_ps.yml python run_wdl.py --config lhy
  9. # Local Data Parallel Using PS
  10. heturun -c local_ps.yml python run_mlp.py --config lps
  11. heturun -c local_ps.yml python run_wdl.py --config lps
  12. # Distributed Data Parallel Using AllReduce
  13. heturun -c remote_allreduce.yml python run_mlp.py --config rar
  14. # Distributed Data Parallel Using AllReduce for Dense Parameters and PS for Sparse(Embedding) Parameters
  15. heturun -c remote_ps.yml python run_wdl.py --config rhy
  16. # Distributed Data Parallel Using PS
  17. heturun -c remote_ps.yml python run_mlp.py --config rps
  18. heturun -c remote_ps.yml python run_wdl.py --config rps
  19. ```
  20. * For other parallel schemes, please refer to `parallel` directory.