Skip to content

wipen/BERT-multi-gpu

This branch is 87 commits behind guotong1988/BERT-pre-training:master.

Folders and files

NameName
Last commit message
Last commit date

Latest commit

675951a · Mar 18, 2019

History

17 Commits
Dec 25, 2018
Dec 25, 2018
Dec 25, 2018
Dec 25, 2018
Feb 25, 2019
Dec 25, 2018
Dec 25, 2018
Feb 21, 2019
Dec 25, 2018
Dec 25, 2018
Dec 25, 2018
Dec 25, 2018
Dec 25, 2018
Dec 25, 2018
Dec 25, 2018
Dec 25, 2018
Dec 25, 2018
Dec 25, 2018
Mar 18, 2019
Dec 25, 2018
Dec 25, 2018
Dec 25, 2018
Dec 25, 2018
Dec 25, 2018
Dec 25, 2018

Repository files navigation

BERT MULTI GPU

BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding

REQUIREMENT

python 3

tensorflow 1.12.0

TRAINING

0, edit the input and output file name in create_pretraining_data.py and run_pretraining_gpu.py

1, run create_pretraining_data.py

2, run run_pretraining_gpu.py

PARAMETERS

Edit n_gpus in run_pretraining_gpu.py

DATA

In sample_text.txt, sentence is end by \n, paragraph is splitted by empty line.

About

multi gpu training in one machine for BERT

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%