NVIDIA BioBert, an optimized version of BioBert was created specifically for biomedical and clinical domains, providing this community easy access to state-of-the-art NLP models.
2. 2
TRENDS IN NLP & SPEECH
NLP’s ImageNet Moment has Arrived
You don’t need a Phd in ML to do industrial
strength NLP.
LOWER BARRIER TO ENTRY
Textual data is still largely not utilized in
healthcare, despite its value.
UNSTRUCTURED & UNTAPPED
Pre-train a very language model once and fine
tune many times for different use cases
BioBERT beats BERT on Biomedical tasks.
ClinicalBERT beats BioBERT on clinical tasks.
DOMAIN SPECIFIC BEATS GENERIC
GROWTH OF MULTI-MODAL DATASETS
Transformer & its derivatives like BERT & XLNet
produce game changing performance improvements.
DRAMATICALLY IMPROVING ALGORITHMS
CONVERSATIONAL AI NEEDS LARGE MODELS
EHR data, PubMed literature, Clinical Notes,
Imaging, Devices, Patient Communications, Social
Media.
3. 3
USE CASES IN HEALTHCARE
Text Classification
Sentiment Analysis
Intent Classification
Message Triaging
Claims Processing
Named Entity Recognition
Information Extraction
Features in ML models
Knowledge Graphs
Automatic Weak Labeling
De-identification
Question-Answer
Answer questions posed in
natural language
Chatbots
Text Summarization
Summarize physician
notes, radiology reports
etc.
Speech Recognition
Call Center optimization
Voice commands
Machine Translation
Patient Engagement
Published Literature
4. 4
RACE TO CONVERSATIONAL AI
Exceeding Human Level Performance
GLUE Leaderboard
Google
(BERT)
Facebook
(RoBERTa
)
Alibaba
(Enriched BERT base)
Uber
(Plato)
Microsoft
(MT-DNN)
Baidu
(ERNIE)
2017 2018 2019 Today
Google
(Transformer
)
5. 5
DOMAIN SPECIFIC BEATS GENERIC
BioBERT
• Pre-trained on top of BERT using
PubMed data
• Beats BERT on Biomedical tasks.
Clinical BERT(s)
• Pre-trained on top of Bio-BERT using
clinical Notes
• Beats BioBERT on clinical tasks.
9. 9
TRAIN USING NGC
Optimized, Scalable & Easy to Use
For comparison, the BioBERT paper reported 10+ days
(240+ hours) to train on a 8x32 GB V100 system.
https://news.developer.nvidia.com/biobert-optimized/