Pre-trained language models in biomedical domain: A systematic survey

B Wang, Q Xie, J Pei, Z Chen, P Tiwari, Z Li… - ACM Computing …, 2023 - dl.acm.org
Pre-trained language models (PLMs) have been the de facto paradigm for most natural
language processing tasks. This also benefits the biomedical domain: researchers from …

Pre-trained Language Models in Biomedical Domain: A Systematic Survey

B Wang, Q Xie, J Pei, Z Chen, P Tiwari, Z Li - arXiv e-prints, 2021 - ui.adsabs.harvard.edu
Pre-trained language models (PLMs) have been the de facto paradigm for most natural
language processing (NLP) tasks. This also benefits biomedical domain: researchers from …

[PDF][PDF] Pre-trained Language Models in Biomedical Domain: A Systematic Survey

B WANG, Q XIE, J PEI, Z CHEN, P TIWARI, Z LI, JIE FU - 2023 - researchgate.net
As the principal method of communication, humans usually record information and
knowledge in a format of token sequences, eg, natural languages, time series, constructed …

Pre-trained Language Models in Biomedical Domain: A Systematic Survey

B Wang, Z Chen, Q Xie, J Pei, P Tiwari, Z Li… - ACM Computing …, 2024 - diva-portal.org
Pre-trained language models (PLMs) have been the de facto paradigm for most natural
language processing tasks. This also benefits the biomedical domain: researchers from …

Pre-trained Language Models in Biomedical Domain: A Systematic Survey

B Wang, Q Xie, J Pei, Z Chen, P Tiwari, Z Li - arXiv preprint arXiv …, 2021 - arxiv.org
Pre-trained language models (PLMs) have been the de facto paradigm for most natural
language processing (NLP) tasks. This also benefits biomedical domain: researchers from …