Natural language generation and understanding of big code for AI-assisted programming: A review

MF Wong, S Guo, CN Hang, SW Ho, CW Tan - Entropy, 2023 - mdpi.com
This paper provides a comprehensive review of the literature concerning the utilization of
Natural Language Processing (NLP) techniques, with a particular focus on transformer …

Deep learning in electron microscopy

JM Ede - Machine Learning: Science and Technology, 2021 - iopscience.iop.org
Deep learning is transforming most areas of science and technology, including electron
microscopy. This review paper offers a practical perspective aimed at developers with …

Codet5+: Open code large language models for code understanding and generation

Y Wang, H Le, AD Gotmare, NDQ Bui, J Li… - arXiv preprint arXiv …, 2023 - arxiv.org
Large language models (LLMs) pretrained on vast source code have achieved prominent
progress in code intelligence. However, existing code LLMs have two main limitations in …

Coderl: Mastering code generation through pretrained models and deep reinforcement learning

H Le, Y Wang, AD Gotmare… - Advances in Neural …, 2022 - proceedings.neurips.cc
Program synthesis or code generation aims to generate a program that satisfies a problem
specification. Recent approaches using large-scale pretrained language models (LMs) have …

Competition-level code generation with alphacode

Y Li, D Choi, J Chung, N Kushman, J Schrittwieser… - Science, 2022 - science.org
Programming is a powerful and ubiquitous problem-solving tool. Systems that can assist
programmers or even generate programs themselves could make programming more …

Is ChatGPT the ultimate programming assistant--how far is it?

H Tian, W Lu, TO Li, X Tang, SC Cheung… - arXiv preprint arXiv …, 2023 - arxiv.org
Recently, the ChatGPT LLM has received great attention: it can be used as a bot for
discussing source code, prompting it to suggest changes, provide descriptions or even …

Unixcoder: Unified cross-modal pre-training for code representation

D Guo, S Lu, N Duan, Y Wang, M Zhou… - arXiv preprint arXiv …, 2022 - arxiv.org
Pre-trained models for programming languages have recently demonstrated great success
on code intelligence. To support both code-related understanding and generation tasks …

Codegeex: A pre-trained model for code generation with multilingual evaluations on humaneval-x

Q Zheng, X Xia, X Zou, Y Dong, S Wang, Y Xue… - arXiv preprint arXiv …, 2023 - arxiv.org
Large pre-trained code generation models, such as OpenAI Codex, can generate syntax-
and function-correct code, making the coding of programmers more productive and our …

Codet5: Identifier-aware unified pre-trained encoder-decoder models for code understanding and generation

Y Wang, W Wang, S Joty, SCH Hoi - arXiv preprint arXiv:2109.00859, 2021 - arxiv.org
Pre-trained models for Natural Languages (NL) like BERT and GPT have been recently
shown to transfer well to Programming Languages (PL) and largely benefit a broad set of …

On the opportunities and risks of foundation models

R Bommasani, DA Hudson, E Adeli, R Altman… - arXiv preprint arXiv …, 2021 - arxiv.org
AI is undergoing a paradigm shift with the rise of models (eg, BERT, DALL-E, GPT-3) that are
trained on broad data at scale and are adaptable to a wide range of downstream tasks. We …