Natural language generation and understanding of big code for AI-assisted programming: A review
MF Wong, S Guo, CN Hang, SW Ho, CW Tan - Entropy, 2023 - mdpi.com
This paper provides a comprehensive review of the literature concerning the utilization of
Natural Language Processing (NLP) techniques, with a particular focus on transformer …
Natural Language Processing (NLP) techniques, with a particular focus on transformer …
Deep learning in electron microscopy
JM Ede - Machine Learning: Science and Technology, 2021 - iopscience.iop.org
Deep learning is transforming most areas of science and technology, including electron
microscopy. This review paper offers a practical perspective aimed at developers with …
microscopy. This review paper offers a practical perspective aimed at developers with …
Codet5+: Open code large language models for code understanding and generation
Large language models (LLMs) pretrained on vast source code have achieved prominent
progress in code intelligence. However, existing code LLMs have two main limitations in …
progress in code intelligence. However, existing code LLMs have two main limitations in …
Coderl: Mastering code generation through pretrained models and deep reinforcement learning
Program synthesis or code generation aims to generate a program that satisfies a problem
specification. Recent approaches using large-scale pretrained language models (LMs) have …
specification. Recent approaches using large-scale pretrained language models (LMs) have …
Competition-level code generation with alphacode
Programming is a powerful and ubiquitous problem-solving tool. Systems that can assist
programmers or even generate programs themselves could make programming more …
programmers or even generate programs themselves could make programming more …
Is ChatGPT the ultimate programming assistant--how far is it?
Recently, the ChatGPT LLM has received great attention: it can be used as a bot for
discussing source code, prompting it to suggest changes, provide descriptions or even …
discussing source code, prompting it to suggest changes, provide descriptions or even …
Unixcoder: Unified cross-modal pre-training for code representation
Pre-trained models for programming languages have recently demonstrated great success
on code intelligence. To support both code-related understanding and generation tasks …
on code intelligence. To support both code-related understanding and generation tasks …
Codegeex: A pre-trained model for code generation with multilingual evaluations on humaneval-x
Large pre-trained code generation models, such as OpenAI Codex, can generate syntax-
and function-correct code, making the coding of programmers more productive and our …
and function-correct code, making the coding of programmers more productive and our …
Codet5: Identifier-aware unified pre-trained encoder-decoder models for code understanding and generation
Pre-trained models for Natural Languages (NL) like BERT and GPT have been recently
shown to transfer well to Programming Languages (PL) and largely benefit a broad set of …
shown to transfer well to Programming Languages (PL) and largely benefit a broad set of …
On the opportunities and risks of foundation models
AI is undergoing a paradigm shift with the rise of models (eg, BERT, DALL-E, GPT-3) that are
trained on broad data at scale and are adaptable to a wide range of downstream tasks. We …
trained on broad data at scale and are adaptable to a wide range of downstream tasks. We …