Thu 7 Dec 2023 15:15 - 15:30 at Golden Gate C1 - Models of Code and Documentation Chair(s): Gema Rodríguez-Pérez

Software is constantly changing, requiring developers to perform several derived tasks in a timely manner, such as writing a description for the intention of the code change, or identifying the defect-prone code changes. Considering that the cost of dealing with these tasks can account for a large proportion (typically around 70 percent) of the total development expenditure, automating such processes will significantly lighten the burdens of developers. To achieve such a target, existing approaches mainly rely on training deep learning models from scratch or fine-tuning existing pre-trained models on such tasks, both of which have weaknesses. Specifically, the former uses comparatively small-scale labelled data for training, making it difficult to learn and exploit the domain knowledge of programming language hidden in the large-amount unlabelled code in the wild; the latter is hard to fully leverage the learned knowledge of the pre-trained model, as existing pre-trained models are designed to encode a single code snippet rather than a code change (i.e., the difference between two code snippets). We propose to pre-train a model specially designed for code changes to better support developers in software maintenance. To this end, we first collect a large-scale dataset containing 1.5M+ pairwise data of code changes and commit messages. Based on these data, we curate five different tasks for pre-training, which equip the model with diverse domain knowledge about code changes. We fine-tune the pre-trained model, CCT5, on three widely-studied tasks incurred by code changes and two tasks specific to the code review process. Results show that CCT5 outperforms both conventional deep learning approaches and existing pre-trained models on these tasks.

Thu 7 Dec

Displayed time zone: Pacific Time (US & Canada) change

14:00 - 15:30
Models of Code and DocumentationResearch Papers / Journal First / Ideas, Visions and Reflections at Golden Gate C1
Chair(s): Gema Rodríguez-Pérez University of British Columbia (UBC)
14:00
15m
Talk
On the Usage of Continual Learning for Out-of-Distribution Generalization in Pre-trained Language Models of Code
Research Papers
Martin Weyssow DIRO, Université de Montréal, Xin Zhou Singapore Management University, Singapore, Kisub Kim School of Computing and Information Systems, Singapore Management University, David Lo School of Computing and Information Systems, Singapore Management University, Houari Sahraoui DIRO, Université de Montréal
Pre-print Media Attached
14:15
15m
Talk
A Vision on Intentions in Software Engineering
Ideas, Visions and Reflections
Jacob Krüger Eindhoven University of Technology, Yi Li Nanyang Technological University, Chenguang Zhu Meta, Marsha Chechik University of Toronto, Thorsten Berger Ruhr University Bochum, Julia Rubin University of British Columbia, Canada
Media Attached
14:30
15m
Paper
Automated Identification of Toxic Code Reviews Using ToxiCR
Journal First
Jaydeb Sarker Department of Computer Science, Wayne State University, Asif Kamal Turzo Wayne State University, Amiangshu Bosu Wayne State University, Ming Dong Wayne State University
Link to publication DOI Pre-print Media Attached
14:45
15m
Talk
GrACE: Language Models Meet Code Edits
Research Papers
Priyanshu Gupta Microsoft, Avishree Khare Microsoft, Yasharth Bajpai Microsoft, Saikat Chakraborty Microsoft Research , Sumit Gulwani Microsoft, Aditya Kanade Microsoft Research India, Arjun Radhakrishna Microsoft, Gustavo Soares Microsoft, Ashish Tiwari Microsoft
Media Attached
15:00
15m
Talk
Recommending Analogical APIs via Knowledge Graph Embedding
Research Papers
Mingwei Liu Fudan University, Yanjun Yang Fudan University, Yiling Lou Fudan University, Xin Peng Fudan University, Zhong Zhou Fudan University, Xueying Du Fudan University, Tianyong Yang Fudan University
Pre-print Media Attached
15:15
15m
Talk
[Remote] CCT5: A Code-Change-Oriented Pre-Trained Model
Research Papers
Bo Lin National University of Defense Technology, Shangwen Wang National University of Defense Technology, Zhongxin Liu Zhejiang University, Yepang Liu Southern University of Science and Technology, Xin Xia Huawei Technologies, Xiaoguang Mao National University of Defense Technology
DOI Pre-print Media Attached