This photo was taken during my internship at Mila.
Hi, there! Here is Ning Dai. I’m now a first-year PhD student in the School of Electrical Engineering and Computer Science at Oregon State University (advised by Prof. Liang Huang).
Previously, I was an undergraduate research assistant at FudanNLP Group (advised by Prof. Xipeng Qiu).
Generally, I’m doing research on Natural Language Processing and Machine Learning. So far, I have explored a lot of topics in NLP and ML, such as Reinforcement Learning, Dependency Parsing, Text Generation and Self-supervised Representation Learning, etc. Specifically, here are the topics that I’m recently interested in:
Multilingual Representation Learning
(e.g. Probing/Understanding Multilingual Pre-training Models)
Unsupervised Sequence-to-Sequence Learning
(e.g. Unsupervised Machine Translation, Unpaired Text Style Transfer)
Semantic Parsing and Program Synthesis
(e.g. Code Generation)
Style Transformer: Unpaired Text Style Transfer without Disentangled Latent Representation (ACL 2019) [paper] [code]
Ning Dai, Jianze Liang, Xipeng Qiu, Xuanjing Huang
Pre-trained Models for Natural Language Processing: A Survey [paper]
Xipeng Qiu, Tianxiang Sun, Yige Xu, Yunfan Shao, Ning Dai, Xuanjing Huang
ByteDance AI Lab
Sept 2020 ~ Aug 2021 ‖ Research Intern, MLNLC Group
AWS Shanghai AI Lab
Nov 2019 ~ Aug 2020 ‖ Applied Scientist Intern, Prof. Zheng Zhang’s Group
Montreal Institute for Learning Algorithms (Mila)
Jul 2019 ~ Oct 2019 ‖ Research Intern, Prof. Christopher Pal’s Group