About Me
I’m a Ph.D. candidate at Okumura-Takamura-Funakoshi Lab at Tokyo Institute of Technology. My research mainly focuses on word representation learning (such as word2vec) and neural language model. My email is x@gmail.com where x is yukunfg
Education
- 2020.4 - present: Ph.D. candidate, Tokyo Institute of Technology
-
Supervisor: Prof. Manabu Okumura
-
- 2018.4 - 2020.3: M.E., Tokyo Institute of Technology
-
Supervisor: Prof. Hiroya Takamura
-
- 2012.9 - 2016.7: B.E., Beijing Language and Culture University
-
Supervisor: Prof. Dong Yu
-
Work Experience
- 2022.04 - 2022.09: Applied Scitienst Intern, Amazon Alexa AI, Cambrdige, UK
-
Mentor and manager: Tom Ayoola, Andrea Pierleoni
-
- 2019.11 - 2020.02: Research Intern, Google Research, Mountain View, U.S.
-
Hosts: Amir Fayazi, Abhinav Rastogi
-
- 2017.02 - 2018.03: R & D Engineer, Sogou Inc., Beijing, China
- 2015.08 - 2017.02: R & D Intern/Engineer, Baidu Inc., Beijing, China
-
Intern: 5/2015-07/2016, Full-time: 7/2016-02/2017
-
Publications
-
Yukun Feng, Amir Fayazi, Abhinav Rastogi and Manabu Okumura, ”Efficient Entity Embedding Con- struction from Type Knowledge for BERT”, Findings of The Asia-Pacific Chapter of the Association for Computational Linguistics (AACL-IJCNLP Findings 2020) [code]
-
Yukun Feng, Chenlong Hu, Hidetaka Kamigaito, Hiroya Takamura and Manabu Okumura, Improving Character-Aware Neural Language Model by Warming Up Character Encoder under Skip-gram Architecture, The International Conference Recent Advances in Natural Language Processing (RANLP 2021) [code] [slide].
-
Yijin Xiong, Yukun Feng, Hao Wu, Hidetaka Kamigaito, Hiroya Takamura and Manabu Okumura, Fusing Label Embedding into BERT: An Efficient Improvement for Text Classification, In Findings of the Association for Computational Linguistics (ACL-IJCNLP 2021 Findings)
-
Chenlong Hu, Yukun Feng, Hidetaka Kamigaito, Hiroya Takamura and Manabu Okumura, One-class Text Classification with Multi-Modal Deep Support Vector Data Description, The European Chapter of the Association for Computational Linguistics (EACL 2021).
-
Yukun Feng, Chenlong Hu, Hidetaka Kamigaito, Hiroya Takamura and Manabu Okumura, A Simple and Effective Usage of Word Clusters for CBOW Model, The Asia-Pacific Chapter of the Association for Computational Linguistics (AACL-IJCNLP 2020) [code] [slide].
-
Yukun Feng, Hidetaka Kamigaito, Hiroya Takamura and Manabu Okumura, A Simple and Effective Method for Injecting Word-level Information into Character-aware Neural Language Models, The SIGNLL Conference on Computational Natural Language Learning 2019 (CoNLL 2019) [code] [poster].
-
Yasufumi Taniguchi, Yukun Feng, Hiroya Takamura and Manabu Okumura, Generating Live Soccer-Match Commentary from Play Data, Thirty-Third AAAI Conference on Artificial Intelligence (AAAI 2019).
-
Yukun Feng, Dong Yu, Jian Xu and Chunhua Liu, Semantic Frame Labeling with Target-based Neural Model , Sixth Joint Conference on Lexical and Computational Semantics (*SEM 2017, ACL workshop).
-
Yukun Feng, Yipei Xu and Dong Yu, An end-to-end approach to learning semantic frames with feedforward neural network, NAACL 2016 Student Research Workshop.
-
Yukun Feng, Qiao Deng, and Dong Yu, BLCUNLP: Corpus Pattern Analysis for Verbs Based on Dependency Chain, International Workshop on Semantic Evaluation (SemEval-2015, NAACL Workshop).