Joo-Kyung Kim



Email: jookyk at amazon.com | LinkedIn | Resume

I am a senior applied scientist at Amazon AGI. You can just call me JK.
My current major research interests are conversational AI with large language models, efficient post-training, LLM evaluation, agentic RAG, language grounding, and multi-hop reasoning.
At Amazon AGI, I have been designing and building machine learning models and systems specifically for improving Alexa's core AI components such as mixed-initiative coversational interaction, clarification/disambiguation, dynamic hypothesis routing, and personalized name-free interaction.
I received my PhD degree advised by Eric Fosler-Lussier in the Department of Computer Science and Engineering at The Ohio State University.
During my PhD, I did three research internships on deep learning for conversational understanding at Microsoft in summer 2016, deep & recursive neural networks for constituency parsing at NEC Laboratories America in summer 2015, and deep & recurrent neural networks for named entity recognition at Nuance Sunnyvale in summer 2014.
Between 2012 and 2014, I worked on hierarchical compositions of acoustic features and multiple target labels from various languages with deep neural networks for IARPA's Babel Program.
I received my MS degree advised by Byoung-Tak Zhang from Seoul National University and my BE degree from Sogang University.
Before OSU, I worked at Naver for two years as a software engineer (manager) developing Internet search services. Also as alternative military services, I developed mobile games at ZIO Interactive and financial risk management software at IBK System.

Selected Publications
Jinyoung Park, Minseok Joo, Joo-Kyung Kim, Hyunwoo J. Kim, "Generative Subgraph Retrieval for Knowledge Graph–Grounded Dialog Generation," EMNLP, pp. 21167–21182, 2024. [PDF]
Jihyung Kil, Farideh Tavazoee, Dongyeop Kang, Joo-Kyung Kim, “II-MMR: Identifying and Improving Multi-modal Multi-hop Reasoning in Visual Question Answering,” ACL Findings, 2024. [PDF]
Shirley Anugrah Hayati, Taehee Jung, Tristan Bodding-Long, Sudipta Kar, Abhinav Sethy, Joo-Kyung Kim, Dongyeop Kang, “Chain-of-Instructions: Compositional Instruction Tuning on Large Language Models," arXiv:2402.11532, 2024. [PDF]
Jinyoung Park, Ameen Patel, Omar Zia Khan, Hyunwoo J. Kim, Joo-Kyung Kim, "Graph Elicitation for Guiding Multi-Step Reasoning in Large Language Models," arXiv:2311.09762, 2023. [PDF]
Taehee Jung, Joo-Kyung Kim, Sungjin Lee, and Dongyeop Kang, "Cluster-Guided Label Generation in Extreme Multi-Label Classification," EACL, pp. 1670-1685, 2023. [PDF]
Joo-Kyung Kim, Guoyin Wang, Sungjin Lee, and Young-Bum Kim, "Deciding Whether to Ask Clarifying Questions in Large-Scale Spoken Language Understanding," ASRU, 2021. [PDF]
Joo-Kyung Kim and Young-Bum Kim, "Pseudo Labeling and Negative Feedback Learning for Large-scale Multi-label Domain Classification," ICASSP, pp. 7964-7968, 2020. [PDF]
Joo-Kyung Kim and Young-Bum Kim, "Supervised Domain Enablement Attention for Personalized Domain Classification," EMNLP, pp. 894-899, 2018. [PDF]
Joo-Kyung Kim and Young-Bum Kim, "Joint Learning of Domain Classification and Out-of-Domain Detection with Dynamic Class Weighting for Satisficing False Acceptance Rates," Interspeech, pp. 556-560, 2018. [PDF]
Young-Bum Kim, Dongchan Kim, Joo-Kyung Kim, and Ruhi Sarikaya, "A Scalable Neural Shortlisting-Reranking Approach for Large-Scale Domain Classification in Natural Language Understanding," NAACL, pp. 16-24, 2018. [PDF]
Joo-Kyung Kim, Linguistic Knowledge Transfer for Enriching Vector Representations, PhD Dissertation, The Ohio State University, 2017. [PDF]
Joo-Kyung Kim, Young-Bum Kim, Ruhi Sarikaya, and Eric Fosler-Lussier, "Cross-Lingual Transfer Learning for POS Tagging without Cross-Lingual Resources," EMNLP, pp. 2822-2828, 2017. [PDF] [Code]
Joo-Kyung Kim, Gokhan Tur, Asli Celikyilmaz, Bin Cao, and Ye-Yi Wang, "Intent Detection using Semantically Enriched Word Embeddings," IEEE Workshop on Spoken Language Technology (SLT), pp. 414-419, 2016. [PDF]
Joo-Kyung Kim, Marie-Catherine de Marneffe, and Eric Fosler-Lussier, "Adjusting Word Embeddings with Semantic Intensity Orders," ACL 2016 Workshop on Representation Learning for NLP (RepL4NLP), pp. 62-69, 2016. [PDF] [Word vector files]
Joo-Kyung Kim, Marie-Catherine de Marneffe, and Eric Fosler-Lussier, "Neural word embeddings with multiplicative feature interactions for tensor-based compositions," NAACL 2015 Workshop on Vector Space Modeling for NLP (VSM), pp. 143-150, 2015. [PDF]
Joo-Kyung Kim and Marie-Catherine de Marneffe, "Deriving adjectival scales from continuous space word representations," EMNLP, pp. 1625-1630, 2013. [PDF]
Joo-Kyung Kim and Byoung-Tak Zhang, "Evolving hypernetworks for pattern classification," IEEE Congress on Evolutionary Computation (CEC), pp. 1856-1862, 2007.
Joo-Kyung Kim, Byung Soo Kim, Oh Hyuk Kwon, Seung Kon Hwang, Jung-Woo Ha, Chan-Hoon Park, Duck Jin Chung, Chong Ho Lee, Jaehyun Park, and Byoung-Tak Zhang, "A DNA computing-inspired silicon chip for pattern recognition," 13th International Meeting on DNA Computing (DNA), 2007.
Byoung-Tak Zhang and Joo-Kyung Kim, "DNA hypernetworks for information storage and retrieval," 12th International Meeting on DNA Computing (DNA), pp. 298-307, 2006.