Akhila Yerukola

Hi! I am a first-year Ph.D. student at Carnegie Mellon University's School of Computer Science in the Language Technologies Institute. I am fortunate to be advised by Maarten Sap.

Previously, I worked as a Senior AI Research Engineer at the Artificial Intelligence Center in Samsung Research America (SRA), where I was advised by Hongxia Jin. I received my Masters in Computer Science from Stanford University in 2019. I was a part of the Stanford NLP Group, where I was advised by Professor Chris Manning. Before that, I received my B.Tech in Computer Science from National Institute of Technology Tiruchirappalli (NIT Trichy), Tamil Nadu, India in 2016.

My research interests are in natural language processing (NLP), specifically in low-resource settings, explainability of natural language generation (NLG) and commonsense reasoning.

Here is my CV [Updated Dec 2021].

Email:  (Click to unscramble)
GitHub  /  Google Scholar  /  Twitter  /  LinkedIn

profile photo
News
  •   [Aug 2022]  I joined CMU's LTI to pursue my PhD!
Publications
  • The GEM Benchmark: Natural Language Generation, its Evaluation and Metrics
    Sebastian Gehrmann, Tosin Adewumi, Karmanya Aggarwal, Pawan Sasanka Ammanamanchi, Aremu Anuoluwapo, Antoine Bosselut, Khyathi Raghavi Chandu, Miruna Clinciu, Dipanjan Das, Kaustubh D Dhole, Wanyu Du, Esin Durmus, Ondřej Dušek, Chris Emezue, Varun Gangal, Cristina Garbacea, Tatsunori Hashimoto, Yufang Hou, Yacine Jernite, Harsh Jhamtani, Yangfeng Ji, Shailza Jolly, Dhruv Kumar, Faisal Ladhak, Aman Madaan, Mounica Maddela, Khyati Mahajan, Saad Mahamood, Bodhisattwa Prasad Majumder, Pedro Henrique Martins, Angelina McMillan-Major, Simon Mille, Emiel van Miltenburg, Moin Nadeem, Shashi Narayan, Vitaly Nikolaev, Rubungo Andre Niyongabo, Salomey Osei, Ankur Parikh, Laura Perez-Beltrachini, Niranjan Ramesh Rao, Vikas Raunak, Juan Diego Rodriguez, Sashank Santhanam, João Sedoc, Thibault Sellam, Samira Shaikh, Anastasia Shimorina, Marco Antonio Sobrevilla Cabezudo, Hendrik Strobelt, Nishant Subramani, Wei Xu, Diyi Yang, Akhila Yerukola, Jiawei Zhou
    [Website]
    Association for Computational Linguistics (ACL) Workshop, 2021.

  • Data Augmentation for Voice-Assistant NLU using BERT-based Interchangeable Rephrase
    Akhila Yerukola, Mason Bretan, Hongxia Jin
    * equal contribution
    European Chapter of the Association for Computational Linguistics (EACL), 2021.

  • Do Massively Pretrained Language Models Make Better Storytellers?
    Abigail See, Aneesh Pappu, Rohun Saxena, Akhila Yerukola, Christopher D. Manning
    * equal contribution
    Conference on Computational Natural Language Learning (CoNLL), 2019.
    [Code][bib]

Teaching Experience
  • In Spring 2019, I was a TA for CS224U (Natural Language Understanding), taught by Christopher Potts and Bill MacCartney . I worked with of 10 CAs for 250+ students, and mentored 10+ student teams for the course project. I also taught a lecture on “Probing black box models.” (Link)
  • In Fall 2018, I was a TA for CS229 (Machine Learning), taught by Andrew Ng . I worked with a team of 30+ CAs for 850+ students. I also mentored 30 student teams for the course project.
  • In Fall 2013, I taught the basics of C++, Java and Android application development to freshmen at NIT Trichy. [Undergraduate]
Selected Awards
  • Samsung Research America (SRA) President's Award for Q4 2020 and Q3 2021.
  • Second Best Project Award, CS231n (Convolutional Neural Networks for Visual Recognition) 2018.
  • Second Best Project Award, CS224n (NLP with Deep Learning) 2018.
  • Third Highest GPA Award, Department of Computer Science, NITT, 2015.
  • Full undergraduate scholarship, Ministry of Human Resource Development (MHRD), India 2012-2016.

This website template is from Jon Barron and Jeff Donahue.