Akhila Yerukola

Hi! I am an AI Research Engineer at Samsung Research America (SRA). I work at the Artificial Intelligence Center, advised by Hongxia Jin.

I graduated from Stanford University with Masters in Computer Science in 2019. I was a part of the Stanford NLP Group, where I was advised by Professor Chris Manning. I received my B.Tech in Computer Science from National Institute of Technology Tiruchirappalli, Tamil Nadu, India in 2016.

My current research interests include low resource NLP, language generation in dialog systems and using commonsense reasoning in AI/DL.

Here is my CV [Updated Oct 2020].

Email:
GitHub  /  Google Scholar  /  Twitter  /  LinkedIn

profile photo
Publications
  • The GEM Benchmark: Natural Language Generation, its Evaluation and Metrics
    Sebastian Gehrmann, Tosin Adewumi, Karmanya Aggarwal, Pawan Sasanka Ammanamanchi, Aremu Anuoluwapo, Antoine Bosselut, Khyathi Raghavi Chandu, Miruna Clinciu, Dipanjan Das, Kaustubh D Dhole, Wanyu Du, Esin Durmus, Ondřej Dušek, Chris Emezue, Varun Gangal, Cristina Garbacea, Tatsunori Hashimoto, Yufang Hou, Yacine Jernite, Harsh Jhamtani, Yangfeng Ji, Shailza Jolly, Dhruv Kumar, Faisal Ladhak, Aman Madaan, Mounica Maddela, Khyati Mahajan, Saad Mahamood, Bodhisattwa Prasad Majumder, Pedro Henrique Martins, Angelina McMillan-Major, Simon Mille, Emiel van Miltenburg, Moin Nadeem, Shashi Narayan, Vitaly Nikolaev, Rubungo Andre Niyongabo, Salomey Osei, Ankur Parikh, Laura Perez-Beltrachini, Niranjan Ramesh Rao, Vikas Raunak, Juan Diego Rodriguez, Sashank Santhanam, João Sedoc, Thibault Sellam, Samira Shaikh, Anastasia Shimorina, Marco Antonio Sobrevilla Cabezudo, Hendrik Strobelt, Nishant Subramani, Wei Xu, Diyi Yang, Akhila Yerukola, Jiawei Zhou
    [Website]

  • Data Augmentation for Voice-Assistant NLU using BERT-based Interchangeable Rephrase
    Akhila Yerukola, Mason Bretan, Hongxia Jin
    * equal contribution
    European Chapter of the Association for Computational Linguistics (EACL), 2021.

  • Do Massively Pretrained Language Models Make Better Storytellers?
    Abigail See, Aneesh Pappu, Rohun Saxena, Akhila Yerukola, Christopher D. Manning
    * equal contribution
    Conference on Computational Natural Language Learning (CoNLL), 2019.
    [Code][bib]

Teaching Experience
  • In Spring 2019, I was a TA for CS224U (Natural Language Understanding), taught by Christopher Potts and Bill MacCartney . I worked with of 10 CAs for 250+ students, and mentored 10+ student teams for the course project. I also taught a lecture on “Probing black box models.” (Link)
  • In Fall 2018, I was a TA for CS229 (Machine Learning), taught by Andrew Ng . I worked with a team of 30+ CAs for 850+ students. I also mentored 30 student teams for the course project.
Selected Awards
  • Second Best Project Award, CS231n (Convolutional Neural Networks for Visual Recognition) 2018.
  • Second Best Project Award, CS224n (NLP with Deep Learning) 2018.
  • Third Highest GPA Award, Department of Computer Science, NITT, 2015.
  • Full undergraduate scholarship, Ministry of Human Resource Development (MHRD), India 2012-2016.

This website template is from Jon Barron and Jeff Donahue.