Growing Up: I was born and raised in Kolkata, the cultural capital of India (Also home to 3 of the 5 Nobel laureates (Citizens) India has produced, or 4 of the 9 Indian Origin Nobel Laureates also home to the most amazing sweets.
Academia: I have spent the most unforgettable four years of my undergraduate studies at Jadavpur University – where I was taught how to become a better human being and was exposed to different aspects of life. Research-wise, I worked on Signal Processing, Control Systems, Nonlinear Dynamical Systems during my undergraduate days. I was advised by Prof. Amitava Gupta and Prof. Ratna Ghosh. I collaborated very closely with Dr. Saptarshi Das , Dr. Indranil Pan and Shantanu Das, Scientist H+,BARC. Together we have authored several research papers during my undergraduate. I proposed a new generalized version of arbitrary order (fractional derivative) Butterworth filter during my UG research (and still gives me immense pleasure).
In 2013 I moved to the US and joined Univ. of California Irvine for MS studies.I was advised by Prof. Animashree Anandkumar and later I worked with and authored paper with Prof. Charless Fowlkes. During my MS I was a visiting research scholar at TTIC, hosted by Prof. David McAllester, also worked as an intern at Toyota Research in North America (TRINA) where we were developing stereo algorithms to make driverless cars work. I also interned at this wonderful start-up FEM inc, which is now sold to AC Nielsen.
Building Alexa : Post my MS, I joined Amazon Alexa AI in June 2016 after briefly working as a data scientist at Schlumberger and eBay Customer Insights and Analytics in 2015. I worked as an Applied Scientist at Amazon Alexa AI for 3 years until June 2019. At Amazon, I was involved from the very beginning of the conception of Alexa and got to work proposing, designing and implementing various NLP problems extensively [I have built and launched NER, Intent Classifiers, Domain Classifiers of the mainstream Alexa models]. The most exciting part of my journey @Amazon was to learn how to build big NLP systems -wiring up different NLP components together in a production setting. While the last 1 or so years as a senior scientist I have focused mostly on multi-turn dialog agents especially I was focused on building dialog simulators and also leading that effort along with a couple of junior scientists and SDEs [Here is what we launched and here is my mention on LinkedIn re: launch]. My work @Amazon has produced a paper and two patents. Here are some articles about my work @Amazon. Amazon Alexa Science Blog, Packt Blog Post.
Back in Academia: Currently I am doing a PhD (July, 2019 – Present) at The University of Texas at Austin being advised by Prof. Inderjit S. Dhillon and Prof. Sujay Sanghavi. I am member of WNCG and ODEN Institute for Computational Engineering and Sciences
- Faster non-convex federated learning via global and local momentum (arXiv)
R. Das, A. Acharya, A. Hashemi, S. Sanghavi, I. Dhillon, U. Topcu.
To appear in Conference on Uncertainity in Artificial Intelligence (UAI) (UAI), 2022. (Spotlight)
- Robust Training in High Dimensions via Block Coordinate Geometric Median Descent (arXiv, slides, poster, code)
A. Acharya, A. Hashemi, P. Jain, S. Sanghavi, I. Dhillon, U. Topcu.
In International Conference on Artificial Intelligence and Statistics (AISTATS), March 2022.
- On the Benefits of Multiple Gossip Steps in Communication-Constrained Decentralized Federated Learning (pdf)
A. Hashemi, A. Acharya, R. Das, H. Vikalo, S. Sanghavi, I. Dhillon.
IEEE Transactions on Parallel and Distributed Systems (IEEE TPDS), December 2021.
- Online embedding compression for text classification using low rank matrix factorization (pdf)
A. Acharya, R. Goel, A. Metallinou, I. Dhillon.
In the AAAI Conference on Artificial Intelligence (AAAI), pp. 6196-6203, 2019.