I'm a computer science student at Columbia University.
My research interests are primarily in deep learning, machine learning, speech recognition, computer vision, natural language processing, reinforcement learning, and GPU computing. I'm also broadly interested in robotics, quantum computing, meta learning, AI strategic game playing, evolutionary or DL-driven art, technological singularity, etc.

I'm currently doing research with Prof. Yoshua Bengio and Prof. Aaron Courville at MILA. We just submitted a paper on deep semi-supervised learning to ICLR 2016 for review. In the paper, I introduced a new variant of the Ladder Network that achieved new state of the art for Permutation-Invariant MNIST in both fully- and semi-supervised experiments.

In summer 2015, I interned at Baidu AI Lab under the supervision of chief scientist Andrew Ng, AI lab director Adam Coates, and research scientist Dario Amodei. We worked on DeepSpeech 2, an industrial scale end-to-end speech recognition engine for noisy and accented English and Mandarin.

In spring 2015, I designed Laminar deep learning library, a comprehensive and scalable framework for training and deploying feed-forward and recurrent neural networks of arbitrary topology. The framework was constructed from scratch in 18,000 lines of C++11. I had the priledge to be mentored by Bjarne Stroustrup, father of the C++ programming language.

In 2014, I worked as a research assistant at Columbia NLP Group, advised by Prof. Michael Collins and IBM senior researcher Brian Kingsbury. I was involved in the IARPA-funded Babel speech recognition project. We scaled up and improved the Rahimi-Recht SVM kernel approximation method for Bengali and Cantonese phoneme classification.

Also in 2014, I collaborated remotely with Stanford AI Lab on the Autonomous Driving Project, advised by Prof. Andrew Ng and Adam Coates.

Earlier, I was a research assistant at Columbia Computer Vision Group with Prof. Shree Nayar and Prof. Daniel Hsu on an interdisciplinary project with the astrophysics department. I also participated in a ChemE ontology-based knowledge engine project with Complex Resilient Intelligent Systems lab, led by Prof. Venkat Venkatasubramanian.

As for more personal projects, I conceived and developed Quark++, an efficient C++ quantum computer simulator that could simulate generic quantum gate algorithms, like Shor's factorization and Grover's search. I subsequently designed a python-like quantum computing simulation language, Quarklang, whose compiler was implemented in OCaml.
Over the summer of 2013, I wrote an FIDE-tournament compliant chess engine that could easily defeat the best of my amateur chess friends.