Why hello stranger‽ (Yep, that's an interrobang). Unsure what brings you here, but hope you find this morass of factoids about me useful in some way.
My name is Vinay Uday Prabhu and I am pretty sure I was born an ailurophile and an Indian. I like tardigrades, crocodilians, statistical learning , Toto's Africa and biryani.
I (un)fortunately live in SF bay area & 'am employed as the Chief Scientist at UnifyID, where I do machine learning, dad jokes, bad puns and consume copious amounts of sparkling water (& hence belch a lot). I also host the UnifyID AI fellowship through which I mentor researchers working on projects spanning areas such as sensor signal processing, machine learning for human kinematics, security & deep learning. You should totally consider applying if you live in the Bay Area.
Email: vinay followed by 'up' at gmail
Hawt nyoos yo!
A: Stanford-HAI seminar talk
I have been passionately working on curating a set of resources and well informed opinions that highlight the four horsemen of ethical malice in peer reviewed machine learning literature. I will be presenting a zoom talk on this under the aegis of the Stanford-HAI weekly seminar series on April 17, 2020 - 11:00am.
B: ICLR - 2020
Say what?! Daniel's and Avoy's hardwork paid off!
Daniel Wu*, Avoy Datta*,Vinay Prabhu ,BiPedalNet: “Binarized Neural Networks for Resource-Constrained On-Device Gait Identification”, Proceedings of the Practical ML for Developing Countries Workshop @ ICLR 2020
Daniel J. Wu, Andrew C. Yang , Vinay Prabhu, “Afro-MNIST: Synthetic generation of MNIST-style datasets for low-resource languages”, Proceedings of the Practical ML for Developing Countries Workshop @ ICLR 2020
1: 'Kannada-MNIST: Spurring the MNIST-moment for the numeral scripts in the developing world', Proceedings, ML for the Developing World (ML4D) workshop
2: ‘Grassmannian Packings in Neural Networks: Learning with Maximal Subspace Packings for Diversity and Anti-Sparsity’, Workshop on Information Theory and Machine Learning (ITML-2019)
3: ’Deep Connectomics Networks: Neural network architectures inspired by neuronal networks’, Neuro-AI Workshop.
The video of our from ICML-19 talk on stealing machine learning models using noise is up!