Hello, World! 👋

Hi, I'm Vinayak Singh Bhadoriya, a Computer Science graduate from The University of Manchester currently getting my Masters at NYU Courant. I have a keen interest in Machine Learning and Artificial Intelligence, and have worked on projects ranging from topics like RNA folding and Natural Language Inference to Neural Network Optimization.

You can scroll down to know a little bit more about me or head over to the Articles page if you're interested in my thoughts on Life, The Universe, and Everything.

Education

BSc Computer Science (2021-2024)The University of Manchester
MSc Computer Science (2024-2026)NYU Courant

Work Experience

Business Technology Intern

Discover Financial Services | June 2023 - August 2023

Part of the Business Technology team, specifically working on the Diners Club International portal. Responsible for creating an automated health check for the portal using Playwright and Java, saving the team hours of manual testing every day.

Academic Notetaker

Randstad | September 2022 - December 2023

Provided support to students with disabilities and attended various lectures and seminars taking accurate notes for students who found it hard to take notes themselves.

Summer Intern

EY | August 2022 - September 2022

Shadowed a Tax Associate and contributed to the development of EY’s DigiTDS product which automates the ‘Tax Deducted at Source’ and ‘Tax Collected at Source’ processes.

Projects

RNA Secondary Structure Prediction Using Machine Learning

As part of my undergraduate thesis I built a machine learning model which predicts the folded secondary structure given an RNA sequence, using a combination of traditional machine learning techniques and a dynamic programming algorithm developed by Zuker and Stiegler.

The report is available here

Natural Language Inference

Natural Language Inference(NLI) using traditional ML and Deep Learning methods where I implemented a Natural Language Inference model using traditional machine learning techniques like Logistic Regression, Random Forest and Gradient Boosting, as well as deep learning techniques like the Neural Network architecture proposed by Bowman et al.(2015) SNLI paper.

View on GitHub