top of page

My Projects

This page summarizes all the significant projects I have done in my journey of engineering.

Autonomous Ground Vehicle with 3D Perception for Mapping / Multi-Object Pose Tracking

Built a ground robot for environment exploration, a 3-DOF manipulator with Realsense D435i RGB-D camera mounted on a Tracked differential drive mobile Robot fully controlled with ROS using Jetson Nano board. The Robot uses EKF localization fusing the wheel odometry with IMU(MPU6050) for state estimation. Real-Time Appearance based mapping (RTAB-Map) is used for building 3D Space/2D grid maps and localization. Scripted a ROS Multi-Object Tracker node that projects objects in 3D space and broadcasts it to the main TF tree. Built an organized repo of ROS packages for the Robot’s Configuration, Control, Perception and Navigation

3-DOF Desktop Robot for Autonomous Object Tracking in 3D space

Adding 3D perception applications to a desktop 3-DOF robot initially designed to mimic the human head’s dexterity. This is a ROS package for Intel realsense D435i with a 3-DOF Manipulator robot that can be used for Indoor Mapping and localization of objects in the world frame with an added advantage of the robot's dexterity. The 3-DOF Manipulator is a self-built custom robot where the URDF with the depth sensor is included. The package covers the Rosserial communication with Arduino nodes to control the robot's Joint States and PCL pipelines required for autonomous mapping/Localization/Tracking of the objects in real-time.

ROS Autonomous SLAM with Rapidly Exploring Random Tree (RRT)

Developed a ROS package for Autonomous environment exploration using SLAM in a Gazebo environment which uses a Laser Range sensor to construct a real-world map at the same dynamically using Rapidly Exploring Random Tree algorithm. The robot uses ROS Navigation stack and RVIZ to visualize the perception of the robot in the environment.

Interactive Virtual Pen using MNIST and deep learning (My first webinar)

github July 2020

This is a python program that uses deep learning and image processing to create a virtual pen where the user can hover with the configured color tip over the webcam to write digits. The deep learning model trained using MNIST Datasets (Extract the data.rar) is used to recognize the digits. It uses Keras for deep learning and OpenCV for image processing.

Robotic Prosthesis Research with Virtual Reality and Motion Capture

Researcher at Assistive Wearable Robotics Lab (AWEAR), University at Buffalo

Currently Developing a Virtual Reality based research platform for Robotic Prosthetic arm design with help of Mujoco Physics environment and OpenVR SDK. This research platform will be used to generate and test Machine learning models, operations, and validation of Robotic prosthetic arms with Delsys EMG sensors. This research is done under Dr.Jiyeon Kang.

Smart Rollator with Rocker Bogie Mechanism and IMU-Linear Actuators based weight distribution control - Product Inaivi

Jan 2018 – Apr 2019

An advanced rollator(wheel walker) for Elderly and disabled people will help them walk on any obstacle-filled ground, go uphill and down the hill, and climb steps. Uses Rocker Bogie suspension design with linear actuators modifying the user's weight distribution over the rollator in different terrains.
Developed unique mechanical design, a mechatronic system consists of linear actuators, Arduino Mega board, BLDC motors with feedback control, and IoT connectivity that can always be relied upon by the elderly.
I am now looking forward to developing it as a Product.

Singularity Analysis and simulation of Fanuc LR Mate 200id Industrial Robot

November 2020

This project consists step by step kinematic and dynamic analysis of the 6dof industrial robot Fanuc LR Mate 200id. The forward kinematics and inverse kinematics with kinematic decoupling was derived. The Jacobian of the decoupled forward kinematic was used to find the singularity conditions of the robot. Matlab was used for simulation.

Self Driving Car using Deep learning and Computer Vision in Unity

July 2020

Built a Self-Driving car within Unity Physics engine which uses Computer vision and Deep Learning. Trained different deep learning models with non-sequential architectures in different unity environments.

Pattern Recognition of Multiple Datasets

github January 2020 - May 2020

Successfully created a set Jupyter notebook repository experimenting with different concepts of Pattern recognition. This repository contains various Jupiter pages written by me working on the MNIST datasets for my course Pattern Recognition. It uses different learning methods such as Support Vector Machines, Neural Networks, Generative Models, Probabilistic Graphic Models, and Linear Discriminant functions. It uses Keras and TensorFlow for most of the codes.

Applied Deep Learning Repository

github  July 2020

Successfully created a set Jupyter notebook repository applying deep learning in different applications. This repository consists of a set of Jupyter Notebooks with different Deep Learning methods applied. Each notebook gives a walkthrough from scratch to the end results visualization hierarchically. The Deep Learning methods include Multiperceptron layers, CNN, GAN, Autoencoders, Sequential, and Non-Sequential deep learning models. The fields applied include Image Classification, Time Series Prediction, Recommendation Systems, Anomaly Detection, and Data Analysis.

Touch-less Clock-Time System for University at Buffalo

Feb 2020 – May 2020

A completely touch-less user/employee attendance system for University at Buffalo was developed which uses face recognition and deep learning to identify users and clock-In/Clock-Out them. The system consists of a Python interactive bot for the user to interact using voice inputs and guided voice outputs. The system has its own cloud database created using the Google Firebase which stores all the user information with a 128-dimensional face descriptor created for individual faces. A back-end monitoring web interface was also created using NodeRed which allows the admin to monitor the activities in real-time and also find anomalies in them.

The HapTap - The elderly haptics based monitoring device 

Aug 2018 – Present

The HapTap is a haptics-based assistive device that consists of various sensors required for health monitoring and IMU to track the user as well as help them communicate using haptics. This is an IoT device that has its own standalone server and multi-device interface which can be used in any medical oriented environment that requires the users to be tracked without physically volunteering. The concept of Haptics was originally developed to help people who are suffering from the condition 'Cerebral Palsy'.

Jet Engine Modelling

Jan 2016 – May 2016

Modeled a Jet Engine from the scratch using Solidworks. The project contains more than 85 distinct parts and 200+ repetitive parts.
This project was done for our machine drawing course.

Module-I Assistive Technology for Disabled People Device Concept

Jan 2017 – Apr 2018

It is an IOT based modular device that can be used by multiple disabled people.
The modular system allows the user to add any assistive module provided according to their disability.
This project was presented at the IIT Shaastra 2017 Accessibility Summit.

Smart Industrial Jacket

Jan 2017 – Apr 2017

A smart industrial jacket to monitor the safety of the worker and detect emergencies using IoT and machine learning algorithms.

Tesla Turbine - CFD Analysis and Prototype

Jan 2017

A Tesla turbine is one of Tesla's proposed theories in fluid mechanics where a compressed fluid is used to rotate stacked circular plates with a gap between them by the effect of boundary layer and loss in kinetic energy of the fluid. A Computational Fluid Dynamic analysis was done on this method to prove this theory.

Automated Home Gardening System

Sep 2016

It is an automated gardening system that keeps track of the environmental values i.e. Add to dictionary Moisture, Sun Light, and water consumption and it waters the plant accordingly for better efficiency.
The data is stored in an IoT Cloud and can access through the web or android app interface I developed.

Gyro assisted Smart Helmet - Motorbike Safety

Dec 2015 – Apr 2016

It Is an smart bike helmet assisted with a gyro sensor to ride the bike carefully in high speeds and turnings using an algorithm.

Laser Listening Device

Jul 2015

A long-range listening device used to listen behind reflective surfaces by means of the laser pulse. It has a laser transmitter and a receiver. The received laser pulse is converted to a sound signal.

bottom of page