Engineering Portfolio


About


I am currently studying Computer Engineering in upstate New York at Union College. I constantly strive to improve via creative solutions, something that shows itself in my projects, cooking and fermentation practices. I tend to spend my free time creating things, often as engineering projects, cooking, fermenting or drawing. My main project subjects involve various forms of automation, aggriculture, quadcopters, or other remotely operated devices, but my interests are wide and ever expanding.

As I do not have time to be constantly updating my portfolio, I have an album linked here that I consistantly add engineering related material to, though without context.


Large Projects


Semi-Autonomous Hexacopter

A project with the goal of making a user following multicopter

NEM-pi

Porting a network mapping system (NeMS) onto new portable hardware

Modular Aquaponics

A research project based upon creating an open source automated and modular food growth platform

Submersible

A research project with the goal of building a machine to retrieve a car from the bottom of a lake

Single Sensor Bluetooth Multilateration

Research performed at Lawrence Livermore National Labratory on locating bluetooth devices.

Senior Capstone:The AFμS Project

Autonomous Flocking micro Subs.

Hexapod Project

Building a large scale hexapodal robot

Semi-Autonomous Hexacopter

2015-2016


This project's objective was to make a small, airborne, fully autonimous drone that followed a user. Possible applications include lighting, video recording, navigation assistance, etc.

Two main problems were addressed in the design and construction of this project. The first was to create a multicopter with enough lift to carry any equipment required, and the second was to design a method for locating and following the user.

To address the issue of lift, I created a custom hexacopter, controlled by an ATmega microcontroller. The microcontroller, programmed in C++, sent signals that imitated those from a receiver. By using a prebuilt flight control system, I was able to use pre-built precisely tuned sensors and flight controllers. This allowed me to focus on other components of the project, such as the tracking algorithm and hexcopter logistics.

After several iterations of the hexcopter's tracking methods, I settled on a system which utilized a camera to track the location of color patterns.

Below are several videos displaying the final design in flight. Due to safety concerns, only the rotation was autonimous. However, the other degrees of motion were all tested individually and had worked as well.



Details of the entire process are provided below


RSSI Trilateration

My first method for tracking was based on using the RSSI signal from my phones hotspot.

To do this I used an electric imp based wireless board, and developed several methods of honing in on the phones signal. The largest issue I came across for this method was due to signal strength being logarithmic, so the values became indiscernible once any real distance away from my phone.

The method I came up with involved moving in a direction until the strength changed, noting how far that was, and then turning and repeating that proccess. This gave three points along with the signal strength at each of them, which allowed the calculation of what angle my phones hotspot was at in relation to it. It would then travel in this direction until the signal strength began to decrease, when it would repeat the proccess again.

Unfortunately the changes in signal strength where so slight that this method would only work over a very large area, so I had to find a better method.


Signal Tracking

In order to test the different methods for locating the user, I made and designed several wheeled robots, for both simplicity and safety reasons.

The first method implemented for tracking the user was simple and quite inefficient. The process involved forward motion until the signal strength began to decrease. The robot would then change direction and repeat the proccess.

A test of the above method is linked here


Pattern Recognition

The method I ended up going with was based on using a camera to track a specific color pattern. This offered the benifit of being much more accurate than the other methods I had tried, though only while in direct line of sight of the pattern. To increase the accuracy of the rotation I wrote a PID controller to ensure smooth rotation and efficient tracking.

A test of this is linked here


The Hexacopter

The hexacopter went through quite a few iterations to find the best weight to sturdiness ratio and to ensure the best placement of electronics.

NEM-pi

2017


During a summer internship at Lawrence Livermore National Labratory I ported a network mapping system to a raspberry pi. This, at the cost of compute power, drastically increased the portability, allowing the system to be carried in a pocket.

I also designed and wrote a system of automation scripts to automatically run the required services and collect the data in an easy format for removal from the device.

Due to this, the whole system can now be plugged in and left until the desired data is collected, and then the data can easily be moved to a more powerful computer for analysis.

This project was presented at a formal symposium, and the poster used is shown below.

During this time I also worked on a project porting AES encryption to IoT devices in C++, though this was not completed while I was there and I am not able to display anything related to this project.

Modular Aquaponics

A research project based upon creating an open source automated, modular food growth platform and testbed for research.

2017-present


When I first found out about what hydroponics is, I was immediatly fascinated by the concept. While I was doing research on the different methods of construction, I began to think of different ways the proccess could be automated.

I decided that this is would be a good project to do through the school, so several other interested students and I formed a preposal for an aquaponics research project on campus. This preposal can be viewed here.

Unfortunately this preposal got denied as it was too costly, so I reworked the preposal into a modular system to fit various budgets, as well as some updated methods, which got approved, and can be viewed here. During my sophomore year, only the tank had arrived, and while it is being proccessed for aquaponics use, we had several small scale hydroponics systems using both the DWC and Kratky methods, as can be seen in the gallery. Systems for automated fish feeding and maintenance were being designed as well.

Upon my return to Union at the begining of 2019, continuation of the project was initiated. The water for the tank was prepared and fish were chosen to get the inital bacterial cycle going. To filter the water, plants needed to immediatly be introducted to the system, even though the system was not ready for automation aspects. Lettuce and tomato plants were chosen for no particular reason besides their availability, but future crops will be selected more carefully. At the moment there are several issues with the timing of the water cycle, which is prohibiting proper aeration of the plants roots, but this will be fixed with an improved pump system.

A prototype of an automatic fish feeder was designed and 3D printed to ensure the survival of the fish over the summer, but the goal is to eventually replace this with a black soldier fly larvae hatchery. This progress can be seen in the photo gallery below. The fish feeder ran off of an embdedded microcontroller running C.

Research Project on Submersible Vehicle

2017-present


Another student and I are resuming a project to make a submersible vehicle to locate a sunken car within a lake. As the exterior of the submarine and ballast system where designed by now graduated students, we are only able to adjust the design of the interior structure, even though it is not of the best design. We have gutted and redesigned their power distribution, control system, and method of steering from on the water.

I designed the new control system around two raspberry pi’s connected by several hundred feet of ethernet. They connect via a python scripted socket, allowing control from a video game controller. The raspberry pi on the sub streams a live video connection to the screen of the raspberry pi with the controller. The next step is waterproof testing in our pool.

Unfortunately the dome did not survive a poorly controlled depth test led by another department, and at the moment I have nothing to do on this project.


Single Sensor Bluetooth Multilateration

2018


During an internship at Lawrence Livermore National Labratory, I researched the feasibility of utilizing a single moving sensor to perform multilateration on bluetooth devices, and then map out their locations. I wrote a system in python to perform this. Due to time and classification constraints, my investigation was less than thurough, and slightly rushed, but I condenced my findings into an IEEE formatted paper, which I presented at an IEEE conference, and will be published in ECAI 2019.

Development of this system presented quite a few challenges relating to the junction between realtime data collection, and analysis of the data, as well as significant insight into location and positioning algorithms. As this particular project is owned by LLNL, I am not able to continue this particular line of research, but I look forward to applying what I learned to future projects.

My paper Single Sensor Bluetooth Multilateration from Arbitrary Locations will be in IEEexplore when the journal is published, but the full paper is available here.


As the majority of the research I did during this time remains classified, I cannot present my code or most of my work. However, this is an example of my documentation practices for my supervisor while performing research on bluetooth sniffing.

Senior Capstone:The AFμS Project

2019-present


As my senior capstone, I will be working on The AFμS Project, which stands for Autonomous Flocking micro Submersibles. The aim of this is to design and build affordable, small submersibles that have the capability to work in a swarm. Currently there are several companies making small submersibles with these capabilities, but most of them are too expensive for indevidual or undergraduate level research. The final product should present a platform on which researchers can mount their own sensors and easily utilize for whatever data collection they need.

The first design report describing the project from the computer engineering side is linked here.


Currently this project consists of a team of two computer engineering students, myself and Jacob Karaul, as well as two mechanical engineering students, Alex Pradhan and Sam Veith.

Along with the technological challenges that will accompany this project, which will be documented here as they unfold, there is the challenge of organizing development across a multidiscipline team. To confront this, I researched methodologies often used by startups within similar fields, several of which have been implemented into our planning and ideation sessions, a gallery of which is included below.



A live updated gallery of related material may be found here, but content on this site will not be updated until the project is complete.

Hexapod

2019-present


To provide a source of events for the theme house I was manager of in college, I decided to design and build a large scale hexapod over the course of about a year.

As budget was limited, careful though was put into the selection of the motors, sensors, SBC, control electronics and power systems.

I used event budget to purchase the parts, and worked alongside Alex Pradhan to come up with a design for the physical characteristics of the hexapod, before he made 3D printable CAD files which I then printed out over time. Each of the servos have 20Kg of torque, allowing for a hefty payload. There will be 6 ultrasonic rangefinders mounted around the circumference of the robot. The largest issue to approach from the hardware side is power supply, as each of the servos can draw up to 20A, and we would like to maximize runtime.

From the software side of things, obsticle avoidence is my largest concern, as I beleive it will be relatively simple to get standard motion dynamics based on what others have done in the same field. The goal is to use ROS on this system.


Smaller Projects


Homelab

Various media and misc. servers

Body heat Seebeck generator

Harvesting electricity from body heat

Remote Door Unlocking Mechanism

A project for a class

3D Printed Mechanical Hand

Designed based on human Physiology

Proof-of-Concept for Heterogenous GPU Computing

Lattice-Boltzmann Fluid Simulations with Image Capture Analysis for User-drawn Boundary Conditions

Light Automation and Control System

Ongoing light automation system for personal use

Facial Recognition "Security" system

Uses facial reconition to send telegram messages based on who is at the front door

Homelab setup

2016-present


Though there have been quite a few different hardware implementations, I have continuously had at least one media streaming server since 2016. The first of these was a joint operation with my friend Shingo Lavine, where we laboriously figured out how to turn an old mac mini into a Ubuntu server. We then spent weeks figuring out how to get plex to properly interact with other programs, as well as setting up port forwarding to "safely" expose a streaming port.

Since then, I have become significantly more adept at Linux administration, and on average it takes about 2 hours to set up a new media server to my preferences from scratch. The hardware implementations have ranged from a series of old macs, to various generations of raspberry pis, custom built computers, and repurposed laptops.

My current system is my previous laptop which I have modified to lay flat against a wall. It currently runs a media suite using interconnected docker images, as well as several automation scripts to interact with other systems on my network. It has a supplimentary external harddrive which is autonomously passed the least likely media files to be streamed, keeping the more frequently accessed files on the local storage.

Body Heat Seebeck Generator

2016-2017


At the 2016 Cleantech Hackathon in Troy, NY, I and several other team members, made a wearable band that traps the majority of the body heat within the encompassed area and uses the Seebeck effect to generate small amounts of electricity, which when stored up in a large capacitor, discharges with enough current to pass through a joule thief, producing enough voltage to trickle charge a LiPo battery. We used a variety of materials to get optimal temperature differential, including copper cloth and thermal paste.

We had been given the Presidential Green Grant from Union College to continue development on this project, focusing on the aspects of thermal retention, power amplification, and ergonomic design, but after extensive testing we proved that with current technology this system is simply too inefficient.

Remote Door Unlocking Mechanism

2016


Another student and I designed a prototype for a mechanical device used to wirelessly unlock your door from the other side, eliminating the need for keys. A small script to host an HTML server was put on an ESP8266, which was wired up to two large continuous rotation servos. These where mounted on a 3D printed gear mechanism which we specifically designed to provide the right amount of torque to turn the the type of handle present in our dorm rooms. This was done completely in SolidWorks and 3D printed.

3D Printed Mechanical Hand With Human Based Physiology

2014-2015


As a final project for a computer aided design and fabrication class I designed and 3D printed a human based hand that used motorized tendons to provide semi realistic movement and a similar range of motion to a human hand.

The largest design challenge was designing a system that allowed each section of the fingers to close simultaniously and still maintain a realistic range of motion for all the fingers. I designed a joint that allows the tendon to move freely between the two sections and angled connections to stop the sections at the ends of rotation without severe increase in friction.

Proof-of-Concept for Heterogenous GPU Computing: Lattice-Boltzmann Fluid Simulations with Image Capture Analysis for User-drawn Boundary Conditions

2016


During an internship at Lawrence Livermore National Labratory, Shingo Lavine and I clustered together Jetson TK1 boards with the goal of running a distributed fluid simulation on image analyzed hand drawn images. To do this we installed all the required packages to run a sailfish lattice-boltzmann fluid simulation and wrote a python script which utilized openCV to take an image, identify the drawing within it, and convert it into boundaries for the simulation.

This project was then presented at an on site poster symposium

Light Automation and Control System

2013-present


I have been pretty much continuously working on electronics automation systems for my room. When I started there where no readily available IoT devices, so everything I've done is using a Particle Photon microcontroller, which runs c++.

Over the years I have implemented a variety of control methods, ranging through physical controls, software integration, and full automation.

Early system setup

Initial direct control was solely through buttons. I also had several basic automation features to toggle my lights and or fan. I setup the photon to ping my phone on my houses network, and when it recognized that my phone had newly connected (implying I just arrived home), it would turn on my lights. I also had timing based functions which would wake me up by turning off my fan and slowly lighting up the room as my alarm.

I wrote an app for the pebble watch in C that allowed toggling of each of these via the buttons on the watch for this version.


When arriving at college I no longer had control over the local network, so I needed to develop new methods of control. I had already implemented webhooks, which I connected to via IFTTT, but I also developed a locally hosted webpage to allow my roommate to control the lights.

Dissatified with IFTTT geospacial triggering, I started to work on an always listening voice recognition, with the goal of being able to simply tell the room what you wanted to happen.

Eventually I switched to using an echo, and then google home for the voice control aspect of this system. My development focus then shifted to making nice light features for the indevidually addressible RGB strips. When I wrote the initial library for these functions, I did not know about interrupts, so I had to get creative to make sure the transition from one light feature to the next was smooth.

The latest version of this system is displayed below.

This version smoothly fades from each RGB value to the next, and allows wirelessly synchonized spectrum shifting via an attached potentiometer. Other methods of controlling I have setup are a command line script, a python library, phone screen widgets and voice control.

As the photon uses an in browser IDE, it does not play well with version control, and theirfore I have only recently added a version of this to my github. This also contains code ranging across a significant increase in coding capabilities, and I have yet had time to refactor everything to use proper practices.

It was annoying to have to run downstairs to see who is at the front door, and having a live video feed is not always the most convenient solution, so I implemented a system to recognize people and text the house group chat who has arrived.

 

The wiring solution

 

Front view

 

This system allows the option to dynamically add unrecognized faces through text.

If a face is unrecognized, it will send an image to everyone in the house. If its someone that is known, but does not live in the house, it will announce that said person has arrived, and if it is a house member it will welcome them home.

I also implemented a very basic self improving system which detects if there was an unknown face closely followed by a known face, and there is only one face in frame, it will add the unrecognized face to the encoding for the known face.

This system uses dlib, though I am using a facial recognition wrapper written by someone else.

Miscellaneous


Various Remote Controlled Vehicles

2010-present


I build remote controlled vehicles in my spare time, of varying design and complexity.


First test flight of flying wing.


An RC snow sled built in a few hours.


Dangerous Things I Did Before I Knew Better

2012-2015


Please do not attempt to replicate anything seen here, I did not have a full grasp of how easily these projects could have killed me at the time.


Coilgun

I build a coilgun at age 15 using capacitors harvested from disposable camera flashes, each of which was charged to 320v. I am unable to locate any of the pictures or videos that I took at the time, but I will upload them when found. The capacitor bank took around 40 minutes to charge, which was completed using the circutry from 7 cameras wired in parralel.

This system was succesfully able to accelerate a nail into a book several inches thick from several feet away.


High Voltage

At some point during freshman year of highschool I read Nikolai Tesla's biography, which inspired me to play with very high voltage. I got my hands on a neon sign transformer, which I used to make a jacobs ladder. Again I cannot find the footage of this, but once found it will be attached. I then wanted to build a tesla coil, and after performing some basic research, decided to build my own electrolytic capacitors for it.

Note how in the following video I was holding the live wire with both hands, providing a great way to quickly end my life if anything had gone wrong.


I continued enjoying this field, and eventually learned to only use one hand, though in the following video I am not being much safer.


I have yet to build a tesla coil.


I eventually moved onto microwave oven transformers, as it was possible to source them freely. During this period I decided that safety was a higher factor than I had been treating it, and built something I dubbed the "safety box", which had a deadmans switch, ensuring that power would turn off if I was shocked.


At the time I made an instructable about it's construction, which can be found here.

I continued to play with microwave oven transformers, rewiring some to be high current-low voltage, which I used to weld coins together. At some point I connected two banks of 3 parralel transformers in antiparallel series, and the resulting arcs were long enough to make me realize that it would be dangerous to go any further without better facilities.

If the videos and photos are found, they will be uploaded


Things that don't require much explanation


In 2014 I decided to build a mechanical keyboard to practice my soldering. I designed the case in SolidWorks and cut it using a shopbot.

From around 2011, the first "robot" I made without kit parts, using a cakepan and parts salvaged from an RC car. I am unsure where I got the motors from.

In 2010, one other student and I competed in a statewide robotics competition. We won judges choice and got 9th place overall out of 50 middle and highschool teams.

I have designed, based upon GPU price, computer power, power usage and availability, and build several cryptocurrency miners.

From 2016, a thrown together halloween prop using parts I had lying around.

Some of the things I have designed and printed that do not fall under full projects.

In 2016 I designed and programmed a metroid themed watchface for the pebble watch in C. The time slowly drifted up and down, the number of missiles indicated battery, and the energy showed the seconds.

In 2012 I made a lockscreen that shows the solar at a 1:3.154e+7 second time scale for jailbroken iPhones. The model is linked here.