Game and Tool Development


FERT: This is an acronym for Facial Expression Recognition Tracker, a convolutional neural network (CNN) built using TensorFlow and Keras. These are both open source libraries which are frequently employed for deep learning purposes in the field of Machine Learning (ML). The system is used to recognise facial expressions and classify them into categories of emotions like ‘Disgust’, ‘Anger’, ‘Happy’, etc. The model was trained on a dataset taken from a Kaggle workspace used in an ML conference several years ago which contained several thousand already classified images that the new classifications were to be based on. The Flask framework was used for deployment of the trained model after which it could be applied to real-time video streams and image data as shown on the left. The FERT can potentially be a useful tool for deployment in the health industry to track patient comfort levels when undergoing a medical procedure in situations where they are unable to verbally express their discomfort. It could also be utilized in the political scene to track public expression as a reflection of popular opinions on candidates. Working on this project was an interesting and informative experience. AI and ML had always been these frightening and mysterious terms that popped up in computer jargon but were largely unfamiliar realms to me. However, some of that mystery was alleviated when I was able to harness the potential of ML in this project and I was also able to learn a great deal about working with large datasets while independently delving into a field that I had been fascinated by for a long time.

Access the code here.

Data set from Kaggle used for training the facial expression recognition tracker
The Kaggle dataset used for training the FERT model
A demonstration of the functional FERT model when applied to a YouTube video
A walkthrough of the primary features of Wanderlust
A walkthrough of the primary features of Wanderlust

Wanderlust: This is a travel application that enables users to customise a travel itinerary for themselves and their friends and choose from a series of hotel, club, dining and attraction options. Users can also collaborate in real time with their friends and family members on the application, so that multiple individuals can work together simultaneously to plan their perfect getaway. The application is log-in and profile driven, so once a user creates their profile on the application, they do not have to log in every time. Their profile will be saved to the database and they can subsequently find custom made trips, specifically created in accordance with their preferences. While the application is currently limited to just North American states, it will eventually be extended to include other geographical locations and options as well so that it does a more comprehensive job of helping a user plan a vacation with their friends and colleagues, by providing them with a wider array of places. Additionally, in the future, the application will support payment options like Venmo and PayPal so that users can manage finances for trip planning on the application itself. Designing the user interface for this application was a fun way to get my hands dirty with wire-framing tools like Balsamiq and Figma, while also learning more about the field of human computer interaction (HCI). Moreover, it helped me learn more about user-centered design and the importance of making stakeholders a priority in the development cycle.

Access the code here.

Robotic Foosball Table: My final project for my Art and Interactive course at Carleton was an interactive foosball game which worked in accordance with the principles of Arduino design, ran on the C programming language, and made use of light sensors, servers, sound sensors, beepers, and LED lights to create a great user experience. The table and the figures themselves were designed and built from scratch out of slabs of wood and spray paint. The table resembled a real-life football field with the players sporting Carleton and St. Olaf jerseys. The interactive game worked as such: if the ball entered one of the goal slots, it would cause the light sensor in the compartment to be blocked, thereby reducing the exposure of light to the sensor. This would trigger a sound sensor and a beeper, playing celebratory goal chants, an LED light, which served as a jumbotron, and some smaller lights, which acted as score measures – if they lit up, it indicated one point for the team that scored. This project was the perfect example of the intersection of art and computing and helped highlight how interlinked the two domains are. It was also a nice way to work with both my hands and my mind to produce something that was fun and valuable at the same.

Access the code here.

A working version of the robotic foosball table for CS 232
A working version of the robotic foosball table for CS 232
An example of Chef Raimsey’s custom GUI screen

Chef Raimsey: This is a dessert recipe generator that draws from a corpus of multiple recipe files scraped from AllRecipes.com, and generates a well formatted recipe of ingredients and corresponding proportions for the user using a fun graphic display. It takes the user’s favorite ingredient in a dessert and the provided allergies into account when generating the recipes. It makes use of NLP techniques including Doc2Vec vectorial semantics model, web scraping and extraction, unigram custom tagging, tokenizing, corpus preprocessing, and GPT-2 text generation to achieve this goal. The AI recipe generator was my final project for an NLP course that I was taking at Carleton and enabled me to incorporate several techniques that I had learnt over the previous few months into a functional system. I was able to work with web based packages and graphics to attain a usable and user-friendly AI domain and this project also tutored me on the importance of collaborative work for wide scope projects to achieve the desired results.

Access the code here.

css.php