Hi! Thanks for visiting my portfolio! The content here is divided into four parts:
The video below showcases works that were composed for various projects I have worked on as a freelancer, staff composer at Noise Distillery/Clean Cuts, and for my own personal fulfillment. Full versions of the music tracks are included below.
The video clips are included for reference and do not represent how the music appears in the film or game.

Streaming music links:




These are projects where I built all aspects of the sonic experience: sound design, music, and implementation in Wwise and Unity,

0:00 Tetris. Using source code of the core game found on Github, I modified the game code to create a more immersive sonic experience using Wwise as the audio engine.
All sound is synced to the underlying musical grid: player actions and game events trigger musical SFX which combine with the backing music to create a tapestry of sounds. The background music is divided into 10 layers which are gradually added into the mix as the player approaches the level objective.
This project was designed for my Music for New Media 4 course in 2024.
1:31 Unity Karting Sample Project. This was conceived as a simple demonstration of vertical layering technique in video game scoring. The final lap triggers an additional musical layer to ramp up the excitement for the player. Low pass filters were also added when players collide with the walls.
All audio is implemented directly in Unity using custom C# scripts that I created. I have also replaced all SFX with my own sounds.
2:34 Wiggle Challenge. Interactive score for a therapeutic game developed with the Johns Hopkins School of Medicine as part of the 2020 Discovery Awards:
https://research.jhu.edu/johns-hopkins-discovery-awards-2020-awardees/
The game is used to treat stroke victims who have lost some or all of their fine motor control. Patients use specially built controllers. Input from these controllers are directly mapped to a variety of music parameters inside of Wwise. The use of interactive music is being studied to determine if patient outcomes improve as a result of direct auditory feedback to their movement.
In order to create continuous tempo adjustment, music was implemented as MIDI inside of Wwise, which triggered virtual instruments built in the actor-mixer hierarchy from samples available through Creative Commons licensing.
To keep the memory footprint low, no dynamic layers were used. Instead note velocity and MIDI CC messages modulate the volume and filter cutoff using RTPCs.
As a Principal Investigator, I oversaw a team of 3 undergraduates who helped work on the project: Matthew Flynn, Naomi Biela, and Cade Mullen. Each contributed their own music for different levels/areas of the game using the principles described above.
As a supplement to the above section, this video focuses only on sound design.

For complete audio redesigns including music, SFX, and implementation, see this video:
https://youtu.be/feBkLYcnmc4
0:00 Logo for an internal production team at a University. I was
responsible for music, sound design, and the final mix. The version here
is the SFX only mix.
0:15 Sound design for the game, One Shot Gladiator. This was part of a
game jam with Fire Totem Games. I was responsible for all audio,
including SFX and music.
1:07 Turning a squeaky drum pedal into a soundscape drone or pad.
The video below showcases additional examples of adaptive music, implementation, and programming. These projects were done in Wwise, FMOD, and Unity.

0:00 Cube Re-score using Wwise. This is based on Audiokinetic's Wwise 201
materials and is used as an example project in my courses at Peabody.
1:55 Adaptive score in FMOD. Made for a playable demo of Marvel Powers
United VR at Comic-Con. This level did not make it into the final game.
5:09 Long regarded as the ultimate test for jazz improvisors, I created a
script in C# to dynamically play along with the chord changes of John
Coltrane's song, "Giant Steps." The pickup sound itself was designed
from scratch using Native Instrument's Massive.