The Mayborn Museum (owned by Baylor University) sent an email to all the computer science students at Baylor asking if anybody was interested in a job to code a pair of audio instrument applications for an exhibit. I heard about the project and, having no prior experience in audio software development but a passion for both audio/music & coding, I said “What the heck” and applied. That night I began researching how the heck I was going to make an audio instrument application. The main reason I took on the project was to learn about audio software development because I've always had a passion for making music and using music software.
After getting hired for the job, the first application I was assigned to make was a multi-voice, sampler instrument that plays four different tones (piano, brass, strings, and sine synth) and is triggered by a keyboard input. The application also shows two visualizations of the generated audio on the screen (oscilloscope and frequency spectrum displays). For the final exhibit, my program was connected to giant piano keyboard that is played with the users’ feet which was used to trigger the notes and sounds. This aspect obviously made the project like ten times cooler.
This second instrument application I created for the exhibit is meant to teach children about the basic relationship between pitch and frequency of a sound. Played on a touch screen, the program generates a pure tone (synthesized sine wave) that rises in pitch as the users’ finger moves from left to right across the screen. The application displays a visualization of this tone via an oscilloscope to show how the waveform changes with pitch. A graph across the bottom shows the relationship between pitch and frequency as the user moves their finger around on the touchscreen. Finally a microphone input allows the user to see how their voice affects the generated tone in the oscilloscope visualizer.