You are on page 1of 5

The Joy-Mustick

Edmond Nolan
University of Limerick Castletroy, Co. Limerick 0861537@studentmail.ul.ie

ABSTRACT
This paper provides a detailed overview of the development of the Joy-Mustick - a new instrument for musical expression (NIME). The instrument comprises of an Arduino microcontroller, which reads in movement data from a joystick controlled by the user. This interaction is used to produce and manipulate audio in the Max/MSP environment, based on the parameter mappings. The instrument was created to form one half of a performance group to perform a composition relating to a chosen theme on replicating the sounds performed by children when interacting with toys and games.

Initially, our project plan was to augment an X-Box 360 controller with a number of sensors, including an infrared and a flex-bend sensor. However, it was decided early on that we should replace the X-Box 360 controller with a different controller, and to avail of the in-built functions of that controller rather than augmenting one with sensors. These ideas for the controllers were influenced by a principle for designing computer music controllers, which stated, Everyday objects suggest amusing controllers [1].

2.2 Project Approach


We chose to use an old arcade-style joystick, used for a number of consoles such as the Commodore 64 as our controller. As these joysticks contain a number of switches for movement, it we decided that we would map the numerous permutations of these movements as a way for the user to control the audio output. As per the project outline, we were required to use an Arduino microcontroller to read the data from the joystick, and in order to create a NIME we decided to design an interface using Max/MSP. In order to stick to our agreed theme, we decided to focus on using a number of sampled sounds from console games such as Super Mario, Sonic the Hedgehog, among others. Along with samples, we researched a number of 8-bit music compositions as an inspiration for creating our own 8-bit music [2]. An outline for the controllers mappings was created from the outset, with a number of sequencers and time/pitch shift capabilities being a priority for our instrument.

Keywords
Music Controller, Computer Music, Arduino, Max/MSP, Pitch Shift, Time Stretch

1. INTRODUCTION
For this project the authors had to design and create a new instrument for musical expression (NIME), the instrument would then be performed with another instrument as part of a group in a performance. The group would choose a theme and use this theme as the basis for the design. The theme chosen was How children interact with toys and games. This paper first focuses on examining the initial theme and concept of the instrument and how we approached the development of the instrument. The paper then focuses on the instrument design, which includes the explanation of how the joysticks features are mapped from the joystick through Arduino and into Cycling 74s Max/MSP. The paper will then focus on this authors contribution to the instrument, before offering a critique of the instrument with a view to future development.

3. INSTRUMENT DESIGN
As stated in Section 2.2, the Joy-Mustick consists of a joystick (seen in Figure 1.), an Arduino microcontroller, and a number of sound samples controlled in Max/MSP.

2. APPROACH AND CONCEPT 2.1 Theme and Initial Concept


The approach to designing this instrument stemmed from the outline of our groups performance theme which centered on re-creating how children interact with toys and games. With our opposite group focusing on designing an instrument to replicate a child playing with blocks and toys, and the ensuing chaos that comes from that, we chose to examine the current console generation and the more melodic and controlled nature, which stems from gaming.
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Copyright remains with the author(s).

Figure 1: External view of the Joystick controller

3.1 Hardware
After examining the wiring of the joystick, it was decided to have the joystick plug directly into the Arduino microcontroller. As pictured Figure 2, there are seven wires in the joystick, with four carrying the directional signal and one, which takes the signal from the Fire buttons on the joystick.

depending on the interaction with the joystick. The patch sections are outlined as follows: Figure 2: Internal view of the Joystick controller All seven wires were soldered to new pin headers to be plugged into the Arduinos ports. Sequencer No.1 Sequencer No.2 Pitch Shifting & Playback Rate Panning Markov Chain Playback of Random Samples

3.3.1 Sequencer No. 1


This was the first musical aspect of the instrument that was constructed; this consisted of a drum sequencer. To switch on this sequencer the joystick would have to be moved in a specific movement, which is left, up and fire. This patch contains kick, snare and hi-hat samples loaded in from buffers.

3.2 Arduino
For constructing this instrument, it was necessary to read in data from the joystick to be used to control the audio. The first-step was to read in data from the correct ports. The movement and fire button wires referred to in Section 3.1 are plugged into digital ports 2-6 on the Arduino, while the two remaining wires (the live and the ground) were inserted into the 5V and ground slots (as illustrated in Figure 3.).

Figure 3: Arduino and Joystick wiring The coding of the Arduino allowed it to read signal from these wires and to transfer the data for processing by Max/MSP. The SARCduino code, which was provided to us during the coding process, was used as a reference for reading in the data [3]. The code (as seen in Figure 4.) reads the data from digital ports 2-6 and allows communication with Max/MSP through the serial port.

Figure 5: Drum Sequencer Patch When left, up and fire are maneuvered on the joystick, a toggle switch turns on the metro object that starts the sequencer. This sequencer is then connected up to a number of bang objects that can trigger certain samples. This is similar to a drum machine. These samples are then fed into a degrade~ object which takes any given signal and reduces the sampling rate and bit-depth as desired. Specific values for the sample rate and bit-depth are kept constant for the performance. Also incorporated into this patch is a tempo controller, which allows the user to change the tempo in real time. To change the tempo the user the user holds down the button on the joystick for an amount of time. To speed up the sequencer the end user should press the button a certain number of times rapidly. To slow down the sequencer the end user should press the button and hold it down for a moment. The last four durations of how long the button is held are recorded; the patch then calculates the average duration and outputs the desired tempo.

3.3.2 Sequencer No. 2


Figure 4: The Arduino code

3.3 Max/MSP
The Max/MSP patch is where the design and mapping for the control of the audio by the joystick lies. This patch allows audio to be played-back and controlled in a number of ways,

This was the second musical aspect constructed for this instrument. This patch consists of a number of different sequencers; these sequencers set each other off. To switch on this sequencer the joystick would have to be moved in a specific movement, which is fire, back and left. This patch contains a number of midi notes.

Figure 7: Melody Sequencer Patch Part Two

3.3.3 Pitch Shifting and Playback Rate


The next musical aspect of this instrument is the ability to change the pitch and playback rate of the Mario sample. This patch contains that ability. To switch on this patch the joystick would have to be moved in a specific movement, which is up and right. Figure 8 shows this patch. Figure 6: Melody Sequencer Patch The first part of this patch, which is shown in figure 6, is similar to the previous drum sequencer. Instead of triggering certain samples, this sequencer triggers either of two midi note values, which in this case are two 'A' notes. These notes are only useful for MIDI playback; to suit an 8-bit music theme an 8-bit sounding synthesizer is required. This synthesizer was influenced by a plug-in on Logic Pro 9; the plug-in is called ES P [4]. This plug-in is able to create an 8-bit sound. This plug-in was then constructed in Max/MSP; this simply consisted of a rectangle wave generator, frequency input and an ASDR (Attack, Sustain, Delay and Release) envelope. The mtof object converted the midi note into its frequency note which can be read by the rect~ object. These synthesized notes are then fed out through the DAC. The next part of this patch, which is shown in figure 7, contains the melody of this instrument. This part of the patch is switched on when the previous sequencer goes through two cycles. This sequencer is similar to the previous sequencer but the notes that are to be played for the melodies are loaded in depending on which cycle the sequencer is going through. When the sequencer is on its first cycle the melody notes are read in from a small sequencer, which is shown on the right of figure 7. These notes are in the form of a message, this message is then sent to the unpack object which sends out all these notes into their own message. When the select object selects one of these midi notes, the note is sent to the mtof object, which converts the note to a frequency value which can be read by a synthesizer which is a copy of the synth mentioned above.

Figure 8: Pitch Shifting and Playback Rate Patch The center of this patch is the patcher object elastic. This object lets the user change the pitch and playback rate of the sample. This object was given as an example in the example files of Max/MSP. The next process in this patch is the ability to change both the pitch and playback rate only using the joystick. This was achieved by moving the joystick in a number of maneuvers as follows: back and left speeds up the playback of the sample, back and right changes the playback of the sample in reverse, fire and left increases the pitch while fire and right decreases the pitch of the sample. Each of these processes have to possible values, for example, to increase the playback speed of the sample the user would move the joystick back and left, this increases to playback speed to 1.5 times normal playback, then when back and left are maneuvered for the second time on the joystick, the playback rate is increased to twice the normal playback speed. This was achieved by using the counter and select objects while also using basic math calculations with the + and objects.

3.3.4 Panning
This Mario sample can then also be panned hard left, slightly left, hard right and slightly right. This is achieved by maneuvering the joystick left/right and then fire, and holding the joystick at this location for a certain amount of time, depending on how far the sound wishes to be panned in either direction. Figure 9 shows this patch.

randomly choose one of five samples which are sent to the DAC for playback. This provides the user with an opportunity to produce immediate and random sound at any given interval.

Figure 9: Panning Patch To achieve either of these two panning intervals, the joystick would have to be held at either of the two previously mentioned maneuvers for a certain amount of time. To pan the sample slightly left, this maneuver would have to be held for over 1 second and below 3 seconds. To pan the sample hard left, this maneuver would have to be held for over 3 seconds. This is the same for the right hand side. To return the sample into a centrally panned position, hold the joystick in either of two maneuvers for less than 1 second.

Figure 11: Play Random Sample code

4. PERSONAL CONTRIBUTION
The authors main contribution to the instrument was to utilize Max/MSP for the creation of old video game music. This music was primarily 8-bit chip-tunes. The author researched different composers for inspiration; Dan Deacon was one of these composers [2]. The author also noted that there were a massive amount of 8-bit remixed songs on various web sites. The author was also responsible for designing and creating a number of the audio controls mentioned earlier. The audio controls that the author created were sequencer no. 1, sequencer no. 2, pitch shifting, playback rate and also the ability of panning.

3.3.5 Markov Chain


A further musical implementation for this instrument was to allow the user to enable a Markov process for playing back a number of samples. A Markov chain describes how to get from one state to another, given the likely probability of each state occurring. In the case of this patch, the chain has a number of weightings, which determines the playback of the seven available samples (as seen in Figure 10.).

5. CONCLUSION AND CRITIQUE


Overall, the development of this instrument has been a successful endeavor. Although our original idea may have needed tweaking in the planning stage, the project stayed true to the agreed performance theme by allowing the user to create and play music using the joystick controller. The Max/MSP patch was successful in offering the user a number of options for controlling the playback of samples. Although the joystick was successful in controlling all the parameters, which we assigned to it, we do feel that due to its limited design and with the small number of maneuver possibilities, we found that we were unable to offer further audio controls to the user. We had hoped to include a modulation patch to the instrument, but instead were forced to remove it from the planned list of controls due to joysticks limitations. This would certainly be something we would look to improve for any future development of the Joy-Mustick. We believe that overall, the re-design the joystick as a NIME was a success. This is particularly due to the easy to understand joystick controls, which require no prior practice or musical talent, as well as the simplistic and intuitive design of the joystick itself. With a note towards further development, we believe that increasing the complexity of this instrument would alleviate the possibility that the user may become bored once the instruments novelty has worn off which often stems from the easy to learn nature of some instruments [5].

Figure 10: Overview of the Markov Chain Patch The user is given the option to being the patch by maneuvering the joystick with left, fire and up commands. Should the user wish to end the playback, they are given the option of either repeating the joystick commands which will stop and reset the sequence immediately, or they may click a bang which will end the sequence on the next eight beat. Although the user cannot adjust the probability weightings of the patch during performance, they are free to outline different figures beforehand.

3.3.6 Playback of Random Samples


The Joy-Mustick also allows the user to trigger the playback of a number of samples, which have been loaded into the patch. These samples (in keeping with the performance theme) were obtained online and are taken from SEGAs Sonic The Hedgehog game. As can be seen in Figure 11, by moving the joystick to the back position and hitting the fire button, the patch will

6. REFERENCES
[1] P. Cook, Principles for designing computer music controllers, in Proceedings of the 2001 conference on New interfaces for musical expression, 2001, pp. 14.

[2] Dan Deacon - CRYSTAL CAT. 2007 [Online]. Available: http://www.youtube.com/watch?v=vFlBJ1xZK10/. [Accessed: 02-Dec 2011]. [3] SARCduino MuSE. [Online]. Available: http://www.musicsensorsemotion.com/2010/03/08/sarcdui no/. [Accessed: 05-Dec-2011]. [4] Logic Pro Tutorial - Classic Video Game Music. 2010 [Online]. Available: www.youtube.com/watch?v=zQFYMppaMFQ/. [Accessed: 02-Dec 2011]. [5] T. Blaine, The convergence of alternate controllers and musical interfaces in interactive entertainment, in Proceedings of the 2005 conference on New interfaces for musical expression, 2005, pp. 2733.

You might also like