JET Content Authoring Guidelines
JET Interactive Music Engine
Vrs 1.0
Authored by SONiVOX
www.sonivoxrocks.com
Copyright 2009 Sonic Network, Inc.
This document contains content creation
guidelines for composers and sound designers authoring music and sound effects
for the SONiVOX JET platform. JET is an
interactive music player for small embedded devices, including the Google Android
platform. It allows applications to include interactive music soundtracks, in MIDI format, that respond in real-time to game play
events and user interaction.
JET works in conjunction with SONiVOXs
Embedded Audio Synthesizer (EAS) which is the MIDI
playback device for Android. Both the
JET and EAS engines are integrated into the Android embedded platform as well
as inherent in the JET Creator application. As such, the JET content author can
be sure that the playback will sound exactly the same in both the JET Creator
and the final Android application playing back on Android mobile devices.
The JET content author works in up to three
different applications to create JET content; a standard MIDI
sequencer (Logic, Cubase, etc.), optionally a DLS2 instrument editor (Awave),
and the JET Creator application to add and audition JET interactive elements.
The final result is a .jet file that the
content author gives to the application programmer for use in the game or
application.
It is important to use a common set of
terms to minimize confusion. Since JET uses MIDI
in a unique way, normal industry terms may not always suffice. Here is the
definition of terms as they are used in this document and in the JET Creator
application:
Channel: MIDI data associated with a specific MIDI
channel. Standard MIDI allows for 16 channels of MIDI
data each of which are typically associated with a specific instrument.
Controller: A MIDI event consisting of a
channel number, controller number, and a controller value. The MIDI spec associates many controller numbers with
specific functions, such as volume, expression, sustain pedal, etc. JET also
uses controller events as a means of embedding special control information in a
MIDI sequence to provide for audio
synchronization.
DAW: Digital Audio Workstation. A common term for MIDI
and audio sequencing applications such as Logic, SONAR, Cubase and others.
EAS: Embedded MIDI Synthesizer. The
name of the SONiVOX MIDI synthesizer engine.
JET: Jet Interactive Engine. The name of the SONiVOX JET interactive
music engine.
Segment: A musical section such as a chorus or verse that is a component of
the overall composition. In JET, a segment can be an entire MIDI file or a
derived from a portion of a MIDI file.
SMF-0: Standard MIDI File Type 0, a MIDI file that contains a single
track, but may be made up of multiple channels of MIDI
data.
SMF-1: Standard MIDI File Type 1, a MIDI file that contains a one more
tracks, and each track may in turn be made up of one or more channels of MIDI data. By convention, each channel is stored on a
separate track in an SMF-1 file. However, it is possible to have multiple MIDI
channels on a single track, or multiple tracks that contain data for the same MIDI channel.
Track: A single track in a DAW containing a timed sequence of MIDI events. Be careful not to confuse Tracks with
Channels. A MIDI file may contain many tracks with several tracks utilizing the
same MIDI channel.
1
The JET Interactive Music Concept
Interactive music can be defined as music
that changes in real-time according to non-predictable events such as user
interaction or game play events. In this way, interactive music is much more
engaging as it has the ability to match the energy and mood of a game much
closer than a pre-composed composition that never changes. In some applications
and games, interactive music is central to the game play. Guitar Hero is one
such popular game. When the end user successfully captures the musical notes
coming down the fret board, the music adapts itself and simultaneously keeps
score of successes and failures. JET allows for these types of music driven
games as well.
There are several methods for making and
controlling interactive music and JET is one such method. This section
describes the features of JET and how they might be used in a game or software
application. It also describes how JET can be used to save memory in small
footprint devices such as Android enabled mobile handsets.
JET supports a flexible music format that
can be used to create extended musical sequences with a minimal amount of data.
A musical composition is broken up into segments that can be sequenced to
create a longer piece. The sequencing can be fixed at the time the music file
is authored, or it can be created dynamically under program control.
Figure 1: Linear Music Piece
This diagram shows how musical segments are
stored. Each segment is authored as a separate MIDI
file. A post-processing tool combines the files into a single container file.
Each segment can contain alternate music tracks that can be muted or un-muted
to create additional interest. An example might be a brass accent in the chorus
that is played only the last time through. Also, segments can be transposed up
or down.
The bottom part of the diagram shows how
the musical segments can be recombined to create a linear music piece. In this
example, the bridge might end with a half-step key modulation and the remaining
segments could be transposed up a half-step to match.
Figure 2: Non-linear music piece
In this diagram, we see a non-linear music
piece. The scenario is a first-person-shooter (FPS) and JET is providing the
background music. The intro plays as the level is loading and then transitions
under program control to the Searching segment. This segment is repeated
indefinitely, perhaps with small variations (using the mute/un-mute feature)
until activity in the game dictates a change.
As the player nears a monster lair, the
program starts a synchronized transition to the Danger segment, increasing the
tension level in the audio. As the player draws closer to the lair, additional
tracks are un-muted to increase the tension.
As the player enters into combat with the
monster, the program starts a synchronized transition to the Combat segment.
The segment repeats indefinitely as the combat continues. A Bonus Hit
temporarily un-mutes a decorative track that notifies the player of a
successful attack, and similarly, another track is temporarily un-muted to
signify when the player receives Special Damage.
At the end of combat, the music transitions
to a victory or defeat segment based on the outcome of battle.
JET can also synchronize the muting and
un-muting of tracks to events in the music. For example, in the FPS game, it
would probably be desirable to place the musical events relating to bonuses and
damage as close to the actual game event as possible. However, simply un-muting
a track at the moment the game event occurs might result in a music clip
starting in the middle. Alternatively, a clip could be started from the
beginning, but then it wouldnt be synchronized with the other music tracks.
However, with the JET sync engine, a clip
can be started at the next opportune moment and maintain synchronization. This
can be accomplished by placing a number of short music clips on a decorative
track. A MIDI event in the stream signifies
the start of a clip and a second event signifies the end of a clip. When the
application calls the JET clip function, the next clip in the track is allowed
to play fully synchronized to the music. Optionally, the track can be
automatically muted by a second MIDI event.
Figure 3: Synchronized Mute/Unmute
JET provides an audio synchronization API
that allows game play to be synchronized to events in the audio. The mechanism
relies on data embedded in the MIDI file at
the time the content is authored. When the JET engine senses an event during
playback it generates a callback into the application program. The timing of
the callback can be adjusted to compensate for any latency in the audio
playback system so that audio and video can be synchronized. The diagram below
shows an example of a simple music game that involves pressing the left and
right arrows in time with the music.
Figure 4: Music Game with Synchronization
The arrows represent events in the music
sequence where game events need to be synchronized. In this case, the blue
arrow represents a time where the player is supposed to press the left button,
and the red arrow is for the right button. The yellow arrow tells the game
engine that the sequence is complete. The player is allowed a certain time
window before and after the event to press the appropriate key.
If an event is received and the player has
not pressed a button, a timer is set to half the length of the window. If the
player presses the button before the timer expires, the game registers a
success, and if not, the game registers a failure.
If the player presses the button before the
event is received, a timer is set to half the length of the window. If an event
is received before the timer expires, the game registers a success, and if not,
the game registers a failure. Game play might also include bonuses for getting
close to the timing of the actual event.
To author JET files and hear them playback interactively,
the content author will work in two or three applications which are designed to
work together smoothly. The first is application is any off-the-shelf MIDI sequencing application or Digital Audio Workstation
that supports VST (for PC) or AU (for Mac) plugins. (Logic, SONAR, Cubase, etc)
Here the author will compose their MIDI music
files using the SONiVOX EAS Synth plugin as the playback synthesizer.
Once the composer has completed their MIDI file(s), they import them into the JET Creator application.
Here the author will setup and audition the conditions for interactive playback
within the JET enabled game.
Optionally, the author may elect to create
a custom DLS soundbank. This can be created in any off-the-shelf DLS authoring
application, such as Awave from MJSoft, and loaded into JET Creator along with
the MIDI files.
Below is an overview of this process. A
more detailed explanation of each step follows.
- Launch digital audio workstation (DAW)
- Assign the SONiVOX EAS Synth plugin as the playback
synthesizer
- Optionally load a custom DLS2 soundset
- Compose and save MIDI file(s)
- Launch the JET Creator application
- Create segments using the MIDI
and DLS2 source files
- Add interactive elements
- Audition interactive elements
- Save and Export JET files for use in the Android application
Launch DAW Content authors will need to
use a third party MIDI authoring application to compose their MIDI
files. It is recommended they use a digital audio workstation (DAW) application
that supports VST or AU plugins as this will enable them to listen to the EAS
MIDI Synthesizer and DLS2 soundsets that will be utilized in the Android
application itself. Some examples of popular DAWs include SONAR (PC) and LOGIC
(MAC).
Assign SONiVOX EAS Synth plugin as the
playback synthesizer The SONiVOX EAS Synth plugin is a VST and AU compatible
virtual instrument that plugs into VST or AU compatible DAWs. This software
plugin uses the same SONiVOX EAS MIDI synthesizer engine and default General
MIDI wavetable soundset inherent in
Android. Using this plugin allows content authors to hear the exact audio
rendering of the instruments and MIDI file
that will be used in their Android applications.
Optionally Load DLS2 Soundset The SONiVOX
EAS Synth plugin allows for the loading of any DLS2 compatible soundset for
playback. These could include a new GM wavetable set, or a small collection of
just a few custom instruments for a given application. Note, the DLS file does
not replace the internal GM wavetable used by the EAS engine. DLS soundsets
play in conjunction with the internal GM wavetable.
Compose MIDI File Compose MIDI soundtracks for the Android application.
Launch JET Creator Once all DLS2 and MIDI source files have been authored, the content author
should launch the JET Creator and begin creating JET Segments. The segments
will reference the MIDI files and any custom
DLS2 soundbanks.
Assign JET Segment Attributes After
creating segments the content author interactive elements. Interactive elements
include mute and unmute settings of individual tracks in the MIDI file(s) as
well as MIDI controller numbers that serve as
events in the game. These attributes tell the JET engine how and when to play
the different musical segments according to the JET API commands in the Android
application. See below for more detail on this.
Audition Interactive Playback After
assigning the segment attributes and creating the JET file, the content author
can audition all interactive playback elements in the JET Audition window.
Save .jtc File After the author is
satisfied with the result, it is recommended they save the JET Creator .jtc
file which will save their settings, references to source files, etc.
Export Files Exporting the JET Creator
file will bundle all source files and their attributes into a single .zip file.
The zip file will also contain a .jet file for use by the Android application.
3
EAS Synth Virtual Instrument Plugin
Included in the JET Creator package is the
EAS software synthesizer in plug-in format. The EAS plugin synth allows the
composer to hear the instruments used in Android as they are composing their MIDI sequence. The EAS Synth plugin allows for the
loading of custom DLS2 sounds as well.
3.1
Installing the EAS Synth Plugin
Follow the instructions for your individual
DAW to install and utilize the plugin. For Mac users this will typically
involve copying the EAS Synth.componant file into your plugins folder which
is usually located at /Library/Audio/Plug-ins/Components. PC users will want to
install the EAS Synth.dll into the plugin folder that their DAW requires.
3.2
Requirements and Settings for
using the EAS Synth Plugin
The EAS Synth is an embedded synthesizer
for small mobile devices. This means it does not have the flexibility of high
end synthesizers typically utilized in a professional application such as
Logic, Digital Performer, etc. As such, only the following attributes are
supported.
Macintosh:
Mac OSX (Intel) Macs
ASIO Supported Soundcards
Sample Rate: 44100 hz
Buffer Size: 256 kbytes
PC:
Windows 2000 or Vista
operating systems
ASIO supported soundcards
Sample Rate: 44100 hz
Buffer Size: 256 kbytes
3.3
Assigning MIDI
Tracks to use the EAS Synth
Each DAW has its own particular method of
assigning MIDI tracks to virtual instrument
plugins such as the SONiVOX EAS Synth. Please consult the user manual for your
DAW for detailed instructions. Below are some general guidelines for Logic
(Mac) and SONAR (PC).
3.3.1
LOGIC 8
The SONiVOX EAS Synth virtual instrument is
a multi-timbral synthesizer. (i.e. it plays back multiple instruments on unique
MIDI channels in a single instance) In Logic
8, however, youll want to set up 16 Logic Instruments,
each with their own instance of the EAS Synth. Each Instrument should be assigned
its own MIDI channel. Use Channel 10 for
Drums. The reason for this is that MIDI controller messages, such as Volume
(CC7) and Pan (CC10) will not be channelized if the plugin is assigned to only
a single Instrument and all MIDI tracks are
set to playback on that Instrument. In order for each MIDI
channel to respond to its own controller messages, you must assign 16 different
EAS Synth instances to 16 unique Logic Instruments.
A Logic 8 template file has been included
in the Android Cupcake release to facilitate the above.
Playback in Logic 8 may require you to be
in record enable mode for each track you are auditioning. To record enable
multiple tracks hold down the Option key.
To write out a standard MIDI
(type 1) file from Logic, you need to use the File Export command. IMPORTANT:
Most edits in Logic are non-destructive edits meaning they are not modifying
the actual data but rather adding an overlay onto to the data. Quantize is one
such non-destructive edit. Therefore when you export a MIDI
file, you may not see your quanitization settings.
In addition, the mix parameters for volume,
pan and program changes may not appear in the event list and therefore may not
write out with the MIDI file. Before exporting
a MIDI file in Logic it is recommended you do
the following:
Select All and use the Insert MIDI >
Insert MIDI Settings as Events command.
Select All and use the Apply Quantization
Settings Destructively command.
3.3.2
Cakewalk SONAR 7
Sonar 7 is a bit easier to set up, use and
save than Logic 8. Simply open or start a new MIDI
file. Go to the Insert menu and select Insert Soft Synth>SONiVOX>EAS
Synth. Then assign each MIDI tracks output to
the EAS Synth. There is no need to record enable a track to hear it play back.
When saving, be sure to select MIDI Type 1.
SONAR 8 works similarly to SONAR 7.
3.3.3
Digital Performer
Weve seen some instances when creating
content with Digital Performer where notes with a release velocity of non-0
will generate an extra note-on event in the EAS synth. If you are hearing a
doubling, editing the release velocity events to zero should fix this problem.
3.4
Using Custom DLS2 Soundsets
The SONiVOX EAS Synthesizer supports two
simultaneous soundsets or wavetables. One is the internal General MIDI wavetable
inherent to the SONiVOX EAS Synthesizer. The other is a Downloadable Sounds
Level 2 (DLS2) soundset. The internal wavetable is a GM Level 1 compliant
wavetable with 127 melodic instruments and 1 drumkit. It is in a proprietary
SONiVOX format. The DLS2 soundsets are an open format published by the MIDI
Manufactures Association.
In the Android Cupcake release, the
internal wavetable is only 200 kbytes, very small, in order to be compliant
with all Android devices which may not have a lot of memory. DLS2 soundsets can
be any size that a particular device supports. Upgraded (larger) internal
wavetables as well as custom DLS2 instruments can be licensed from SONiVOX.
3.4.1
Loading a DLS2 Soundset
To load a custom soundset, click on the
Load DLS button in the EAS Synth plugin interface. Browse to the DLS2 file you
wish to load and say OK. Only DLS Level 2 formatted soundsets are
supported.
3.4.2
Using a DLS2 Soundset
Since both the internal EAS GM wavetable
and a custom DLS2 soundset are used simultaneously, you must be sure you have
your MIDI Program Changes set correctly. DLS2 instruments must be assigned to a
Bank other than the default GM bank
used by the internal synthesizer.
The internal EAS synthesizer is assigned to
Banks 121 (melodic instruments) and 120 (drum instruments). This follows the
General MIDI Level 1 specification. Note: Most MIDI
sequencers require you to use Bank 0 to select the default wavetable. Custom
DLS2 soundsets, therefore, should utilize a different Bank. We recommend Bank
1.
The EAS synth supports MSB (Controller 0),
LSB (Controller 32) Bank change messages. There are two places you need to set
this Bank and Program Change number. The first is in your DLS2 soundset. Using
Bank 1, each Instrument would be assigned MSB 1, LSB 0, then the Instrument
Program Change number. The second place to use the Bank and Program Change
number is in your MIDI sequence.
In your MIDI
track, the MSB should be sent first followed by the LSB and then the Instrument
number. For example, if your DLS2 instrument is assigned MSB 1, LSB 0,
Program1, you would send CC0, 1 followed by CC32, 0 followed by Program Change
Message 1. This might look like the following in an event window:
4
JET Creator Guidelines
JET Creator is the desktop application
where youll edit and audition the JET interactive music elements. For details
on the JET Creator application please see the JET Creator User Manual. Below
are some additional guidelines to help you out.
4.1
Order of Tasks
As with all projects, its best to discuss and
design the interactive music scheme with the game designer and programmer
before beginning your composition. An outline and/or specification can go a
long way in saving you from having to redo things after the game is in place.
In general youll want to first write your
music in your DAW of choice the way youre used to composing, then break up the
final MIDI file as needed for the application.
Next, move to JET Creator and create all of your music segments in the order
easiest to preview them when played in order. Finally, add the JET Events to
control the segments via the Android game and Audition them as needed in JET
Creator. Finally, save the project in JET Creator and hand off the .jet file to
the programmer to integrate it in the game. After previewing there will likely
be changes to the MIDI file(s) and JET Creator
attributes.
4.2
Conserving Memory
If youre trying to conserve memory,
compose as few MIDI files as possible, and create several segments from that MIDI file. For example a 12 bar MIDI
file with three sections of 4 bars, A, B, C, can create a much longer song.
Simply create multiple segments that reference the one MIDI
file, then order them however you like. For example, A, A, B, A, C, A, B, A, A
would create a 36 bar song. Use JET to add repeats, transpose segments, and
interactively mute and unmute tracks to keep it even more interesting.
4.3
Replicate
To make adding segments or events faster,
use the Replicate command. Replicate can add multiple segments or events at one
time and uses an offset parameter and prefix naming convention to keep things
easy to read. The MOVE command is also useful for moving multiple events by a
set number of measures, beats or ticks.
4.4
Interactive Options
There are several interactive audio
concepts possible in JET. Below are a few examples although we hope developers
will come up with others we havent thought of! These are:
4.4.1
Multiple Segment Triggering
In this method the application is
triggering specific segments based on events in the game. For example a hallway
with lots of fighting might trigger segment 1 and a hallway with no fighting
might trigger segment 2. Using JET TriggerClips in conjunction with this method
creates even more diversity.
4.4.2
Mute Arrays
In this method the application is
triggering mute and unmute events to specific tracks in a single MIDI sequence. For example a hallway with lots of
fighting might play MIDI tracks 1-16 and a
hallway with no fighting might play the same midi file but mute tracks 9-16.
Using JET TriggerClips in conjunction with this method creates even more
diversity.
4.4.3
Music Driven Gameplay
Music driven gaming is similar to what
Guitar Hero and JETBOY have done in that the music content determines how
graphic events are displayed. The application then queries the user response to
the graphic events and interactively modifies the music in response. In this
method the game is utilizing JET Application Events, MIDI controllers that are
embedded in the MIDI file and read by the game
in real-time. Based on the user response, multiple segment triggering and/or
mute arrays can be set.