LNCS 3166 - iFP: A Music Interface Using an Expressive Performance...

12
M. Rauterberg (Ed.): ICEC 2004, LNCS 3166, pp. 529–540, 2004. © IFIP International Federation for Information Processing 2004 iFP: A Music Interface Using an Expressive Performance Template Haruhiro Katayose 1,2 and Keita Okudaira 1 1 School of Science and Technology, Kwansei Gakuin University Sanda 669-1337, Hyogo, Japan {katayose, keita}@ksc.kwansei.ac.jp http://ist.ksc.kwansei.ac.jp/~katayose/ 2 PRESTO, JST, JAPAN Abstract. This paper describes a performance interface called iFP that enables players to play music as if he or she had the hands of the virtuoso. iFP is a tapping-style musical interface and refers to a pianist's expressiveness described in a performance template. The paper describes the scheduler that allows a player to mix her/his own intension with the expressiveness in the performance template and the user interfaces. The results of a subjective study suggest that using the expression template and the tapping-style interface contribute to the subject’s joy of playing music. This result is also supported by a brain activation study that was done using near-infrared spectroscopy. 1 Introduction Music has been an important form of entertainment since ancient days. Recently, interfaces for playing music have been incorporated into videogames. In 2003, the game “Taiko-no-Tatsujin” (Master of Drumming) manufactured by NAMCO ltd. sold more than 1.2 million copies, reaching 5th place in game sales for that year. Many people enjoy playing such music games. However, the object of most products is to have players compete over who displays the most rhythmic accuracy for a given piece of music, not to provide a pleasurable musical experience for the players. Musical activities foster self-expression, acquisition of new skills, appreciation of good art, and sharing of experiences with others. Above all, the performance itself is the fundamental musical activity. Although it is fun to play a musical instrument, not a few people have experienced embarrassment due to their lack of skill in playing one. Sometimes this may even be the reason for a musician quitting and giving up a means of self-expression. Interactive musical instruments are meant to overcome this problem. They are expected to give users a chance to express what they would like to express even if they lack certain musical skills. The score-follower based on beat tapping and proposed by Mathews [1] is a simple, intuitive music interface to express tempo and dynamics. It is intended especially for amateurs. Mathews’ work has been followed by various conducting systems [2,3,4].

Transcript of LNCS 3166 - iFP: A Music Interface Using an Expressive Performance...

Page 1: LNCS 3166 - iFP: A Music Interface Using an Expressive Performance …jacob/250aui/ifp-performance-template.pdf · 2011-11-12 · iFP: A Music Interface Using an Expressive Performance

M. Rauterberg (Ed.): ICEC 2004, LNCS 3166, pp. 529–540, 2004.© IFIP International Federation for Information Processing 2004

iFP: A Music Interface Using an Expressive PerformanceTemplate

Haruhiro Katayose1,2 and Keita Okudaira1

1 School of Science and Technology, Kwansei Gakuin UniversitySanda 669-1337, Hyogo, Japan

{katayose, keita}@ksc.kwansei.ac.jphttp://ist.ksc.kwansei.ac.jp/~katayose/

2 PRESTO, JST, JAPAN

Abstract. This paper describes a performance interface called iFP that enablesplayers to play music as if he or she had the hands of the virtuoso. iFP is atapping-style musical interface and refers to a pianist's expressiveness describedin a performance template. The paper describes the scheduler that allows aplayer to mix her/his own intension with the expressiveness in the performancetemplate and the user interfaces. The results of a subjective study suggest thatusing the expression template and the tapping-style interface contribute to thesubject’s joy of playing music. This result is also supported by a brainactivation study that was done using near-infrared spectroscopy.

1 Introduction

Music has been an important form of entertainment since ancient days. Recently,interfaces for playing music have been incorporated into videogames. In 2003, thegame “Taiko-no-Tatsujin” (Master of Drumming) manufactured by NAMCO ltd. soldmore than 1.2 million copies, reaching 5th place in game sales for that year. Manypeople enjoy playing such music games. However, the object of most products is tohave players compete over who displays the most rhythmic accuracy for a given pieceof music, not to provide a pleasurable musical experience for the players.

Musical activities foster self-expression, acquisition of new skills, appreciation ofgood art, and sharing of experiences with others. Above all, the performance itself isthe fundamental musical activity. Although it is fun to play a musical instrument, nota few people have experienced embarrassment due to their lack of skill in playingone. Sometimes this may even be the reason for a musician quitting and giving up ameans of self-expression. Interactive musical instruments are meant to overcome thisproblem. They are expected to give users a chance to express what they would like toexpress even if they lack certain musical skills.

The score-follower based on beat tapping and proposed by Mathews [1] is asimple, intuitive music interface to express tempo and dynamics. It is intendedespecially for amateurs. Mathews’ work has been followed by various conductingsystems [2,3,4].

Page 2: LNCS 3166 - iFP: A Music Interface Using an Expressive Performance …jacob/250aui/ifp-performance-template.pdf · 2011-11-12 · iFP: A Music Interface Using an Expressive Performance

530 H. Katayose and K. Okudaira

If the note descriptions of the score given to the system are nominal (quantized),the players’ expression would be limited to the tempi and dynamics. We designed ascore-follower called iFP, which utilizes expression templates derived from virtuosoperformances [5]. iFP enables its users to enjoy the experience of playing music, as ifhe or she had the hands of a virtuoso.

The next section outlines the design of iFP. We then describe the scheduler thatrealizes a mixture of the player’s intention and expressiveness described in theperformance template. After introducing the user interfaces, we discuss theeffectiveness of using the expressive performance template as determined from asubjective evaluation and an observation of the test subject’s brain activity.

2 System Overview

In this section, we briefly describe the iFP design and some of its functions. We thenillustrate how the expressive performance template is utilized in iFP and describe itsprincipal functions.

2.1 Utilizing a Template

Beat tapping is an intuitive interface to input tempo and dynamics to a performancesystem. However, the player cannot express the delicate sub-beat nuance with onlybeat tapping. The primary goal of utilizing the expressive performance template is tofill in expressions at the sub-beat level. The player’s intention and the expressionmodel described in the template are mixed as shown in Fig. 1.

Templates of Expressive data

TempoDatum Point, Without Expression

Delicate Control within a Beat

Datum Point, Without Expression

+ User Gesture / Intention

Tempo

Beat Dynamics

User Gesture / Intention

Tempo

Beat Dynamics

Weighed Expression

Adopted

Expression Vector

Fig. 1. Conceptual overview of performance calculation. The performance data are given by amixture of the player’s intention and expressiveness described in the performance template. Inthis three dimensional space, the vertical axis denotes the variance of deviations of all noteswithin the beat.

The player is allowed to vary the weight parameters dynamically by using sliders,each of which is multiplied with the deviation of tempo, dynamics, and delicatenuance within a beat. If all of these weight parameters are set to 0%, the expressionof the template has no effect. On the other hand, if it is set to 120%, for example, theplayer can emphasize the deviations of the template.

Page 3: LNCS 3166 - iFP: A Music Interface Using an Expressive Performance …jacob/250aui/ifp-performance-template.pdf · 2011-11-12 · iFP: A Music Interface Using an Expressive Performance

iFP: A Music Interface Using an Expressive Performance Template 531

iFP also provides a morphing function to interpolate (extrapolate) two differentexpressive performance templates of a musical piece.

2.2 Outline of Scheduling

Schedulers of interactive music systems have to calculate the timing of notesdynamically. iFP adopts a predictive scheduler, which arranges notes from the currentbeat to the next beat by using the history of the player's tap. One of the importantpoints of using a predictive scheduler is that tap (beat) detection and scheduling ofnotes should be independent. This yields the merits of 1) compensation of the delaywhen using a MIDI-driven robotic acoustic instrument, and 2) easy implementation ofthe automatic performance mode (sequencer of the performance template). Apredictive scheduler might produce undesired gaps between the predicted beat timingand the actual players' tap. Especially if the gap is a delay, it may be perceived asspoiling the performance. We prepared two countermeasures to improve the response;one is a function to urge the system when player's tap precedes the scheduled time,and the other is for receiving double taps for the given tactus (see the ExpressivePerformance Template section).

2.3 Functions

The features described above and the other characteristic functions are summarized asfollows:• Utilization of expressive performance template• Real-time control of weight parameters regarding expressions• Morphing of two different performance templates• Predictive scheduling which allows the player to tap an arbitrary beat• Pauses (breath) inserted based on release timing• Real-time visualization (feedback) of expressiveness.• Gestural input using a conducting sensor, a MIDI keyboard.

2.4 Expressive Performance Template

Fig. 2 shows a part of a performance template. The left row represents the start timingof the events. Information about each note, except for the start timing, is placed inbrackets. Each bracketed term, in order, represents, the deviation regarding the starttime, note name, velocity, duration, and the deviation of duration, respectively. In thisformat, the tempo is described using the descriptor BPM. The description is followedby the tempo (in bpm beat per minuets) and the beat name to which the tempo isgiven.

Page 4: LNCS 3166 - iFP: A Music Interface Using an Expressive Performance …jacob/250aui/ifp-performance-template.pdf · 2011-11-12 · iFP: A Music Interface Using an Expressive Performance

532 H. Katayose and K. Okudaira

Fig. 2. Description of performance template.

iFP’s predictive scheduler continues the performance, even if the performer does stoptapping. The player does not have to tap every beat. However, there often is the casethat the players wish to tap to each note, instead of the beat. We introduced adescriptor TACTUS to explicitly describe how many taps are received for the beat.The following bracketed expression is an example of a TACTUS description. (1.00TACTUS 2 4) This example means that after time 1.00, two taps are received forquarter notes; in other words, the system receives a tap every eighth note, after time1.00. It is not easy to obtain quantized notation, because local tempi varies more thantwice from the average tempo. Manual quantization is extremely troublesome.Therefore, we designed a tool which identifies the notes in the performance givenonly a sparse score and then assigns a canonical notation value and deviation to all ofthe notes [6]. The DP matching procedure is utilized for the 1st step and a HiddenMarkov Model (HMM) is used for assigning the 2nd time value to the notes. This toolenables us to prepare error-free expressive performance templates by giving only 10%of the notes as guides. At present, we have made over 100 expressive performancetemplates.

3 Scheduler

In this section, we describe the scheduler that realizes a mixture of the player’sintention and expressiveness described in the performance template.

3.1 Calculation of Tempo

The tempo is calculated using 1) the average tempo obtained from the recent history(tactus-count: α) of the tempo, 2) the tempo to which the player's veer is considered,using the differential of the two most recent tapping, and 3) the prescribed tempo in

.....2.00 BPM 126.2 42.00 (0.00 E3 78 3.00 -0.11)=21.00 TACTUS 2 41.00 BPM 128.1 41.00 (0.00 C#4 76 0.75 -0.09) (0.04 E1 60 1.00 -0.13)1.75 (0.10 D4 77 0.25 -0.14)2.00 BPM 130.0 42.00 (0.00 B3 75 1.00 -0.03) (0.00 G#3 56 1.00 0.03)3.00 BPM 127.73.00 (0.00 B3 72 1.00 0.00) (0.09 G#3 56 1.00 -0.12) (0.14 D3 57 1.00 -0.21)=31.00 TACTUS 1 41.00 BPM 127.6 41.00 (0.00 B3 77 2.00 -0.05) (0.00 G#3 47 2.00 -0.05) (-0.06 D4 57 2.00 -0.32)3.00 BPM 129.7 43.00 (0.00 F#4 75 1.00 -0.15) (0.00 D4 54 1.00 0.03)=41.00 BPM 127.7 41.00 (0.00 D#4 73 0.75 -0.38) (0.02 C4 65 0.75 -0.08)

Page 5: LNCS 3166 - iFP: A Music Interface Using an Expressive Performance …jacob/250aui/ifp-performance-template.pdf · 2011-11-12 · iFP: A Music Interface Using an Expressive Performance

iFP: A Music Interface Using an Expressive Performance Template 533

the performance template [TempoT]. Let stdTEMPO denote the overall averagetempo of the template, and wH, wP, and wT denote the weights for 1), 2) and 3)respectively. The tempo after nth tactus, BPMn is calculated as:

BPMn = wT × TempoT − stdTempo( )+ wH

1

aBPMk − stdTempo

k= n−a

n−1

⋅ 1

wT + wH

⋅ BPMn−1

BPMn−2

wP (1)

Fig. 3 shows an example of tempo calculation. If the player sets wT to a bigger value,more template data will be imported, and the player can feel like conducting a pianist.Setting wP to a bigger value quickens the response. By setting wH to a bigger value,the tempo of the music will be stable, affected by the recent average tempo. The usercan set the parameters as s/he likes.

Fig. 3. Calculation of tempo

3.2 Improvement of Response and Estimation of Beat Position

The problem with using predictive control is the possibility of undesired gapsbetween the predicted beat time and the actual user input. We introduced thefollowing procedures in order to fill the gap, when a player's tap for the next beat isprior to the scheduled (predicted) timing. Fig. 4 shows the scheduling status in thecase of a) the player’s tap is prior to the scheduled beat and b) the player’s tap is afterthe scheduled beat. Here, the adjustment level is a parameter that stands for howmuch the scheduler has to shrink the gap between the scheduled (predicted) time andthe player's real tap, when the beat is detected prior to the scheduled time. If theadjustment level is set to 50%, the system issues events that correspond to the note onthe beat, at the middle position between the player’s tap and scheduled time.

Page 6: LNCS 3166 - iFP: A Music Interface Using an Expressive Performance …jacob/250aui/ifp-performance-template.pdf · 2011-11-12 · iFP: A Music Interface Using an Expressive Performance

534 H. Katayose and K. Okudaira

Fig. 4. Beat Position. A). Player’s tap is prior to the scheduled beat, b) Player’s tap is after thescheduled beat.

Estimation of the beat position in the music is one of the challenging unsolved targetsin cognitive musicology. However, the scheduler has to estimate the position of thecurrent beat in order to predict the next beat. iFP’s scheduler estimates the position ofthe current beat by inter/extrapolating the timing of the player’s tap and that thescheduler issues the events corresponding to the beat. The adjustment ratio, employedhere, stands for the value for the inter/extrapolation. If the adjustment ratio is set to ahigher/lower value the performance will tend to close to the player’s/template’sintention.

3.3 Calculation of Note Event Timing

The timing of each note event (note-on, note-off) is calculated using IOIn given bythe inverse of BPMn (see the Calculation of Tempo section), as follows,

Timeeach_note_issue = IOIn · (posT_each_note + devT_each_note · wT_dev) (2)

where, Timeeach_note_issue [s] is the time after the identified current beat, posT_each_note isthe scheduled time of the note without deviation, devT_each_note is the value of thedeviation term of the note, and the wT_dev is the weighting factor for the template.When wT_dev = 0 is given, the temporal deviation under beat level will be mechanical.

3.4 Calculation of Velocity (Note Intensity)

The notes are classified into the control notes (note on the beat) and the remainder.First, the system decides the beat velocity Vbeat for the beat. It is calculated,considering how loud/quiet the player and the machine (performance template) intendto play the note of the beat.

Vbeat = Vstd +(VT − Vstd ) ⋅ wT _ v + (VU − Vstd ) ⋅ wU _ v

wT _ v + wU _ v

(3)

where, Vstd is the standard (average) velocity of the all notes of the template, VT is theaverage of the note-on velocity within the beat, VU is the velocity that the playergives, and wT_v and wU_v are the weights for VT and VU, respectively. When wT_v andwU_v are 0, the intensity deviation will be mechanical.

Page 7: LNCS 3166 - iFP: A Music Interface Using an Expressive Performance …jacob/250aui/ifp-performance-template.pdf · 2011-11-12 · iFP: A Music Interface Using an Expressive Performance

iFP: A Music Interface Using an Expressive Performance Template 535

The velocity of the each note Veach_note_issue is calculated as:

Veach_note_issue = Vbeat · (1 + VT_each_note_dev + VU_each_dev) (4)

where, VT_each_note_dev stands for deviation in the template, and VU_each_dev stands forthe player's intensity veer.

VT_each_note_dev = (VT_each_note – VT) / VT · wT_dev (5)

VU_ each _ dev =VUcurrnt − VUprior

VUprior

⋅ (posT _ each _ note + devT _ each _ note ⋅ wT _ dev ) ⋅ wUd _ v (6)

where, VT_each_note is each note-on velocity within the beat, and VUn denotes thevelocity given by the nth player’s tap and wUd_v denotes the weight for the player'sveer.

4 User Interface

4.1 GUI and Gestural Interface

Fig. 5 shows the standard GUI to characterize the performance. The users are givensliders so they can set the weight parameters regarding tempo (template/user),dynamics (velocity: template/user), and deviation (delicate control within a beat:template).

Fig. 5. Standard GUI for setting parameters.

iFP also provides a morphing interfaces of two templates of a music piece. The playercan interpolate and extrapolate the performance using each of the morphingparameters. The player is also allowed to use peripherals of MIDI instruments insteadof the software sliders. If the radio button “keying” is selected, the system accepts theplayer's beat tap. If “auto” is selected, the system does not accept the beat tap. Theexpressive performance template is played without the control of the player’s beat tap(automatic mode). The gestural conducting sensor is based on capacity transducers(see Fig 6).

The beat signal is issued when the hand is located at the lowest position. The rangeof hand movement is assigned to the dynamics for the next beat. When the hand islower than a certain threshold, the system holds the performance, i.e. gives a rest.

Page 8: LNCS 3166 - iFP: A Music Interface Using an Expressive Performance …jacob/250aui/ifp-performance-template.pdf · 2011-11-12 · iFP: A Music Interface Using an Expressive Performance

536 H. Katayose and K. Okudaira

Fig. 6. With a conducting gestural interface using capacity transducers.

4.2 Visualization

iFP provides real-time visualization of the performance trajectory in three-dimensional space, as shown in Fig. 7. The axes are the tempo, the dynamics, and thesummed variance of the expression deviation within a beat. The user can observe thetrajectory from various viewpoints. If the player uses iFP with automatic (sequencer)mode, this visualization function should be the view of the expressiveness of theperformance template.

Fig. 7. Example of visualization of a performance. K.331 played by S. Bunin.

5 Evaluation

We conducted two experiments to verify the effectiveness of using expressiveperformance templates and gestural input devices. One was an evaluation regardingthe players' introspection, and the other was a brain activity measurement using near-infrared spectroscopy (NIRS).

Page 9: LNCS 3166 - iFP: A Music Interface Using an Expressive Performance …jacob/250aui/ifp-performance-template.pdf · 2011-11-12 · iFP: A Music Interface Using an Expressive Performance

iFP: A Music Interface Using an Expressive Performance Template 537

5.1 Introspection Evaluation

We focused on “controllable” and “expressive” aspects for the introspection study.“Controllable” stood for difficulty in playing music. "Expressive" stood for how wellthe player could express the music. For the experiment, we used a conductinginterface and an expressive template for “When You Wish Upon A Star” for piano.The system parameters for this experiment were those of a music teacher who is alsoa conductor so that the performance taste would be close to conducting, as shown inFig. 5. We interviewed forty subjects, whose music experience was 0~33 years. Weasked them “Which performance (with / without the performance template) is more"controllable" or more "expressive"? We limited the time to practice to 10 minutes inthis experiment. The results are shown in Table 1.

Table 1. Introspection Regarding Expression Template Use: The value in each column is thenumber of subjects who preferred the condition.

ControllableBetter: with

TemplateBetter: withoutTemplate

Sum

Better: withtemplate 13 15 28

ExpressiveBetter: withoutTemplate 0 12 12

sum 13 27 40

We investigated the response of the 27 subjects who answered, “controllability isbetter without template”, by changing the parameters affecting controllability. All ofthe subjects answered that the controllability was improved when the parameters ofboth adjustment level and ratio were 100%. This meant dis-coincidence of theplayer’s own taps and heard beats makes the performance difficult. However, some ofthe subjects who are experienced in music commented that both adjustment level andratio should not be so high in order to gain expressiveness, and the dis-coincidenceplayed an important role for expressiveness. It is interesting to consider this point,related with what makes ensembles splendid.

Next, we investigated learning effects, for five of the 15 people who answered“expressive performance template contributes to expressiveness, but it does notcontribute to controllability”. Four people among five subjects changed their opinionto “prefer to use a template also for controllability” after learning. These results seemto indicate that the expressive performance template contributes to bothexpressiveness and controllability, after one has learned how to play the music usingiFP.

5.2 Evaluation Using NIRS

Physiological measurements are good for verifying subjective introspection results.Brain activity is a most promising measure for what a subject is thinking and feeling.Recently, a relatively new technique, near-infrared spectroscopy (NIRS), has beenused to measure changes in cerebral oxygenation in human subjects [7]. Changes in

Page 10: LNCS 3166 - iFP: A Music Interface Using an Expressive Performance …jacob/250aui/ifp-performance-template.pdf · 2011-11-12 · iFP: A Music Interface Using an Expressive Performance

538 H. Katayose and K. Okudaira

oxyhemoglobin (HbO) and deoxyhemoglobin (Hb) detected by NIRS reflect changesin neurophysiological activity, and as a result, may be used as an index of brainactivity. It is reported that the brain in the Fz area is deactivated (HbO decrease),when a subject is relaxed, in meditation, or in immersion in playing games. Wemeasured brain activities around the Fz area when the subjects played with the iFP, ordid other musical tasks (control tasks).

Fig. 8 shows the results of experiments investigating the effects of using anexpressive performance template and comparing input interfaces, for a subject whoanswered, “The expressive performance template contributes to both expressivenessand controllability.” educated in music, and received her Master of Music degreefrom a music university. We can see the decrease in HbO, when the subject playedwith the expressive performance template. The decrease was most salient with theexpression template and using the conducting interface. These results correspond verywell to the reports of the subjects’ introspection regarding pleasantness. The right datawere obtained by chance. It is interesting to see the response of the subject whensomething unexpected happened.

With Expression Template Interface: Conducting Sensor Error

task

0 2 min.

task

0 2 min.

task

With Expression Template Interface: Keyboards

task

0 2 min.

Fig. 8. Brain activity measured with NIRS. Measured during playing “When You Wish Upon AStar” using iFP. Arrows show the duration of the performance. A single emitting source fiberwas positioned at Fz. The red line shows the sum of Oxi-Hb.

Fig. 9. Brain activity measured with NIRS. Measured during listening and playing with the iFP.

Fig. 9 is a comparison with other music activities. HbO was lower when the subjectlistened to the music carefully imagining the next musical progressions, and playedwith the iFP (subject A). The decrease was more salient with the iFP. The right data

Page 11: LNCS 3166 - iFP: A Music Interface Using an Expressive Performance …jacob/250aui/ifp-performance-template.pdf · 2011-11-12 · iFP: A Music Interface Using an Expressive Performance

iFP: A Music Interface Using an Expressive Performance Template 539

were obtained when the subject was shaking her hands without playing with the iFP.These results also correspond very well to the reports of the subjects’ introspection.

We also conducted experiments regarding the learning effect for a subject whoanswered, “The expressive performance template contributes to expressiveness butnot to controllability.” This person is an experienced organist. In the first trial, therewas no salient feature to the HbO while the subject used the conducting sensor, andthe HbO fell while the subject used the keyboard. After the subject learned playingwith the iFP for about one month, the behavior of HbO while using the keyboard wassame as in the first trial. In contrast, the HbO increased while using the conductingsensor. The subject reported that it was difficult to control the conducting sensor. Itseems that the subject became to be obliged to “think” how to control the conductingsensor. We should thus continue our experiments to assess the learning effect in thisregard.

Although the interpretation of deactivation at Fz itself is still controversial [8], wecan say that the results for subjects evaluating their own introspection when using theiFP match with the NIRS observation of brain activity.

In summary, the experimental results reported here suggest that 1) use of theexpression template and 2) use of gestural input interfaces, contribute to a player’sexperience of pleasantness when performing a musical activity.

6 Conclusion

This paper introduced a performance interface called iFP for playing expressive musicwith a conducting interface. MIDI-formatted expressive performances played bypianists were analyzed and transformed into performance templates, in which thedeviations from the printed notation values are separately described. Using thetemplate as a skill-complement, a player can play music expressively over and underbeat level. The scheduler of iFP allows the player to mix her/his own intension withthe expressiveness in the performance template. The results of a forty-subject userstudy suggested that using the expression template contributes to a player’s joy ofexpressing music. This conclusion is also supported by the results of brain activitymeasurements.

We are just beginning our experiments using NIRS. We would like to trace thechanges of the subjects' introspection and brain activity, as they learn to play with theiFP. We are also interested in investigating interactions between brain regions whenthe subjects are playing music. We would like to propose a design guideline for musicgames, based on further investigations. Another important task is to provide moredata to be used in iFP. So far, a tool to produce a performance template from MIDI-formatted data has been completed. We would like to improve the tool, so it canconvert acoustic music into an expressive performance template.

Acknowledgement. The authors would like to thank Mr. K. Noike, Ms. M. Hashida,and Mr. K. Toyoda for their contributions to the study. Prof. Hiroshi Hoshina and Mr.Yoshihiro Takeuchi made valuable comments, and they were very helpful. Thisresearch was supported by PRESTO, JST, JAPAN.

Page 12: LNCS 3166 - iFP: A Music Interface Using an Expressive Performance …jacob/250aui/ifp-performance-template.pdf · 2011-11-12 · iFP: A Music Interface Using an Expressive Performance

540 H. Katayose and K. Okudaira

References

1. Mathews, M.: The Conductor Program and Mechanical Baton, Proc. Intl. Computer MusicConf. (1989) 58-70

2. Morita, H., Hashimoto, S. and Ohteru, S.: A Computer Music System that Follows aHuman Conductor. IEEE Computer (1991) 44-53

3. Usa, S. and Mochida, Y.: A Multi-modal Conducting Simulator,” Proc. Int. ComputerMusic Conf. (1998) 25-32

4. Nakra, T. M.: Synthesizing Expressive Music Through the Language of Conducting. J. ofNew Music Research, 31, 1 (2002) 11-26

5. Katayose, H., Okudaira, K.: sfp/punin: Performance Rendering Interfaces usingExpression Model, Proc. IJCAI03-workshop, Methods for Automatic Music Performanceand their Applications in a Public Rendering Contest (2003) 11-16

6. Toyoda, K., Katayose, H., Noike, K.: Utility System for Constructing Database ofPerformance Deviation, SIGMUS-51, IPSJ (2003 in Japanese) 65-70

7. Eda, H., Oda, I., Ito, Y. et al.: Multi-channel time-resolved optical tomographic imagingsystem. Rev. Sci. Instum. 70 (1999) 3595-3602

8. Matsuda., G., and Hiraki, K.: Frontal deactivation in video game players, Proc. Conf. ofIntl. Simulation And Gaming Assoc.(ISAGA) (2003) 799-808