How can it be used in the class?
The interface is part of the lesson plan of the teacher. It replaces the traditional blackboard. Explanations and traditional methods to use can be displayed simultaneously by the movement of the arc on the visual interface. The student can then internalize the process of trying and additionally get help and feedback from the teacher.
Process, Visualization and Listening: The student, during the hearing and by the visual impression try to imitate the motion (without bow).
Exercise: You can practice the movement alone or along with music. In the proposed interface, there will be vertical lines which would indicate the pulse.
Results: the student can record and watch and listen to what he has done.
After all exercises to get ratings, which function according to cumulative way. At the end of the level achieved is represented by a color.
There are so use the Violin Painting the possibility of online exchanges between students. Is recommended for students in the same school, the same teacher, or for those who live in a city because the motivation can be enhanced through the competition.
The method is already developed. Now its clear how its going to be used in the classroom as well as in the daily practice. Now its necessary to organized all the music material we need. Which music are we going to use? How is the best way to put all the information we want to work with in a short musical piece?
We need to make a recording in the next days and then assemble it to the interface.
Based on previous observation and tests, now the visualisation still needs more tuning, but is working with some changes such as:
Due to lack of space on Vimeo, click on the following links to look at the videos:
During the latest sessions we had, we aimed to fine tune the Visual response to the violinist’s bow strokes so they could be better represented on the interface.
Basically this prototype has 3 lines which represent the original sample (green) which should be followed by the other 2 (purple) which try to catch up with the sample, so you can can have kind of a “Karaoke” interaction; although properly defining the user experience, interface design and player interaction is the next step.
We faced different challenges:
1. Define coordinate system: Find out how should we work with the coordinate system. (We are visualising in 2d while we are tracking 3d movements).
2. Do the math: make linear transformations for every value you want to work with.
3. Define the origin and reference of the system: One size fits all! It should work either the player is short or tall, e.g.
4. What’s best, to work with position information, or speed change information? actually, both, but apparently speed changes work best to easily draw and fine tune the particles and their changes in direction and speed.
5. Are we doing an adequate translation of the bow strokes into the interface? We’re getting there we think! we want to properly achieve a proper sense of the metaphore between a bow stroke and a brush stroke.
[ylwm_vimeo]56986051[/ylwm_vimeo]
We have been trying the different effects which can be used in the interface related to the movements of the right hand (to measure the violin bow hold). We base our work in the 5 different bow strokes from the last session. The movement of the right hand is not always properly reflected on the interface; after different tests, we realized we were not analyzing the depth factor. Our next attempts wil be focused into increasing the capacity to capture better the movements. There are some videos that will come next time!
[ylwm_vimeo]56952054[/ylwm_vimeo]
We started to analyze the sound of the open violin strings: G, D, A and E. The frequencies are faraway from each other, we supposed would easier to pick them up. Later we tried to analyze consecutive notes a little bit faster. The result was positive, the program is collecting the different frequencies without problems.
How does this process work?
We have a feature extractor analyzer which captures sound through a standard laptop microphone, would be possible also to do it with any other one. We focused on the “ Raw Midi pitch” which basically tell us at which frequency is the violin vibrating. This midi-data is been translated into OSC (Open Sound Control) messages which can be later used to control visual environments in order to produce sound visualization.
The same thing is happening which the Kinect. Different features from motion are captured such as joints’ positions and speeds from both arms (hand, elbow and shoulder) are also being translated into OSC messages. All this information can now used and read by sound synthesis or animation software such us “Processing” or “Quartz composer”.
The next steps include translating information into visual representations from both, movements and sound.
[ylwm_vimeo]56952053[/ylwm_vimeo]
Our research has already started. We have tried two different softwares to work with the Microsoft Kinect, Ethno Tekh’s Ethno Tracker and FAAST. While Talía played the violin, we observed the movements and parts of the body that each skeleton tracker detects. Elbow, wrist and hand were to be analysed. Having tested different bow strokes from different perspectives, we noticed that the Ethno Tracker collected the signal better when the violin player was in front of the machine. Here we have the results of the different bowing techniques:
The results were quite satisfactory. We have already seen what is possible to analyse with Kinect. The next step is trying to train the machine to recognize the bow strokes for itself. This is step is not compelling yet to draw the graphics in the interface.
Let’s keep up the good work!
Some examples of what has been done so far in this field…
There are different references which work as mobile apps/ sort of video games:
Rehearsal: An app for practicing musicians
All of the previous references were designed as mobile applications, and as stand alone not collaborative apps.
The use of these apps in pedagogic work and their efficacy still has to be further researched and analysed.
Bjork’s Biophillia Album App Well known application, very interactive and visually compelling although focused on the artist’s album.
In order to be able to come up with an idea of how to start working out the relations between visuals and sound, first we had to get deeper understanding of sound, but specially from the relations between the physical aspects of sound, digital sound, acoustics and the musical sound.
Summary
A theoretical understanding of sine waves, harmonic tones, inharmonic complex tones, and noise, is useful to understanding the nature of sound. However, most sounds are actually complicated combinations of these theoretical descriptions, changing from one instant to another. *For example, a bowed string might include noise from the bow scraping against the string, variations in amplitude due to variations in bow pressure and speed, changes in the prominence of different frequencies due to bow position, changes in amplitude and in the fundamental frequency (and all its harmonics) due to vibrato movements in the left hand, etc.
Digital representation of sound (As explained in the MAX/MSP help guide)
“To understand how a computer represents sound, consider how a film represents motion. A movie is made by taking still photos in rapid sequence at a constant rate, usually twenty-four frames per second. When the photos are displayed in sequence at that same rate, it fools us into thinking we are seeing continuous motion, even though we are actually seeing twenty-four discrete images per second. Digital recording of sound works on the same principle. We take many discrete samples of the sound wave’s instantaneous amplitude, store that information, then later reproduce those amplitudes at the same rate to create the illusion of a continuous wave.”
But we also needed to understand its limits and advantages:
Sound explained from it’s acoustic properties:
*Pitch: is a perceptual attribute of sounds, defined as the frequency of a sine wave that is matched to the target sound in a psycho- acoustic experiment. If the matching cannot be accomplished consistently by human listeners, the sound does not have pitch.
*Fundamental frequency: is the corresponding physical term and is defined for periodic or nearly periodic sounds only. For these classes of sounds, fundamental frequency is defined as the inverse of the period. In ambiguous situations, the period corresponding to the perceived pitch is chosen.
*Melody: is a series of single notes arranged in a musically meaningful succession.
*Chord: is a combination of three or more simultaneous notes. A chord can be consonant or dissonant, depending on how harmonious are the pitch intervals between the component notes.
*Harmony: refers to the part of musical art or science which deals with the formation and relations of chords.
*Harmonic analysis: deals with the structure of a piece of music with regard to the chords of which it consists.
*Musical meter: this term has to do with rhythmic aspects of music. It refers to the regular pattern of strong and weak beats in a piece of music. Perceiving the meter can be characterized as a process of detecting moments of musical stress in an acoustic signal and filtering them so that underlying periodicities are discovered. The perceived periodicities (pulses) at different time scales together constitute the meter. Meter estimation at a certain time scale is taking place for example when a person taps foot to music.
*Timbre, or, sound colour, is a perceptual attribute which is closely related to the recognition of sound sources and answers the question “what something sounds like”; Timbre is not explained by any simple acoustic property and the concept is therefore traditionally defined by exclusion: “timbre is the quality of a sound by which a listener can tell that two sounds of the same loudness and pitch are dissimilar”.
The four aspects of musical sound:
After all, music is actually and simply sound. When we arrange these characteristics in such a way that we find it “pleasing” to listen to we call that music, although the term “pleasant” its closely related to the subjectivity in perception.
(1) Pitch – the highness or lowness of sound
(2) Duration – the length of time a musical sound continues
(3) Intensity – the loudness or softness of a musical sound
(4) Timbre/Tone color – the distinctive tonal quality of the producing musical instrument.
Explained in more depth here
Recent Comments