Skip to content

Commit 750b31b

Browse files
committed
style fixes
1 parent 212dd78 commit 750b31b

File tree

1 file changed

+4
-2
lines changed

1 file changed

+4
-2
lines changed

README.md

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -3,15 +3,17 @@ Authors: Teresa Pelinski [@pelinski](https://github.com/pelinski) and Gonzalo Ni
33

44
Wekitunenator 3000 is our final project for the Advanced Interface Design course of the Sound and Music Computing Master at the Music Technology Group, in Universitat Pompeu Fabra (Barcelona).
55

6-
[![Wekitunenator 3000](http://img.youtube.com/vi/M6bAb42s-lo/0.jpg)](http://www.youtube.com/watch?v=M6bAb42s-lo "Wekitunenator 3000")
6+
<div style="display:block;text-align:center">
7+
<a href="http://img.youtube.com/vi/M6bAb42s-lo/0.jpg"><img src="http://www.youtube.com/watch?v=M6bAb42s-lo"/></a>
8+
</div>
79

810
Wekitunenator is an instrument that applies sound effects in real-time to the user's voice. The applied sound effects are selected with a MIDI controller and the parameters regulating them are modified with the user's hand movements. Machine learning is used in mapping hand movements to sound effects values. However, this instrument tries to move away from the conventional notion of machine learning (ML) as a control paradigm and intends to use ML as a discovery tool. This was inspired by Sonami and Fiebrink's paper [Reflections on Eight Years of Instrument Creation with Machine Learning](https://www.nime.org/proceedings/2020/nime2020_paper45.pdf), presented in NIME 2020.
911

1012
For modeling the mappings between hand positions and effects parameters, Wekinator is used. The model can be trained directly using the MIDI controller by selecting the effects (with buttons) and the parameters values (with knobs) and hitting on the `REC` button. The `REC`, `TRAIN` and `RUN` buttons control the MIDI interface (if no MIDI controller is available, the buttons in the screen interface can also be controlled with the mouse). This way, the user can dynamically train and use the model without having to use the keyboard or mouse. When running the model on unseen combinations of effects, the mappings and hence, the resulting sounds, will be unpredictable.
1113

1214
The hand positions are tracked by a lightweight version of [@faiip](https://github.com/faaip/)'s [handPose-OSC](https://github.com/faaip/HandPose-OSC). The code can be found [here](https://github.com/gonski/HandPose-OSC). The communication between this model, the MIDI controller, Wekinator, and the Digital Audio Workstation (Reaper) is handled by the Processing sketch `handleOSC`.
1315
## Pipeline
14-
<div style="text-align:center"><img src="./.bin/pipeline.jpeg"/></div>
16+
<div style="display:block;text-align:center"><img src="./.bin/pipeline.jpeg"/></div>
1517

1618
## Setup
1719
Tested in MacOS Catalina [10.15.7].

0 commit comments

Comments
 (0)