You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Authors: Gonzalo Nieto[@gonznm](https://github.com/gonznm) and Teresa Pelinski[@pelinski](https://github.com/pelinski).
1
+
# Wekitunenator 3000
2
+
Authors: Teresa Pelinski[@pelinski](https://github.com/pelinski) and Gonzalo Nieto[@gonznm](https://github.com/gonznm).
3
3
4
-
Wekitunenator 3000 is our final project for the Advanced Interface Design course of the Sound and Music Computing Master at the Music Technology Group, in Universitat Pompeu Fabra (Barcelona). We built an interface that:
5
-
* controls audio effects in real time, intended for voice.
6
-
* is controlled with the user’s hand (using a camera) and a midi controller.
7
-
* goes beyond a control paradigm and uses machine learning (ML) as an exploratory tool.
4
+
Wekitunenator 3000 is our final project for the Advanced Interface Design course of the Sound and Music Computing Master at the Music Technology Group, in Universitat Pompeu Fabra (Barcelona).
8
5
9
-
Wekitunenator is inspired by [Sonami and Fiebrink’s discussion on instrument creation with ML](https://www.nime.org/proceedings/2020/nime2020_paper45.pdf). Instead of being just used to control, ML can provide access to surprise and discovery in mapping creation. Furthermore, mapping can be part of the instrument itself, not independent from it.
In our instrument, Wekinator maps hand positions and midi controller values to Reaper, the Digital Audio Workstation in charge of applying effects to the input signal in real time.
12
-
These programs communicate through OSC messages, which are handled by a Processing sketch.
8
+
Wekitunenator is an instrument that applies sound effects in real-time to the user's voice. The applied sound effects are selected with a MIDI controller and the parameters regulating them are modified with the user's hand movements. Machine learning is used in mapping hand movements to sound effects values. However, this instrument tries to move away from the conventional notion of machine learning (ML) as a control paradigm and intends to use ML as a discovery tool. This was inspired by Sonami and Fiebrink's paper [Reflections on Eight Years of Instrument Creation with Machine Learning](https://www.nime.org/proceedings/2020/nime2020_paper45.pdf), presented in NIME 2020.
13
9
14
-
Therefore, there are four main parts:
15
-
1. Hand tracking – made by HandPose-OSC and Processing
16
-
3. OSC messages handling – done by a Processing sketch
17
-
2. Mapping creation – Wekinator
18
-
4. Audio and effects managing – handled by Reaper
10
+
For modeling the mappings between hand positions and effects parameters, Wekinator is used. The model can be trained directly using the MIDI controller by selecting the effects (with buttons) and the parameters values (with knobs) and hitting on the `REC` button. The `REC`, `TRAIN` and `RUN` buttons control the MIDI interface (if no MIDI controller is available, the buttons in the screen interface can also be controlled with the mouse). This way, the user can dynamically train and use the model without having to use the keyboard or mouse. When running the model on unseen combinations of effects, the mappings and hence, the resulting sounds, will be unpredictable.
<pstyle="text-align: center;">(icons created by <ahref="https://thenounproject.com/eucalyp/">Eucalyp</a> under CCBY license)</p>
23
-
24
-
25
-
Watch a short demo and explanation [here](https://youtu.be/M6bAb42s-lo).
12
+
The hand positions are tracked by a lightweight version of [@faiip](https://github.com/faaip/)'s [handPose-OSC](https://github.com/faaip/HandPose-OSC). The code can be found [here](https://github.com/gonski/HandPose-OSC). The communication between this model, the MIDI controller, Wekinator, and the Digital Audio Workstation (Reaper) is handled by the Processing sketch `handleOSC`.
This application will start processing information from your camera, to extract the points of your hand. Now you can open the OSC handler.
24
+
This application will start processing information from your camera in order to extract the points of your hand. Now you can open the OSC handler.
37
25
38
26
### OSC handler
39
-
Double click on `handlerOSCapp`, which will run a java application. If HandPose-OSC is also running, you will be able to see the points of your hand –if you are showing it to your camera– in green, on top of a black background. Something like this:
27
+
Run `handlerOSC.app`. If HandPose-OSC is also running, you will be able to see the points of your hand in green over a black background.
Select your MIDI input device and press `UPDATE`. The FX switches are controlled by MIDI pitch indexes 0, 1, 2, 3 and 4. The `REC`, `TRAIN` and `RUN` buttons can be modified with MIDI pitch indexes 5, 6 and 7.
31
+
Select your MIDI input device and press `UPDATE`. The FX switches are controlled by MIDI pitch indexes 0, 1, 2, 3 and 4. The `REC`, `TRAIN` and `RUN` buttons can be modified with MIDI pitch indexes 5, 6 and 7. If you don't have a MIDI controller device, you can also control Wekitunenator clicking on the dark blue buttons. If you want to be able to modify the effects parameters during training using knobs in your controller, you will probably need to change the values of `knobCCs`in `handlerOSC.pde` and assign them to the corresponding ones from your controller.
44
32
45
33
### Wekinator
46
34
You can download Wekinator [here](http://www.wekinator.org/downloads/). Once downloaded, open the program and select `File>Open project...`.
@@ -58,4 +46,24 @@ You will be able to see that this Wekinator project receives 27 inputs –hand b
You can download Reaper [here](https://www.reaper.fm/download.php). Once it has been installed, configuration instructions can be found in this [document](./Reaper/README.md).
49
+
You can download Reaper [here](https://www.reaper.fm/download.php). Once installed, open `wekitunenator\Reaper\liveInput_FXs.RPP`. You may need to modify your audio device input/output configuration `Preferences>Audio>Device`.
50
+
51
+
#### Install FXs
52
+
Download and install Graillon 2 Live Changer free edition [here](https://www.auburnsounds.com/products/Graillon.html), in order to have auto-tune.
53
+
54
+
Load user FX presets by dragging and dropping the *presets.ReaperConfigZip* file from Explorer or Finder into REAPER's arrange window.
55
+
56
+
57
+
#### Buffer size
58
+
In order to obtain a good real-time performance you should modify the default buffer size in `REAPER>Preferences>Audio>Device`. Check the `Request block size` box and set it to 64. You may need a larger size if you hear clicks, it depends on your computer.
* In the `Control Surface Settings`window, in the `Control surface mode` dropdown menu, select `OSC (Open Sound Control)`. In the `Pattern config` menu, select `Processing`. In `Mode`, select `Local port`. Then set `Local listen port` to `6449` and tick the option `Allow binding messages to REAPER actions and FX learn`. You won't need to modify the local IP. Your configuration should look like that (the local IP might look different):
0 commit comments