MIDI routings and layers/splits

MIDI routings and layers/splits

I am currently a brainspawn forte user, and I am very interested in your product to replace my current setup.
Is it possible to , for example, connect few instances of some VSTis (like kontakt, b4, etc) and switch the midi input to them on the fly?
Which configuration would you recommend? of course I do not want to re-load samples in the middle of a song, so it would be nice to change only the midi routing and final mixer.

Hi Luigi,

I think what you want to do would work perfectly. MIDI input routings are not dynamical but there are several mechanisms to cope with that:

1) MIDI keyboard focus (part of the control focus feature)
2) Scene snapshots
3) Live-sets

With (1) you just need to tell RP in the config what MIDI channel your MIDI keyboard is connected to. Once it is setup, the module having control focus (green frame around its GUI) will receive keyboard input automatically. You can change the control focus by defining keyboard mappings: display the context menu for the module and select "quick keyboard map..." then press a key. Afterwards pressing that key will set the control focus to the module and you can start playing it right away.

If you don't want to use your computer keyboard (or the mouse) you can use (2). Just configure your N instruments to receive MIDI from your keyboard(s) channel(s) (in context menu select "receive MIDI" and the correct channel) At this point they will all play in unison when you hit your keys, this is obviously not what you want. Then you will need to "mute" all instruments excepted one, then save a scene snapshot. Then un-mute another instrument and mute the other and save a second scene snapshot, and so on. This way, whenever you recall a scene snaphot, only a single instrument will be active and will play (the others being muted, they will not receive the MIDI note on messages)

You can then switch scene snaphots on the fly using your MIDI controller or your computer keyboard (ex: create a quick keyboard mapping by right clicking on the scene snapshots floating window, "quick keyboard map"...then once the control port is selected, you can change snapshots using your numeric pad numbers and/or plus and minus keys)

If you can afford a few seconds of interruption between your songs, then with (3) you can load a completely different document (including patch, samples and midi mappings etc) This way each document can have a different instrument mapped to your MIDI input.

Alright I think it should cover your needs one way or another.
Live Factory Team
thanks for the reply. What would you suggest for layering/midi splitting? Say that in a single song I have:
- intro: piano+strings
- verse: pads+ some weird effect split at the top
- rit: some other sound layered with piano, and some heavy bass in the low keys
and so on. How could I accomplish this?

Many thanks for your time
It should be possible but maybe not with the flexibility you would like. In the same way as the MIDI routings, the built-in MIDI effects are not dynamical (ie you configure them once for the document / song)

You can create splits / layers / merges (and now also velocity layers / splits/ compression / expansion since version 1.1) and it will work as expected. But it will become problematic if you re-use one instrument (ex: the piano in your example) and want it configured differently for different parts of your song.

We could improve that aspect in the future if there is enough demand. But as it is right now, you would have to do:

- For the piano: set its "notes range" MIDI effect (context menu > MIDI effects > notes range) so that it only plays for the mid to high octaves
- For the strings: do the same as for the piano
- For the heavy bass: set its "notes range" to play only the low octaves (note that you can then further remap the received notes to another octave using the "notes transpose" MIDI effect, such that for instance instead of playing octave 1 to 3 it would play 4 to 6, even if technically you'll play it with the 1 to 3 keys)
- For the verse part: do the same to split between the pads and the weird patch

Then create a "intro" scene snapshot where all instruments are muted excepted the piano and strings. Another snapshot for "verse" with all muted excepted the pads and the weird patch. And so on...

Note that if the various sounds are just programs of the same instruments / synths, the scene snapshots are also able to recall presets and programs properly. But as I said, re-using instruments could be problematic because the MIDI effects setups are static.

That being said, instruments use in general very little CPU when they don't receive notes and are not playing. So you could load many instruments side by side and use every one of them to just play a single patch / part, on demand (as soon as they are muted they don't receive any MIDI notes)
Live Factory Team
well, muting instruments is not an option for me, since I do not want cut the tails of the sound.
If was aiming
to use a combination of some vst midi plugin, like

http://www.fsynthz.com/F_Prog8.htm (really like the crosschannel overview)
http://kx77free.free.fr/English-page-vst.html (kx-midi-filter)

to build a patchbay like this ones:
http://img.informer.com/screenshots/56/56051_2.png (<< see the right midi panel)

with this hypotetical setup, i would have:
- midi in from the master keyboard, always firing to ch1
- 1 big "midi router/splitter/transposer" that changes its setup at every sound change (1 config for the verse, 1 for the rit, 1 for the intro, etc)
This would act as a central point of organization for the patches
- N VSTis that are never muted and do not change their preset. They simply play when the corresponding midi in fires
- a final mixer that again, changes its setup at every sound change (because maybe the "piano" sound should be louder in the intro than in the verse

What do you think? Is that feasible? Do you already have such a midi processor in your product? If not, may I suggest one :P
This is actually not a bad idea at all. We could provide such a live programmable "MIDI router/hub" module relatively easily.

Meanwhile you can use the VST MIDI effects you listed. You might have to create several instances of them and hook them all to your single MIDI input. Then each instance would send MIDI to one of your instruments. You would use scene snapshots to recall presets of these MIDI effects, muting and unmuting them on demand, and in this case you would still get the tails of your sounds.

Another solution we are considering would be to introduce "control focus groups", in the same vein as the current monitor and record groups. You would define "slots" containing one or more control focus targets, and you would be able to switch them on the fly using a mapping of your choice (MIDI or keyboard) This way, using the MIDI keyboard control focus, you could redirect automatically your master keyboard to one or many targets, and again you wouldn't lose your tails.

Either or both of these solutions could be implemented in the next release, but you might already try using 3rd part VST MIDI effects until it's ready.
Live Factory Team
I have a couple of upcoming gigs so I do not have time now for these tests :( The Ui idea is very nice, expecially if coupled with a touchscreen monitor to have a programmable controller.
I will send you a more rich review when I will stress-test it a bit

I have spent ~ one hour with the demo: I admit that it is not much, but the user interface is great, and kills every other modular stuff I have tried (like bidule, audiomulch, etc). Really nice and effective as a live environment!

A big showstopper for me is the lack of explicit MIDI connections, i.e. all the midi routing is done via context menus instead than via connections (like the "white" wires of bidule).
My main reason to change host and go into the modular world is to have 100% control on my midi routings, so sadly the lack of this feature is a big issue for me :(
Really it shouldn't be a show stopper at all as the functionality is the same, you still have 100% control but the decision we made (on purpose) is well motivated and will allow to implement some nice features later, including the dynamic MIDI router module you suggested.

In some prototypes we had the MIDI routing done in "Patch" mode but we were not satisfied by the result. Not only does it make the patch more confusing, to the point we considered having a "audio" vs "MIDI" patching mode, but it was also not user friendly and not good looking. With a large patch you had to drag MIDI cables across a large distance to reach the top of the patch where the MIDI input pins were materialized. It was impractical and not good visually, as all cables converged to the same location, with no relation to the modules layout which is dictated by their audio links mostly.

We want to keep the patching experience as simple as possible because it's very daunting for some users, this is why we decided to have it handle audio only, and everything control-related (including MIDI) happens in "Edit" mode.

The functionality is exactly the same as you can have 1:1 and 1:N MIDI routings (many modules can receive from the same input) For MIDI control mapping you have the quick MIDI map feature which has to happen in "Edit" mode (because you start it from the controls on the module GUIs) And MIDI output is done using the special "External MIDI gear" module as it makes more sense this way in our architecture.

As it is those MIDI routing links are static but the fact they are defined in "Edit" mode will allow to make them dynamical in the future. Currently all "controller" class internal modules use this dynamical mechanism (check the "LFO" controller module for instance) This means that presets and scene snapshots can recall different targets, meaning that the same sequencer or modulation source can control target A for preset A, then target B for preset B.

This is very useful because you need to map your controller class module only once to your hardware controllers and you can re-use them throughout your performance. This is this exact same mechanism that would be used if we implement the MIDI hub/router module you suggested.

I hope this was clear enough, but really you are not losing any bit of control and you are getting more possibilities instead in fact.
Live Factory Team
Clearly from you answer it seems that you are targeting people that will have complex audio paths but relatively "simple" midi ones, since you cannot remember all the settings/connections you did via the context menus.

My use case is probably an odd one, but it is exacly the opposite :) as a vsti player, the audio path will be as simple as VSTis -> mixer -> final VST effects -> audio card output, while in the midi area I am planning to add complex stuff, to help me in playing complex studio-like parts live (like pressing a footswitch and having a bunch of automations, arpeggiators, hold notes with reset after a timeout, etc). In this respect, I totally like the idea of having two views, MIDI patch and audio patch, so that the user chooses what he wants :)
Well the idea would be that with such a dynamic MIDI router, you don't have to remember the connections and channel numbers, you would just have to create named presets or scene snapshots for that. Thank you for your input, we will see what we can do to support your use case, which is indeed quite advanced.

By the way, it is possible that you could achieve what you want already with the current version by using 3rd party VST MIDI plug-ins.
Live Factory Team

Subscribe to our newsletter