There might be a roundabout way to do this, but at present all I can do is a bit of guessing . . . I think that NOTION 3 can send MIDI commands in real-time, which is done by its MIDI Out functionality, which supports four MIDI ports or devices (A, B, C, D) and some number of channels per port or device . . .
In NOTION3 you can configure up to four external devices that can provide instrument sounds in response to MIDI commands. You inform NOTION3 about these devices in Preferences and assign each device a letter, A through D, as a shorthand description.
The NOTION 3 User Manual information on this is a bit sparse, as is the information about using channels, but depending on the way you read the documentation, it appears to be possible . . .
f you like, you can have NOTION3 read a score and translate notation and some score markings for one or more parts into MIDI signals. The external equipment then reads these signals to produce certain instrument sounds, instead of NOTION3. Since the sound is generated at the external device and not at your computer, many studio setups include an external mixer, digital audio workstation, or other audio equipment that can “sum” together the sounds from NOTION3 with the sounds from the MIDI device.
Similarly, the Kontakt 5 Application Reference specifically mentions that what it calls an "Instrument Bank" can be configured to interact with a specific MIDI port, channel, or whatever . . .
Banks allow you to combine up to 128 Instruments into a container that responds to a single MIDI input channel; you can then switch the active Instrument by sending MIDI program change messages on this channel. This allows you to create General MIDI-compatible sound sets, or combine Instruments that contain various articulations of the same acoustic instrument into one slot. A typical example of an Instrument Bank would be a number of violin Instruments that contain legato, detaché, staccato, and pizzicato Samples, respectively, with the different articulations and playing techniques being switchable via program change messages.
THOUGHTSOne of things I always do is to look for what one might call "unusual" ways to do stuff, which from the perspective of Computer Science involves discovering things that software will do but are not so obvious, and sometimes these things are made difficult by design, which typically is done by the folks who design and program operating systems, where for example Mac OS X and Windows generally make it very difficult to have a bit of FUN with self-modifying programs, but there is a way to do it in Windows, at least through Windows XP when you can create VBScript scripts in real-time on the fly and then run them, where another way is to create your own interpretive programming language in Visual Basic, but even though there are ways to do it, the operating system folks tend to make it difficult toward the goal of discouraging folks from doing it, since once you discover how to cause a computer to become intelligent, everything gets a bit surreal . . .
[
NOTE: I like to design computer programs that create computer programs and then run them, but when an operating system allows this to be done easily, it can be difficult for the operating system to maintain control, since programmers are not always so skilled in controlling recursion, hence the primary reason that it usually is discouraged vigorously by design. It was difficult to do in Windows until VBScript appeared, but so long as you could jump into MS-DOS (command line mode) for a while, it was not so difficult, especially when combined with hooks, really . . . ]
So, depending on the way one reads the bits of information I quoted from the NOTION 3 User Manual and the Kontakt 5 Application Reference (
see above), I do
not exclude a few possibilities, and I am intrigued by some of them, which might not be so difficult to test, once I make a bit more sense of MIDI . . .
On the other hand, I am not convinced that articulations actually need to be specific for particular instruments, since this makes no intuitive sense if you think about it for a while . . .
Obviously, doing staccato on a violin is different in terms of playing technique from doing staccato on a trumpet, but I think the music notation is the same . . .
If you look at the way Miroslav Philharmonik (IK Multimedia) works, each instance can have as many as 16 "instruments", each of which is assigned a "part" and "channel", and without creating any custom rules, if you watch it responding to articulations in the music notation, you will observe that, depending on the specific articulation, a particular "instrument" will do audio generation for the specific note(s) . . .
I have not done a lot with Kontakt 5, but this appears to be similar to the way an Instrument Bank works, so one possibility is
not to use any specific Kontakt 5 custom rules, but instead to determine what NOTION 3 sends to Kontakt 5 for different articulations, where you then can create an Instrument Bank that has the correct set of articulations to respond appropriately to whatever NOTION 3 sends, although perhaps not . . .
The other way, which I think is more complex and might not work, is to use MIDI Out to control Kontakt 5 directly when it is running in standalone mode, which might work, although perhaps not . . .
On the Mac, which is where I do everything, if NOTION 3 can control Kontakt 5 via MIDI Out ports and channels, devices, or whatever, then the only problem I envision at the moment is getting the Kontakt 5 generated audio into a Digital Audio Workstation (DAW) application (for example, Digital Performer 7.24 [MOTU]), where the reason is that since NOTION 3 is not generating the audio, the audio needs to come from Kontakt 5 and to be sent to the DAW application, which probably is the case with the instruments for which NOTION 3 is generating audio, where for these instruments the NOTION 3 generated audio can be sent to the DAW application via ReWire . . .
One of the problems is that Kontakt 5 does
not do ReWire, but Kontakt 5 can operate as a VSTi virtual instrument in a DAW application, if the DAW application supports VSTi virtual instruments, in which the problem is a matter of determining whether NOTION 3 can send commands via MIDI Out to Kontakt 5 when Kontakt 5 is functioning as a VSTi virtual instrument for a DAW application, although I suppose it might be possible for the DAW application itself to receive the MIDI Out commands from NOTION 3, but at this point the entire thing is a bit mind-boggling, and until I have time to do a few experiments, I simply do
not know . . .
Regarding custom rules, this line from the NOTION 3 sample custom rules file is intriguing:
- Code: Select all
<plugin id=""> <!-- Enter PluginID here or use <midi-out id="X"> where X = A,B,C or D for MIDI out port rules-->
And this might be a useful clue, where instead of focusing the Kontakt 5 custom rules on the plug-in ID, focus it on MIDI Out ports . . .
If you read it literally without doing a lot of thinking, it tends to suggest that you can make articulations specific to a MIDI Out port (A, B, C, D) . . .
There is another bit of interesting information in the NOTION 3 User Manual, which makes a lot of presumptions regarding specific knowledge of complex sound production systems, and it is the part that refers to being able to use the strategy of sending NOTION 3 stereo groups, or alternatively up to 32 stereo channels, to external sound processing equipment, where everything is mixed externally, and as best as I can determine at present his is something which might be practical to do with a MOTU 828mk3 Hybrid audio interface, which I have, since it has its own mixer (CueMix FX), in which case the problem continues to be determining how to redirect (a) the audio generated by Kontakt 5 when it is running in standalone mode but is being controlled via MIDI Out from NOTION 3 to (b) the external mixer (CueMix FX) or to a DAW application when everything running on the same computer, except for the external processing that the MOTU 828mk3 Hybrid audio interface does, of course . . .
SUMMARY At present, all this stuff is mind-boggling, but I am beginning to envision a strategy that might work, and while it might not work so easily with Kontakt 5, it might work easily with Reason 6.5 (Propellerhead Software), since getting the Reason 6.5 generated audio into Digital Performer 7.24 is easy, since Reason 6.5 supports ReWire, and I already have verified that it works nicely with Digital Performer 7.24 . . .
Explained another way, there might be a way to do what you want to do, but it probably is a bit different, and the strategy I use in this scenario is to start doing experiments toward the goal of discovering how everything works, which (a) is both annoying and frustrating; (b) typically takes several months; but (c) usually maps to discovering something useful, which I call a "workaround" . . .
And it is important to understand that a "workaround" usually is different from what you intuitively wanted to do originally, but it solves the problem, which is my primary focus here in the sound isolation studio, since while stuff might not work the way I would like it to work, so long as I can devise a "workaround", then I am happy, which is fabulous . . .
Fabulous!