Attention:

Welcome to the old forum. While it is no longer updated, there is a wealth of information here that you may search and learn from.

To partake in the current forum discussion, please visit https://forums.presonus.com

Lock-Track Feature Request & velocity adjustment

A Forum to Discuss NOTION

Lock-Track Feature Request & velocity adjustment

Postby SouLcRusaDer_kA » Tue May 13, 2014 1:14 pm

ok i dont find the lock-track feature in Notion
and its important/essential for composing & pretty much a time-saver
so i hope Notion team can make it happen soon.

let say i got several tracks in Notion
what is the easiest way to adjust the velocity?
i find that using sequencer overlay to handle velocity is pretty slow when compared to using DAW
(since i prefer writing music in notation style,right now i dont consider composing directly in DAW)
can i adjust it in DAW through rewire? (Notion as slave)
any alternatives would be appreciated :)
SouLcRusaDer_kA
 
Posts: 60
Joined: Sat Nov 09, 2013 10:14 am

Re: Lock-Track Feature Request & velocity adjustment

Postby wcreed51 » Tue May 13, 2014 4:21 pm

Hairpins and dynamic markings would be the standard way
Bill Reed
Notion 4, Sibelius 7.5, Finale 2011/14, Overture 4, Cubase 7.5
Win8 x64, 32GB RAM
M-Audio ProFire 2626
Kontakt, VSL VI Pro, VE Pro, EWQL Orch, Choirs and Pianos
User avatar
wcreed51
 
Posts: 754
Joined: Wed Oct 07, 2009 10:50 am
Location: Berkshires, MA USA

Re: Lock-Track Feature Request & velocity adjustment

Postby SouLcRusaDer_kA » Wed May 14, 2014 8:09 am

wcreed51 wrote:Hairpins and dynamic markings would be the standard way

yes, but using hairpins, dynamic markings are much much slower than using piano-roll event editor
if i get 3 or 4 tracks needed to handle their velocities a bit intensive
then its going to be a disaster if i do in hairpins/dynamic markings/sequencer overlay
velocity.JPG
velocity.JPG (10.59 KiB) Viewed 13141 times

(i sure u know the event editor very well, its just for a better understanding)
....and usually, using MIDI-import requires cleaning some data by hand, and i need to re-load the VST
which is pretty troublesome ..
is that any other way to go?
SouLcRusaDer_kA
 
Posts: 60
Joined: Sat Nov 09, 2013 10:14 am

Re: Lock-Track Feature Request & velocity adjustment

Postby idiotSavant » Thu May 15, 2014 9:16 pm

This is a long-requested feature. Some way to graphically draw velocities would be the bomb.
Michael

Notion 4 Mac
VSL Dimension Strings
VSL Special Edition 1,2, Plus
Miroslav Philharmonik
Apple Logic
User avatar
idiotSavant
 
Posts: 302
Joined: Tue Dec 18, 2012 8:20 pm
Location: San Francisco, CA

Re: Lock-Track Feature Request & velocity adjustment

Postby Surfwhammy » Fri May 16, 2014 3:29 am

These are my thoughts at present on the general ideas of (a) locking tracks and (b) using MIDI sequencer style velocity controls for music notation . . .

[NOTE for SouLcRusaDer_kA: My reply to your original post follows later, which includes a detailed overview that explains how to use both NOTION 4 and the DAW application to create the scenario where you can use the DAW application to edit MIDI velocity at the individual note level. It is an advanced activity, but it is supported. This is a bit complex, so it takes me a while to work through everything logically, hence the short novel . . . :P ]

(1) Locking a track in the music notation universe does not make a lot of sense conceptually, since (a) it is easy to create a copy of a NOTION 4 score via "Save As . . . "; (b) when working with music notation nothing changes unless you specifically change it; and (c) one can "lock a track" by recording its generated audio as a soundbite in a Digital Audio Workstation (DAW) application via a ReWire 2 session, noting that in the grand scheme of everything I consider NOTION 4 to be part of a complete digital music production system rather than a complete digital music production system by itself . . .

Image
Complete Digital Music Production System

(2) Regarding velocity, I think that doing this using traditional marks is the preferred strategy when one is focused on music notation, but I understand the logic which suggests using the MIDI sequencer strategy, at least from the perspective that NOTION 4 ultimately communicates with VSTi virtual instrument engines via MIDI, hence everything happens based on MIDI messages and parameters . . .

But I think there are other considerations, where one consideration involves the activities of arranging, producing, and mixing, with the latter two activities being best performed in a DAW application for a variety of reasons, noting that in some respects arranging is a part of composing and producing, although I usually promote it to higher level, since it is an important activity . . .

With the caveat that producing and mixing strategies and techniques generally are strongly dependent on specific genres, my perspective on the idea of adjusting the velocity of individual notes is that it tends to fascinate and to mesmerize composers who do not understand producing and mixing in the digital music production universe, because even for more traditional musical genres it is vastly frivolous, noting that the frivolity is the direct consequence of the basic rules of acoustic physics, which in the digital music production universe are the defining rules . . .

Explained another way, devoting great attention to the way each note is played makes good sense when one is working with real musicians who are playing real instruments or singing, since in this scenario the most subtle nuances are possible when the musicians and singers are skilled; but when everything switches to virtual instruments, digitized sampled sounds, and MIDI, there other considerations because it is completely and totally different in every respect and will continue to be completely and totally different perhaps for another few decades, depending primarily on how quickly advances in computing machines and artificial intelligence algorithms occur, where at present the fact of the matter is that the technology is not available in any practical way, if the specific technologies even exist at present . . .

Why do I suggest this?

Great question!

When one attends what I call a "traditional" concert by a symphonic orchestra where there is no sound system and everything is real, the rules are different, and in this scenario music notation and the conductor provide guidance, but ultimately people (musicians and singers) and real instruments are making the sounds in a way that makes it practical to focus intensely on minutiae . . .

However, in the digital music production universe, everything is vastly different and, among other things, one of the basic rules of acoustic physics provides the clue that quite a few key aspects of audio are logarithmic and geometric, which specifically is the case with such things as volume, panning, pitches, tones, textures, harmonies, and so forth, which is the case in both the analog and the digital subverses, where there is a bit more of what one might call "wiggle room" in the analog subverse, but even then everything is done according to the rules of electromagnetism and mechanical physics . . .

As it pertains to volume, this maps to a combination of volume level and perceived loudness, where the basic rule of acoustic physics is that generally for a sound to be perceived as being twice as loud, its volume needs to be increased 10 times, hence the logarithmic unit called the "decibel (dB)", although there are different types of decibels, where one of the key types is a unit of sound pressure level ("dB SPL") and is different from the decibel used for volume sliders in a DAW application or in the NOTION 4 Mixer . . .

The dynamic range of normal human hearing is quite amazing, and based on devoting a bit of attention to doing some difficult calculations involving physics and chemistry, as best as I can determine the human hearing apparatus can detect the sound made by a single electron vibrating, although more specifically the changes in standard atmospheric pressure made by the motion of a single electron, and this is on the extreme pianissimo side of the dynamic range of normal human hearing, where the other side includes considerably more violent pertubations . . .

[NOTE: The logic I used for the calculation is that an electron has a definite size, which I think can be approximately, which is what I did; and then I mapped it to how much air it would move at standard atmospheric pressure. There were some intermediate presumptions and calculations, but I think it makes a bit of sense, especially since it fits nicely in a way consistent in the universe of quantum electrodynamics and the human eye being able to perceive a single photon, since an electron can transform into a photon, more or less, depending on the way one interprets Feynman diagrams . . . ]

In this Feynman diagram, an electron and a positron annihilate, producing a photon (represented by the blue sine wave) that becomes a quark–antiquark pair, after which the antiquark radiates a gluon (represented by the green helix).


Image

[SOURCE: Feynman Diagrams (wikipedia) ]

[The value of the minimal threshold of hearing] has wide acceptance as a nominal standard threshold and corresponds to 0 decibels. It represents a pressure change of less than one billionth of standard atmospheric pressure.


[NOTE: In this usage, "decibels" is "db SPL", as indicated by the information in the second sentence, and this is completely and totally different from the "0 dB" setting of a volume slider in a DAW application or the NOTION 4 Mixer, as is the case with "0 dB" for an external digital audio and MIDI interface like the MOTU 828x or the PreSonus AudioBox 1818VSL, where for the external digital audio and MIDI interfaces "0 dB" maps to setting of "10" on a Marshall electric guitar amplifier with the Nigel Tufnel (Spinal Tap) option, which extends the volume level to "11" or its more colloquial value "+6" . . . ]

Image

[SOURCE: Threshold of Hearing (HyperPhysics) ]

There is another very important aspect to this, and it involves what happens when the velocity parameter is changed for a digitally sampled sound, where specifically this is done as a computation rather than by having the real musician play the note at a different speed or with different force . . .

The digitally sampled sound has an intrinsic or native velocity, but since (a) the corresponding MIDI note event parameter ranges from 0 to 127 and (b) changing the velocity maps to specifying a different value for the parameter, this does not change the way the real musician played the real instrument, hence (a) is entirely arbitrary and (b) is not the same as having the real musician play the real instrument at the desired velocity . . .

Yet another aspect involves sample sound libraries which are not chromatically sampled, which in simple terms maps to only a subset of notes actually being sampled and all the intermediate or non-sampled notes being computed by various algorithms that have logarithmic components, where for example in a sampled sound library where C4 (a.k.a., "Middle C" in scientific pitch notation) and D4 are sampled but C#4 is not sampled, then the sound for C# will be computed typically either (a) by using the sampled sound for C4 and increasing its pitch via computation or (b) by using the sampled sound for D4 and decreasing its pitch via a computation, noting that this is not necessary for chromatically sampled sound libraries where every note is sampled in the 12-tone universe, although if a chromatically sampled sound library is not sampled for 24-tone purposes, then computations are required, as is the case when certain types of fluctuating articulations like tremolo and vibrato are specified but there is no corresponding set of sampled sounds . . .

~ ~ ~ Continued in the next post ~ ~ ~
Last edited by Surfwhammy on Fri May 16, 2014 8:26 am, edited 7 times in total.
The Surf Whammys

Sinkhorn's Dilemma: Every paradox has at least one non-trivial solution!
User avatar
Surfwhammy
 
Posts: 1137
Joined: Thu Oct 14, 2010 4:45 am

Re: Lock-Track Feature Request & velocity adjustment

Postby Surfwhammy » Fri May 16, 2014 3:49 am

~ ~ ~ Continued from the previous post ~ ~ ~

There also is the matter of the computing required to compute the MIDI for each note when the music is played in real-time, and the more articulations, playing styles, dynamics, and so forth that are specified, the more computing is required to create the MIDI for each note and then to send it to the VSTi virtual instrument engine, which also needs to do more computing than for a simple note, especially when this requires switching among subsets of sampled sounds . . .

I think it can be productive to provide dynamic marks for phrases, which can include ramping (for example, crescendo and diminuendo), but when one is focusing on producing and mixing a little bit goes a long way, and too much creates problems, since two of the rules in the digital music production universe are (a) that it is relatively trivial to add signal processing when one switches to producing and mixing in the respective DAW application but (b) that it is very difficult, if not impossible, to remove signal processing when it is embedded in the raw audio of a soundbite or recorded track . . .

For this reason, the strategy I prefer is (a) to use native velocity, as is the case with nearly everything else; (b) to peg the NOTION 4 volume sliders at 0 dB, and (c) to set the panning controls to full panning, although for some techniques I specifically set the panning control values to position instruments in specific locations, since in some producing scenarios it is easier to control the locations of instruments in the DAW application when they are panned in very specific ways in the NOTION 4 Mixer, which causes the panning location information be embedded in the raw audio of the corresponding soundbites, where for reference "soundbite" is the terminology used by Digital Performer (MOTU) for the NOTION 4 generated audio as it is recorded in a ReWire 2 session where Digital Performer is the ReWire 2 host controller and NOTION 4 is the ReWire 2 slave application . . .

If I need an instrument played in a particular articulation, dynamic, or style, then I use the corresponding set of samples rather than attempt to do this indirectly via music notation marks or corresponding MIDI parameters, if there are any, since doing it indirectly maps to using computer algorithms to simulate the articulation, dynamic, or style. It is more realistic when it the note was played by the musician in the specific way, hence the focus on minimizing additional after-the-fact computing . . .

[NOTE: The NOTION 4 Mixer has what I consider to be true stereo panning controls, as contrasted to the simpler and not so desired "stereo balance controls" that most DAW applications provide. Monaural panning controls are true panning controls, but for stereo tracks Digital Performer has stereo balance controls; and stereo balance controls really are volume controls that only change the respective volume levels of the left and right channels rather than do any panning. However, there is a MOTU Trim control that when added to a stereo track makes it a true stereo panning control, which is fine with me. The easy way to understand this is that with a monaural track, the panning control determines where the sound is heard on what I call the "Rainbow Panning Arc", such that for example you can position the sound at far-left, top-center, or far-right, but his is not the case with a stereo balance control, since the left channel always is on the left and the right channel always is on the right, where all you can do is control how loud they are, which is not the same as actually moving the audio in the left channel to far-right or moving the audio in the right channel to far-left, and so forth. Yet another important bit of information is that rule of panning in part are logarithmic, which makes panning a bit more complex than one might imagine . . . ]

Image

And then there is the matter of perceived loudness, which is very different from volume . . .

As explained (see above in the previous post), the general rule is that for a sound to be perceived as being twice as loud its volume needs to be increased 10 times, but there are other ways to achieve this goal, and one of them is to use the Haas Effect, which is a perceptual effect in which two identical sounds appearing in a very rapid sequence are merged into a single perceived sound by the perceptual apparatus of the brain, and this single sound is perceived as being louder than either of the two individual sounds from which is is constructed . . .

[NOTE: There are two things happening, and the first is that the location of the first arriving sound determines the location when the two identical sounds are separated by 1 to 5 milliseonds; but from 5 milliseconds to 30 milliseconds the two sounds merge and are perceived as a single sound, although the first arriving sound continues to determine the overall location of the single merged sound. Depending on the amount of the delay, both the perceived location and perceived loudness will vary, so it is not so absolute for every delay time, but there are delay values that produce very distinct results. And there are at least two ways to have a bit of FUN with the Hass Effect, where one way is to duplicate a track (monaural or stereo) and then to delay the duplicate by 1 to 30 milliseconds but keeping the panning or balance control the same for the original and duplicate tracks, while another way is to use a delay control on the original track without making a duplicate, in which case the mix between "dry" and "wet" for the delay effect is used, noting that in the first way only the "wet" part of the delay effect is used for the duplicated track, since the "wet" part is the delayed part and the "dry" part is the original part, but there are other parameters on most delay effects plug-ins or signal processors, where "signal processor" is the general name for something that processes an audio signal . . . ]

Haas Effect (wikipedia)

Another way to manipulate the perceived loudness of an instrument or voice is to use a signal processing technique called "ducking", which is done by one or more advanced compressor limiters that compress or limit an instrument or voice an amount determined (a) by the volume level of another instrument or voice and (b) by the various parameter settings of the controls for the compressor limiter, where the amount or level of "ducking" is controlled by the latter; and this also is primarily an auditory illusion like the Haas Effect . . .

[NOTE: The best digital signal processor (a.k.a., "effects plug-in") for "ducking" is Pro-C (FabFilter Software Instruments), and the way it works is that you send the audio for the controlling track (typically the lead vocal for a song that has singing) via a bus to the Pro-C "external sidechain" for the instrument that you want to "duck". What happens is that when the lead vocalist is singing, the volume level of the singing provides the high-level control for the respective Pro-C effects plug-in, which in turn via the various Pro-C parameter settings determines how much Pro-C lowers the volume level of the respective track. You can control how rapidly the "ducking" is initiated; the intensity or level of "ducking"; and how quickly or slowly the "ducking" ends when the lead singer stops singing. "Ducking" is explained in the "Expert Mode" tutorial video linked below . . . ]

Pro-C (FabFilter Software Instruments)

Pro-C Expert Mode ~ "Ducking" (FabFilter Software Instruments) ~ YouTube video

In the same way that in your roles as producer and audio mixing engineer you can control panning, "ducking", and create various loudness perception auditory illusions, so can you use different types of signal processors to control the perceived velocities of instruments and voices, where the general rule is that the more simple and pristine the audio generated by NOTION 4, the more control you have when you switch to the producing and mixing roles, where yet another key bit of information is that the focus in the producing and mixing roles is on the Gestalt (or the entire song, if you prefer); and this is where the logarithmic and geometric aspects of sound and acoustic physics come into play and at the individual note level essentially make adjusting velocity a bit frivolous, at least when the velocity adjustments are small and there is not just a single instrument or voice . . .

Explained another way, when you are focusing on the Gestalt, all the instruments and voices tend to blend, and from my perspective this makes ensuring that everything is heard the most important overall activity, where one of the key rules is that if you cannot hear it, then either (a) it needs signal processing or (b) it does not need to be there, because either way if it is there but you cannot hear it, then in the digital music production universe for all practical purposes it is noise, and noise is best avoided whenever possible, which is fabulous . . .

Fabulous! :D
Last edited by Surfwhammy on Fri May 16, 2014 8:39 am, edited 1 time in total.
The Surf Whammys

Sinkhorn's Dilemma: Every paradox has at least one non-trivial solution!
User avatar
Surfwhammy
 
Posts: 1137
Joined: Thu Oct 14, 2010 4:45 am

Re: Lock-Track Feature Request & velocity adjustment

Postby Surfwhammy » Fri May 16, 2014 5:37 am

SouLcRusaDer_kA wrote:ok i dont find the lock-track feature in Notion
and its important/essential for composing & pretty much a time-saver
so i hope Notion team can make it happen soon.


I am not certain that this will accomplish anything useful, since nothing changes unless you specifically change it; and you can make copies of a NOTION 4 score via the "Save As . .. " option . . .

SouLcRusaDer_kA wrote:let say i got several tracks in Notion
what is the easiest way to adjust the velocity?
i find that using sequencer overlay to handle velocity is pretty slow when compared to using DAW
(since i prefer writing music in notation style,right now i dont consider composing directly in DAW)
can i adjust it in DAW through rewire? (Notion as slave)
any alternatives would be appreciated :)


I agree with wcreed51 on this one . . .

wcreed51 wrote:Hairpins and dynamic markings would be the standard way


HOW TO USE NOTION 4 EXTERNAL MIDI STAVES TO PLAY DAW-HOSTED VIRTUAL INSTRUMENTS

Nevertheless, depending on the DAW application and its functionality, there is a way to do what you want to do . . .

It depends on whether the DAW application can host VSTi virtual instruments, which on the Mac can include Audio Unit (AU) virtual instruments when the virtual instrument plug-in also provides an AU interface, so there are two types of instruments (VSTi and AU) and native DAW instruments; and in some scenarios it might be possible to do this with standalone virtual instruments, provided their audio output can be routed to the DAW application, which also is the case with external MIDI synthesizers and other types of MIDI instruments (physical units connected via USB or standard MIDI cables) . . .

And there are several ways to do this, with one way being to use a NOTION 4 External MIDI staff and a "virtual MIDI cable" which causes the music notation on the NOTION 4 External MIDI staff to be converted to MIDI messages and sent to the DAW application where the MIDI messages then play the virtual instrument hosted by the DAW application . . .

Once you record in the DAW application the incoming MIDI sent from NOTION 4, you can edit the recorded MIDI in the DAW application and if the DAW application has functionality for editing velocity, which most DAW applications have, then there you are; and you also can use this technique to do other MIDI actions which currently cannot be done with NOTION 4 . . .

Another way is to export the MIDI from NOTION 4 and then to import the exported MIDI to your DAW application, with the difference being that in the first strategy you are working in a ReWire 2 session where the DAW application is the ReWire 2 host controller and NOTION 4 is the ReWire 2 slave, which makes it possible to continue to do editing and other activities in NOTION 4, while the export version typically is done separately and is not so flexible . . .

This is easy to do on the Mac, because Mac OS X includes "virtual MIDI cable" functionality, which is done via the "Audio MIDI Setup" application . . .

If you are doing digital music production on a Windows computer, then you will need a "virtual MIDI cable" utility program, and the setup and configuration might be more difficult, although all I can do is guess, because I do everything on the Mac . . .

[NOTE: This YouTube video tutorial shows a ReWire 2 session where DIgital Performer (MOTU) is the ReWire 2 host controller and NOTION 4 is a ReWire 2 slave. MIDI is played on an external MIDI keyboard and is recorded both to Digital Performer and NOTION 4. This is part of what you need to know, but the MIDI notes also can be done with music notation on an NOTION 4 External MIDI staff, which is shown in the second YouTube video tutorial . . . ]

DP8 N4 ReWire2 Real-time MIDI Recording ~ YouTube video tutorial

[NOTE: This YouTube video tutorial does not have voice-over, but it shows a ReWire 2 session where Logic Pro X (Apple) is the ReWire 2 host controller and NOTION 4 is a ReWIre 2 slave. The NOTION 4 score has a NOTION 4 External MIDI staff which is playing the Logic Pro X acoustic string bass, and the MIDI is sent from the NOTION 4 External MIDI staff to Logic Pro X to a track for which the Logic Pro X acoustic string bass is assigned. In other words, the music notation on the NOTION 4 External MIDI staff is playing the Logic Pro X acoustic bass in real-time on the fly. This is not shown with Logic Pro X recording the MIDI, but it is not difficult to record the MIDI in Logic Pro X, and I have other YouTube video tutorials that show how to do this . . . ]

N4 LPX ReWire External MIDI ~ "Ice Crystals" ~ YouTube video tutorial

As you can see, this is a bit on the advanced side of things, but (a) it works and (b) after you do it a few times, it tends to become a lot easier . . .

The key is to use NOTION 4 External MIDI staves and VSTi virtual instruments or the native instruments that come with the DAW application . . .

This also works for PreSonus Studio One 2.6.2 Producer/Professional, and it comes with a lot of native virtual instruments, as does Logic Pro X. Digital Performer 8 has a nice set of virtual instrument engines with presets, which is different from the native instruments that come with Logic Pro X and Studio One . . .

Summarizing, you can do this, and I can tell you exactly how to do it on the Mac, but if you are doing digital music production on a Windows computer I cannot be much help, because I do everything on the Mac . . .

There is a lot of stuff that needs to be setup and configured, including a "virtual MIDI cable", and the way this is done will be very specific to the operating system platform (Mac or Windows), but so long as the operating system and associated software support it, it is possible; but you need to understand MIDI and a lot of other stuff, so this is what I consider to be an advanced activity . . .

In this respect, the important bit of information is that NOTION 4 can do it, and there are DAW applications and VSTi/AU virtual instruments that can do it, as well as external MIDI keyboard synthesizers and other types of physical MIDI instruments like MIDI guitars and MIDI drums . . .

Lots of FUN! :)
Last edited by Surfwhammy on Fri May 16, 2014 7:24 am, edited 1 time in total.
The Surf Whammys

Sinkhorn's Dilemma: Every paradox has at least one non-trivial solution!
User avatar
Surfwhammy
 
Posts: 1137
Joined: Thu Oct 14, 2010 4:45 am

Re: Lock-Track Feature Request & velocity adjustment

Postby SouLcRusaDer_kA » Fri May 16, 2014 7:22 am

Surfwhammy wrote:I am not certain that this will accomplish anything useful, since nothing changes unless you specifically change it; and you can make copies of a NOTION 4 score via the "Save As . .. " option . . .

sometimes we might clicked/dragged the notes accidentally and we dont know that

without lock-track, if we are doing a song seriously,
we will probably want to check all the tracks several times for that in the final stage
its pretty time-consuming, and absolutely not efficient
lock-track will save u a lot time there

this feature is adopted by some DAWs
this proves that lock-track is not that useless

idiotSavant wrote:This is a long-requested feature. Some way to graphically draw velocities would be the bomb.


sad to know this is an old request ~~
hey Notion team
Sibelius is doing better in velocity adjustment
plz listen to your loyal users and make some changes
SouLcRusaDer_kA
 
Posts: 60
Joined: Sat Nov 09, 2013 10:14 am

Re: Lock-Track Feature Request & velocity adjustment

Postby Surfwhammy » Fri May 16, 2014 8:12 am

SouLcRusaDer_kA wrote:
Surfwhammy wrote:I am not certain that this will accomplish anything useful, since nothing changes unless you specifically change it; and you can make copies of a NOTION 4 score via the "Save As . .. " option . . .

sometimes we might clicked/dragged the notes accidentally and we dont know that

without lock-track, if we are doing a song seriously,
we will probably want to check all the tracks several times for that in the final stage
its pretty time-consuming, and absolutely not efficient
lock-track will save u a lot time there

this feature is adopted by some DAWs
this proves that lock-track is not that useless


I did not intend to imply that being able to lock a staff was useless, but it not something I would think of doing intuitively, primarily because I tend to do everything "by ear", and so long as everything sounds good, I know that nothing needs to be changed, regardless of whether I inadvertently changed it without being aware that I changed it . . .

Nevertheless, from the perspective of software engineering I think adding "lock" and "unlock" at the staff and corresponding NOTION 4 Mixer track level is not a particularly difficult or time-consuming task, and after pondering it for a while, I think there is value to it . . .

The only caveat is that what happens if you inadvertently change something and then lock the staff all the while thinking it is golden?

So, there is a "Catch 22" aspect to it . . .

SouLcRusaDer_kA wrote:
idiotSavant wrote:This is a long-requested feature. Some way to graphically draw velocities would be the bomb.

sad to know this is an old request ~~
hey Notion team
Sibelius is doing better in velocity adjustment
plz listen to your loyal users and make some changes


My perspective on this primarily is based on the realities of software engineering, which is one of my areas of expertise, although I am more focused on music these days . . .

In great contrast to adding staff level and corresponding NOTION 4 Mixer track level "lock" and "unlock", this is what I consider to be a major software engineering activity, and more specifically it is part of generally enhancing the MIDI editing capabilities of NOTION, which includes providing support for a full range of pedals and other types of MIDI control change (CC) messages, and so forth . . .

Velocity is part of MIDI note events, but the concept is the same as providing support for a full range of MIDI control change (CC) messages . . .

Doing anything requires time and resources, which are limited in one way or another, so I think it is helpful to put feature requests into perspective, especially for digital music production applications like NOTION, which are as much art as science . . .

It might might be interesting to ponder the idea that Stradivarius could have made more violins if he hired more workers or that Michelangelo could have painted a lot more ceilings if he hired more workers, but how practical is that?

Applications like NOTION are so time-sensitive and need to do so much intense computing that the only practical way to do it is with a handful of software engineers, which also is the case with operating systems for the core activities . . .

As explained in my previous post, there is a way to have full MIDI editing capabilities via a Digital Audio Workstation (DAW) application which provides full MIDI editing, and if you need to do detailed velocity editing, this is possible and actually is not so difficult to do beyond what would be required in terms of graphic user interface (GUI) activities if NOTION supported it . . .

Doing it in the DAW application has the advantage of the computing being done in the DAW application rather than in NOTION, which is the reason that I do all the signal processing (or "effects plug-ins work") in the DAW application rather than in NOTION, because (a) this reduces that amount of computing that NOTION needs to do and (b) since the DAW application does not need to do what NOTION does, the DAW application typically has more available computing resources and time in which to do the required computing, plus as noted (see my previous posts), it is easy to add stuff in the DAW application but it is virtually impossible to remove stuff that is embedded in a soundbite . . .

Lots of FUN! :ugeek:

P. S. The only practical example of velocity editing that comes to mind at the moment is based on something the KORG Triton Music Workstation (88-Keys) does with certain presets, which specifically is that depending on the way a note on the keyboard is pressed, other stuff happens, where an example is all the various stuff that happens with one of the Trance presets, as heard in the following YouTube music video, which is fabulous . . .

[NOTE: This is played with two fingers, and there are a few times when it would be nice to be able to edit the velocity for specific notes, but if I do it for a while, I usually am very precise in the way I press the notes on the keyboard, where as I recall in this instance this music for the YouTube music video is just a few minutes of what actually as an hour of doing this over and over, where it was audio recorded in Digital Performer rather than being a MIDI sequence and so forth, since at the time I had no idea how to do anything with MIDI, which in some respects also with the case with music notation, except for being able to sight-sing music notation, of course, since I have been able to sight-sing music notation since I was in elementary school, mostly since I was in a liturgical boys choir, which among other things is the reason I do everything on soprano treble clef regardless of the actual register or range of the instrument, since I can map how the notes are played via the Transposition option for each staff in NOTION Score Setup, which works nicely since here in the sound isolation studio there are 12 notes and 8 or so octaves, excluding the in-between notes for 24-tone stuff . . . ]

"Trance Star Pt. 1" (The Surf Whammys) ~ YouTube music video

Fabulous! :P
The Surf Whammys

Sinkhorn's Dilemma: Every paradox has at least one non-trivial solution!
User avatar
Surfwhammy
 
Posts: 1137
Joined: Thu Oct 14, 2010 4:45 am

Re: Lock-Track Feature Request & velocity adjustment

Postby SouLcRusaDer_kA » Fri May 16, 2014 11:39 pm

Ok i just want to say its 21st century, not Renaissance
In this age, its a fact that if a program evolves way too slow,
most people wont wait for it, they just switch side

Ok let return to the original topic
so in the other word, if i import midi form notion to a daw,
theres no way to avoid re-loading/re-opening
the vsts that already loaded in notion...?
(Ok , Say, i loaded kontakt piano in track 1 , after importing that track from notion to a daw, i have to load kontakt piano into the daw again for that track, right?
so far my experience tells me : right
but im not that sure if any trick that can save me time)
SouLcRusaDer_kA
 
Posts: 60
Joined: Sat Nov 09, 2013 10:14 am

Next

Return to NOTION

Who is online

Users browsing this forum: No registered users and 11 guests


cron