Jump to content
Please note: You can easily log in to MPN using your Facebook account!

New VST host: Gig Performer


johnchop

Recommended Posts

@EscapeRocks

 

That looks gorgeous! I hope it runs great for you as well.

 

------------------------------------

Nebojsa Djogo

Co-founder, Deskew Technologies, LLC

www.gigperformer.com

 

Thank you. So far so good. I am a long time Mainstage user, so creating widgets for controls is second nature to me.

 

The controllers I am using are a Casio PX5S, and an Arturia Keylab 61 Black edition.

 

In rack one, I have the 8 sliders assigned to the 8 ADSR buttons on the Arturia Jup-V software.

 

One thing I really like is when editing the rack space widgets, you select which plugin it's going to be used in, and then GP populates that window with all of the the assignable parameters. Very easy!

 

I made a second Rack space that I haven't pictured here for an instance of NI The grandeur that's hosted in Kontakt 5 Player.

 

Similar setup to the above. In this case, I have the volume knob, 4 other knobs and one switch:

 

Knob 1 controls the tone: hard to soft parameter of the Lid

 

Switch turns on or off the "space" (concert hall) parameter

Knob 2 -4 are "amount,"room size," and, "distance"

 

These 6 controls are assigned to the sliders on the PX5S for easy adjustment while playing.

 

Granted I come from Mainstage and I understand workflows, so maybe that's why this is so easy. However, i really do think your GP interface is easy, especially for your target audience who probably also have experience with VST/AU setups.

 

As far as performance, this part really intrigues me:

 

My Macbook Pro is a late 2013 13" Retina 2.4GHz, 4GB Ram, 256 SDD. Not exactly an overly robust system for live music. I bought this before I even considered doing this.

 

I have some very nice libraries that really tax my system inside Mainstage, namely NI The Grandeur, Alicia's Keys, and PianoTeq 5.

 

Thru a lot of manipulation I can get them to run, "ok", but I usually leave them off my list of instruments and go with the built in sounds, since I can easily spike the CPU on heavy passages with a lot of sustain pedaling.

 

I set up a test last night.

 

I created a new Mainstage concert that mimics the setup I have so far in GP.

 

2 instances of Arturia Jup 8V, one instance of Kontakt Factory sounds string ensemble in one patch.

 

1 instance of Kontakt The Grandeur in the other.

 

No plug in effects.

 

Using my PX5S as controller. I played The Grandeur.

 

With wifi, bluetooth, and other things turned off (including Gig Performer) I was seeing about 10% on the CPU in resting mode, "perform in full screen"

 

I played a couple minutes of the piano outtro of "Layla."

 

CPU hovered around 35% with many spikes up to 75% to 80%.

I finished by doing a 3 octave gliss with sustain pedal.

hit 100% CPU with attendant audio glitches.

 

Using Steinberg UR22 512 samples.

 

Then I closed Mainstage, and opened Gig Perfomer and loaded up the gig.

 

Identical instrument plug in, and no effect plug ins.

 

Resting state showed 3% CPU.

 

I then played the exact same "Layla" passage and ended with the same 3 octave gliss.

 

CPU stayed around 8% with an occasional jump to 12%

 

The final 3 octave gliss with sustain maxed out at 14%

 

No audio drop outs, no note loss.

 

I monitored this with the Apple Activity monitor instead of each program's CPU meter.

 

One thing I found is that GP does not add much, if any CPU% to the AU hosts (Kontakt Player 5 and Arturia). What is being shown as CPU by those interfaces is what GP and activity monitor are showing.

 

Mainstage adds quite a bit.

 

Anyway, I just wanted to share my findings after just a couple days messing around with the free trial of GP.

 

I am not sure, yet, if I will go with it, as I have a very extensive Mainstage setup.

 

The fact I am able to run my good piano libraries (I just re downloaded Alicia's Keys) without the usual glitches on my system is very tempting.

 

I can also see how GP would be a great visual and easy host on a PC system.

 

 

David

Gig Rig:Roland Fantom 08 | Roland Jupiter 80

 

 

 

 

 

Link to comment
Share on other sites

  • Replies 151
  • Created
  • Last Reply
I created a new Mainstage concert

...

I finished by doing a 3 octave gliss with sustain pedal.

hit 100% CPU with attendant audio glitches.

...

Then I closed Mainstage, and opened Gig Perfomer and loaded up the gig.

...

The final 3 octave gliss with sustain maxed out at 14%

 

No audio drop outs, no note loss.

Well there's something that can justify the premium price compared to Mainstage.

 

Tangentially, there's another thread going on where people have been talking about the feasability of Apple putting Mainstage on an iPad. Even forgetting about iOS's lack of plug-in functionality, streaming limitations and greater hunger for "real" memory, just in terms of processing power, your model has a geekbench 3 score of 6253 (64-bit, multi-core). The top of line iPad Pro 12.9 is 5411 (the Air 2 is 4403). Not that some kind of "Mainstage Light" might not be viable, but it seems you'd at least need lower expectations!

Maybe this is the best place for a shameless plug! Our now not-so-new new video at https://youtu.be/3ZRC3b4p4EI is a 40 minute adaptation of T. S. Eliot's "Prufrock" - check it out! And hopefully I'll have something new here this year. ;-)

Link to comment
Share on other sites

 

I set up a test last night.

 

I created a new Mainstage concert that mimics the setup I have so far in GP.

 

2 instances of Arturia Jup 8V, one instance of Kontakt Factory sounds string ensemble in one patch.

 

1 instance of Kontakt The Grandeur in the other.

 

No plug in effects.

 

Using my PX5S as controller. I played The Grandeur.

 

With wifi, bluetooth, and other things turned off (including Gig Performer) I was seeing about 10% on the CPU in resting mode, "perform in full screen"

 

I played a couple minutes of the piano outtro of "Layla."

 

CPU hovered around 35% with many spikes up to 75% to 80%.

I finished by doing a 3 octave gliss with sustain pedal.

hit 100% CPU with attendant audio glitches.

 

Using Steinberg UR22 512 samples.

 

Then I closed Mainstage, and opened Gig Perfomer and loaded up the gig.

 

Identical instrument plug in, and no effect plug ins.

 

Resting state showed 3% CPU.

 

I then played the exact same "Layla" passage and ended with the same 3 octave gliss.

 

CPU stayed around 8% with an occasional jump to 12%

 

The final 3 octave gliss with sustain maxed out at 14%

 

No audio drop outs, no note loss.

 

I monitored this with the Apple Activity monitor instead of each program's CPU meter.

 

One thing I found is that GP does not add much, if any CPU% to the AU hosts (Kontakt Player 5 and Arturia). What is being shown as CPU by those interfaces is what GP and activity monitor are showing.

 

Mainstage adds quite a bit.

 

 

 

Thanks for this report. Entirely different kettle of fish, but I experienced similar differences in efficiency when comparing Mainstage to VSTLord earlier this year. I couldn't get Omnisphere to reliably play on my old MacBookAir in Mainstage, but it ran smoothly in VSTLord.

 

VSTLord is a test application and only hosts one plugin at a time, so the comparison is meaningless to some extent, but if GigPerformer offers similar improvements over Mainstage, it may well be worth the extra $$$

 

 

local: Korg Nautilus 61 AT | Yamaha MODX8

away: GigPerformer | 16" MBP M1 Max

home: Kawai RX-2 | Korg D1 | Roland Fantom X7

 

Link to comment
Share on other sites

Thanks for this report. Entirely different kettle of fish, but I experienced similar differences in efficiency when comparing Mainstage to VSTLord earlier this year. I couldn't get Omnisphere to reliably play on my old MacBookAir in Mainstage, but it ran smoothly in VSTLord.

 

VSTLord is a test application and only hosts one plugin at a time, so the comparison is meaningless to some extent, but if GigPerformer offers similar improvements over Mainstage, it may well be worth the extra $$$

 

I am going to download a trial to run on my PC laptop. to see how it runs in that environment.

 

I can download my VST's in that format as well.

 

As much as I'd like a new MBP built for live use, I can't afford one right now. If GP works, I may think about going PC route where I've been pricing out robust systems for far less $$$ than Mac, and have $$$ left over for more plug ins.

 

All in all, I am happy to see another player in the game to give us all options for how we each want to run a system.

David

Gig Rig:Roland Fantom 08 | Roland Jupiter 80

 

 

 

 

 

Link to comment
Share on other sites

@EscapeRocks

 

Thank you so much for this unbiased and incredibly detailed review!

 

This is exactly what we've measured in our environment and what we have been hearing from our users. I personally run my live gig on an even lower powered, 2011 Mac Book Pro where this difference becomes even more noticeable.

I'm glad that the workflow in setting up your rackspaces and controls worked for you. Our goal is to try to keep it as simple and as obvious as possible.

 

New features are coming soon as well. Internally - we're already testing things like "Patch Remain" functionality, MIDI filtering, MIDI file player etc... If you have any suggestions - just shoot us an email to info@gigperformer.com and we'll listen.

 

As a thank you for everyone on the Keyboard Magazine Forums - I would like to offer you a special coupon that will knock the price of Gig Performer down to $99 should any of you decide to purchase it. Obviously - feel free to share this coupon with anyone you'd like. The coupon will be valid starting now until Jan 11th, 2017

 

Enter the coupon code: KMF99 on checkout and your price will drop accordingly.

 

 

------------------------------------

Nebojsa Djogo

Co-founder, Deskew Technologies, LLC

www.gigperformer.com

Link to comment
Share on other sites

Nebojsa, a quick question if I may how does GP manage assigning different threads in a multi-core processor environment? In Bidule, if you want to assign a module to not use the main thread running on core 0, you have to do it manually and once assigned, all modules upstream of the assigned module also move to this same core. You cannot "cross wires" in Bidule by having connected modules that run on different cores without risking audio glitching. The exception is that you can connect them together at midi and audio inputs & outputs. In reality, getting a complicated Bidule setup to run well on multiple cores is almost a black art!

 

I'm wondering if a setup that uses Bidule to do my midi routing and processing, connected via virtual midi ports to GP, which would host the actual VI plugins, might be a good way to get glitch-free multi-core functionality. I'm almost there with Bidule but I still get very random and very infrequent glitching. So, does GP manage multiple cores as part of its design? Thanks for showing up here and participating, BTW!

Link to comment
Share on other sites

@Reezekeys

 

Hi there ... today's processors are extremely good at distributing the load of multiple threads to cores within the processor. Trying to "mess" with that process is somewhat a "black art" as you say and in most cases, when it comes to audio, it could result in glitches.

 

Having said that - take a look at almost any plugin host and see how many threads its running. The threads will be spread out over the cores. Many modern plugins will actually take advantage of that and use multiple cores to process the incoming audio. "Diva", for example, has a switch that allows multiple cores to be used while processing audio. It makes sense for such an intense plugin to try to spread the processing over the cores and then combine it back when its done.

 

What I'm trying to say is that you will most likely not gain anything trying to do it yourself since all your cores will be busy anyway if the currently running processes are CPU hungry.

 

If you use virtual midi outputs - sure - those will automatically show up in GP even if its running and you will be able to use them immediately as any other MIDI port. I'd be interested to see that working actually and I would also be interested in hearing what kind of MIDI processing are you doing. We are working on making some advanced midi filters/mappings and we will at some point have an entire language that you will be able to use for very complicated stuff.

 

Almost forgot an important part ... you can also have multiple instances of GP running at the same time and hosting different rackspaces. This will basically again be spread out across the cores automatically by the system and each instance could respond to different things coming from the outside via MIDI or OSC.

Running multiple instances of GP will almost guarantee that they will be "preferred" by different cores, but we don't mess with that process and let the OS and the processor do what they do best.

 

------------------------------------

Nebojsa Djogo

Co-founder, Deskew Technologies, LLC

www.gigperformer.com

 

 

 

Link to comment
Share on other sites

This is David, I'm the other guy at Deskew. I figured I'd respond to this one because I do almost the same thing as you're suggesting except

1) I'm using Max rather than Bidule and consequently

2) I'm sending MIDI messages over OSC rather than directly through virtual MIDI ports

 

You can certainly do the latter (and it works just fine) but if you ARE using a system that supports OSC rather than just using GP standalone and you have a lot of plugins ( I tend to have anywhere between 5 and 8 synth plugins (as well as numerous effects) per rackspace) it's extremely convenient to use OSC messages instead of creating virtual ports. I'm pretty certain Bidule allows OSC messages to be sent out just like Max.

 

So instead of creating a MIDI In block that's associated with a physical MIDI device, you create a MIDI In block and give it an OSC name, say remote1

 

Now, you can send OSC messages of the form

 

/remote1/NoteOn number velocity channel

 

and of course using a velocity of 0 is effectively a NoteOff event.

 

 

Further, since you can associate widgets with OSC names as well, you can assign widgets to plugin parameters and then send OSC messages to the widgets to change plugin parameter values.

 

Of course you can also bypass widgets and send OSC messages directly to plugins to change parameters so you can do a ton of automation using Max or Bidule and let Gig Performer be responsible for managing the audio. That was one of our design goals by the way. GP will also send OSC data back to you as things change, which is also useful.

 

Having said all that, we don't do anything special to associate plugins with particular cores. You just instantiate the plugins you want and let the OS do whatever thread/core allocations it wants.

 

(Edit: didn't realize Nebojsa had already responded but I figure the OSC information will be of interest anyway)

Link to comment
Share on other sites

Thanks to the Gig Performer devs for the info (and the coupon). Just curious - how have you (or others) handled trying to incorporate Logic's instruments into your GP setups? Are you hosting the Logic instruments in MainStage and using IAC MIDI to trigger?

 

I'm giving GP a test run (especially since my go-to piano (Pianoteq) just doesn't seem to want to play nice with the latest versions of MainStage). I'm not mega-tied to the Logic instruments, but it would be nice to throw everything into one app.

 

Thanks again - I'm excited to see a non-MainStage option!

 

-Don

"Inspiration is not a choice, it's got to search you out..." - Jason Falkner

Kurzweils, some oldie but goodie stuff from Yamaha/Korg, and soft-synths that I've barely explored.

Link to comment
Share on other sites

@midiotlv

 

Logic Pro X and MainStage use proprietary formats and cannot be hosted outside their applications.

 

You certainly could create a virtual MIDI device, then create a MIDI out block in GP and connect whatever MIDI in device you wanted to it. You can then set the properties in the MIDI out block in GP so that it sends a Program Change number out when you switch your rackspaces so that you change the patch in MainStage for example.

 

This should definitely work (haven't really tried it myself), but remember that you would need to run both GP and MainStage/Logic. Also - if you use GP as the master controller - make sure that in MainStage you block PC messages from external devices.

 

------------------------------------

Nebojsa Djogo

Co-founder, Deskew Technologies, LLC

www.gigperformer.com

Link to comment
Share on other sites

@midiotlv

You certainly could create a virtual MIDI device, then create a MIDI out block in GP and connect whatever MIDI in device you wanted to it. You can then set the properties in the MIDI out block in GP so that it sends a Program Change number out when you switch your rackspaces so that you change the patch in MainStage for example.

 

I am going to try this this weekend. I have some Logic Pro X/Mainstage sounds that I really like.

I figure I could make a bare-bones concert without all kinds of screen controls, and create the few patches with the built in sounds I use.

 

Thanks for that response, as I had not thought of it before.

 

Also, GP is still running just fine on my setup. I did re-download my NI "Alicia's Keys" and am getting the same great CPU results I outlined in a previous post.

 

 

David

Gig Rig:Roland Fantom 08 | Roland Jupiter 80

 

 

 

 

 

Link to comment
Share on other sites

@Reezekeys

 

Hi there ... today's processors are extremely good at distributing the load of multiple threads to cores within the processor. Trying to "mess" with that process is somewhat a "black art" as you say and in most cases, when it comes to audio, it could result in glitches.

 

Having said that - take a look at almost any plugin host and see how many threads its running. The threads will be spread out over the cores. Many modern plugins will actually take advantage of that and use multiple cores to process the incoming audio. "Diva", for example, has a switch that allows multiple cores to be used while processing audio. It makes sense for such an intense plugin to try to spread the processing over the cores and then combine it back when its done.

 

What I'm trying to say is that you will most likely not gain anything trying to do it yourself since all your cores will be busy anyway if the currently running processes are CPU hungry.

I hear what you're saying but unfortunately Bidule forces one to "do it yourself." It won't spread the threads over multiple cores automatically. I don't blame them really since the end user is free to "wire" his or her own layout, there is no way Plogue can predict the paths that audio and midi will take. So everything gets put on core 0 by default. I actually used a quite involved layout on my older Core 2 Duo MacBook Pro, with everything on one core, at a 128 buffer and it worked quite well. Now I'm looking to expand my layout, so I'd like to get some processing headroom by spreading the load among all the cores. It's been a slightly bumpy ride so far.

 

If you use virtual midi outputs - sure - those will automatically show up in GP even if its running and you will be able to use them immediately as any other MIDI port. I'd be interested to see that working actually and I would also be interested in hearing what kind of MIDI processing are you doing. We are working on making some advanced midi filters/mappings and we will at some point have an entire language that you will be able to use for very complicated stuff.

Without making this post real long (it's already long, lol), I'll just say that Bidule lets you extract midi data and use it as a modulation source for doing things within the program. You can also scale this data or do other math on it. Since my focus is live performance, I use Bidule's midi processing to do things like 1) turn VIs on & off to save CPU, 2) change presets on individual Bidule modules or groups of modules, and 3) add new capabilities not usually found in midi keyboards or workstations like my "multitouch" group where I can use a single button on my keyboard controller to create four separate midi CC #s, or a "midi round-robin" group I created that transposes a single midi note # by half-steps each time it's received, then wraps around. This is what I really like about Bidule I can customize my live setup to work exactly the way I want. My $350 Roland midi controller and Bidule can do the same things that $2K+ workstation keyboards do, and I get to design the "firmware"! Sounds like you're moving in that direction with GP. I have some (amateur) programming background so this stuff is challenging and even fun for me.

 

Thanks for taking the time to respond in such detail. I'm definitely interested in investigating whether a hybrid setup with Bidule and GP could give me 100% glitch-free audio.

Link to comment
Share on other sites

Re glitch free audio. If you are running Yosemite or El Cap you might like to consider upgrading to Sierra. It is my understanding that Core Audio in Sierra was substantially revised.

 

There are numerous reports from those that were experiencing intermittent audio glitches in earlier OSX versions that the glitching has gone away after upgrading to Sierra.

A misguided plumber attempting to entertain | MainStage 3 | Axiom 61 2nd Gen | Pianoteq | B5 | XK3c | EV ZLX 12P

Link to comment
Share on other sites

Re glitch free audio. If you are running Yosemite or El Cap you might like to consider upgrading to Sierra. It is my understanding that Core Audio in Sierra was substantially revised.

 

There are numerous reports from those that were experiencing intermittent audio glitches in earlier OSX versions that the glitching has gone away after upgrading to Sierra.

 

 

I concur with this. I upgraded to Sierra when it came out.

My random glitches stopped, and the CPU spike with "Perform in Full screen" on my 2013 Retina stopped as well.

David

Gig Rig:Roland Fantom 08 | Roland Jupiter 80

 

 

 

 

 

Link to comment
Share on other sites

I wonder if this glitching you both say was fixed with Sierra had to do with issues around Mainstage or Logic which I can image Apple might have more of an interest in addressing. I'm always a little nervous upgrading OSes on my "gigging partition" but I'll look into this. I usually make a fresh partition somewhere & clean install a new OS, then add music apps & plugins one at a time to insure everything works well together. Thanks for the tip, maybe this will help me.
Link to comment
Share on other sites

Nope, the current version of MainStage, 3.2.4, came out 6 months before Sierra. NI users reported that the 'bug' had been fixed in a beta version of Sierra and NI has acknowledged the existence the bug in El Cap.

 

Hope it works for you.

 

A misguided plumber attempting to entertain | MainStage 3 | Axiom 61 2nd Gen | Pianoteq | B5 | XK3c | EV ZLX 12P

Link to comment
Share on other sites

Check that all your plugins are Sierra compatible before upgrading.

 

Anything that tries to install in system folders will not install and disabling SIP in Sierra does not work for many legacy plugins.

A misguided plumber attempting to entertain | MainStage 3 | Axiom 61 2nd Gen | Pianoteq | B5 | XK3c | EV ZLX 12P

Link to comment
Share on other sites

I'm going to do some more scientific testing this weekend, but here is what I found from some playing around last night.

 

I loaded-up both MainStage (MS) and Gig Performer (GP) with Pianoteq 5, Falcon (a DX EP), Kontakt (Orchestral strings), and the OPX-Pro II plug-in (Halen's Jump sound). [Note: this isn't a typical setup for me, but I wanted to try to stress the software packages a bit.]

 

Once I had everything going, I saved both projects and tried each one on their own. In GP, the maximum CPU I could force was 60% (on a mid-2015 Mac Book Pro). This was holding the sustain pedal down and playing glissandos up and down the keyboard (basically just mashing as many keys as possible). I never had a single crackle or pop, and the CPU was hitting in the 40's most of the time. The 'resting' CPU was much lower (between 2 and 7%).

 

Over to MainStage... It did better than I thought it would (no true system overloads reported). However, the 'resting' CPU was in the 30-40 range, and when I started playing the CPU shot-up to 90 - 112% range (I love using > 100% haha). I did have one or two crackles, but that seemed to go away once I disabled wi-fi.

 

So, it appears that I saw a significant reduction in CPU when using GP over MS (on my most processor intensive plug-ins!). So kudos to GP for being that much more efficient!

 

Some other findings to balance this (otherwise very positive) review:

 

1. I had lots of issues with the HALion Symphony Orchestra plug-in inside of GP (I had just downloaded the trial version to kick the tires). It would let me add it to the canvas, but it was 'spotty' -- sometimes it wouldn't play, sometimes it was off by a semi-tone (my guess is a difference in sample rates), and sometimes it would play just fine. However, I could never switch sounds in the plug-in. When I clicked on HALion's drop-down menu to change sounds, the menu actually appeared on the upper right hand corner of my screen (not where it should be) and I couldn't actually change anything [as soon as I clicked on anything the menu vanished and did not affect the plug-in]. I restarted GP many times, but could never get it to work. I know this plug-in is older, but if you are a heavy HALion library user be sure to test your stuff with the trial.

 

2. The lack of MIDI filtering/CC mapping hurts GP. I never realized how much I used that stuff in MS until I couldn't. I went to turn-off the sustain pedal to a Falcon pad, and realized I couldn't. I think I read in an earlier post that this functionality is coming soon.

 

3. While layering was a breeze, creating splits wasn't as straightforward as in MS (I ended-up creating multiple instances of my keyboard with different keyboard ranges). I didn't see how to easily create velocity layers (i.e. triggering sounds only in a certain key velocity range) -- but I don't know if that is a limitation of GP or I'm just not seeing where to do it.

 

4. I tried to use Apple's file player as a substitute for MS's playback engine, but it won't work unless the plug-in is in edit mode. GP hasn't advertised file playback, but it would be cool for those of us who sometimes use a backing track.

 

So, in summary, I was very pleased with my initial testing. GP was rock solid and the CPU really stayed low (especially when I ran a more typical setup like Pianoteq and a Falcon pad). It was impressive enough that I think I will end-up using it for most of my piano-centered gigs. I'm going to do some more testing before I pull the trigger, but so far it lives-up to the marketing. It really needs some MIDI CC filtering/mapping ability (which, again, is reported to be coming soon) and support for velocity switching - if it isn't already there and I just missed it. The UI is very clean and easy to use, as well.

 

 

 

 

"Inspiration is not a choice, it's got to search you out..." - Jason Falkner

Kurzweils, some oldie but goodie stuff from Yamaha/Korg, and soft-synths that I've barely explored.

Link to comment
Share on other sites

Hey, thanks for the interesting feedback on your experience with Gig Performer. Regarding MIDI Filtering, I'm probably not going to get into trouble for posting this image of our Midi Filter plugin which will indeed be in the next update, probably within a week or two ('cos I'm going skiing for the holiday weekend tomorrow!) along with a couple of other nice new features which we just need to finish testing.

 

http://i.imgur.com/14ah2gz.png

Link to comment
Share on other sites

posting this image of our Midi Filter plugin which will indeed be in the next update

Looks nice.

 

Suggestion: Scrolling through all 127 might be a nuisance sometimes. Maybe you could have a View pop-up with "show/hide" checkboxes, where you could filter the list itself, i.e. specify that you do (or do not) want to see Allowed, Blocked, or Redirected parameters. Then the user could probably easily view, for example, all his blocked and redirected parameters at once (without scrolling).

Maybe this is the best place for a shameless plug! Our now not-so-new new video at https://youtu.be/3ZRC3b4p4EI is a 40 minute adaptation of T. S. Eliot's "Prufrock" - check it out! And hopefully I'll have something new here this year. ;-)

Link to comment
Share on other sites

We need to get some user experience with things like this. Using my trackpad, for example, it's pretty easy to scroll through the list fast and the color scheme lets modified (blocked or mapped) parameters be noticed pretty quickly.

 

But I understand the benefit of filtering lists - we do that already with the quick plugin selector where instead of scrolling through hundreds of plugins (yeah, I'm one of those who never met a plugin I didn't want!) you can just type a few characters and just see plugins matching your typed string.

 

I'll put your suggestion on our list. When it comes to workflow, we have many things we want to do to make the creation and editing of rackspaces much faster but as I keep telling people, if we waited until we had everything we wanted, Gig Performer wouldn't come out for another 5 years (and of course at that point, we'd still want more)

Link to comment
Share on other sites

@reezekeys I do the identical kinds of MIDI transformations with Max. I used to host my plugins in it as well but I really wanted a standalone plugin host environment for audio. So it's not that we're "moving" in that direction, we're actually there!

 

I've already been on tour with Max as my MIDI processor and Gig Performer to handle all audio chores. I connect them using OSC (although you can certainly use virtual MIDI) and the whole setup "feels" much more robust than when I was just using Max for everything.

 

 

Link to comment
Share on other sites

Are you only sending midi between Max and GP? In that case I'm curious as why you'd choose OSC over plain midi. Creating virtual midi ports is so easy (a one-time command on Bidule, you can make up to 64 ports), and they automatically show up in all other midi apps. With OSC you have to keep track of addresses for everything, right? It seems like a lot more work to me but maybe I'm not understanding how you're using it.

 

I am using OSC (in Bidule), but not for anything midi. I use MOTU's CueMix FX software to control my audio interface's built-in DSP mixer (it's a MOTU Microbook IIc). CueMix FX responds to OSC commands. I have a few knobs on my Roland keyboard controller mapped to Bidule "variable" objects that send their values over OSC to CueMix FX. That's how I control my in-ears mix while I play.

Link to comment
Share on other sites

Yes, I could trivially create MIDI ports in Max or indeed directly in OS X via the Audio/Midi Settings app but OSC is way more powerful and with proper abstractions (Bidule calls them groups but I don't know how they compare with Max abstractions) it's much easier to manage. For example, consider the following short Max patcher that I created in about 30 seconds

 

Aw581Zd.png

 

I've got three plugins which I can address directly (and I can see what they are, piano, organ and moog). The actual OSC addressing is handled automatically by the abstraction, I don't have to keep track of that stuff at all and I don't have to remember which MIDI port/channel is going to which plugin. BottomLeftCore represents the bottom keyboard on my left in my live rig. It receives MIDI messages from that keyboard and sends them to the GigRackSynthRx objects where they're internally converted to OSC messages on the fly and sent to GigPerformer.

 

But not only can I send standard MIDI messages to those plugins over OSC, I can also send VST parameter changes to them directly, no need to deal with mapping MIDI directly to plugin parameters. You can also associate Gig Performer widgets directly with plugin parameters as well and then drive them from both physical MIDI devices AND from OSC. The nice thing about the latter is this will all synchronize with Lemur or TouchOSC.

 

Further, our own plugins can also receive OSC messages directly. For example, that new Midi Filter plugin can accept OSC messages to ALLOW/BLOCK/MAP parameters programatically.

 

Of course Gig Performer itself can change rackspaces or variations through OSC messages. A message like /GigPerformer/SwitchToRackSpace "Lead A Perfect Life" is way better than having to remember that the song "Lead A Perfect Life" is Program Change 42 in Bank 3 etc.

 

Now, before I put anyone off by making them think that this is getting too complicated, let me point out that you don't HAVE to do this. As I'm sure you've already discovered, you can just fire up Gig Performer, connect a few plugins together, associate some knobs and buttons with your MIDI controllers and off you go. Most users will just do this kind of thing and will be up and running in a few minutes and making the product really easy for non-techy musicians who just want to play was a core goal.

 

But if you happen to be technical, you CAN do some very interesting things by using such tools as Max or Bidule and in my case, doing everything with OSC was way more convenient than creating lots of MIDI ports.

Link to comment
Share on other sites

Hey, thanks for the interesting feedback on your experience with Gig Performer. Regarding MIDI Filtering, I'm probably not going to get into trouble for posting this image of our Midi Filter plugin which will indeed be in the next update, probably within a week or two ('cos I'm going skiing for the holiday weekend tomorrow!) along with a couple of other nice new features which we just need to finish testing.

 

http://i.imgur.com/14ah2gz.png

 

I'd suggest adding to the pitch bend filter a slider from 0-100 that thins out pitch bend data, rather than just an on/off. Pitch bend sends a lot of superfluous messages that just help clog the MIDI stream.

The fact there's a Highway To Hell and only a Stairway To Heaven says a lot about anticipated traffic numbers

 

People only say "It's a free country" when they're doing something shitty-Demetri Martin

 

Link to comment
Share on other sites

Have you actually run into a problem with the MIDI stream being clogged on a modern computer in a live situation due to too many pitchbend messages? I get the concern on older machines with slower clocks, less RAM, running sequencers and sending MIDI data to multiple synths on different channels on a single MIDI cable at 33.1Khz but when you're using a modern computer where you can trivially print MIDI messages to a terminal window (that's a slow operation!) faster than the rate they arrive, I'm not sure it's a real issue and may be more of a "premature optimization"

 

On a physical controller, when you move a pitchbend wheel or a knob sending CC data, it's not sending every single value out anyway.

 

So thinning continuous data is in our tracking system (every suggestion/idea we have ever had or heard about is in it!) but I'm not sure it needs to be high priority at this time.

 

 

I'd suggest adding to the pitch bend filter a slider from 0-100 that thins out pitch bend data, rather than just an on/off. Pitch bend sends a lot of superfluous messages that just help clog the MIDI stream.

Link to comment
Share on other sites

For example, consider the following short Max patcher that I created in about 30 seconds

Thanks, I stand corrected it seems like working with OSC in Max is a hell of a lot easier than Bidule! There, you need to create "groups" and rename them to the OSC address. Not hard to do once you understand things, but it is time consuming often you're nesting groups inside of groups to do it.

 

I assume you're actually playing these plugins in real time by sending midi notes via OSC, right? I'll also assume that the latency vs using virtual midi ports, is the same or maybe it's better? (Not that I notice any issues with my current Bidule setup, but I am curious).

Link to comment
Share on other sites

As a competitor in the music software business, wearing my Deskew Technologies' hat, it's unclear how appropriate (or not) it is for me to comment on other products.

 

However, if I wear my educator's hat (among other things, I teach computer science at SUNY Purchase including the new "Introduction to Programming with Max" course that I created there) I would tell you that Max is a programming language aimed at a much larger audience than just music (people do real time video processing, dance, museum installations, robotics, software radios, internet apps and so forth) and has a lot of support for important comp sci principles such as proper abstraction, parameter passing, messages and so forth. People also create Max objects in C, Javascript, Java, Python and other traditional languages so there is huge support out there.

 

My guess is that Max abstractions are probably more general than Bidule groups but I'm open to being corrected on that.

 

There are numerous books available for Max users as well, one excellent one is Electronic Music and Sound Design - Theory and Practice with Max 7 and it's well worth reading.

 

I am indeed playing the plugins in real time by sending MIDI notes via OSC messages. However I should point out that both Gig Performer and Max are running on the same computer. OSC itself uses UDP packets to send data over the network and it is the nature of UDP that packets can be lost. This is fine for such things as continuous data but I have seen single packets get lost when using OSC wirelessly so I wouldn't use OSC for MIDI over a wireless connection.

 

I don't know the latency of OSC vs virtual MIDI ports, all I can say is that as a keyboard player, even when playing very percussive sounds, I have never "felt" any softness or delay. I generally run GP at 44.1K with a 64 length sample buffer size.

Link to comment
Share on other sites

A few quick, basic questions. Should I be using a separate Rack Space for each sound and effects? I use NI pianos as well as Pianoteq pianos as well as Pianoteq Rhodes and Scarbee Rhodes. So should I have a different rack space for each of the different pianos and EPS? Additionally, so far my CPU is resting fairl high.....however, I am using Amplitude and PSP L'Otary which are very CPU intensive. Finally, is there a way to use one instance of an AMP model or one instance of a Reverb effect that can be used for every sound or do I need to have a separate one in each rack for each sound I use the effects on?
Korg SV2, Nord Electro 5D, Gigperformer/lots of VSTs
Link to comment
Share on other sites

As a competitor in the music software business, wearing my Deskew Technologies' hat, it's unclear how appropriate (or not) it is for me to comment on other products.

It's not my call but I don't see an issue here you've stated your affiliation plainly (how much plainer could your handle be, lol) and all you're doing is answering my innocent question of why you use OSC instead of midi and you didn't use that as an excuse to tout your program. It was actually pretty enlightening since while I knew OSC could transfer midi data, I never imagined it being used to play notes in real time I always thought of it as a way to send CCs or other control information.

 

However, if I wear my educator's hat (among other things, I teach computer science at SUNY Purchase including the new "Introduction to Programming with Max" course that I created there)

Small world, I lived a few minutes down the street from there in Port Chester (now I'm about 30 minutes north), and I know a few of the jazz faculty One of them recorded a CD at my home studio a few years ago.

 

I would tell you that Max is a programming language aimed at a much larger audience than just music (people do real time video processing, dance, museum installations, robotics, software radios, internet apps and so forth) and has a lot of support for important comp sci principles such as proper abstraction, parameter passing, messages and so forth. People also create Max objects in C, Javascript, Java, Python and other traditional languages so there is huge support out there.

I wish there were more than 24 hours in a day (or I was young and able to take time off to take your course!). Programming experience helps with Bidule but it sounds like you can take things much farther with Max. Bidule has enough generality that I can usually put together anything I need, but I'm only a piano player so I don't need too much! :)

 

My guess is that Max abstractions are probably more general than Bidule groups but I'm open to being corrected on that.

Your guess is probably correct. Bidule groups are a convenience feature with a few frills. If you use the same arrangement of Bidule modules you can save them as a "group", available anytime you need. You can also construct simple GUIs for them that open when you double-click on the group icon.

 

There are numerous books available for Max users as well, one excellent one is Electronic Music and Sound Design - Theory and Practice with Max 7 and it's well worth reading.

I'll look for that book, thanks!

 

[edit ouch that book is a little pricey - but I guess that's par for the course with textbooks!]

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...