Jump to content


Please note: You can easily log in to MPN using your Facebook account!

Synths - All Grown Up and Nowhere to Go?


felix.

Recommended Posts

Has the synthesizer, as a musical instrument, reached maturity?

 

It seems to me that musical instruments undergo an early period of fast-paced change and innovation, followed by a maturation of sorts, where some tried-and-tested (and economically feasible) configurations of the instrument prevail. Long ago this has happened with pianos and traditional instruments (brass, woodwinds, etc.). In the more recent past, it happened with guitars.

 

Are we going to see any big changes for synths, or just refinement and progression of what's already there? It's been a long time since the first half of the 80's, when I think the most innovation was happening.

Link to comment
Share on other sites



  • Replies 28
  • Created
  • Last Reply

I think we are entering the first real golden age since the early 80s programmable polyphonic analogs. Instruments like the Nord Modular Rack, Hartman Neuron and Roland V-Synth are really opening things up in the hardware world. Even the cheapest ROMpler has a great modulation matrix. Knobs and sliders are back in. It seems like they are giving keyboards back to the players and programmers.

 

Robert

This post edited for speling.

My Sweetwater Gear Exchange Page

Link to comment
Share on other sites

VAs have been around for awhile, as (of course) have the analogs they emulate. And Neuron & V-Synth do open up new avenues of sound-mangling for fun and profit. :)

 

Sample-based synths have indeed reached maturity. How much further can they go? They could sample at more and more volume levels; perhaps one day we'll see Yamaha's 'octuple-strike' piano. But I'm finding the limitations of sample-based emulation in the intro to "Cross-Eyed Mary". A real flute has an initial 'breath' sound on the first note of a legato run (and on repeated notes), while that breath does not exist in any following notes of the run. In order to recreate that, I have to assign a breath flute to one velocity layer and a non-breath flute to another layer - and then hope my playing can be that precise. And in the process, dynamics become inextricably linked to the layers. I cannot for instance play a loud legato passage; it will sound like a machine-gun flute run.

 

So in order to better emulate acoustic instruments (and let's face it, this is one of our tasks), what do we do? At what point do manufacturers (and players) begin to realize that the next big evolutionary step is physical modeling? THAT is where we need to go.

I used to think I was Libertarian. Until I saw their platform; now I know I'm no more Libertarian than I am RepubliCrat or neoCON or Liberal or Socialist.

 

This ain't no track meet; this is football.

Link to comment
Share on other sites

There's still a long way to go. Increases in processor power and, in the case of sample-based synths, storage will continue to create new possibilities: bigger and better sample sets; new types of synthesis that haven't been discovered yet; unusual types of synthesis such as additive, vector, and formant; real analog circuitry; and combinations of these things in one synth. The guitar's lower tech, so it could be expected to mature sooner, but even it has had some dramatic advances with amp modeling and the VG8, and there's no saying that there's not more in store.
Link to comment
Share on other sites

I agree Physical Modeling is the way to go. Just look at all the hardware and software that is all VA modeling, as well as even physical modeling of acoustic instruments--e.g. Korg OASYS engine; or Yamaha VL engine. I'm always hoping Korg comes out with a "Z2" which would basically be an update of the Z1 and use the currently improved and expanded algorithms found on their OASYS soundcard! And even FM is making a big comeback; now if only Yamaha would see this light and re-issue a DX7 but with modern specifications, this would be sort of like an FS1R but with higher specs, a keyboard, and *much* improved interface.
Link to comment
Share on other sites

I would be disappointed if advancements in modeling were just to facilitate synthesizers modeling acoustic instruments.

 

I'd be hopeful that future progress for synthesizers would be to advance them as their own special kind of instrument instead.

 

But better modelers and more powerful ROMplers are just variations on a theme IMO. Are we really going to see anything completely different?

Link to comment
Share on other sites

Looking into my crystal ball, my prediction is for more of a synergy between computer software and today's hardware.

 

This might include: interchangable synth engines a la plugzilla or noah; fully user-customizable synth options and samples; easier editing / better user interfaces; new & improved algorhythms(?sp) for sound generation; easier upgrade-ability; ability to run several different types of synthesis simultaneously, each sound with it's own effects; ability to capture live performance data for later "tweaking"....all at a competitive price.

 

Wouldn't that be nice..

Tom F.

"It is what it is."

Link to comment
Share on other sites

Ummm... what exactly would you 'model' to generate an original instrument?

 

Modeling, by definition, requires a subject to model. Who knows, perhaps whatever technologies are developed in modeling may be applied elsewhere. But just as sampling is an existing low-rent way to poorly emulate acousttic instruments, so modeling could be a better way. Anyone who's ever heard a velocity switch knows sampling has fallen far short of that particular goal.

 

So modeling could become the means to the realistic emulation goal which, as byproduct, also creates something new. For lack of something revolutionary, something evolutionary will have to do. And IMO that evolution is modeling, since it's not been applied to most instruments as of yet - sampling is truly "all grown up with nowhere to go".

 

Originally posted by felix the scary black cat:I would be disappointed if advancements in modeling were just to facilitate synthesizers modeling acoustic instruments. I'd be hopeful that future progress for synthesizers would be to advance them as their own special kind of instrument instead. But better modelers and more powerful ROMplers are just variations on a theme IMO. Are we really going to see anything completely different?

I used to think I was Libertarian. Until I saw their platform; now I know I'm no more Libertarian than I am RepubliCrat or neoCON or Liberal or Socialist.

 

This ain't no track meet; this is football.

Link to comment
Share on other sites

Lots of things other than acoustic instruments can be modeled, and I hope people find innovative ways to use modeling technologies toward innovative sound design techniques.

 

So we have Samplers, ROMplers, and Modelers all at an evolutionary, rather than revolutionary stage. So our best bet for revolution is in new forms of synthesis.

Link to comment
Share on other sites

Play around with Tassman 3 for a while and you will begin to understand the possibilities of modeling. It is nice to mix and match physical elements into instruments that do not exist. The tricky part is finding a sweet spot that gives you control to vary a sound without hitting unusable extremes.

 

Robert

This post edited for speling.

My Sweetwater Gear Exchange Page

Link to comment
Share on other sites

Additive, ladies and gents, additive.

 

When CPU horsepower can handle the load (I'm predicting 5 years), true additive synthesis, with a touch of modeling should dominate emulative synthesis. But we have a ways to to.

"For instance" is not proof.

 

Link to comment
Share on other sites

Many of the responses in this thread have addressed the engine inside the synth, but not the physicality of the synth itself, and that is what came to my mind when I read the question:

 

Has the synthesizer, as a musical instrument, reached maturity?
Musical instrument...hmm...on that level, I think synthesizers are by-and-large still in their infancy, but part of that is because synth players are not expecting more.

 

There have been some wonderful control devices over the years, and keyboards with wonderful feel and response...but the public at large does not seem willing to buy, explore or master an instrument that is more than a row of velocity-sensitive switches with a complement of real-time controllers in the forms of knobs, sliders, wheels, joystick, pad, etc.

 

Until this happens, all the improvements to the engine inside are going to be hampered to a large degree.

Go tell someone you love that you love them.
Link to comment
Share on other sites

The morphology of the synthesizer is intrinsically elaborated out of the principle of mutability. In this, it is unique among instruments.

 

Unless that changes, synthesizers will continue to be ever-changing.

 

I don't see us as anywhere _near_ done with interface, performative expressive control, synthesis science and theory implementation, or with recombinant possibilities derived from the atomic elements of sound and music.

 

The crest we appear to be hung on for the moment is sample-based synthesis, and subtractive synthesis. The limitation there is not so much in the ability to implement alternatives, but in the fact that we still haven't solved the "DX7 patch programming" problem for a wide variety of synthesis techniques. There's a level of understanding needed to really make many of the remaining synth techniques which haven't gotten full play more available to the musician for performative engagement. There's a lack of a level of understanding generally of all the fundamentals of sound that should contribute to anyone's musical education. It's a rare bird, even here, who can attack the problem of sound design from a detailed, knowledgeable perspective on and mastery of such fundamentals.

 

I'd say education is the biggest lack at this point, and ignorance the pervasive result.

 

But then again, that's the case across the board for the past thirty years, not just with synthesis. ;)

 

rt

Link to comment
Share on other sites

A couple of quick points:

 

- I'll tend to consider the synthesizer a mature instrument when I'll hear a good implementation of additive resynthesis, complete with separation of the noise components and full editing, like access to the pitch envelope of every harmonic etc.

 

- Wendy Carlos said that physical modeling is a 'cul de sac'. I couldn't agree more. It can sound very good, but conceptually and creatively, I'll look forward for additive. Folks please, I'm getting older! :D

Link to comment
Share on other sites

I'm perceiving two (or more) different threads here.

 

First is the synth interface thread. There are many different interfaces generally based on existing instruments such as drums, sax, guitar, and of course mostly keyboard. I'd imagine there could be many others. One possibility is to expand the D-beam idea so that it works like a Theremin, or even so it envelops you in a total 3D performance space. Every body movement, every twitch and eyeblink would result in sound production.

 

That aside, there's a good reason synth interfaces mostly look like other, long-existing instruments. Those other instruments allow our fingers, our arms, our legs/feet, our mouths/breath to express ourselves musically. They are OUR interface to the world. Frankly, it'll be very difficult to create brand-new interfaces or even to push existing interfaces much beyond where they are at - the limitation being not machinery but our own dexterity. I'm a keyboardist who happens to play synthesizer. There are guitarists who play synth, drrummers, sax players - all on something that approaches the interface they are familiar with. Perhap one day there will be the 'synthesist' on a unique interface....

 

Next discussion is the actual technology on the other side of the interface. It is not neccessary for a sax player to understand the physics behind the sound he produces. All he need know are the techniques to express himself. So he may remain 'ignorant' yet be a great player.

 

I consider synth sound design and performance two different arenas. They naturally have some overlap, but the player/performer need not have full understanding of the physics involved in order to perform. This might even tend to limit the technologies which emerge, and the use of those technologies. Remember, the vast majority of DX7 users wanted nothing more than a portable electronic piano-type thing. It was capable of much more, but how many of the sounds created on it were considered musically useful? Now look at neural, additive, resynthesis, etc. All cool ideas, and I'd encourage their development. But will they enable the average musician to produce something useful with minimal training? Subtractive synthesis didn't become a common tool until the Minimoog stripped it down to essentials, and made it easy to use. Any synth technology which cannot do that will likely remain "in the lab".

I used to think I was Libertarian. Until I saw their platform; now I know I'm no more Libertarian than I am RepubliCrat or neoCON or Liberal or Socialist.

 

This ain't no track meet; this is football.

Link to comment
Share on other sites

Originally posted by coyote:

That aside, there's a good reason synth interfaces mostly look like other, long-existing instruments. Those other instruments allow our fingers, our arms, our legs/feet, our mouths/breath to express ourselves musically. They are OUR interface to the world. Frankly, it'll be very difficult to create brand-new interfaces or even to push existing interfaces much beyond where they are at - the limitation being not machinery but our own dexterity. I'm a keyboardist who happens to play synthesizer. There are guitarists who play synth, drrummers, sax players - all on something that approaches the interface they are familiar with. Perhap one day there will be the 'synthesist' on a unique interface....

Yes! I wholeheartedly wish for a "new" interface for playing synths.

 

My take on it is that the keyboard interface is a great one, but I don't see it as "connected" to the sound of synthesizers as say a guitar is to it's sound.

 

For instance, a guitarist can bend notes, pick in different areas of the strings and use numerous other playing techniques to vary his performace (and hence "musicality") at will, and immediately. It's a very intimate playing interface, as are many other acoustic instruments. These instruments invite an individual's creativity to shape what is being played - in real time. The guitar for instance (again), is such a powerful physical interface that if you had different guitarists play the same piece of music, and on the same exact setup, each player would still probably sound very different. It may even be possible to pick out who the guitarist is without seeing who it is, just by hearing the expression they add via their interaction with the instrument. The same is true with keyboards, but to a much lesser degree I think.

 

With keyboard synths, the musician is "most often" locked into a non-moveable instrument. This forces a person to sit or stand in one spot to play it. That one aspect has limitations in sparking creativity imo. Not that I want to run around my house when I play, but I think you know what I mean. ;) Music can be very moving, both emotionally and physically. I feel that being able to move around can have positive impact on a performance.

 

Keyboard synth performance interfaces have improved massively over the past 20 years though, I will say that. From keyboard velocity, to the various pads, wheels, etc. They are nice additions, yet I still don't feel that the instrument lends itself to being manipulated as expressively as a guitar ("...again with the guitar!" ;) ). Yes, it can be (and is) done, but not as easily, imo.

 

As an example, aftertouch is very nice in theory, but in real-life proves to be largely (not totally) unusable for precise control. I mostly use it for adding some vague amount of vibrato while playing monophonically, or for coarse changes in the tonality of certain patches. With some synths that I own, the aftertouch response mostly acts as an on/off control. It's impossible to play the keys and control the aftertouch in any fashion other then min/max , so I don't even use it on those synths.

 

Other performance controllers are nice, like various ribbons and x/y pads, however, their usability is limited sometimes because of the location/placement of the controls. The pad on the Minimoog Voayger is an example of this.

 

I think it would be neat to have some type of synth controller like the SynthAxe that Alan Holdworth used to use. It had guitar-like strings that could be picked, tapped, strummed, and bent. There were also "keys" that would allow you to play other sounds (apart from the sounds triggered by the strings). It also had controls to modify the sound as you played. Some of these controls were pressure sensitive. Cool stuff!

 

That's just one example though. I can envision an electroinic instrument that would allow you to hold it, articulate every note independently with ease. Maybe even something that would allow you to use pressure on various parts of the body of the instrument, that would also sense your hand's orientaion above the point of contact. Press on it in one place to vary say... an osc sync effec. Then while still maintaining the pressue for that effect, being able to slightly rotate the same hand to introduce proportional vibrato or any other effect on the sound that you can think of. Maybe even have an area on the body of this instrument that you can hit or tap with your palm or fingertips which responded like a drum head but would induce a percussive variation of any synth parameter/s. That way you could just hit the thing to add accents to a synth line or chords, or whatever. Maybe use it to kick in a formant filter and sweep the formants with an AR envelope that responds in proportion to how hard you hit the virtual pad. To further the idea (possibly too far) how about having the whole instrument respond to a performer trying to bend the instrument itself? The whole thing could be cased in one huge piezo-electric sensor! ;)

 

Heck, I can remember when I was first learning how to play synths/keyboards as a young teen. In the very first band I was in (playing a Moog Satellite, SH-1, RS-505), I would sometimes instinctively move my fingers left to right on a key while playing a melodic line. In my unlearned brain, I felt this was a natural movement of my hand to impart vibrato. Later in my life, I've read that there have been some keyboard instruments that have had features like this. Where the heck are they? Bring 'em back I say!!!! :D

 

Many of these ideas could probably be done using various controllers on the market today, but I think if an instrument were designed that was more integrated and more responsive to the way a person would NATURALLY interact with an acoustic intrument, it would be pretty cool....

 

Oh...and if any manufacturers are reading this, I accept royalties! My email addy is in my profile. :D

Link to comment
Share on other sites

I do think the interface needs to improve. As does the brain. Synths have certain limitations due to their history.

 

Take Coyote's example of sample retriggering on each key press. I understand why a rompler replays the entire sample (the attack is sampled in) but analog, va and fm synths, have little control over this too. We typically have a single versus multi-trigerring switch. Instead we should be able to trigger each envelope via a key press OR a controller such as a ribbon. Imagine being able to articulate a "wah" filter envelope each time you depress a ribbon, while holding down note. Most synths don't allow this, even though it would be useful. The reason: when synths were young, this kind of flexibility required huge amounts of hardware and we have never revisited this issue in the age of software control.

 

Similarly, the dedication of portamento (lag) to pitch, the unique treatment of pitchbend (MIDI) information, the discrete treatment of oscillator pitch (a stepped coarse tune, and a smooth fine tune), the 128 step midi controllers, the dedication of velocity to the note being played, the routing of effects to the end of the signal chain, etc.

 

I realize that the software environment that includes more flexibility in these areas is like a modular synth. But synth interfaces need to be more customizable if we are to have more expressive things to say.

 

Jerry

Link to comment
Share on other sites

This is not neccessarily true. Look back at the flute example I posed earlier in the thread. A "breath" on the first note of a legato run, but on none following. The same happens on all other wind instruments, and something similar happens on plucked and bowed strings. If we model that feature, we don't need to assign it to a ribbon - all we need do is a brief pause like the instrumentalist would do. And that feature could then be brought to ANY synthesis technology.

 

(It already exists somewhat in legato mono synth playing, except that the envelope continues to decay as the notes are played. Strangely the device which comes closest in spirit is the Hammond organ with its multi-triggered yet monophonic percussion. And what technology has gotten closest to the Hammond? You guessed it: modeling.)

 

Originally posted by Jerry Aiyathurai:

But synth interfaces need to be more customizable if we are to have more expressive things to say.

Jerry

If you could wiggle the keys back and forth to produce controlled vibrato... nah, use a guitar midi interface if you want that.

I used to think I was Libertarian. Until I saw their platform; now I know I'm no more Libertarian than I am RepubliCrat or neoCON or Liberal or Socialist.

 

This ain't no track meet; this is football.

Link to comment
Share on other sites

Ok, Now I'm confused. :confused:

 

Originally posted by coyote:

This is not neccessarily true. Look back at the flute example I posed earlier in the thread. A "breath" on the first note of a legato run, but on none following.

Do you mean a breathy attack (or chiff)? Or do you mean a pause (breath)? If you mean a chiff and the keyboard legato-staccato touch is insufficient, then your situation appears similar to one who just wants to insert a wah (or chiff or grunt or other articulation) based on a trigger that is separate from the keyboard.

 

Or maybe I'm missing something. :freak:

 

Originally posted by coyote:

If you could wiggle the keys back and forth to produce controlled vibrato ... nah, use a guitar midi interface if you want that.

Actually I really love the ribbons we have for vibrato. :thu: That's not my quibble. Though I understand what you mean about it not being the same as a guitar. Personally it's the limited interface we have to control timbre that's occasionally frustrating to me. I understand that it may not be so for you.

 

Cheers,

 

Jerry

Link to comment
Share on other sites

I suppose I'm rehashing what the others have said above, but I'm always happy to hear myself type. ;)

 

There is a problem with demanding more synthesis types, as the more arcane they are, the more difficult they are to program to get a musically useful sound, or a sound that you're aiming to create. As a synthesist I'm always looking for more soundmaking capacity to broaden my pallete, but the further you get away from traditional synthesis structures, i.e. Minimoog osc-filter-amp, or the more arcane elements you include, such as with the Kurzweil, Absynth or Nord Modular, the more lost we become wrangling with them. I do agree with everyone that this "arcanery" should continue, even if the programmer has only the foggiest idea what they're doing. The engineers creating the continuing evolution of todays synthesizers are edging forward cautiously, but they have a daunting task as they have to give us wild new modes of sound generation, while not loosing us with what we're given. In the case of FM or additive synthesis, there's not much they can do besides slap a resonant filter on it to give us some familiar control. With physical modelling, they give us cut and paste simplified modules with familiar elements of known instruments, and I'm not sure what else they can do in that regard.

 

I don't think there's reason to mutter discontentedly. Each synth maker is advancing the state of the art gradually. Emu is still putting their 60 some odd filter types in their instruments. The Korg Karma has perhaps the most advanced algorithmic music generation and modifying system available. Roland gave us the V Synth. Nord is giving us the next generation of their Modular with some inticing synthesis elements that I'm lusting after rather badly. Kurzweil is inching forward the slowest perhaps, but their synths are already so advanced, most people such as myself have barely touched the more convoluted modules they give us to work with. In fact most synth programmers still stick to the well travelled roads of the basic osc-filter-amp to create with, and I'm as guilty of that as anyone. I do think synth manufacturers should follow the Kurzweil and Nord model, and offer thousands of sounds online and on CDs with each instrument. Kurzweil even has webspace available to upload your own sound creations. The best teacher for learning how to use these tools is possibly to see what someone else has done with them. Play a patch, mess with the controllers, open up the engine to see what the elements do and fiddle about.

 

We should also tell the manufaturers what we want, all of them. In particular, I think a synth is lacking if they don't offer some way to turn voices on and off while we play, as with the Kurzweils and old Ensoniqs with Patch Select buttons and pedals. It opened up some marvelous expression possibilities, as you can change the sound of a patch subtly or outrageously as you play, and would allow some additional control over things such as Coyote's flute patch. I also think it's about time that polyphonic pressure sensitive keyboards return. Configurable modules such as Creamware's Noah and Roland's VariOS are growing in popularity, in which you can load in synthesizer instruments and effects, and more instruments are coming. Roland appears to be producing a VariOS keyboard, based on the venerable Jupiter 8 with a huge control panel laiden with sliders and knobs, hurray! I also think it's nigh time that a new incarnation of the Fairlight and Synclavier emerge. Those instruments still do things no synthesizer does, and with 1980s level technology. We have almost infinitely faster processors, ram and DSP chips, and with little expense can load up an instrument with a 2 ghz CPU, hard drive, CDR rom, a gigabyte of rom and ram, and include monitor, qwerty keyboard, pen and mouse interfaces. Or do the same thing with a computer, just come up with a software package and peripherals to create one on your PC or Mac. I would prefer a dedicated instrument, as PCs and Macs are still rather delicate, but for studio or home use it would be ideal. The manufacturers have been giving us much of what we want, so let's keep asking. In time, we recieve.

This keyboard solo has obviously been tampered with!
Link to comment
Share on other sites

Sir Basil,

 

Thanks for summarizing some of the thoughts here, and more, so eloquently.

 

Re: that patch switch on/off thing, I'd go even further (if you can call it that) and suggest that it'd be cool if you could "sequence" a set of predetermined waveforms in a sample-based synth so that you could morph the sound in real-time, with a modulator, by traversing a series of sampled sounds seamlessly.

 

Wait: that's the Microwave! Sort of... :) -- which is to say, I hope Waldorf comes up with an up-to-date iteration of this concept some day. In hardware.

 

rt

Link to comment
Share on other sites

I am obliged.

 

And yes, it's also the Korg Wavestation, which I had for a while and then sold like an idiot, but I love those Waldorf synths, and will have to get one of those too someday. What if we could sequence entire patches, with adjustable fade times between each patch slot?

 

Another trend which I hope is universal is for each synth to include several controller types on each synth, or master controller if we get to the point that the synth is a module loaded with softinstruments. Kurzweil offers the player an incredible wealth of control, with two ribbons - one large and programmable, two wheels, four footswitch, two expression pedals, eight sliders, ten switches and a breath controller! *gasp* Now THAT'S control options! And I shouldn't neglect to mention the Andromeda with its full control panel and its own large ribbon. Another synth of note is the forthcoming eKo, although it's a PC in a keyboard case and plays VST instruments. However it does include a nice selection of controllers.

 

http://namm.harmony-central.com/WNAMM03/Content/Open_Labs/PR/eKo-lg.jpg

 

I did neglect to mention my opinion, that synthesizers are fairly mature between the engines they use and the controllers and output options they're equipped with, while at the same time have no discernable limits on how far they can be taken. I do wish that the industry would adopt mLAN, the full spec of course. Unless someone has a better idea, go with it for pity's sake! :P

This keyboard solo has obviously been tampered with!
Link to comment
Share on other sites

Originally posted by coyote:

....

If you could wiggle the keys back and forth to produce controlled vibrato... nah, use a guitar midi interface if you want that.

I like to use aftertouch for that. For more extream cases I wiggle my joystick from side to side.

 

Robert

This post edited for speling.

My Sweetwater Gear Exchange Page

Link to comment
Share on other sites

Originally posted by Sir Basil:

I don't think there's reason to mutter discontentedly. Each synth maker is advancing the state of the art gradually.

Just to clarify, when I started this thread, it was my observation that Synths have matured, much like many other instruments have matured in history. I believe they'll continue to progress, but the progression looks like it will be along paths that have been established, moreso than forging new paths. I don't think this is a bad thing - in fact, I don't even want alternate controllers (I like keyboards).

 

I look forward to the new and innovative ways that manufacturers find to push synthesis and synthesizers forward.

Link to comment
Share on other sites

Originally posted by Rabid:

Originally posted by coyote:

....

If you could wiggle the keys back and forth to produce controlled vibrato... nah, use a guitar midi interface if you want that.

I like to use aftertouch for that. For more extream cases I wiggle my joystick from side to side.

 

Robert

Robert, just curious, are you using aftertouch to control LFO modulation of pitch, or do you have pitch bend routed to aftertouch?

 

I've done the latter, but on my synths it's not responsive enough for vibrato. Does good uni-directional pitch bends though. :thu:

 

Cheers,

 

Jerry

Link to comment
Share on other sites

"If you could wiggle the keys back and forth to produce controlled vibrato....."

 

This was a feature on the high-end Yamaha Electone organs of the early '70's.

It worked exactly as you've imagined.

Simple and effective, it was called "Touch Vibrato".

It was also found on at least one of their portable organs, along with a "portamento" strip, whose function I can't find the words for at the moment.

Link to comment
Share on other sites

Jeep, that's exactly right, and somewhere online I saw that information years ago, and I'd like to track that down again. I thot that was a rather cool feature, and if those organs exist, maybe I could score one on ebay.

 

And felix, I agree with you. I guess that line was making everyone sound grouchy, wasn't it? But this future we're in is very exciting. I'm expecting some absolutely huge developments in synthesizers soon, such as my baby Synclavier, the Korg Oasys, and keyboards well endowed with bountiful synthesis engines and lots of control options. I would like to see at least one synth maker come up with a monster module and associated controllers for them, so you could choose the number of keys, weighted or unweighted, wheels or joystick, channel or polypressure, lots of knobs/sliders or few. And once again, fully implemented mLAN please. ;)

This keyboard solo has obviously been tampered with!
Link to comment
Share on other sites

I've played those organs regularly. We purchased the one just below that in the range. :(

 

My mum was taking organ lessons from the Yamaha school. And one of my friends taught there. For recitals, these organs would be ones on stage.

 

It was only the third manual that had the feature. It had slightly smaller keys and lead type (trumpet, oboe) tones with some envelope to them instead of organ stops. These organs also had a cool ribbon a la the CS80.

 

IIRC there were two or three models with traditional wood and the level above that was the mighty GX1 in space age white. The GX1 was toured around and I saw people perform on it, but I never got to play it.

 

Jerry

Link to comment
Share on other sites

Originally posted by Jerry Aiyathurai:

Originally posted by Rabid:

Originally posted by coyote:

....

If you could wiggle the keys back and forth to produce controlled vibrato... nah, use a guitar midi interface if you want that.

I like to use aftertouch for that. For more extream cases I wiggle my joystick from side to side.

 

Robert

Robert, just curious, are you using aftertouch to control LFO modulation of pitch, or do you have pitch bend routed to aftertouch?

 

I've done the latter, but on my synths it's not responsive enough for vibrato. Does good uni-directional pitch bends though. :thu:

 

Cheers,

 

Jerry

I route aftertouch to the LFO amount. Routing directly to pitch is dangerous for my playing. If I get excited and press on the keys while playing I get a bit out of tune. :)

 

Robert

This post edited for speling.

My Sweetwater Gear Exchange Page

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...