Jump to content


Please note: You can easily log in to MPN using your Facebook account!

How much MIDI do you use live?


frostbyte

Recommended Posts



  • Replies 85
  • Created
  • Last Reply
I would use (for practicing, and not live performance) an Alesis MMT8. It was a great little piece of hardware. It finally died and I gave it away.

No guitarists were harmed during the making of this message.

 

In general, harmonic complexity is inversely proportional to the ratio between chording and non-chording instruments.

 

Link to comment
Share on other sites

There was a time in the late 90's when I used MIDI for stage sequencing (multiple modules and drum machines) plus multi-vocalist harmonizing, controlling MIDI mixers, and controlling lights, all live at the same time for a vocal group/stage show. That was probably when I used MIDI the most, although I've used it in many ways since the early 80's.

 

These days in my solo act I use either an arranger with an 88-note controller live (some MIDI) or a piano and an MP3 player (no MIDI on stage). I still create sequences a lot and that requires a healthy MIDI setup in my home studio. So I guess I use MIDI less but still couldn't live without it.

 

I still keep my MIDI mixers and light controllers handy in case another show group needs me.

Link to comment
Share on other sites

Back in the 90s my MIDIBoard was the master controller for my modules, boards, and effects. It was great to have everything preprogrammed - patch numbers, volume, CC controllers, layers, everything.

 

The last band I was in was a blues/R&B band. I didn't need MIDI (blues/R&B mostly requires Hammond and piano) and I left the MIDIBoard at home.

 

I plan on using MIDI when I get a small format band going - not only will I preprogram all my patches but I will sequence bass and drum parts. The Andromeda will be my master controller and the Alesis Datadisk SQ will slave to MIDI clock from the Andy. I'd use the MIDIBoard but it has no facility to transmit MIDI clock.

Link to comment
Share on other sites

One wire connecting my keytar to my triton. I used to use a lot more but it's so overly complicated that I haven't messed with it. I think for our next show I will be using both the AX1 and a Korg T2 as midi controllers, but we'll see.

"...Keytar in a heavy metal band is nothing more than window dressing" - Sven Golly

 

Cursed Eternity - My Band

Dick Ward - My Me

Link to comment
Share on other sites

Originally posted by soundscape:

Depends on whether the 'internal' data representation (i.e., from the keyboard to the tone generator) is actually MIDI, doesn't it?

MIDI data only "exists" when you connect two instruments to one another. There is no MIDI between a digital piano or workstation's keys and it's tone generator.

 

MIDI is both the standardized hardware interface and the serialized data stream that the processor in your keyboard creates, based on various input sources (key velocity, aftertouch, pitch/mod wheel position, button presses, knob turns, etc), and then transmits to the MIDI (or sometimes USB/Firewire) port. Between your keys and various controllers and the internal CPU on your synth/keyboard, it's merely discrete signals from various components.

Link to comment
Share on other sites

Originally posted by Sven Golly:

Originally posted by soundscape:

Depends on whether the 'internal' data representation (i.e., from the keyboard to the tone generator) is actually MIDI, doesn't it?

MIDI data only "exists" when you connect two instruments to one another. There is no MIDI between a digital piano or workstation's keys and it's tone generator.

 

MIDI is both the standardized hardware interface and the serial data stream that the processor in your keyboard creates, based on various input sources (key velocity, aftertouch, pitch/mod wheel position, button presses, knob turns, etc), and then transmits to the MIDI (or sometimes USB/Firewire) port.

Yep, that's exactly what I was suggesting.
Link to comment
Share on other sites

Originally posted by Sven Golly:

MIDI data only "exists" when you connect two instruments to one another.

Actually, I don't think that's 100% accurate. Certain control data can be (and is) transmitted/manipulated intra-instrument as MIDI data. For example, consider how the K2600 handles programming for after touch, its sliders, layers, and splits.
Link to comment
Share on other sites

Originally posted by dp2:

Originally posted by Sven Golly:

MIDI data only "exists" when you connect two instruments to one another.

Actually, I don't think that's 100% accurate. Certain control data can be (and is) transmitted/manipulated intra-instrument as MIDI data. For example, consider how the K2600 handles programming for after touch, its sliders, layers, and splits.
Based on your statement, it would appear to me that you are misunderstanding what is meant by MIDI data. Are you familiar with the structure of the MIDI data stream?

 

If you are, then please cite a reference from Kurzweil that supports your statement.

Link to comment
Share on other sites

Originally posted by soundscape:

Originally posted by dp2:

an Atari ST computer (running Creator and Notator for those of you who remember it) =(

Talking of which... how have sequencers improved (or not) over the years...?
Let me take a raincheck on my reply to this one.

 

Initially, I can already tell that there have been some great improvements, but I'd like to take some time to "explore the limits" a bit.

 

Nevertheless, I can already tell you for certain that Korg's Karma technology definitely is a disruptive technology for sequencing. It takes more of a parametric approach to sequencing which is ideal for me, because it allows me to be able to tweak various aspects of the sequence in real-time and afterwards too. In the past, sequencing felt more like recording to tape to me--except usually more restrictive.

Link to comment
Share on other sites

Originally posted by Sven Golly:

Originally posted by dp2:

Originally posted by Sven Golly:

MIDI data only "exists" when you connect two instruments to one another.

Actually, I don't think that's 100% accurate. Certain control data can be (and is) transmitted/manipulated intra-instrument as MIDI data. For example, consider how the K2600 handles programming for after touch, its sliders, layers, and splits.
Based on your statement, it would appear to me that you are misunderstanding what is meant by MIDI data. Are you familiar with the structure of the MIDI data stream?

 

If you are, then please cite a reference from Kurzweil that supports your statement.

I'd like to see that reference as well. Because of the serial nature of MIDI, using it for internal information can introduce unwanted latency in the data stream. It makes much more sense to use ribbon cables with parallel data stream for any data transmitted within the instrument.

 

I've had my N364 apart on several occasions, and the connections from the motherboard to every other board that uses the data stream is connected with a ribbon cable, in much the same way the various components of a computer are connected.

"In the beginning, Adam had the blues, 'cause he was lonesome.

So God helped him and created woman.

 

Now everybody's got the blues."

 

Willie Dixon

 

 

 

 

 

Link to comment
Share on other sites

From http://www.midi.org/about-midi/abtmidi.shtml

 

'The original Musical Instrument Digital Interface (MIDI) specification defined a physical connector and message format for connecting devices and controlling them in "real time". A few years later Standard MIDI Files were developed as a storage format so performance information could be recalled at a later date. The three parts of MIDI are often just referred to as "MIDI", even though they are distinctly different parts with different characteristics.

 

'The MIDI Message specification (or "MIDI Protocol") is probably the most important part of MIDI. Though originally intended just for use with the MIDI DIN transport (see Part 2) as a means to connect two keyboards, MIDI messages are now used inside computers and cell phones to generate music, and transported over any number of professional and consumer interfaces (USB, FireWire, etc.) to a wide variety of MIDI-equipped devices. There are different message groups for different applications, only some of which are we able to explain here.

 

'There are also many different cables/connectors that are used to transport MIDI data between devices. The "MIDI DIN" transport causes a lot of confusion because it has specific characteristics which some people associate as characteristics of "MIDI" -- forgetting that the MIDI-DIN characteristics go away when using MIDI over other transports (and inside a computer). With computers a High Speed Serial, USB or FireWire connection is more common. Each transport has its own performance characteristics which might make some difference in specific applications, but in general the transport is the least important part of MIDI, as long as it allows you to connect all the devices you want use!

 

'The final part of MIDI is the Standard MIDI File (and variants), which is used to distribute music playable on MIDI players of both the hardware and software variety. All popular computer platforms can play MIDI files (*.mid) and there are thousands of web sites offering files for sale or even for free. Anyone can make a MIDI file using commercial (or free) software that is readily available, and many people do, with a wide variety of results. Whether or not you like a specific MIDI file can depend on how well it was created, and how accurately your synthesizer plays the file... not all synthesizers are the same, and unless yours is similar to that of the file composer, what you hear may not be at all what he or she intended.'

Link to comment
Share on other sites

Originally posted by Joe P:

Eric,

I'm curious why you would use an S90 to control an Electro. Can you elaborate?

Regards,

Joe

Sorry, I missed this question the first time around and I think the forum has been down for a few days or something.

 

I use the S90 and the Electro together for many purposes. I use the S90 as the lower manual for traditional Hammond, with the Electro as upper manual. I use the S90 to control the Electro's EP sounds sometimes. I have Master setups in the S90 for different songs that trigger a program change and possibly a zone setting for the Electro. For example, I may need to play several S90 sounds together, with Electro EP somewhere in the mix. The S90 Master sets the right program on the Electro and maps the sound to a particular range of keys.

 

Regards,

Eric

Link to comment
Share on other sites

Eric,

Thanks for the clarification, I was thinking too simplistically. I can see the lower organ manual usage (although I didn't think of that). I just thought "why would you use the action of the S90 for organ, and wouldn't the S90 have a good Yamaha EP?". But your reply clarified and kinda blew me out of the water! :)

Regards,

Joe

Link to comment
Share on other sites

I think it's sort of ironic how advancements in technology have made MIDI more powerful while simultaneously diminishing the need for it. Just my $.02 anyway. The only time I use it is to play a drum machine from an actual keyboard rather than those annoying pads. That's just me, though. Take what I say with a grain of salt, as my music could probably be thought to a chimp in a long afternoon.
Link to comment
Share on other sites

My journey through the MIDI maze has taken me from 5 keys/3 modules/2 drum machines/sequencer in the late 80's to 3 keys/3 modules now. My goal, like many here, is to get it down to 2 or 3 keys, no MIDI. I MAY have one of the keys control my laptop (which I have spent quite a bit of time on getting it ready for the road). It has Traktion2 (Sonar in the studio, but I've been playing with Traktion and like it - and it's inexpensive!), Minimonsta, Atmosphere, B4, CS80v, MoogModular, ImpOscar and TimeWarp2600. That should blow my 80's setup out of the water sonically. Now, just to control it will be the challenge.

 

My biggest headache in the 80's were songs like Europe's "Final Countdown" (which has tons of backing keys parts), John Farnham's "You're The Voice", and Saga's "On The Loose". That took a lot of sequencing, and getting our drummer to lock to a click was a real headache ("Bartender, can I have another one please?"). I have learned to simplify the keys parts and arrangements, and, of course, I play better now, so I can actually play 2 different parts at the same time. Our band leader wants me to use Reason-type stuff on some of our tunes, the jury is still out on this one. I am playing with the arpeggiators on some of the softies, and having fun, but I would rather concentrate on playing (which can be a challenge all on it's own).

 

Jay

Link to comment
Share on other sites

Originally posted by Sven Golly:

If you are, then please cite a reference from Kurzweil that supports your statement.

Sorry for the delay in my reply; my wife and I moved a week ago--literally.

 

Anyway, I'd like to assume that you understood soundscape's explanation of MIDI data, so that you'd not need the reference. However, I'll supply a few references anyway just in case.

 

Page 6-7 (of the Musician's Guide), pargraph 1:

"The local keyboard channel enables the K2600 to receive MIDI information on a single channel, then rechannelize that information. . . . When you're in Program mode, the local keyboard channel remaps incoming information to the K2600's current channel . . ."

 

Page 6-8, paragraph 2:

"All MIDI information the the K2600 receives on the local keyboard channel gets remapped to the channels and control destinations used by the zones in the setup."

 

In case you didn't realize this, remapping and rechannelizing operations are internal forms of manipulation of MIDI data. I could have provided a lot more examples, but I think these should be sufficient.

Link to comment
Share on other sites

Most MIDI "commands" are stored in a microprocessor's ROM. The input from a keyboard is serialized 1s and 0s. A shift register is a simple equivalent of this.

 

It's the processor/software that creates MIDI from this simple data.

 

The original Prophet 600 used a Z-80 processor (which doesn't respond to MIDI). When, for example, the C4 key is pressed, it sent a digital "value" to the DAC, which sent a voltage to the sound generator to produce the subsequent pitch.

 

In parallel to this, the processor sent out a MIDI set of data, stored in ROM, to the MIDI Out port. MIDI itself isn't used internally to the keyboard at all.

 

All of the chips, including the processor, were already developed before MIDI, and thus do not speak MIDI.

Link to comment
Share on other sites

Originally posted by dp2:

Anyway, I'd like to assume that you understood soundscape's explanation of MIDI data, so that you'd not need the reference.

dp2, you're missing the point. MIDI is not used internally in the Kurzweil, between the physical keyboard and the cpu. Of

 

The examples you cited discusses how data received at the MIDI in port is interpreted by the keyboard... it does not refer to the internal communications between the physical keyboard and the cpu.

 

Re-read my post, then re-read what you cited. You'll see that we're not talking about the same thing.

 

Soundscape, nothing that you posted suggests that the internal, intracomponent communication in the Kurzweil is done as structured, serialized MIDI data. Please see Prague's post above this.

 

I'm amazed that a concept as simple as MIDI can be so misunderstood. :(

Link to comment
Share on other sites

Originally posted by Sven Golly:

The examples you cited discusses how data received at the MIDI in port is interpreted by the keyboard... it does not refer to the internal communications between the physical keyboard and the cpu.

 

Re-read my post, then re-read what you cited. You'll see that we're not talking about the same thing.

Sven, either you misread or you misunderstood what I wrote. I wrote that MIDI data also gets manipulated internally.

 

No offense: the line you just wrote, ". . . is interpreted by the keyboard . . ." is utter nonsense. Keyboards don't interpret data--period. Something in the keyboard has to process that inbound data. That something has to be one (or more) of the following: an I/O channel device, a microcontroller, or a CPU. Technically, if I wanted to disect your attempted rebuttal, then I would have pointed out that the moment that the inbound MIDI data got latched by the first register within the first port to receive the data is being manipulated. Furthermore, each time that data gets propagated--whether passed through various gates or something else--that data is manipulated.

 

If you fully understood what I wrote from both a hardware and software perspective, then you should have been able to connect the dots.

 

Perhaps, in the initial statement (to which I replied), you weren't thinking of internal data interactions. Besides being a musician, I work in IT, and I specialize in systems programming. For me it's all about the internal data. I simply pointed out that MIDI data can't only "exist"--using your words--when you connect 2 (or more) keyboards. Something (or some things) within the keyboard has to generate the MIDI data, and that data has to be routed through several other components before it gets sent out of the keyboard.

 

Re-read. Think. Perceive. And then reply if you'd like. :)

Link to comment
Share on other sites

Originally posted by Prague:

Most MIDI "commands" are stored in a microprocessor's ROM. The input from a keyboard is serialized 1s and 0s. A shift register is a simple equivalent of this.

 

It's the processor/software that creates MIDI from this simple data.

 

The original Prophet 600 used a Z-80 processor (which doesn't respond to MIDI). When, for example, the C4 key is pressed, it sent a digital "value" to the DAC, which sent a voltage to the sound generator to produce the subsequent pitch.

 

In parallel to this, the processor sent out a MIDI set of data, stored in ROM, to the MIDI Out port. MIDI itself isn't used internally to the keyboard at all.

 

All of the chips, including the processor, were already developed before MIDI, and thus do not speak MIDI.

I realize this, and again my point was simply that the MIDI data couldn't only "exist" when 2 (or more) keyboards are connected. If part of the MIDI data is stored in ROM, then it still has to be manipulated--not necessarily understood--to get it out of the keyboard. Additionally, inbound MIDI data has to be manipulated inside the keyboard as a function of routing the data where it needs to go.

 

I wasn't trying to explain how or why it was done. Rather, I focused narrowly on its existance and internal manipulation.

Link to comment
Share on other sites

In a way, I think Sven is correct. MIDI is a Digital Interface.

 

It leaves the processor ROM and goes directly to the MIDI Out (perhaps via a U/ART). From the MIDI In it goes directly to the processor (perhaps via a U/ART).

 

Unless the ports are connected to something (interfacing), MIDI doesn't "exist". It merely goes to an unused port, hanging in space.

 

I would easily hazard the guess that no keyboard needs MIDI to operate, since it is an interface, not an Operating System.

 

A computer doesn't need Ethernet, either. It is simply an interface...

 

 

... maybe. ;)

Link to comment
Share on other sites

Originally posted by Prague:

In a way, I think Sven is correct. MIDI is a Digital Interface.

 

It leaves the processor ROM and goes directly to the MIDI Out (perhaps via a U/ART). From the MIDI In it goes directly to the processor (perhaps via a U/ART).

 

Unless the ports are connected to something (interfacing), MIDI doesn't "exist". It merely goes to an unused port, hanging in space.

 

I would easily hazard the guess that no keyboard needs MIDI to operate, since it is an interface, not an Operating System.

 

A computer doesn't need Ethernet, either. It is simply an interface...

 

 

... maybe. ;)

Again, not to be picky, I didn't state that keyboard needs MIDI. I only stated that MIDI data can't only exist when 2 (or more) keyboards are connected. Whether the outbound data is stored in ROM, gets generated by a JIT-compiler implemented for a virtual-machine, or whatever else, that data has to get routed (hence manipulated) through 1 or more components before it's sent along "the wire". Similarly, any inbound data has to be manipulated--if only as a function of the routing--to get it where it needs to go.

 

Interfacing is irrelevant in this case. Either the data exists or it doesn't. If it exists, then something had to create it. Once it exists, it can--but doesn't have to--be sent or received. If the data gets sent or received, then it has to be manipulated along the way.

Link to comment
Share on other sites

It's like, "When a tree falls in a forest, and no one hears it, does it make a sound?". No, it makes no sound. It produces sound waves, but an ear converts these into sound.

 

Does an unconnected battery "produce" electricity? No. There is no current, so there is no electricity.

 

Since a MIDI In port provides the load (interpretation) for a MIDI Out port, the actual data is not produced until it is received. So, it doesn't exist until received.

 

 

... maybe. ;) Heavy, man ....

Link to comment
Share on other sites

As I posted above:

 

''The final part of MIDI is the Standard MIDI File (and variants), which is used to distribute music playable on MIDI players of both the hardware and software variety. All popular computer platforms can play MIDI files (*.mid) and there are thousands of web sites offering files for sale or even for free. Anyone can make a MIDI file using commercial (or free) software that is readily available, and many people do, with a wide variety of results. Whether or not you like a specific MIDI file can depend on how well it was created, and how accurately your synthesizer plays the file... not all synthesizers are the same, and unless yours is similar to that of the file composer, what you hear may not be at all what he or she intended.''

 

(Taken from http://www.midi.org/about-midi/abtmidi.shtml)

 

I guess SMF files don't "exist," then?

Link to comment
Share on other sites

Originally posted by soundscape:

I think we're getting in a bit of a mess here with the different "layers"... application, physical layer, etc.

You're probably right.

 

Nevertheless, whether one deals with the data via the physical layer (usually as a signal) or via one of the "higher" software layers (usually as some kind of message following an arbitrary protocol), if it exists, then it has to be manipulated to be transmitted/received.

 

Yet, the data doesn't have to transmitted/received.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...