Jump to content

JohnG11

Member
  • Posts

    111
  • Joined

  • Last visited

Posts posted by JohnG11

  1. If they wan't to call themselves slutz, why on earth not?

    It's not as if they're pointing a finger at anybody else.

     

    SO much fuss and bother over nine little letters!

    Good grief, who CARES!

     

    Time to get a life.

     

    I'm more concerned about what may happen tomorrow in a country I don't even live in, and constantly thank goodness I don't live there.

     

    Good luck sane American people on inauguration day.

  2. An allusion?

    Is that what it is?

     

    And there was me thinking the adjectives he used were anthropomorphisms!

     

    Oh well! I stand corrected!

     

    Please don't take me too seriously, Chip. I'm really not worth it. Just an old codger on lockdown having a bit of fun.

    I don't have a lot else to do and the Missus is busy teaching French or German or is it English on Zoom?

    Langweilig! Yawn.

     

    JohnG.

  3. This thing is stubborn, mean, and smart.

     

    Sorry Craig, I'm just going to have to take issue with that statement. (In the nicest possible way, of course.)

     

    A virus has no brain. It has no feelings, it has no motives. It just mutates within a changed environment or, as the article suggests, some human being changes its structure.

     

    The mutations that allow it spread more effectively survive, the others mutations perish. It's just survival of the fittest.

    Usually viruses mutate to become less deadly but more infectious.

    Then they can produce lots more of themselves.

    That's the purpose behind all living creatures.

     

    Sorry Craig, it just tickled my funny bone.

     

    And, much kudos to you Greg. The world could do with a few more like you.

    Correction, many more.

     

    JohnG.

  4. Craig, It wouldn't surprise me at all.

    Certainly, what I saw the two times I visited Tokyo, albeit many years ago, was a great respect for other people in the general populace.

    Everyone I met was almost excessively polite.

     

    I came across this interesting article yesterday, It's rather long, but I thought I'd share it. It is just speculation though.

     

    https://nymag.com/intelligencer/article/coronavirus-lab-escape-theory.html?utm_source=pocket-newtab-global-en-GB

     

    JohnG.

  5. July 19 for me. Or so they say -- I don't really remember.

     

    Well, that makes me nearly two weeks older than you ... youngster! ;-)

    Strange you say that, I have no recollection either. Perhaps we both have amnesia?

     

    Yes, I intend to still wear my mask and keep other safety precautions up after I get my second shot. If not for me, for the 'greater good' of society so as not to infect others. I understand I can get it and pass it on before my immune system beats it.

     

    I believe that's the correct thing to do until it all dies down.

     

    Oxford says they will distribute the vaccine at cost - for zero profit. In the US we have the best politicians money can buy. Without profit, they didn't buy. Perhaps the new administration has different priorities - time will tell. 'Nuff said before I cross the line here.

     

    I think I read somewhere in the 'better' UK press (if there is such a thing) that our government had pre-ordered something like 100 million does of the Oxford vaccine, less of the Pfizer one.

    Did, the US pre-order any of the Oxford vaccine I wonder? Maybe that's why the FSA isn't hurrying, there's limited availability and it's going first to the countries that ordered in advance.

    No point rushing to approve it if it won't be available in the US until Feb or March, or even later, who knows.

     

    The Pfizer vaccine, I'm told, uses a completely new process, probably contributing to its higher cost, and the need for deep refrigeration.

    The Oxford one, according to what I've heard on the BBC, is a 'conventional' anti-virus vaccine, not dissimilar to the way the flu jab is produced. I'm guessing that using older technology makes it cheaper.

    I recall seeing UK prices of around 2 pounds for the Oxford vaccine and around 15 pounds for the Pfizer one. Quite a difference.

     

    It's strange how cultures differ.

    Many years ago, back in the late eighties as I recall we had a Japanese gentleman from Fujitsu come to visit our office to learn what we were doing with some networking technology.

    A few days into his visit he came in wearing a face mask, and I asked him if he was scared of catching something.

    "Oh, no" he repled, "I think I'm catching a cold and I don't want to pass it on to anyone."

    Apparently it's normal in their culture.

    I found it was true when a reciprocal visit occured and I spent a few weeks in Tokyo.

    (I also learnt to bow and to say "Hai!")

     

    JohnG.

  6. Thanks for the compliment. I'm 74 and a half. Our governor has given priority to everyone over 65.

     

    People often tell me that I look young for my age.

     

    That makes two of us, Notes.

     

    Me? July '46. They say it was a good year, if very cold in the UK.

     

    I should get my first shot in Feb. A strict rollout here with the real oldies and vulnerables first, then key workers, etc.

    So far about 2.5 million people vaccinated.

     

    You do understand that even when you've had your two shots you can still catch the virus but you have a 90% chance of not being very ill.

    You can still, having caught it, pass it on, as with flu.

     

    Take care.

    JohnG.

  7. Sorry for the delay, other things to deal with.

     

    Here we go with some background.

     

    Before I begin to describe how the data transmission mechanism works, I thought I'd give a quick overview of how a current MIDI controller keyboard works. It will help those not familiar, I believe, to understand the rest of the process.

    Naturally, those who know can skip this section.

    And, it's not meant to be all inclusive, just an overview.

    ______________________________

     

    In a modern computer we will typically have a QWERTY keyboard attached, and the way this works is by an electronic mechanism scanning the keys very regularly to see if one has been pressed.

    If one has, its identity is forwarded to the relevant program. Each key has a simple, spring loaded on/off switch.

     

    A MIDI keyboard works in the same way, that is by being scanned (probably thousands of times a second), but each key has two switches (at least) underneath it.

    One which triggers at the beginning of the down stroke, a second one whih triggers at the end of the down stroke.

     

    Let's say that we have an 88 note controller and we've told it to transmit all notes on MIDI channel 1.

     

    So, say we play a C major chord on our keyboard based on middle C.

    As the scanning mechanism surveys the switches it will detect that, e.g. the C key has just begun its downward movement and the first switch has triggered.

    (we may think we play all the notes simultaneously, but in terms of microsecond timing, we don't. Hand shape, length of fingers, rotated wrist, etc.)

     

    So the scanning mechanism now knows a MIDI Note On has occured, it's on Channel 1, and it's a middle C or MIDI Note #60, but it doesn't yet know how strongly the key has been played.

    It starts a note timer. From the timer it will derive how quickly the key travels from top to bottom.

     

    Meanwhile scanning continues and G (top switch) is detected so it can do the same as above but G above middle C is MIDI Note #67. Start another note timer.

     

    Scanning continues and E, top switch is detected, so MIDI Note #64 is put in a third location with the other information and a third note timer is started.

     

    Scanning.

     

    Middle C second switch occurs the MIDI Note Velocity can be derived.

    We now have the entire contents of a MIDI message created so it can be sent to the output buffer for the UART (a piece of electronics) to begin its transmission through the MIDI DiN out socket.

     

    Scanning.

    As above but with E.

    Scanning.

    As above but with G.

     

    So, we've now started to play our chord but we haven't released the notes yet.

     

    Simply put, our MIDI Note On message looks like this:

    Note On indicator, - 4 bits

    Channel number, - 4 bits

    Note number, - 7 bits

    Velocity. - 7 bits

     

    The scanning continues and the release of the second switch for each note will be detected, note by note, and then ditto for the first switch of each note.

    These will generate MIDI Note Off messages for each key.

     

    (More of this process, perhaps later. There's more to Note Offs than meets the eye.

    ____________________________

     

    So a MIDI 'Note On' message arrives at the ouput buffer.

    The UART sends a 'start bit' to wake the other end and establish accurate timing, and then sends, one by one, the bits in the first byte to the other end.

    Then a stop bit, followed by a start bit for the second byte, the byte and a stop bit, then a third start bit, the last byte (bit by bit) and the final stop bit.

    Job done.

     

    The actual transmission is the waggling of an electrical signal up and down between pins four and five of the Out interface.

     

    With this mechanism the transmission can start just as soon as the MIDI Note On message arrives in the output buffer, providing that the UART is not in the process of transmitting another message.

    If there's one or more messages queueing, it's just put in the queue.

     

    In this case it's highly likely that the three messages will be sent out just as soon as they reach the buffer.

    Three three byte messages (that's thirty bits each (including start and stop bits)) at 31,250 bits/sec (that's a thousand messages per second) will take three thousands of a second, or 3 ms for all the Note.Ons.

    (That's if I can still do mental arithmetic, and not make mistakes!

     

    There's a clever wrinkle in the MIDI 1 spec. which says that if succeeding messages are the same type and channel, e.g. all Note On messages on channel 1, then theres no need to send the first byte, just the next two.

    So our 9 bytes for 3 messages can be reduced to 7, thus reducing the overall transmission time for the second two messages. Clever eh?

     

    And there's yet another clever wrinkle.

    We can't have a Note On with a velocity of zero, it simply doesn't make sense.

    So we can use Note On velocity zero to signify Note Off.

     

    With our C chord it now means that the three Note Offs can be transmitted as Note Off C, E & G with velocity zero and we've saved another three transmission bytes.

     

    Not clear? Try reading it again.

     

    And that's it for part one.

    In part two I'll look at the receive end of the transmission, or what happens at the MIDI In Din interface.

     

    E&OE of course.

    See ya later.

    JohnG.

     

    P.S. Please let me know if you spot any typos, etc.

    Or GLARING errors.

  8. Thanks very much for your rapid response, SynMike.

     

    We were not specifically targeting a match to the 1 bit at 31,250 bps. It just turned out that way.

    We had a certain number of time bits available and practical timing accuracy goals to meet.

    It's a convenient time value for devices with a 1 MHz clock.

     

    Sorry, I didn't mean to imply that you were copying the DIN interface transmission rate, just isn't it odd how when we do things like this we turn up with the same number?

    A 1 MHz clock subdivided will often give the convenient figure of 31,250.

    And, of course, it existed before MIDI, because it happens to be a clock frequency we can feed to the UART.

     

    Removing jitter always requires adding latency. When the connection is first made there is a burst of JR Timestamps data so the receiver can make some determination about jitter on the connection, then add an appropriate amount of latency. That might typically be up to a few milliseconds. From there the Sender puts JR Timestamps on every message. The receiver is delaying everything coming in, reassembling it and assigning timing according to the JR Timestamps of each event, before performing the data.

     

    Okay, yes a trade off. Thanks for that, I'll have to think about the implications. It certainly makes more sense now. (Thinking cap goes on.)

     

    MIDI Association members are working on defining the SMF2 format now. It will have notable differences from the original formats. I predict it will take another year to complete.

     

    Mike.

     

    Okay, yes, thought it might take a while.

     

    Do we have any insights to completed parts yet?

    Header, obviously, but still three file types; 0,1 & 2?

    Similar chunks perhaps with more in them, but maybe new group chunks?

    Any changes to delta timing?

    TPQN still the same?

     

    Curious minds are seeking early insights.

    (Especially this one.)

     

    Thanks again,

    JohnG.

  9. The Jitter Reduction part of MIDI 2.0 is pretty cool. The resolution is 32 microseconds (!), and multiple events can have the same time stamp to make sure things really do happen at exactly the same time.

     

    Another cool aspect is that it is possible for MIDI 1.0 devices to take advantage of this. I'm not sure whether manufacturers will go back and update firmware or whatever for MIDI 1.0 devices, although some probably will. However even when MIDI 2.0 comes out, there will still be situations where new MIDI 1.0 devices are manufactured, because they won't need the extra resolution and such. These can incorporate Jitter Reduction time stamps.

     

    The bottom line is that timing will be much tighter. Another twist is that timing doesn't deteriorate as data streams get more complex.

     

     

    32 microseconds?

     

    Isn't that 1,000,000 / 32?

    or

    1 bit at 31,250 bps?

     

    Taken from "M2-104-UM_v1-0_UMP_and_MIDI_2-0_Protocol_Specification,pdf" section 4.8.2 and 4.8.3.

     

    Strange number that, now where have I come across it before?

     

    Oh! Yes.

    One bit at the old MIDI 1 DIN interface transmission rate.

    Isn't it odd that the same things keep turning up again and again?

    Nothing new under the sun, eh?

     

    To be honest, I've been reading and re-reading that part of the protocol specification (section 4.8) and haven't yet been able to relate it to real life.

     

    I expect it's me.

     

    Say a two handed chord (six notes) played on a MIDI 2 compliant keyboard (sending either MIDI1 or MIDI2 messages, UMP encoded) has the timestamp prepended to each Note On,

    but the six messages happen to get split across two USB packets because of USB timing, just how do they get reassembled at the receiving end (also MIDI2 compliant),

    assuming it's a computer with VSTi's or some kind of new sound module?

     

    Does the first batch of notes get delayed in the input buffer awaiting the second, or what?

    Or, is it completely irrelevant in this situation?

     

    Sorry, I just can't figure it out.

     

    Can't wait for the MIDI 2 SMF specification to arrive.

    MIDI 2 files are going to get quite a bit bigger, aren't they?

     

    Oh, and will a MIDI 2 file name sufffix be different from a type 1?

    e.g. filename.md2

    And I suppose the file header information will change too?

     

    Enquiring 74 year old minds need to know. ;-)

     

    JohnG.

  10. But that's no reason to piss on it.

    Is that what I was doing?

    It certainly wasn't what I thought I was doing.

    (Strange how one can be misunderstood, isn't it?)

     

    Dr. Mike got it when he said that I made some fair points, which is what I intended.

     

    Before I go on to answer some of the points you make, Mike, let me make an analogy.

    I'm using an aircraft analogy because my first job, as a teenager, was as an electronics engineering apprentice at B.A.C. formerly Vickers Armstrong where I helped to build VC10's and BAC1-11's.

    Cockpit and radio bay electronics. But machines of past decades.

     

    You may have heard of a aircraft designer called J.R.Mitchell, he designed a world beating aircraft called the Supermarine S.6B which won the Schneider trophy in 1931.

    This incredible (for its time) aircraft went on to become the iconic machine of its time, the '40s, the Supermarine Spitfire.

    Still seen as the iconic shape of WWII today.

     

    And what I hear you ask has this got to do with MIDI 2?

     

    Well, this airframe would be nothing, absolutely nothing without the Rolls Royce Merlin engine that powered it.

    1296 cubic inches capacity delivering initially 1100 horse power allowed this plane with the pilots who flew them successfully to defend the UK.

     

    The point?

    The airframe is brilliant but without the engine it would never have defeated the ME109, another brilliant design, simply the Spitfire would not have been up to the job.

     

    MIDI 2 is an exceedingly fine protocol specification, but without the appropriate connectivity mechanisms, it becomes like a Spitfire with the engine of a Bugati Veyron.

    Great engine, but not up to the job.

    USB 3 is the Bugati Veyron, fantastic new networking technolgy, just not the right one for MIDI 2. Or not if you want fly.

     

    Am I still, as you say, urinating on MIDI 2?

    I hope not.

    The point i'm trying to make is that USB, whilst offering some connectivity, does not give us the flexibility that the current DIN interface allows in certain more demanding environments.

    Many professionals need that connectivity.

     

    In another post, wihich will take a little while to prepare, I'll try to explain the difference between the way MIDI 1 DIN and MIDI over USB works and why it's considerably more complex with MIDI 2.

    Okay?

     

    JohnG.

     

    Oh! And whilst I remember you said "But nobody ever said that MIDI was simple."

    But I did, back in 1988 when I first started messing about with it.

    It is simple, incredibly simple, that's why it's been SO successful.

    Simple interface, simple command structure, designed to create excellent sounding music. BRILLIANT!

    That's why it's lasted nearly forty years, and will go on maybe for another forty.

     

    You should compare the MIDI command structure with ISDN Basic Rate Interface Delta channel signalling protocol and the 2B1Q line code of the PSTN line.

    Now there's a specification to make even the most expert protocol analyst go white at the gills. Aaaaargh!

    Good, in its time, before ADSL, for those who needed it. And still available, I believe, as a satellite data comms link via a BGAN terminal.

  11. Sorry Mike, I got the product name wrong, it's a MindBurner 20 way MIDI thru splitter, studio edition. It has 20 assignable MIDI DIn sockets. Just Google mindburner MIDI.

    The price? An emminenly affordable 84 pounds 99p.

    Put simply, it allows you to direct MIDI channels to various devices.

     

    Yes, a box with a processor and a number of USB ports could do it.

    The problem is that unpacking potentially multiple MIDI commands from within USB packets and repackaging them into other USB packets for onward transmission is a LOT more complex than simply rerouting MIDI messages.

    With MIDI 1 it's a doddle. I wrote some code for an Atari way, way back that did something similar. Very straightforward.

     

    To add to the confusion, the MIDI 2 messages can now include a 4 bit "group" number as part of the messsage as well as channel number.

    A MIDI 2 format mesaage can include a MIDI 1 channel message OR a MIDI 2 channel message, which further complicates routing.

    See M2-104-UM_v1-0_UMP_and_MIDI_2-0_Protocol_Specification" sections 3 and 4.

    And even when one has solved the routing issues, the cable length for USB is totally unsuited to studio use, not to mention the earthing issues.

     

    As you said "if there's sufficient market for it", and there I think you have hit the nail on the head.

    How many people need 16 bit velocity or other 16 bit CC's? A few synth users certainly, but elsewhere?

    If major manufacturers can't be bothered any more to put even 10 bit pitch bend in their market leading products, what hope for all this extra complexity?

    I'm guessing they'll all go for Property Exchange (so they can claim MIDI 2 conformance) and the message they'll send is "we send MIDI 1 messages".

     

    I can hear you saying "you old cynic you!" And I am.

     

    Dr Mike.

     

    To be honest I can see some of the problems being ironed out over time, the question is "how long?"

    Will I still be around?

    Well, I'm 74 now, so maybe, maybe not.

    The brain still seems to fuction okay (apart from the moaning) I'm not so sure about the physical side tho'.

     

    A question. How many turns of an encoder do I have to make to go from zero to sixty five thousand five hundred and thirty five? I.e. 16 bits.

     

    And, to be brutally frank, I can't see anything in MIDI 2 that I yearn for.

     

    There are several other issues which I could expand upon, but I think you've all probably had enough of this moaning, cantankerous(?) old git.

    I'll go back to sleep for a while.

  12. What is the "MIDI Burner" that you pictured? What's its function? There may not be a drop-in replacement but there may be a functional replacement, or could be if anyone wants to buy it.

     

    There's always some evolution with new standards. Give it time to grow. In the mean time, you can continue to use what you know.

     

    This one is a twenty something port MIDI 1 thru/expander.

    Take a look here where Rara found a used(?) one.

    https://yamahamusicians.com/forum/viewtopic.php?f=4&t=16942

  13. Well, Dr Mike, I dare say it does, I dare say it does.

    Call me old and disillusioned, for that's what I am.

     

    I've been heavily inlvolved in data communications from the early seventies and have a number of 'firsts' to my name in that field, which I won't go into here as it''s not appropriate.

    Suffice it to say that I've worked on and off on various national and international standards committees to do with LAN's, WAN's and telephony over the years, and the implementation of networks on an international basis, both terrestrial and via satellite.

    This on prototype Ethernet, ISDN BRI, ADSL, GAN and BGAN networks.

     

    To date, my experience has been that the development of protocols and line codes typically goes hand in hand with the development or designation of the physical network over which they will run.

     

    It is what made MIDI 1 great. The Din interface and the MIDI command structure.

    It dismays me that the same, at least apparently, is not happening with MIDI 2.

    USB, whilst possibly answering the home studio and simple set up requirement, simply doesn't cut it, for example, in a studio.

     

    How, for instance does MIDI 2 plan to replace the piece of kit shown below?

    How, using USB, could this be done and allow the length of connection that this MindBurner allows?

     

    Mindburner_midi_expander_setup.jpg

     

    I look at MIDI 1 equipment and what do I find?

    Manufacturers of modern, fairly expensive devices not using the 14 bits available for pitch bend, in many cases not even 10 bits, a few not even 7 bits, in a area where the human ear is most sensitive.

    And do we honestly expect them suddenly to implement even 16 bits for note velocity?

    Somehow I have my doubts.

     

    I hope I'm wrong, Dr Mike, I really do, but I'll remain pessimistic until I see some signs of reality emerge from the MMA.

    We need an open Ethernet solution.

    We need some signs of a MIDI 2 file specification.

     

    They could, so easily, have solved the enharmonic note differences instead of thinking only of fixed pitch instruments.

     

    I'm just disappointed. Very disappointed.

    Protocols aren't everything.

    (I can't believe I'm saying that, when most of my life I've worked with IBM 2780, ICLC03, TCP/IP, IEEE Q.9321/931, 2B1Q, etc.)

     

    [/rant] Sorry.

     

    JohnG.

    1721.jpg.1616fd8d55ee4f85f98a0ba2928cb072.jpg

  14.  

    There's no reason why a MIDI patch bay can't be electro-mechanical like an analog patch bay. That always works.

     

    I'm afraid, Mike, that with only USB defined as the connecting mechanism for MIDI 2, there's every reason why a simple mechanical interface won't work. USB is a polled protocol. Some 'intelligence' has to do the polling for the USB data packets and then extract the MIDI command(s) from them, rerouting them to the appropriate port or channel. That's why in the demo's there is a raspberry pi between devices, it's acting as the USB host to the keyboard slaves, constantly polling them both for data and forwarding it to the other device. Unless, that is you implement both host and slave functions in every device and then, in effect, we've gone back to an 'in' and 'out' configuration. And more expense. And MUCH shorter cable lengths.

     

    And why will we need a MIDI patchbay anyway? We can already assign any MIDI output to any MIDI input through software. I had (still have, in a rack somewhere) a Digital Music Corp MX-8 MIDI routing switcher that let me send keyboard data to any one or combination of sound modules or the computer, or from the computer to sound modules. Today we do things differently, without all the hardware. You can send a MIDI track (or copies of that track) in a DAW to any number of virtual instrument plug-ins, and that same setup - or maybe a simpler software program - can be used in live performance.

     

    Your MX-8 won't handle MIDI 2 commands encapsulated in USB packets and won't poll USB devices. It's MIDI 1.

    Your DAW (at least I haven't seen any yet) doesn't support MIDI 2 commands. So you can't route them anywhere.

    At this current time we can ONLY use USB as the transport mechanism. USB is a polled protocol interface totally unlike MIDI 1 Din. IMV USB poses significant interconnection problems.

     

    I recognize that there are other applications for MIDI (despite the original meaning of "MI" in the name) than playing musical instruments, and some of those applications may indeed require Ethernet-like routing techniques. But music is by far the greatest application for MIDI, so I believe that's going to be supported first, and also be first to say "OK, that's enough work for now, let's move on to other applications."

     

    No, I'm only thinking about music applications.

    For example currently, I create a few of split points on my keyboard, I have channel one and two going to a Yamaha MU1000 equipped with three PLG cards, DX, AN and VL, another chanel going to an ancient Roland SC8850 sound module all via MIDI Din.

    I can either cascade via MIDI din, or go via a MIDI router such as yours.

    Assuming I can get MIDI 2 replacements for the sound modules, and my keyboard firmware is updateable to MIDI 2 (however unlikely) how do I do that over one USB out from the keyboard?

    I need a Rasberry PI in the middle that examines the incoming USB data, extracts the MIDI 2 commands from the packets, then re-encapsultes the MIDI data into new USB packets, then reroutes them to the appropriate USB port.

    It can be done (in fact I could probably code it myself), but when will it?

    Just how much latency and/or jitter will that introduce?

     

    I haven't heard of any DAW makers make any predictions/promises about MIDI 2 yet. Have you?

     

    But more along the line that you're thinking, Dante and a few other audio over IP protocols are getting more popular and less expensive, and that's just the sort of configuration that you're describing. You might want to look at the possibility and practicality of MIDI over Dante. It seems like a natural path.

    I must admit I don't know Dante. I'll do some research.

    It needs to be an open standard like Ethernet, or USB are, not proprietary.

     

    Having said all that, MIDI 2 promises to offer a lot more flexibility, especially to synth users.

    It's just that it's an awful long way from being implemented in anything that is half way usable. IN MY VIEW.

     

    I'm not trying to put it down, although it may appear that way I admit, I just would like people to have realistic expectations about it.

     

    In my seventy something years I've been promised the earth and been disappointed too many times to take the marketing hype at face value any more.

     

    Back in the days Dave Smith and Ikutaro Kakehashi got it right. (IMHO)

     

    P.S. It seems Dante is audio and video not MIDI (either 1 or 2).

  15. Thanks Joe,

     

    Yes, I am aware of the MMA web site.

    In fact, when they revamped the site some years ago, because of my contributions to the original MIDI forum and my own forum, I was made a moderator of the MMA forum.

    I've been working with MIDI since the late eighties.

     

    My background is as a data communications protocol analyst since the early seventies, from modems in the early days to satellite in the early noughties, working for mainframe manufaturers, telcos and global satellite companies, before I retired a few years back.

    It's this background that has taught me that just having a very detailed MIDI protocol is totally insufficient, we need the physical network over which the protocol will flow that will provide the connectivity that people will need.

    For example, how will I implement a MIDI 2 patch bay? The only solution I see MIDI is over Ethernet with an Ethernet switch and structured cabling, that will provide sufficient multiconnectivity and cable lengths suitable for running around a studio.

    I'm probably in the dark about what is happening here, but in my view the protocols and the interwiring need to happen hand in hand.

    USB provides a host to slave architecture, and whilst that is okay for a simple link, it is more problematic when we're talking slave to slave. It needs an intervening processor to forward data from one terminal to another.

     

    I've been following the development of MIDI 2 for some time (downloaded and read and hopefully understood all the published specifications) and whilst I applaud the efforts made so far in terms of protocol development, I am disappointed with the lack of apparent progress in terms of connectivity.

    Perhaps I need to learn a little more patience?

     

    I also find it extremely short sighted not to have addressed what I consider to be a relatively straightforward issue, namely the ability within the protocol to address the fact that enharmonic notes are only the same on a fixed pitch instrument e.g. a piano.

    They're not, for instance, on a violin. That is A# is not the same, or needn't be the same, as Bb. This could be quite easily resolved by utilising the extra bit available in the MIDI note number, now that we aren't restricted to 7 bit data fields.

     

    A much easier tuning system could have been introduced too based on the widely implemented Scala temperament files.

    See http://huygens-fokker.org/scala/

     

    But there you go, I'm just a lone voice crying in the wilderness, not a member of some acknowledged music instrument company or a journalist.

  16. Just realised, I'm wrong about the DAWs.

     

    Since there's no MIDI 2.0 file format specified yet, there's no standardised way of recording all these wonderful new solutions that MIDI 2.0 is going to give us. ... Oh well!

     

    Unless, that is you know differently!

     

    And ... as the only connection specification available so far is USB, are we always going to need a 'box' in between two MIDI 2 devices to provide the host to slave / slave to host connections?

    Is the fact that all these messages are going through such a pasthrough USB controller going to introduce latency?

    Don't expect them to be too far apart, the USB spec needs the cables to be relativley short compared, that is, to the Din cables we're used to. :-(

     

    Or are we going to use an Ethernet connection in the future to provide multi-connectivity?

    Will I then have to insert an Ethernet controller into every MIDI 2 compliant device with the appropriate protocol stack (IEEE 802.whatever) implemented in the firmware?

     

    Better save a little bit more than you were expecting for your new keyboard!

     

    Or am I totally wrong?

  17. Great News, only ...

     

    You'd better start saving right away so that you can afford to replace ALL of your old MIDI 1.0 gear with that super-duper MIDI 2.0 technology.

     

    I may be wrong but I don't think there'll be an upgrade for my AN1x, nor my VL70m, nor my Akai EWI4000s, nor my PX560, nor my ... etc.

    Then there'll be all my DAWs and all my VSTi's and ... etc.

     

    Hope your pockets are deep enough, mine aren't.

     

    (I know ... wet blanket!)

     

    But then I'm still running Windoze 7 on a quad core laptop and loving it.

    No OS breaking updates for me. Wonderful.

     

    Have fun.

  18. Chip, I wasn't trying to say that microwave radiation exposure was of NO risk, just of a different risk to ionising radiation.

     

    Many of the tests that I've read are of relatively high amplitude, i.e. they exceed the maximum output of a cell phone, or the exposure is for a prolonged period, i.e. 72 hrs continuous exposure.

     

    The output from a cell phone can be as little as 1mW in a good reception area but as high as 1W or 2W (dependant open radio wave frequency). Some of the tests I've read are using up to 6W of radiated power!

     

    This is from the BMJ:

     

    "The operator"s network controls and adjusts the output power of each connected mobile phone to the lowest level compatible with a good signal quality. This is obtained by logarithmically scaling the power from the maximum (1 or 2 W at 1800 MHz and 900 MHz, respectively) down to a level that may be as low as 1 mW. Such adaptive power control (APC) takes place continuously, with the selected power level depending on several factors, including the distance from the base station, the presence of physical obstacles, whether the phone is used indoors or outdoors, and handovers."

     

    That's why, to some extent, the jury is still out.

     

    BTW. Alex Jones is/was the female presenter of The One Show on the BBC here in the UK.

    Are you referring to her?

  19. To be clear, the jury is still out as to whether cell phones have deleterious effects on the human body - there have been conflicting studies - but they have nothing to do with spying on people. They're about the effects of low-level radiation exposure..

     

    Just a word of clarification, if you'll permit it.

     

    The radiation exposure is to microwave radio at extremely low amplitude, not to ionising radiation, i.e. not nuclear radiation.

     

    The theory is that these radio waves may affect the growing tissues (e.g. brain) of children. The likely affect on adults is considered to be far less harmful, if at all.

     

    As you said, Craig, the jury is out.

  20. For someone who arrived back in the UK at the age of 16 after living abroad, it was as much the 'spirit of the age' in England at the time, as the quality of the recordings, etc.

     

    I landed back in London at Tilbury docks (now gone) in early January '63 to the Big Freeze (the worst weather in living memory). The snow didn't melt until March.

     

    I'd heard Love Me Do whilst abroad but then 'Please Please Me", 'From Me To You", 'She Loves You", 'I Want To Hold Your Hand", all in '63, just incredible.

    So different to the Elvis and Buddy Holly, etc. I'd grown up with.

    But it was the exuberance, the sense of fun, the new styles, the whole thing that almost blew my mind.

    And the Stones too, often seen at the Eel Pie Island Hotel, Twickenham.

    The whole 'sixties scene'.

     

    The Beatles will be forever a part of my youth.

  21. Thinking back to the early sixties, these two remind me still of my first love:

     

    Nat King Cole and Stardust and also Somewhere along the way.

     

    And now the purple dusk of twilight time

    Steals across the meadows of my heart

    High up in the sky the little stars climb

    Always reminding me that we're apart

     

    You wander down the lane and far away

    Leaving me a song that will not die

    Love is now the stardust of yesterday

    The music of the years gone by

     

    Sometimes I wonder why I spend

    The lonely night dreaming of a song

    The melody haunts my reverie

    And I am once again with you

    When our love was new

    And each kiss an inspiration

    But that was long ago

    Now my consolation

    Is in the stardust of a song

     

    Beside a garden wall

    When stars are bright

    You are in my arms

    The nightingale tells his fairy tale

    A paradise where roses bloom

    Though I dream in vain

    In my heart it will remain

    My stardust melody

    The memory of love's refrain

     

    Credited to Mitchell Parish.

    --------------------------------------

     

     

    I used to walk with you

    Along the avenue

    Our hearts were carefree and gay

    How could I know I'd lose you

    Somewhere along the way?

     

    The friends we used to know

    Would always smile "Hello"

    No love like our love they'd say

    Then love slipped through our fingers

    Somewhere along the way

     

    I should forget

    But with the loneliness of night I start remembering ev'rything

    You're gone and yet

    There's still a feeling deep inside

    That you will always be part of me

     

    So now I look for you

    Along the avenue

    And as I wander I pray

    That some day soon I'll find you

    Somewhere along the way

     

    Credited to Sammy Gallop

     

    Not always, but when in melancholy mood they have the ability to bring a tear to the eye.

     

    Whatever happened to sweet Susan, I wonder?

×
×
  • Create New...