Jump to content
Please note: You can easily log in to MPN using your Facebook account!

Logic gets MIDI 2.0 on Big Sur+ with 10.7.2


Recommended Posts

https://support.apple.com/en-us/HT203718?fbclid=IwAR3ZetDwgTa6TyJdWsXQretpN0lzi4fRZRre2xAVTWNPj6gv86c-dIcYSJE

 

Huge list of new features and fixes.

 

 

New in Logic Pro 10.7.2

Stability and Reliability

Resolves an issue where Logic could quit unexpectedly when performing Flex Pitch analysis.

Logic no longer quits unexpectedly when tabbing to the end of a region after inserting a chord symbol into the Score.

Fixes an issue where Logic could quit unexpectedly when loading presets in Audio Units plug-ins, or copying tracks that contain Audio Units plug-ins.

Fixes an issue where Logic could quit unexpectedly when converting imported REX2 files.

Resolves various issues with Audio Units plug-ins that might cause Logic to quit unexpectedly.

Resolves an issue where Logic could quit unexpectedly when selecting a new patch while a Sampler window is open.

Performance

Improves performance and responsiveness when performing Smart Tempo analysis while the project is playing.

Performance is improved when selecting files in the Audio Import window on macOS Monterey.

The Logic interface now remains fully responsive when track level meters are displayed.

Fixes an issue where MIDI notes could stick when played through a Summing Stack using Logic's built-in instruments on Apple silicon Macs.

Notes created using the Pencil tool in GarageBand for iOS no longer get stuck when played back in Logic Pro.

Logic no longer hangs when zooming in to the maximum zoom level with the Movie track open.

Resolves an issue in Autosampler where playback to USB audio interfaces could become distorted.

Spatial Audio

Muting channels in the Surround Panner now behaves as expected in Spatial Audio.

Accessibility

VoiceOver now consistently announces the selection state of items in the Project Settings windows.

The blue highlight now follows selected items in the Preferences windows in VoiceOver mode.

VoiceOver can now be used to control a channel strip's 3D Object Panner in Spatial Audio projects.

Live Loops

An alert is now displayed when a Step Sequencer pattern cell is converted to MIDI, warning if the contents require that they be aligned to a single pitch.

Resolves an issue where Software Instrument Live Loop cells could appear to be empty immediately after recording.

In full screen view, the Live Loops grid now reliably updates when toggling Ultrabeat cells

Triggering and recording into Live Loops cells from control surfaces and MIDI controllers now works reliably.

Step Sequencer

Fixes an issue where playback can pause unexpectedly when recording large amounts of data into an unquantized Step Sequencer pattern in which step 1 has a negative offset.

Playback of a Step Sequencer region now continues as expected if the MIDI In port of the track is changed while it is playing.

Flex Time and Flex Pitch

Resolves an issue where audio files might not play after Flex Pitch analysis was performed.

Analyzing an audio file for Flex Pitch no longer resets existing Flex Pitch edits in the file.

Flex Pitch curves in the Audio Track Editor now consistently display as expected after an audio file is re-analyzed for flex pitch.

The Strength slider for Flex Pitch no longer jumps back to 100% after a lower value is set in the Audio Track Editor.

Flex Pitch data is now displayed correctly immediately after an audio file is analyzed for Flex Pitch.

Mixer

The Mixer now immediately shows the effect of changing from Post-fader mode to Pre-fader mode.

The Channel Input Mode button for Software Instrument tracks now displays reliably when the Mixer is set to Tracks view.

Deselecting all multiple selected channel strips in the Mixer now leaves only the currently focused channel selected in the Track List.

Groups

All regions of grouped tracks are now selected when selecting a track that's a member of the group.

MIDI 2.0 Support

The Arpeggiator plug-in now sends the correct note-off messages to third-party Software Instruments and MIDI FX plug-ins when Logic is running in MIDI 2.0 mode.

Plug-ins

An enabled EQ thumbnail now consistently displays as expected.

Logic's instruments now consistently respond as expected when playing quarter-tone tunings in Legato mode.

Adding a second instance of a third-party MIDI FX plug-in to a project no longer causes the track with the first instance to stop playing.

Sampler and Quick Sampler

Changes to the modulation visualization of controls in Quick Sampler are now immediately visible.

Automation

Automation for the Tape Stop parameter in RemixFX now remains functional after the play head is manually dragged during playback.

Resolves an issue where the RemixFX Gate effect did not respond properly to automation.

Logic Remote

Changes to the length of a pattern region are now immediately visible in Logic Remote on iPhone.

Logic Remote now displays the Filter On/Off switch for Quick Sampler.

It is now possible to activate the Filters in the Gate plug-in using Logic Remote.

Left Delay and Right Delay controls for the Sample Delay plug-in are now available in Logic Remote.

The Gain and Q-Factor controls in the Single-Band EQ plug-in are now available in Logic Remote.

Control surface and MIDI controller support

If a newer LUA script for an installed control surface is available, Logic will now use that instead of the built-in settings.

Logic now retains changes made to the display mode of control surfaces running in Logic Control mode.

Control surfaces supported by LUA script now send and receive feedback when using MIDI 2.0.

The Control Surface setup window now displays the correct group number for a selected device.

Fixes an issue where Novation Launchpad could unexpectedly show closed Track Stacks as being empty.

Setting a control surface to move the play head by ticks no longer causes the play head to only move backwards.

Control surfaces now update to select newly created tracks in Logic.

Changing an assignment control name in the Controller Assignments window now updates the name in the Key Commands window and Smart Controls assignments inspector.

It is now possible to edit an assignment in the Controller Assignments window if a Smart Controls inspector is also open.

Export and bounce

Fixes an issue where canceling bounce-in-place of a pattern region could cause the region to become corrupted and uneditable.

Sound from Remix FX is now included in bounced projects.

Content

It is now possible to load a channel strip CST file by dragging it from the Finder to the channel strip header.

Impulse Response Utility

All rows of the level meter in the Impulse Response Utility now update correctly.

Undo

Fixes an issue where a triple click could prevent additional undo steps from being added.

Score

The strum up and strum down markings are now displayed as expected in the Tablature Settings window.

Editing

Notes created with the Brush tool now reliably use the quantize values chosen by key commands.

Scale and Quantize settings in the Piano Roll no longer reset to defaults when the Piano Roll window is closed and then re-opened.

The value scrubber in the Position > Operations column of the Transform window now works correctly.

Selecting a region in the Event List editor now deactivates an active Marquee selection.

General

Resolves an issue in which MIDI might sometimes be recorded onto an Audio track.

Fixes an issue where edits to the currently selected and zoomed track could affect another track.

Auto-punch using Marquee selection now works consistently.

Fixes an issue where very short MIDI note events could sustain longer than expected when the MIDI 2.0 preference is enabled.

Previously selected MIDI regions are now deselected after recording a new MIDI region with Overlap mode enabled.

The Dynamics control in the MIDI Region Inspector now properly limits MIDI values that would exceed the top range of possible MIDI values.

Musical Typing now remains active if a record-enabled MIDI track is not the focused track.

Resolves an issue where the Region Delay sometimes could not be set to 'ms.'

Resolves an issue in which using Option + Shift to copy a Marquee selected area unexpectedly divides regions covered by the Marquee original selection.

Yamaha CP88, Casio PX-560

Link to comment
Share on other sites

  • Replies 20
  • Created
  • Last Reply

Top Posters In This Topic

Top Posters In This Topic

I have no clue what midi 2.0 is, guess I need to read up.

 

"Resolves various issues with Audio Units plug-ins that might cause Logic to quit unexpectedly."

 

I wonder if that fixed the issues I had with several plugins (that passed validation) after I upgraded to Monterey. Most of these, I put it down to the plugins being old and unsupported (chiefly Izotope's Exponential audio reverb plugins), but Kazrog True Iron has been bombing Logic if I use it on stereo tracks.

 

Always amazes me when I see a huge list of bug fixes that I (mostly) haven't encountered :)

Link to comment
Share on other sites

I'll guess that most musicians have yet to use 5% of the capabilities of midi 1.0. I'm still glad they came up with 2.0 though. The neat thing is that it's supposed to be backward-compatible with 1.0 so in theory you can continue to do everything the way you've been doing it, with the v1.0 gear you've been doing it with, without issues. If I can't hook up my Casio CZ-101 I'm gonna be pissed! :)
Link to comment
Share on other sites

Overview of MIDI 2.0...

 

 

I think the biggest advantages for keyboard players will apply to those who are using computers (e.g. connecting to a DAW or triggering VSTs). I'm not sure there's any apparent advantage yet to, for example, connecting one MIDI 2.0 keyboard to another. (And as before, MIDI per se generally isn't relevant when just using a board to play its internal sounds, that doesn't change with 1.0 vs. 2.0.)

 

Some of what 2.0 does is standardize ways to do some of what can already be done. For example, there are already things you can do if you need more than 16 channels, but now there will be a single standardized way of addressing more than 16 channels that won't require jumping through hoops, requiring certain hardware/software or complicated configuring or have restrictions on what does and doesn't work. Or, a cool feature of Camelot Pro is that, without having to do any tedious data entry, it already knows the names of the patches in many keyboards you might attach; but now any MIDI 2.0 device will have a standard way of knowing all the patch names of any attached MIDI 2.0 device. Or, Omnisphere maps its synth functions to lots of hardware, but instead of your VST synth having to be Omnisphere and your controller having to be a board that Omnisphere has been programmed to work with, any 2.0 app would have a standard way of addressing (or being addressed by) the controls of any 2.0 hardware. That kind of thing. Even the MPE stuff already exists to a good extent. So I don't see 2.0 so much as being able to do something new as there being more flexibility and consistency in what things can be controlled and how, as well as less manual configuration of things. But then, it's all new to me, too.

 

Ideally, sysex/NRPN headaches will be largely eliminated, because with nearly limitless CCs to work with, there should be little call for using these for functions that could easily be handled by a CC. (Not that this explains why some stuff is sysex today on boards that still have plenty of unused CC locations available.)

 

The optional jitter reduction when playing a VST is an interesting feature. It basically "delays" some things for consistency. The idea being that, for example, a constant latency of 10 ms is preferable to a latency that is varying between 5 ms and 10 ms. (Numbers for illustration purposes only.) The theory at least is that we can more easily psychologically adjust to something that is always off by the same amount than we can adjust to things that are off by varying amounts even if things are usually off by less. I'm curious to see whether I'll notice a difference.

Maybe this is the best place for a shameless plug! Our now not-so-new new video at https://youtu.be/3ZRC3b4p4EI is a 40 minute adaptation of T. S. Eliot's "Prufrock" - check it out! And hopefully I'll have something new here this year. ;-)

Link to comment
Share on other sites

Thanks, Scott.

 

Also a nice why it matters article from Sweetwater. A lot of the long time leadership that we are familiar with at Yamaha, Roland, Korg, etc. are members of the midi.org board, I suspect a few of our moderators and posters here are members of midi.org. You can join if you have an interest.

 

https://www.sweetwater.com/insync/midi-2-0-what-actually-matters-for-musicians/

Yamaha CP88, Casio PX-560

Link to comment
Share on other sites

I suspect a few of our moderators and posters here are members of midi.org. You can join if you have an interest.

 

Well, I'm the President of the MIDI Association until January 2022...so it's true they'll let anyone join :) FWIW I did the "Introduction to MIDI 2.0" video above. I've also done MIDI videos for other conferences, Music China, etc., and liase with AMEI (the Japenese equivalent of the MIDI Association) and music organizations in China.

 

The neat thing is that it's supposed to be backward-compatible with 1.0 so in theory you can continue to do everything the way you've been doing it, with the v1.0 gear you've been doing it with, without issues. If I can't hook up my Casio CZ-101 I'm gonna be pissed!

 

Your CZ-101 will be fine. The reason why MIDI 2.0 can pull off backward compatibility is because it's a language. The analogy I give is that if I learned to speak Italian tomorrow, it wouldn't mean I had forgetten how to speak English. It simply means I could converse in English or Italian. If a MIDI 2.0 device sends a capability inquiry message to a MIDI 1.0 device asking if it speaks MIDI 2.0 and doesn't get an answer, the MIDI 2.0 device defaults to talking to it using the MIDI 1.0 language.

 

Waiting for someone to post here that they've done something interesting with MIDI 2.0 before I'm going to start paying attention. Not sure why I need it yet.

 

MIDI gear hasn't arrived yet, so you won't hear about what people are doing with it. Roland has a MIDI 2.0-ready keyboard but there's not much it can talk with at the moment. One of the big roadblocks was a lack of MIDI 2.0 in MacOS and Windows, but with Apple incorporating it, more will be happening. Apple, Microsfot, and Google are all active MIDI.org members.

 

A lot of MIDI 2.0 announcements were expected at the 2022 Winter NAMM, but that's been postponed to June 2022.

 

The biggest deal about MIDI 2.0, aside from more resolution, more controllers, and easier controller implementation, is that MIDI is now a dialogue instead of a monologue. Gear can talk to each other. I've seen a prototype piece of software that asked a synth about what parameters it had, and the equivalent of an editor/librarian came on-screen...even though no one had written an editor/librarian for it. It was just MIDI saying "thanks, I'll put that info onscreen now, then send the edited versions back to you if you want."

Link to comment
Share on other sites

(And as before, MIDI per se generally isn't relevant when just using a board to play its internal sounds, that doesn't change with 1.0 vs. 2.0.)

 

I'm not sure that will be true, although only time will tell. MIDI is such a key technology, many synths use MIDI internally for many properties. Manufacturers want their devices to respond equally when being played live or when being recorded and played back from a sequencer. MIDI 2.0 availability might encourage developers to take advantage of higher resolutions and new expressive capabilities, both internally and externally.

Mike Kent

- Chairman of MIDI 2.0 Working Group

- MIDI Association Executive Board

- Co-Author of USB Device Class Definition for MIDI Devices 1.0 and 2.0

 

Link to comment
Share on other sites

I love this! Thanks Elmer.

 

MIDI 2.0 will develop use cases, but many will be invisible, as with all foundational technologies. Mostly, when it works, we will take it for granted, like electricity. We don't get up in the morning and say electricity is great. We just brew the coffee and check the internet.

 

My personal wish is for more resolution. High resolution and characterful sound reproduction ( e.g. leslie speaker versus PA powered speaker) are often what separate more expressive instruments from less expressive ones. As increases in resolution appear, we may not notice that they were enabled by MIDI 2.0.

 

Good for Apple! Good for MMA! :cheers:

Link to comment
Share on other sites

(And as before, MIDI per se generally isn't relevant when just using a board to play its internal sounds, that doesn't change with 1.0 vs. 2.0.)

 

I'm not sure that will be true, although only time will tell. MIDI is such a key technology, many synths use MIDI internally for many properties. Manufacturers want their devices to respond equally when being played live or when being recorded and played back from a sequencer. MIDI 2.0 availability might encourage developers to take advantage of higher resolutions and new expressive capabilities, both internally and externally.

Good point. One example would be that we occasionally see complaints about people hearing stepping when sweeping through some parameters on some boards, the move to 2.0 could eliminate that.

Maybe this is the best place for a shameless plug! Our now not-so-new new video at https://youtu.be/3ZRC3b4p4EI is a 40 minute adaptation of T. S. Eliot's "Prufrock" - check it out! And hopefully I'll have something new here this year. ;-)

Link to comment
Share on other sites

Osmose is 2.0 so if it ever shows up I might have try the New Logic with it, thanks for the heads up :)

RT-3/U-121/Leslie 21H and 760/Saltarelle Nuage/MOXF6/MIDIhub, 

SL-880/Nektar T4/Numa Cx2/Deepmind12/Virus TI 61/SL61 mk2

Stylophone R8/Behringer RD-8/Proteus 1/MP-7/Zynthian 4

MPC1k/JV1010/Unitor 8/Model D & 2600/WX-5&7/VL70m/DMP-18 Pedals

Natal drums/congas etc & misc bowed/plucked/blown instruments. 

Link to comment
Share on other sites

Thanks for sharing that vid, Scott. For me, the killer new feature will be the peer discovery and utilization, ideally using wifi or something else low latency. But I think it's going to take around 5 years for the ecosystem to develop. All I need is a direct brain-to-midi 2.0 adapter, and I'll be all set, thanks.

Want to make your band better?  Check out "A Guide To Starting (Or Improving!) Your Own Local Band"

 

Link to comment
Share on other sites

Idle related thought: if one was building, say, a hardware stage keyboard, MIDI 2.0 would make a great internal protocol to coordinate the various subsystems. I mean, it's all there in the spec. That could lead to some fascinating interconnects, for example maybe exposing, say, the LFO modulations from keyboard #1 and having filter section #2 and osc #3 consume it.

 

A connected collection of MIDI 2.0 synths would give you a meta-synth, where any component could control any other. Mind = blown.

Want to make your band better?  Check out "A Guide To Starting (Or Improving!) Your Own Local Band"

 

Link to comment
Share on other sites

(And as before, MIDI per se generally isn't relevant when just using a board to play its internal sounds, that doesn't change with 1.0 vs. 2.0.)

 

I'm not sure that will be true, although only time will tell. MIDI is such a key technology, many synths use MIDI internally for many properties. Manufacturers want their devices to respond equally when being played live or when being recorded and played back from a sequencer. MIDI 2.0 availability might encourage developers to take advantage of higher resolutions and new expressive capabilities, both internally and externally.

 

Mike has been instrumental in MIDI 2.0's development, on multiple levels, and any speculation from him is grounded in reality. Aside from resolution, the opportunity for controllers beyond the usual mod wheel / pedal / ribbon controller might provide new and more expressiive ways to access internal sounds.

 

There's also a competitive advantage for manufacturers. Suppose two synths are sitting side-by-side. With one, you can sense the stair-steps from quantization, while the other's controls have the feel and "liquid" sound quality of an analog synth like a Moog Voyager or whatever. I know which one I would want :)

Link to comment
Share on other sites

There's also a competitive advantage for manufacturers. Suppose two synths are sitting side-by-side. With one, you can sense the stair-steps from quantization, while the other's controls have the feel and "liquid" sound quality of an analog synth like a Moog Voyager or whatever. I know which one I would want :)

Yes, I alluded to that as well, in the post four above yours, and it's probably even more of an issue in the "synthier" boards than the "acoustic instrument replicating" stuff that probably makes up more of what we typically talk about here. (Or is that just me??)

 

But here's one thing I'm curious about, that I'm hoping MIDI 2.0 might address... One thing that has been absent in MIDI are increment/decrement commands. One example is when you'd like to send a "next patch" command... but you can't, you need to send the absolute number of the desired patch. Or when you're using an external controller for something, and it has endless encoders on it. One advantage of endless encoders internally is that you can start moving them and they will change (increment/decrement) relative to the current value of the assigned parameter, whatever that might be. But when using them to control parameters on an external device, it's not so straight-forward, because, for example, there's no standard MIDI command to "increase filter cutoff frequency by value x" or whatever. So getting controls in sync in terms of their initial values is still an issue, which would be nice to eliminate, whether through supporting increment/decrement, or perhaps, via the 2-way communication, by device A (controller) querying device B (the one being controlled) as to what the current value of some parameter is, which certainly sounds like something MIDI 2.0 could do, which doesn't necessarily mean it's something vendors will do, I suppose. Though some kind of standard language for increment/decrement might permit a lot of this stuff without having to query each parameter. Though I admit, this entire conversation is at the edge of my understanding, so I could be missing something entirely. I haven't done any MIDI programming of substance since the 80s. ;-)

Maybe this is the best place for a shameless plug! Our now not-so-new new video at https://youtu.be/3ZRC3b4p4EI is a 40 minute adaptation of T. S. Eliot's "Prufrock" - check it out! And hopefully I'll have something new here this year. ;-)

Link to comment
Share on other sites

... But here's one thing I'm curious about, that I'm hoping MIDI 2.0 might address... One thing that has been absent in MIDI are increment/decrement commands...

 

MIDI 1.0 has had Increment and Decrement commands since the earliest days: Control Change #96 and #97. I think the DX-7 was the first MIDI products to use them. But it is not easy to implement these relative controllers in a way that is interoperable with other products. The original MIDI 1,0 specification did not clearly define implementation. The value of relative control was not defined. The "Recommended Practice 18" specification adds some information, defining the target as the current RPN or NRPN. But still, implementation is rare and interoperability between devices is not common.

 

The Universal MIDI Packet (UMP) Format and MIDI 2.0 Protocol defines new Relative Registered Controller (RPN) and Assignable Controller (NRPN) messages. But only the core message format and data values are defined. Predicable implementation might be defined in further MIDi 2.0 specifications such as Profile definitions. For example, a DAW Controller Profile (there is no such thing yet) might define how rotary encoders send relative values.

 

The first MIDI 2.0 products will not have these features. It will take some years for widespread implementation of a wide range of the potential enhancements that will come via MIDI 2.0. I'm guessing that these Relative Controllers will not be among the first features to be delivered by products. But in the long run, I do agree that the kinds of features you describe will be useful.

 

Mike.

Mike Kent

- Chairman of MIDI 2.0 Working Group

- MIDI Association Executive Board

- Co-Author of USB Device Class Definition for MIDI Devices 1.0 and 2.0

 

Link to comment
Share on other sites

(And as before, MIDI per se generally isn't relevant when just using a board to play its internal sounds, that doesn't change with 1.0 vs. 2.0.)

 

I'm not sure that will be true, although only time will tell. MIDI is such a key technology, many synths use MIDI internally for many properties. Manufacturers want their devices to respond equally when being played live or when being recorded and played back from a sequencer. MIDI 2.0 availability might encourage developers to take advantage of higher resolutions and new expressive capabilities, both internally and externally.

Good point. One example would be that we occasionally see complaints about people hearing stepping when sweeping through some parameters on some boards, the move to 2.0 could eliminate that.

 

That stepping is usually from the resolution of the hardware control itself. Or perhaps the update speed of the scanning of the panel controls. If you look at the output on a MIDI monitor you'll often see devices skipping values even within the "limits" of MIDI 1.0. Turn the knob really quickly and it gets even worse.

 

Jerry

Link to comment
Share on other sites

That stepping is usually from the resolution of the hardware control itself. Or perhaps the update speed of the scanning of the panel controls.

 

The filter cutoff frequency is often stepped intentionally in a chromatic fashion to ensure that resonance is in tune. So there are a lot of potential reasons behind the scenes that aren't always evident to someone not involved in the instrument's design process where these kinds of trade-offs are debated. Engineering is always a series of compromises rather than a simple question of what "correct" behavior looks like.

 

Sadly, I suspect that MIDI 2.0 will suffer from the same malaise as many prior advanced MIDI features: inconsistent implementation leading to manufacturers looking elsewhere for differentiation and getting by with a relatively anemic MIDI feature set. If you can choose between making your instrument a stellar stand-alone product, and hoping that integration with some other product will be a compelling feature, chances are a product manager will choose what's in their control and benefits all their users the vast majority of the time.

 

I do hope we see more granular resolution, tighter timing, and use of higher bandwidth USB connections at the very least. Working well with modern DAWs seems like the place you could see the most benefit initially, so perhaps we'll see some wins there. Using the bidirectional communication features of MIDI 2.0 to at least have an instrument identify itself without having to manually name a channel or preset would be a nice convenience there.

Acoustic: Shigeru Kawai SK-7 ~ Breedlove C2/R

MIDI: Kurzweil Forte ~ Sequential Prophet X ~ Yamaha CP88 ~ Expressive E Osmose

Electric: Schecter Solo Custom Exotic ~ Chapman MLB1 Signature Bass

Link to comment
Share on other sites

Come from a programming background and working for two software publishers of compilers the sad part of Standard is everyone implements them. I remember when at Symantec (Lightspeed) and C++ was just rolling out and we used to joke and say we sold a C+- compiler because so much of the languages with didn't implement or Mike didn't like what the spec' was. Latter I worked at Borland and some of our developer were part of t C++ standards group. It was cool when they had one of the Standards group meeting in our area so we got to hang with some of the group. Like all standards people are coming from different points of view so interpretations of the standard leads to different implementations.
Link to comment
Share on other sites

That stepping is usually from the resolution of the hardware control itself. Or perhaps the update speed of the scanning of the panel controls. If you look at the output on a MIDI monitor you'll often see devices skipping values even within the "limits" of MIDI 1.0. Turn the knob really quickly and it gets even worse.

 

I recall that the Panasonic DA7 mixer got around the stair-stepping problem by interpolating between values, so there were actually 1,024 steps instead of 128. So when you sent a MIDI message that went from a value of, say, 56 to 57 the amplifier itself went through 8 steps instead of 1. That was enough not to hear stair-stepping when you moved the faders.

 

And in the immortal words of Herman Cain, "I don't have facts to back this up," but if the control was an analog potentiometer connected to a voltage sounce, and an A/D converter was reading that voltage, wouldn't the potential resolution be limited only by the A/D converter's resolution? As to whether to stair-step intentionally, doesn't keyboard tracking let you create a 1:1 correspondance between the frequency of the note and the filter's resonant frequency? It seems that would work for a chromatic or microtonal scale, as long as it was even-tempered.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Unfortunately, your content contains terms that we do not allow. Please edit your content to remove the highlighted words below.
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...