I’ve been working on a next gen DAW for a few years and this article misses all of my priorities. AI will encourage mindless replication; instead modern tooling should be analytical, precise, deliberate, flexible, and have layered complexity.
My priorities are more in line with turning music production into an integrated and interactive development environment using modern design principles. Sub-modular capability, A/B testing, git integration, non-local collaboration, scientific visualization, notebook style experimentation, integrated synth build / play, web embedded optional interface, social sharing & tutorials, polyglot open source interface (primarily Rust), programmable behavior / macros, higher order signal dependency optimization, algorithmic mastering, targeted oversampling, creative process reusability, etc. You can solve the plugin issue by just synchronizing the output audio by the user that has it installed.
Quality music production is an opague art and everything is way more daunting than it needs to be. Most producers just mess around until it sounds good and that gets people stuck in a local maximum of clarity. If it takes too long to experiment then you won’t get through the effort of trial for understanding. I have spent 15 years building tools as a research quant dev and also a dj (Extrn). There is a huge unaddressed gap in the audio space and huge barriers to entry in accessibility and cognitive burden.
Spot on. It's telling that most DAW UIs are based on archaic ideas (like mixers/tracks and MIDI/step-sequencer grids). Even BuzzTracker (2009) outshines many of the current DAWs with DAG instrument/effects-chaining and arrow-key/cursor navigation.
My post-graduate research concerned signal-rather-than-event-based generation/transformation of compositional data, integrated with textural/timbral synthesis.
My current focus is building a DSP framework for this purpose in C++20 [1].
In any case I'm interested in following your progress, and happy to contribute code/ideas if you feel like collaborating (links in profile).
My priorities are more in line with turning music production into an integrated and interactive development environment using modern design principles. Sub-modular capability, A/B testing, git integration, non-local collaboration, scientific visualization, notebook style experimentation, integrated synth build / play, web embedded optional interface, social sharing & tutorials, polyglot open source interface (primarily Rust), programmable behavior / macros, higher order signal dependency optimization, algorithmic mastering, targeted oversampling, creative process reusability, etc. You can solve the plugin issue by just synchronizing the output audio by the user that has it installed.
Quality music production is an opague art and everything is way more daunting than it needs to be. Most producers just mess around until it sounds good and that gets people stuck in a local maximum of clarity. If it takes too long to experiment then you won’t get through the effort of trial for understanding. I have spent 15 years building tools as a research quant dev and also a dj (Extrn). There is a huge unaddressed gap in the audio space and huge barriers to entry in accessibility and cognitive burden.