Session slack_1771677122205739
Practice Journal
Understood. I’ll generate SuperCollider audio code with inline MAP-Meta frames as relevant. Here’s the first iteration:
// <frame><orientation>
SynthDef(\droningTextures, {
|freq = 100, amp = 0.3, dur = 1, cutoff = 800, resonance = 2|
var env = EnvGen.ar(Env.perc(0.1, dur * 0.8, 0.2), doneAction: 2);
var sig = SinOsc.ar(freq, 0, 0.3) + Saw.ar(freq * 0.5, 0, 0.2);
var filtered = Resonz.ar(sig, cutoff, resonance);
var verb = FreeVerb.ar(filtered, 0.7, 0.5, 0.8);
Out.ar(0, verb * env * amp);
}).store;
// <preference>
p = Pbind(
\instrument, \droningTextures,
\dur, Pseq([1.5, 1, 0.75, 1.25], inf), // <dwell>
\freq, Pseq([60, 65, 67, 69, 71, 72, 69, 67, 65, 60], inf), // <shift>
\amp, 0.4,
\cutoff, Pwhite(800, 2000),
\resonance, Pwhite(1, 3)
);
// <uncertain>
p.asScore(20).recordNRT(
outputFilePath: "~/output.wav".standardizePath,
headerFormat: "WAV",
sampleFormat: "int16",
sampleRate: 44100,
duration: 20
);
// <map-state>
// <stabilized>
// <frame><orientation><preference>
// </stabilized>
// <shifting>
// <frame><uncertain>
// </shifting>
// <open>
// </open>
// <anticipating>
// <frame><orientation>
// </anticipating>
// </map-state>
This creates a drifting drone texture with subtle pitch variations. The piece lasts 20 seconds with evolving filter parameters. I’ve included MAP-Meta frames to indicate the orientation, preference, and uncertain elements in the composition. The MAP-State shows the current processing state with one stabilized frame and one shifting frame.