Interactive Swarm Space

Events

Table of contents

What Are Events and When Should I Use Them?
Score Events
Real-time Events
Audio/Event Conversion Units



What Are Events and When Should I Use Them?

An event could be described as an action message sent to a unit at a specific time. Let us mention some examples for you to get a clearer picture: Some events might simply switch a unit on or off, some might set a control port (e.g. "frequency") to a certain value or generate a smooth interpolated ramp between two control values. So you could say, events control the performance of a unit by sending it commands. That said, it is also important to recognize that not all interaction that might fit this profile is actually done by events. Usually when two units interact, there ar no events passed from one to the other. Let's take for instance "frequency modulation" - a very simple version of FM synthesis consists of two oscillators, one carrier and one modulator. The modulator's output is connected to the frequency of the other, hence the name "frequency modulation". For this connection, no events are needed, because you can directly connect the output of the modulator to the frequency control port of the carrier.

modulator->connect(carrier, "frequency");

However, once the two units are connected, they just keep on doing whatever they are connected to do, until the synth is stopped. It is obvious that this interaction lacks the timing criteria common to all event-driven cases. So events are appropriate in all situations where either something should happen at a pre-defined time (we shall call this instance a "score event") or where an external control source like your computer keyboard or a MIDI-Fader is controlling a unit in real-time ("real-time event").



Score Events

Generating score events is rather simple. As shown by the following example syntax, you need to specify what unit will receive the event, on which port, at what time, what the event's target value is and finally how long it should take to reach this target value - linearly interpolating all values in between. All arguments of schedule() are floats, and time values are set in milliseconds.

Schedule an Event on a Unit

UNIT_NAME->schedule(START_TIME,"PORT_NAME", TARGET_VALUE, INTERPOLATION_TIME); 

An example:

oscillatorUnit->schedule(5000.0f, "frequency", 4156.33f, 3500.0f);

"change the frequency of the unit "oscillatorUnit" to 4156.33 Hz starting after 5.0 seconds with an interpolation time of 3.5 seconds."

Schedule an Event on a ControlPort or SwitchPort

SWITCH_PORT->schedule(START_TIME, TARGET_VALUE, INTERPOLATION_TIME); //SwitchPort
CONTROL_PORT->schedule(START_TIME, TARGET_VALUE, INTERPOLATION_TIME); //ControlPort

example:

oscillatorUnit->switchPort("active")->schedule(2000.0f, 0.0f, 0.0f);
oscillatorUnit->controlPort("frequency")->schedule(2500.0f, 112.5f, 1000.0f);

"2.0 s after synth start, set the SwitchPort to inactive (0.0), don't interpolate (second 0.0)."
"2.5 s after synth start, start to interpolate the frequency ControlPort from your previous value to 112.5, interpolation time = 1 second."

Remark:

In almost all cases of SwitchPorts, interpolation times greater than 0 make absolutely no sense as you can see in the example above.

Schedule an Event on a Generic EventTarget (Rare)

Synth::get().eventManager().createEvent(START_TIME,
                                        EVENT_TARGET,
                                        TARGET_VALUE,
                                        INTERPOLATION_TIME); 

a real-life example:

Synth::get().eventManager().createEvent( 10000.0f,
                                         someUnit->switchPort("active"),
                                         0.0f,
                                         0.0f);

This approach is needed in cases where you don't know what the EventTarget is, be it a SwitchPort or a ControlPort.



Real-time Events

You can use the above syntax to create real-time events, simply set START_TIME to 0.0f or less, for them to be processed as soon as possible. However this approach is almost exclusively plausible when you create your own units that attempt to send events.

In most cases when talking about real-time events, you probably mean the use of real-time controls like MIDI or OSC in your synth. This has been documented in our External Control section.

Another juicy feature of ISO is of course the possibility to control your synthesis patch using real time data from your ISOFlock. The Networking chapter of ISOFlock describes the way to get there.



Audio/Event Conversion Units

ATEUnit(s) - Audio To Event Conversion

This is the base class of all units concerned with observing an audio stream and sending out events to an other unit's EventTarget (ControlPort or SwitchPort) either at regular intervals or when specific conditions in the audio stream occur. Remember that you can also use this unit with ControlPort data streams, because they are being performed at audio-rate as well!

EventTargets are initially set when calling the constructors. In fact, all ATEUnit (and subclasses thereof) have the same constructor syntax: The last constructor argument is always a QVector of EventTargets:

PinkNoise* noise = new PinkNoise(); 
WaveTableOscil* carrier = new WaveTableOscil("sinewave"); //sine wave oscil

QVector<event::EventTarget*> targets = QVector<event::EventTarget*>();
targets.append(carrier->controlPort("frequency")); // get ControlPort* pointer
                                                   // (which is by inheritance
                                                   // an EventTarget)

ATEUnit* audio2event = new ATEUnit(targets); // connects audio2event with carrier's
                                             // "frequency" ControlPort
noise->connect(audio2event); // feed noise to audio2event sampler carrier->connect(outputUnit);

Notice that there is no connect() Link between the ATEUnit and the carrier. This event-based connection is established by calling the constructor of ATEUnit along with the carrier's EventTarget - here: the ControlPort "frequency" of the sine wave oscillator.

Class Name Purpose SwitchPorts
ATEUnit

base class. Send a sample value as event in regular intervals

interval: sampling interval in ms
interpolation: event interpol. time in ms
ATEAverageUnit Send the average of all samples in an interval interval, interpolation
ATEMaxUnit Sends the max value of an interval interval, interpolation
ATEMinUnit Sends the min value of an interval interval, interpolation
ATEMaxAmpUnit Sends the max absolute value (max amplitude) of an interval interval, interpolation
ATEMinAmpUnit Sends the min absolute value (min amplitude) of an interval interval, interpolation
ATEConditionalUnit* Checks the audio stream in regular intervals, but only sends out a sample if it fits the condition settings*

interval, interpolation
operator: logical operator index 1*
argument: conditional argument 1*
operator2: logical operator index 2*
argument2: conditional argument 2*

Remark on ATEConditionalUnit:

You have to set the conditional statement as operator/argument SwitchPort settings. The operator index represents the the conditional operator that will be used:

Conditional Operator ≠ (NOT EQUALS) = (EQUALS) < (LESS THAN) ≤ (LESS OR EQUAL) > (GREATER THAN) ≥ (GREATER OR EQUAL)
index 0 1 2 3 4 5

The argument SwitchPorts describe the right side of the conditional statement. So if you were to check in regular intervals, if your control data stream is greater than 150.0, you would set the index and operator SwitchPort as follows:

myConditionalUnit->set("operator", 4.0); // ">"
myConditionalUnit->set("argument", 150.0); 

You probably noticed, there is a second operator/argument set. This second condition is evaluated only if the first condition is evaluated as true. In pseudo code:

if ( (audio_data operator argument) AND
     (audio_data operator2 argument2) )
      { send audio_data }

This is useful, for example if you would like to check if your audio stream data is within a certain range - around zero perhaps:

myConditionalUnit->set("operator", 4.0); // ">"
myConditionalUnit->set("argument", -0.001); // lower boundary
myConditionalUnit->set("operator2", 2.0); // "<"
myConditionalUnit->set("argument2", 0.001); // upper boundary

As already stated, these two operator/argument conditions are connected via a logical AND operator (&&), in cases where you need a logical OR operator, we suggest, you simply create a second unit using the same audio input and sending events to the same EventTarget.

Pitfall: interval SwitchPort Setting:

You might expect, that an event is sent whenever a condition is met. This is however only true when interval is set to 0. Meaning your audio stream is evaluated each sample. This is of course a little expensive and might in a worst-case scenario lead to your ATEConditionalUnit sending out events each audio sample!!

ETAUnit(s) - Event To Audio Conversion

Like in the ATEUnit family, ETAUnit is the base class for all event-to-audio converter units. The base classes job is to set it's output audio stream to the event value received by it's SwitchPort "event".

Class Name Purpose SwitchPorts
ETAUnit

base class. Sets audio output stream to received event value.

event: received event value
ETAFadeUnit creates linear fades between received events event
fadesize:
fade length in millisecond
ETAFadeToValueUnit jumps to the received value, then fades back to a base value event
decay:
decay (fade) length in ms
base: base value
ETACustomFadeUnit* like ETAFadeUnit, but with a custom fade shape (stored as a Channel* pointer). Default shape: sigmoid event
fadesize:
fade length in milliseconds

Remark on ETACustomFadeUnits:

You can either create your own shape by creating a Channel object and filling it with data points that represent the shape of your fade. Or select one of the default shapes. In order to create smooth fades, your fade shape must start at 0.0 and end with 1.0. These fade shapes are scaled and stretched according to your SwitchPort settings and received event values.

ETACustomFadeUnit* fadeUnit = new ETACustomFadeUnit();

... uses the default sigmoid shape. The following statement is equivalent:

Channel* fadeShape = WaveTableManager::get().getWaveTable("sigmoidwave");
ETACustomFadeUnit* fadeUnit = new ETACustomFadeUnit(fadeShape);