Quadvector Gestural Fractal Synthesis-Current Implementation: Part I
in an attempt to not unload streams of incomprehensible information all at once, every 2-3 months, I will try to unload more bite sized streams of incomprehensible information at a slightly more regular pace…you’re welcome. as we near the event horizon of the singularity referred to as “releasing the project to the public”, little things become big things, as if their mass increases by virtue of time compression.
one such thing is the synthesis engine concept, currently referred to as a quad-vector gestural synthesizer. this is one of a few “fractal” synth concepts that I have been working on for the last 2 years. the idea with this one is of a single synthesizer, tentacled with branching gestural parameter interfacing, that can, through various combinatory means such as key sequencing, hand position and mode switching, mutate into any sound I can imagine. this is accomplished using the multimodal gestural capabilities of the exo-voice prosthesis, whose interaction matrix is fractal. by “fractal”, i mean systems that evolve to higher states of “resolution” over time, from simple origin states. this applies to everything from design to programming to performative interaction.
in the design of the hand units, following pure function, I needed “keys” that were spaced to be comfortable for my fingers, while also being attached to my hands unobtrusively, which was accomplished in the first, cardboard, version. over time, the design iterated toward things like better ergonomics, better light distribution, ease of printability, and so on, so the design and function evolved, while still being based on the the idea of the first version. since all aspects of the exo-voice share this design concept, the entire design evolves fractally; as new states are investigated, the resolution of each state increases toward a state of “perfection” at which point it forms the basis of a new state of investigation.
each hand unit has one 3 axis accelerometer. the x axis measures tilt, side to side, like turning a door knob. the y axis measures tilt forward to backward, like the paint motion from karate kid (hope that is not too old of a reference for some of you). the easiest way to picture it mentally is to think about the range of motion your hand would have if your wrist was stuck in a hole in a thin glass wall; you could turn your wrist and you could move your hand, from the wrist, up and down, as well as combinations of those two motions. this forms the basis of the gestural kinematics the system is built on. the x-range is one parameter and the y range is a second one. the z is unused right now. it’s the functional equivalent of using a touch pad in 3d space with x and y acting as two semi-independent control vectors.
there are a number of ways to use these vectors.
- Individually(per hand)-x is one “knob” and y is a separate”knob”
- Together(per hand)-the combined x/y position is considered to be one control point, similar to how a computer mouse works.
- Together (both hands)-2 x/y control grids
- Quadvector-x/y-left and x/y-right are combined into a 4 dimensional control structure. think of this as video game mode or similar to how flying a quadcopter works; the left hand is, in this metaphor, forward, backward, left, right, and the right hand is up, down, turn left, turn right. no one control is enough. all vectors must be interacted with simultaneously to control the position of the quadcopter in 3d space.
the exo-voice system works by routing accelerometer data to parameters which are determined by which “keys” are pressed when in edit mode. ( a key in this system is a function initiated by pressing a sensor with the finger, like how wind instruments work, but here is more symbolic since the same function could be achieved in any number of ways. since I was trained as a sax player, my mind gravitates toward that modality.) so in “play” mode, the keys are interacted with as saxophone fingerings on a basic level. when the edit mode button is toggled, the entire system switches to edit mode and all the keys become function latches, ie, if I press key 1, the accelerometer controls parameter 1 and parameter 2. if I press key 2, the accelerometer controls parameter 3 and parameter 4, and so on. there are 8 keys so that is 16 base level parameters that can be controlled.
in previous iterated versions of the software system, each key could instantiate a separate synthesizer when pressed hard (HP1-8), each with an edit matrix that was specific to each synth which meant there were 16 parameters to control for each of the 8 synths. now though, there is one synthesizer with 4 oscillators-each assigned to its own accelerometer axis. the HP modes are for “timbral routing” which means that the synth is routed though a filter which makes it sound timbrally unique in ways such as being plucked or being blown. some parameters are already set, such as pan position, gain, delay and pitch offset, each taking on key, so that leaves 4 keys (8 parameters) to paint with, which is not a lot. this is where the fractal concept begins to pay off.
each key sends either a 1 or a 0 when pressed. in my system each key press is multiplied by 2, ie., key 1 =1, key 2=2, key 3=4, key 4=8, key 5=16, key 6=32, key 7=64, and key 8=128. this is an 8bit number sequence. one useful aspect of this is that each combination of key presses creates a completely unique number that you cant not achieve by any other combination of key presses. so, key 1, key2 and key4 (1,2 and 8) equal 11. there is no other combination of key presses that will generate 11. in addition to 4 keys per hand, there are two button arrays-3 buttons on top, controlled by the index finger and 4 buttons on the bottom, controlled by the thumb. the upper buttons are global, so they can not be mode-switched (change function in edit mode), but the bottom button array on the left hand is used to control octave position in play mode. in edit mode all keys become parameter latches, so octaves arent necessary, but in edit mode, if you press key 1, while at the same time pressing lower button 1 (LB1) you get the 8bit number 257 and can control 2 new parameters not achievable with either button alone. this is how I was able to create relateable sax fingerings. in play mode, I use the octave button presses to add or subtract 12 or 24 from the midi note number that is generated from the key presses, but in edit mode, I simply gave each of the 4 octave buttons, their own 8 bit number (256, 512, 1024, 2048) which now means that when I press key 1, I have the original parameter position as well as 4 new ones, totalling 10 parameters per key press and 80 parameters overall. this is stage 2 fractal resolution.
so, as you can imagine, there is ample opportunity for this to become a class A headfuck. it is not enough to simply throw parameters in just because one can. functions have vectors as well, otherwise they become unwieldy. the question that must be asked is “what function naturally emerges as “next”?” it is at this point that I use a mental construct called a “perceptual matrix”. this is the mental projection of the flow of parameters within the system. I discovered it by accident after I completed the first working version and discovered that the act of building it had created a mental model in my mind. this freed me from having to reference any visualizations. now I design the functions to make sense within this perceptual matrix. so far, I have not found a limit at which I can no longer interact with its growing complexity. in fact, by following a fractal design concept, the perceptual matrix guides the design evolution since the mind is the origin fractal of the matrix itself. as the name exo-voice implies, it is a fractal prosthesis for the mind, both in creation and interaction. the job of the visualizations is to convey this information to observers and collaborators as well as providing a means of interacting with the data after the fact.
im having to investigate these issues because fractal synthesis makes sense now. 2 years ago, it was two words spoken one after the other. now it is a protocol. every parameter vector can be mapped now and iterated through a continuously evolving key sequencing modality (every key sequence can have 5 positions of 10 parameters, which could end up being over 200 parameter vectors per hand and 600 between both hands, without adding any new hardware). but what goes where becomes the grasp-worthy straw.
my percussive concept is begging for evolutionary adaptation but its not as easy as just throwing something in there because, since it is one synth with parameter vectors as sort of tentacles, or synapses, one connection has to lead to a next connection and a next, fluidly. my mind is pulled in a dozen directions.
I needed to share this modality with whoever is interested in knowing it because a lot of what I will be expressing in the coming months will be based on this and without this information, there is no way to know what the hell I am talking about. in addition, I expect that within the next year, these parameter trajectories will evolve to more resemble biological expressions like nervous systems, which are fractal as well.