In seiner Funktionalität auf die Lehre in gestalterischen Studiengängen zugeschnitten... Schnittstelle für die moderne Lehre
In seiner Funktionalität auf die Lehre in gestalterischen Studiengängen zugeschnitten... Schnittstelle für die moderne Lehre
The Synth Gauntlet is a MIDI device controlled by gestures that connects to your DAW, such as Ableton Live. Assign up to three parameters to the device and use intuitive hand gestures to control filters, effects and other elements in real time during your performance.
The Synth Gauntlet uses simple hand gestures to control MIDI parameters in real time. To activate it, performers press their last three fingers to their palm, avoiding accidental triggers. The system recognizes three gestures, each linked to different effect categories. Rotating the hand clockwise adjusts effect intensity, while switching gestures locks values so multiple layers can be combined.
Open Source: all code and design files are freely available, encouraging DIY building, modification, and community-driven improvements;
Affordable: significantly lower cost compared to existing commercial alternatives;
Digitally Fabricated: 3D-printed construction allows rapid prototyping, easy repair, and tailored customization;
Direct MIDI Integration: communicates straight with any DAW or MIDI-compatible hardware without extra adapters;
Customizable Design: adjustable sizing, materials, and aesthetics to fit each performer’s style and needs.
The Synth Gauntlet is designed for musicians who perform with DAWs, synthesizers, and digital effects in live settings, with particular focus on darker genres such as dark wave, minimalist, and dungeon synth. These often DIY-oriented genres align perfectly with our product's aesthetic and ethos - the medieval-inspired design resonates with the atmospheric, otherworldly nature of these musical styles, while the affordable, digitally fabricated approach fits the independent, do-it-yourself culture that defines these communities.
We initially explored two primary directions for the project: developing a sound visualization tool for live performers (incorporating lights and audio-reactive visuals) or creating a sound manipulation interface through unconventional gesture-based controls, similar to existing products like Visco and Random controllers. The central research question that emerged was:
How can we manipulate sound through its own physical visualization, and potentially reverse that relationship?
Through this exploration, we established several core design requirements that would guide the project development. The solution needed to be a physical product with unconventional, playful design characteristics that departed from traditional control interfaces. We prioritized affordability and replicability to align with the budget constraints common in DIY music communities. Critical exclusions included graphic-only interfaces and flashy visual elements typical of commercial rave environments, as we aimed to create something more intentional and artistically focused.
At the same time, we created a moodboard that served as a tool for our artistic direction, in particular it helped us identify the specific music genres that would inform our target audience and design language.
On this basis, we conducted our desktop research on the two identified directions: sound visualisation and uncommon ways to manipulate sound.
By mapping these references, we identified not only existing technological possibilities but also gaps and opportunities to design something innovative that was aligned with our initial ideas
As part of our initial desktop research, we conducted a series of interviews with electronic music performers to better understand their workflows, tools, and unmet needs in live performance contexts.
Since these conversations took place early in the process, they also helped us map the broader sound visualization landscape and industry trends. Our participants included two visual artists, a light technician, and a performer (singer and guitar player), giving us perspectives from different parts of the live performance ecosystem.
The interviews combined general questions about background and performance style with more detailed explorations of technical setups, tool usage, and creative processes. We looked at how performers prepare and execute their shows, what tools they rely on for sound manipulation and visuals, and how they adapt to the limitations of current equipment. Each session closed with speculative questions about what tools they wished they had, giving us valuable insight into gaps in the current market and inspiration for our own design directions.
Following our research, we identified the following question as being key to our concept:
How might we enable live performers to fluidly manipulate sound through intuitive physical interactions with an innovative controller?
We used Dungeon Synth, which is known for its medieval-inspired, dark ambient sound, as a starting point. Its community is notorious for being both DIY-centred and very niche and tight-knit.
Drawing inspiration from this genre, our Synth Gauntlet is aimed at DIY electronic music artists who want to enhance their live performances with greater flexibility.
The Synth Gauntlet solves the fundamental problem of controlling effects without breaking your performance flow or stage presence, at a fraction of the cost of commercial gesture controllers. Unlike proprietary systems, our open-source approach provides complete creative control - full access to 3D printing files, Arduino code, and customization guides means you can modify sensitivity, housing design, or add visual elements to match your artistic vision. The medieval-inspired aesthetic integrates perfectly with darker electronic genres, turning the technology into part of the theatrical performance while the included Ableton Live template ensures the artist can go from unboxing to performing in minutes.
After defining our core concept, we needed to determine the specific gestures and their corresponding effects to create an intuitive yet distinct gestural language.
For the musical implementation, we identified key parameters that transform synthesized sounds into the atmospheric textures characteristic of dungeon synth: vibrato, low-pass filtering, and tape hiss distortion. These became the foundation for our pre-mapped Ableton Live template using only free stock plugins, creating a plug-and-play experience for artists entering the genre. We also explored integration with specialized tools like Topos, though its limited parameter range (filter, amp, speaker) made it less suitable for our broader creative vision.
For the gesture selection, we researched electronic music workflows to understand how gesture control could enhance performance fluidity, drawing inspiration from diverse sources including choir conducting techniques, witchcraft symbolism, and musical shaping gestures used by performers with hearing impairments.
To validate our gesture design, we conducted hands-on testing sessions with a piano player, allowing us to directly observe the ergonomics and usability of each proposed gesture in a musical context. This practical testing phase was crucial for refining our gestural vocabulary before moving into technical implementation.
Based on our research and testing, we finalized a three-gesture system designed for reliable performance use.
Our final gesture vocabulary consists of three distinct hand positions:
This combination provides performers with an intuitive mapping between gesture expressiveness and sonic character - aggressive gestures control intense effects, while subtle positions manage atmospheric parameters, creating a natural relationship between physical movement and musical expression.
One of the Synth Gauntlet's core features is its open-source design. While similar devices already exist, our target audience is a niche community with a strong DIY culture, so we aimed to stand out by making the project fully accessible and affordable. From the outset, we developed the device in two ways: as a downloadable DIY kit for makers who want to build and customise their own glove (including 3D files to print the glove in the right size at home), and as a ready-to-use version for those who are less tech-savvy but still want to perform with it straight away.
To make this possible, we first explored different methods of gesture recognition, ranging from AI-powered, camera-based software to flex sensors, gyroscopes, accelerometers, and motion or distance-tracking systems. Looking at reference projects such as sign gloves, Arduino-based systems, and wearable keyboards, we quickly realized that camera setups were too bulky and unreliable for live performance, while commercial wearables didn’t offer the customizability we needed. This led us to adopt a DIY, sensor-based approach that could be seamlessly embedded into a wearable glove.
When it came to the microcontroller, we tested several options and eventually narrowed it down to two: the Arduino Leonardo, which we used during prototyping, and the Teensy 4.0, which provided higher performance for the final version. Both offered the right balance of size, processing power, and MIDI compatibility. To keep the design flexible, we envisioned two modes of access: users could either download the open-source files and build the Synth Gauntlet with the board of their choice, or purchase a pre-assembled version through our website, built with the Teensy-based configuration.
To capture gestures, we combined flex sensors (to detect finger bends) with a gyroscope/accelerometer module (MPU6050) for rotation and tilt detection. A momentary capacitive touch sensor was added to activate the system and prevent accidental triggering. Finally, a NeoPixel LED ring provided live feedback, both for debugging and for stage presence. Together, these components gave us a system that was compact, affordable, and easy for other makers to replicate.
To simplify our work, we developed two prototypes: one functional prototype with embedded electronics and a connection to the computer, and one aesthetic prototype to demonstrate the 3D printing technique and the expected appearance of the Gauntlet. These two prototypes would work together to create the final Gauntlet.
The electronics followed a straightforward flow:
sensors collect movement data → Arduino reads and processes it → recognized gestures are mapped into MIDI Control Change messages → these are sent to the DAW.
Flex sensors connect through voltage divider circuits with pull-up resistors, providing analog readings that are continuously sampled and filtered to reduce noise. The Arduino processes this sensor data through our custom gesture recognition algorithm, which identifies specific finger combinations and translates them into corresponding MIDI control change messages. A dedicated LED driver circuit (which was not implemented in our final prototype) ensures consistent visual feedback without interfering with sensor readings or MIDI communication timing.
Initially, the components of our functional prototype were glued and taped to a glove, while the breadboard and Arduino Leonardo were left loose on the table surface. However, this setup required longer wires that were loose and often came off. Consequently, we ended up using a breadboard attached to the back of the hand to connect the sensors. This was connected to an Arduino Leonardo, which was secured to the wrist with two straps. The other sensors were glued and sewn onto the glove.
On the software side, our code followed a simple loop:
read sensors → process data → map to MIDI → send out control changes.
We used threshold detection and some smoothing to prevent jitter, since the flex sensors in particular were noisy. The gyroscope handled the continuous control: rotating your hand would sweep a parameter from 0 to 127.
Different finger positions (open palm, fist or pointing) acted as gesture 'modes', enabling us to map multiple effects to the same glove. Each gesture would trigger a different control number, which would then trigger a different effect. The MIDI messages were sent directly over USB, allowing plug-and-play functionality with Ableton Live and other DAWs.
Needless to say, building a wearable instrument came with its fair share of challenges. Low-quality flex sensors often gave inconsistent readings, so we had to compensate in software with filtering and calibration. Solder joints sometimes broke due to the stress of moving the glove, which meant re-soldering wires more times than we’d like to admit. Cable management also became an issue, since too much tension interfered with comfort and accuracy. Our solution was partly technical (stabilizing sensors in code) and partly mechanical (rethinking how wires were stitched and protected in the glove).
To decide which methodology to use for creating the glove, we investigated digital manufacturing techniques and available materials to choose the most accessible option in terms of skills and costs. We identified several 3D textile fabrication methods: Chainmail (modular elements joined to create interconnected flexible structures); GCode and Infill (ultra-thin sheets whose flexibility is achieved through the filament pattern), and 3D Printing on mesh-like textile (printing geometric patterns that sandwich mesh fabric). For the material, we selected TPU, a rubber-like polymer best suited for durability and comfort during extended use.
Considering the equipment that both we and users would have available (FDM printers rather than powder-based printers like MJF) and the material we deemed optimal for the glove, we chose the infill method, as it was best suited for TPU printing.
To develop the glove prototype, we started by building a pattern from scratch for a classic-cut glove: we needed to understand what components make up a glove and the logic behind how they are joined together. This step was fundamental because, aware of the limited dimensions of the print bed, we immediately realized that the panel division proposed by traditional tailoring would not be feasible.
Keeping user comfort in mind, we decided to use TPU as the main material for the glove, a choice that also constrained us to this textile typology and assembly technique, considering that FDM printers are the most widespread. However, selecting the material type was not sufficient: TPU is an elastomer that comes in various degrees of elasticity, each with its ideal use case but also different printing requirements. After several tests, we chose TPU with 82A Shore hardness; a filament with lower hardness would probably have been more pleasant to the touch, but would have caused too many printing issues.
Another factor that influences the flexibility of 3D textiles created with the infill method is the type of infill selected. This fabric consists of very thin shapes (approximately 0.6mm thick) printed without top and bottom surfaces, so the visible pattern is the infill assigned by the slicer. To select the best pattern shape and density, we conducted various tests, choosing the sample that maintained acceptable pattern density without compromising flexibility: Zig-Zag at 40%.
As mentioned, the limited dimensions of the print bed prevented creating the glove as a single piece, but this pushed us to divide it strategically at areas of greatest tension, such as the knuckles and anatomical joints of the hand.
With numerous pieces to assemble, we needed a simple method that wouldn't require additional tools beyond 3D printing. We drew inspiration from Variable Seams garments, a designer specialized in modular fashion, creating dual-entry modular connections: each element features two connection points (male and female) on opposite sides, allowing simultaneous connection of two different pieces.
This assembly system is elastic because it's made from TPU like the fabric panels, the divisions follow anatomical points that require greater freedom of movement, and the orientation of the connections is parallel to the direction of applied traction forces.
The print file is organized with pieces already grouped by section and optimally arranged on the print bed. Regarding sizing, this production method allows users to scale the pieces by a percentage corresponding to the different glove sizes available on the market.
Users who turn to our product-service have different purchasing options: order the set of electronic components necessary for operation and create the glove themselves by downloading the file, or purchase the complete glove from the website by entering their specific measurements.
As a standalone business, it would make sense to invest in an MJF (Multi Jet Fusion) printer to produce gloves consisting of fewer pieces and with more complex geometries; powder-based printers allow avoiding intricate supports that would be necessary with an FDM printer to prevent the glove from collapsing during printing.
For the FDM printing production method, users will be asked to scale the shapes in the file according to a factor proportional to the glove size they usually wear. If they don't know it, they will need to measure the palm circumference and the distance between the middle fingertip and wrist attachment.
For made-to-order production, instead, they will be required to send the previously mentioned measurements plus the forearm circumference. Unlike the previous method, the glove will be produced to their exact measurements and will fit „like a glove.“
Users who purchase only the electronic components kit will install it on the designated patches using the PLA frames. Those who purchase the complete glove will receive the electronics already embedded within the print: since TPU is an elastic material, by using proper tolerances, it's possible to fit the electronic components inside the printed piece.