admin
Fri, 10/14/2022 - 16:12
Edited Text
Technology in Live Performance:
Redefining the Role of the Controller

by Misty Jones
Music Technology Innovation Program

1. Intro
2. Description of the Work
3. Innovative Aspects of the Work
4. New Skills Acquired
5. Challenges, Both Anticipated and Unexpected
6. Future Ramifications and or Plans for the Work
7. Conclusion

1. INTRO
Technology is currently well represented in live performance settings, however the manner in
which it is used still sometimes gives the appearance of segregation. These technological
elements are usually set apart from the rest of the instruments, contributing more as standalone
devices. Performers are still sometimes seen on laptops in their respective stations serving
independent functions. By redefining the use of controllers in an innovative way, we can make it
easier for technologists to be more integrated and interactive with other band members.

Earlier this year Numark released a MIDI controller called the Orbit. We were instructed to
demonstrate an innovative use for the Orbit in an assignment, and I explored the capabilities of
the device while putting the demonstration together. After finishing this project I wanted to
continue working with the Orbit in my Culminating Experience to discover other creative
possibilities.

2. DESCRIPTION OF THE WORK
The use of technology in performance has become increasingly prominent, along with advances
in the development of hardware controllers. For my Culminating Experience, I am performing
using current technology that uses these devices in a new and innovative way, plus presenting
my findings concerning challenges and new discoveries learned from this experience.

I am performing my song “Redline” with a suit I’ve created made out of four Numark Orbit
controllers. Up until this point in my career I have only manipulated pre-produced elements in
performance. For this project I loop and launch elements in real-time while also manipulating
them with effects. I am also performing with an electric guitarist and “remixing” the guitar live
during the song using some modified Stutter Edit presets. The line input from the guitar is fed
directly into Ableton Live, which allows me to capture and remix those sounds using the Orbits
and MIDI mapping.

3. INNOVATIVE ASPECTS OF THE WORK
The Orbit was initially designed to be a remote control of sorts for a DJ. By embracing both the
capabilities and limitations of the Orbit, I was able to maximize the possibilities of what it could do.
There is the obvious innovation of performing with a suit made of four Numark Orbits, which to my
knowledge hasn’t been done, although I’d say that’s just a surface observation.

I believe my attempt to be as mobile with technology as possible while still needing to be
connected to a laptop is innovative. The great lengths I went to as far as designing a USB hub
system on my back, to creatively consolidating cables was my way of saying, “even though I
might still need to be connected to my laptop, I won’t let it restrict my movement or creativity!” I
hope it inspires hardware developers to possibly consider developing controllers like the Orbit
that can use something more reliable than IR technology. I’m hoping in the future these types of
controllers can truly be wireless without worries of interference or complications.

I believe remixing a guitar live in real-time is innovative. I love the fact that every time we
perform, it will never be the same way twice. The current debate over what is “live” and what is
simply played back is one that I hope causes a resurgence of technologists to consider new ways
to be creative in real-time on stage.

4. NEW SKILLS ACQUIRED
MIDI Light Programming
I had not originally planned to make this a part of my project, but the more I performed the more I
realized that the suit had the potential to be the primary “visual” during the performance. I had
never tried MIDI light programing but knew a classmate who had done some experimenting, so I
had him show me the basics. I learned that the colors could be programmed using velocity
numbers, however my friend warned me that he hadn’t figured out a way to play on the same pad
bank as the lighting sequence without having lights drop off as a result of MIDI messages. With

some tinkering and further experimentation on my part, I discovered a way to play on the same
pad bank as the lighting sequence, using some channel and remote settings.

MIDI-Driven Video
Just a week before Sonar I was inspired after seeing a performance from visiting artists to
attempt MIDI-driven video. First, I took the composite rhythm from my song and translated that
audio into MIDI in Ableton. Then I took the color background layer of the video I was using for my
performance and used a Max For Live patch that enabled me to use those MIDI notes to drive the
changes of the frames of the video so that it would essentially be in sync with the music. The
video was triggered live, and the base content for the color background was created using the
Orbits and VIZZable plugins.

“Mass” Mapping
Before my studies here at Berklee, I had actually never done any MIDI mapping of any sort.
Trying to take my vision for what I wanted the song to sound like and look like and boil that down
to strategic mapping spread out over a palette of 4 devices was new ground that forced me to
think through channel and note numbers in a detailed way. This was all new territory for me; our
Orbit competition was the first time I had mapped for one device, so the thought of trying to map
for four devices for not only sound but also light was new ground for me.

Live Sound Aspect
I had also never used an interface before my time here at Berklee. At my previous job I had a
tech team who always set up my interface, so I was never in a situation that forced me to learn
even the basics of my hardware. My studies here of course catapulted me into a world where I
had to figure out every aspect of how the interface worked, and how I was going to get it to work
well for my project, which I’m thankful for. I had the added challenge of figuring out how to deal
with 2 live guitar sends, one of which would need to be muted at certain times during the
performance. I’ll discuss that in further detail during the “challenges” section.

Ableton Skills
Before my time here at Berklee I only used Ableton for the sound bank, rewired into Logic. I
knew early on that I wanted to use Ableton for this project simply because I wanted to be a highly
proficient user by the time my studies had finished. I can confidently say that I’ve reached a high
level of proficiency as a result of Ableton classes, plus time spent under the hood trying to figure
out mapping, effects, and MIDI challenges for this project.

5. CHALLENGES, BOTH ANTICIPATED AND UNEXPECTED
Infrared (IR) Interference
Earlier in the year an ensemble of classmates performed with 12 Orbits and encountered
problems that forced them to have to perform with all of them plugged into USB hubs. I hadn’t
considered it might be an IR interference issue, and assumed it was because they were using a
large number of devices. When I was selected to perform at Sonar, one of my classmates pointed
out that IR Interference would be a real issue. I had to find a way for the Orbits to be plugged in
while giving the appearance of looking as “wireless” as possible. I developed a “tail” which
consisted of the USB hub DC power cable, an XLR cable, and a USB male-to-male extender. I
was able to conceal the 4 USB cables under my suit while wiring them to the hub attached to my
back, and then connected the hub to my computer via this “tail” which ran down my pants leg. I
was still tethered of course, but I had found a way to make it look less obvious, and minimized the
cabling to essentially one mega-cable at floor level, which seemed less distracting to the viewer.

MIDI Overload
I hadn’t encountered too many issues with CPU or MIDI overload until I started working with the
light programming. I wanted four different light sequences for each Orbit, so when I initially tried
four different tracks of MIDI information, it caused Ableton to crash. I solved the issue by
mapping every pad to a different note, and then condensed the entire lighting sequence into one
track of MIDI information, containing all of the notes needed to reach all four Orbits. I then routed

this one MIDI track to four new empty MIDI tracks dedicated to just light routing, and the problem
was solved. I also noticed that even after this fix, the lighting information overloaded the Orbits
when they were wireless, which was yet another reason for them to be plugged in to the hub.

Data Amplification
During a performance on campus I had complications with the USB extender in my suit tail, and
had quickly dismissed it as a broken cable. After the concert I did some troubleshooting with a
new identical cable, and still had the same issues; the USB extender was still not working. After
a visit to an electronics store and a talk with the owner, he explained that it was a data
amplification problem and recommended an active USB extender. I dismantled the suit tail and
replaced the regular USB extender with the new active one, and the problem was solved.

Live Signal Routing
Something else that had become apparent during a school concert was that I hadn’t thought
through the live signal routing. In order to “remix” the guitar live, I had to find a way to send one
discreet live guitar signal to the house, but I also had to have the same signal routed to the track
that was going to record the signal. The problem was that I had 2 live guitar sounds now being
sent to the house. With some automation and mapping, I set things up so that when I hit a pad to
record the guitar signal it also triggered a mute for that same channel, and then when playing the
remixed signal back, the mute was lifted.

Playing on the Light Sequence Pads
My classmate who showed me the basics of light programming warned me that the lighting
sequence would have to run on a separate bank of pads than the ones I had intended to play on,
because hitting the pads would cause the lights to go out. I had seen YouTube videos where
people had managed to play APC40s with lighting sequences without lights dropping out but
couldn’t find detailed instructions, so I knew there had to be a way to make this possible. After
testing out some channel settings and remote configurations in Ableton’s preferences, I was able

to find a way to play on the same pad bank as the lighting sequences without sending “off”
messages. This was a breakthrough discovery, because up until this point I had been switching
back and forth between banks during the performance.

Accessing Accelerometers
After deciding where the Orbits would be placed on the suit, it became apparent quickly that I
would have a hard time using the accelerometers, since the controllers were attached to my
body. I figured out that the Orbits attached to my arms were going to be the easiest to access,
and found effects that worked well with the certain arm motion needed in order to trigger the
accelerometers. I had to be creative with mapping, since I only had one hand available to hold
the accelerometer plus trigger the pad at the same time.

6. FUTURE RAMIFICATIONS AND OR PLANS FOR THE WORK
My work with the Orbit Suit caught the attention of Numark early on, and even led to a job
interview for a possible position as a product specialist. In the event that Numark had the ability
to hire me, I would definitely continue my work with the Orbit on a bigger scale at more of a fulltime level of dedication.

Regardless, in order to continue working with the Orbit Suit it would need major modifications by
a designer to make it more comfortable to wear and easier to assemble. If there were
opportunities for Orbit Suit performances, I would consider a possible Kickstarter fundraising
campaign to raise the money needed to upgrade the suit. However, at this point I have to ask
myself a hard question: is it worth continuing to build a suit made out of an already name-branded
product, or is it finally time to learn how to build controllers myself?

I reached an identifiable crossroad when I discovered that IR interference would be an issue.
Instead of thinking, “how can I still have all Orbits plugged into the computer?” I could have asked
the question, “how can I still be truly wireless?” I would also like to continue exploring the

possibility of using a Wi-Fi Arduino so that the hub could communicate wirelessly to the computer.
Near the end of my project I collaborated with a classmate who successfully hacked the Orbit, but
we are currently still searching for the right piece of code required for capturing MIDI information
from the controller. This would eliminate the need for a “tail,” since the hub could be converted to
run on battery power.

7. CONCLUSION
In conclusion, my only regret is that I feel like I had to spend too much time devoted to the
physical aspect of the suit instead of investing that time in more technology driven aspects of the
project. As much time as I’ve poured into the suit, it still doesn’t aesthetically look as great as it
could, since I was constantly trying to balance out time spent between suit aesthetics and time
spent on actual music technology. I would have loved to have spent more time solving the
“tethering” issue, and investigating the possible Arduino solution.

I think this experience has been extremely valuable on so many levels. As a performer, it has
challenged me to use technology on stage in a way that seemed scary and unknown in the past.
As a technologist, it further expanded my capabilities with both software and hardware, and
developed critical troubleshooting skills under great pressure. And as a person, it forced me to
collaborate and work with others to solve problems, since there weren’t a lot of answers that
could be found by a simple Google search. I am thankful to Berklee College of Music for
providing me with such an incredible experience, and I look forward to seeing the role it plays in
my future.

Media of