“What we’re working on is not just the performance. What we believe is that the next revolution is to have every instrument smart and connected…”
Michele Benincaso is the CEO of Mind Music Labs, the Sweden-based startup behind the Sensus Smart Guitar.
You may know it from the guitar’s unveiling at the Web Summit conference in November 2015; from its success in the Midemlab and Sonar+D startup contests in 2016; or from the instrument’s first stage performance at the Slush conference that November.
Having raised $220k of seed funding in January this year – following a previous round for the same amount – Mind Music Labs is one of a number of companies trying to bring tech smarts to the guitar and other music instruments. See also: Jammy, Magic Instruments, Jamstik+ and FretX.
That said, Benincaso hopes his company stands out, partly because the Sensus is a proper standalone guitar – “it’s not just a controller for your computer!” – and partly because Mind Music Labs has wider ambitions.
“The smart guitar is the first instrument that is using our technology. But what we have actually developed is a system that can make every music instrument smart. It’s a full embedded system designed for ultra low-latency audio performance,” says Benincaso.
That provides potential for Mind Music Labs’ technology to be used for a range of instruments and electronic-music devices: not just guitar, and not just instruments made by the company. For now, the company is keeping any plans on this front close to its chest.
Benincaso is happy to talk about the potential that connected instruments have for music education, however.
“Music education is a huge market. The last figures we saw were that it’s $9bn in the US alone, although that includes a lot of music lessons. But people are wanting to learn in a completely new way too,” he says.
A lot of this comes down to data, he continues, noting that a human teacher giving a lesson in-person can watch a student play, and then give them real-time feedback on what they are doing wrong, and how to fix it.
“But when it comes to digital learning, you don’t have much data. Apps like Yousician and Uberchord – which I think are doing great work – use the microphone of the phone to get the audio coming from the guitar, it’s true, but the app doesn’t know where I am putting my fingers, and if I am holding the guitar right or wrong,” he says.
This is where Mind Music Labs believes that connected instruments can help music learners: their sensors and connectivity could provide music-learning apps and services with more (and more-useful) data on how they’re being played, rather than just the sound that they’re making.
“It will open up a completely new world when it comes to education. The instrument will be able to send over wirelessly not only the audio, but all the data that an app like Yousician needs to give the user feedback and a much more complete experience,” says Benincaso.
“That’s the biggest challenge today when it comes to music education: we don’t have enough data on how people play. If we have a connected instrument, we can have access to that data.”
He also sees the potential for smart instruments to be used together with other devices. For example, smart speakers like Amazon’s Echo and its Alexa voice assistant.
“Think about your Amazon Echo, and Alexa telling you that you need to move the guitar another way and move that finger. That’s very complex to do today,” says Benincaso.
“The only way to get information from the instrument is via the microphone and the audio coming from the instrument. But so much other data is lost. When we change that, music education technology will have a big jump.”
For now, Mind Music Labs is continuing to work on the Sensus and the platform that underpins it, while considering what kind of partnerships could take the latter forward. Benincaso is confident about the potential.
“In 10-15 years from now, every new musical instrument will have its own system and be connected to each other. And it will be like the mobile app world, with SDKs and APIs for people to write all kinds of applications on top of these systems, and will give access to the musicians to interact with all the newest technology like VR, wearable devices and the whole Internet of Things world,” he says.
That could fuel everything from educational apps to games and virtual-reality experiences, where the audio and associated data from smart instruments is available to developers.
Benincaso draws a parallel with a previous shift in instruments, when acoustic guitars went electric, with amps and other technologies.
“When you went from the acoustic to the electric guitar, you went from 30 people in the room to thousands of people. Or even Woodstock, where you had just a bunch of guitar players on stage and 400,000 people getting crazy!” he says.
“Musicians’ needs are a new way of expression using sound, and also the need to engage people and to reach more people. And today, if you want to reach more people, you can’t just get a bigger amplifier. You need to be connected.”