In this dBs Tutorial, DC Breaks test drives ACE Studio's impressive AI vocal platform. Learn how you can generate INSANE AI vocals in this guide.
Artificial intelligence, or AI, is finding its way into all corners of our lives. From AI tools on your phone and laptop to popular text and image generation tools Chat GPT and Midjourney, working with, rather than against, AI is something that we have to acclimatise to.
Artificial intelligence in the creative industries, particularly in the music industry, could be transformative for the democratisation of music production and sound engineering; breaking down barriers of entry for people. Of course, many people worry about its impact on jobs and that's something we all need to be wary of.
Chris Page - one-half of drum and bass duo DC Breaks and a dBs Institute module leader - is fascinated by new AI tools and has been test-driving them for us! In this dBs Tutorial, he explores ACE Studio and how it can be used to generate some pretty insane vocals.
How to generate AI vocals in ACE Studio
We've broken down the main points of our dBs Tutorial on ACE Studio here, but you can watch the full video below!
Choose an AI Voice in Ace Studio
When choosing an AI voice in ACE Studio, you can refine your choice of AI vocalist based on the language, gender and style of the singer, with choices including:
- Chinese language
- Japanese language
- English language
- Male/Female
- 'Pop'
- 'Bright'
You can also create your own custom singer which requires you to upload around 30 songs worth of vocals to get an effective model.
Generating the AI vocals
In brief, the way that you generate your AI vocals in ACE Studio is to 'draw' on some MIDI notes in the software and then apply lyrics to those notes. The software then maps the MIDI and lyric information to the 'singer' you have chosen. Some important things to consider during this process are:
- You can only use syllable per note
- Notes cannot overlap (as real singers can't sing more than one note simultaneously)
- Each syllable is split into its constituent Phonemes; the smallest unit of sound in a word
Once generated, you can use ACE Bridge to 'bridge' the generated vocals straight into a DAW and use reverb and other processing.
Modulating your generated AI vocals in ACE Studio
One of the amazing things about ACE Studio is the flexibility with which you can edit and modulate the vocals you've generated. In the software, you can amend the vocals using controls that manage things like pitch, vibrato, falsetto, breath, air, tension, energy and formant.
We'll let Chris take it from here! You can watch his full dBs Tutorial below.
Learn from Chris Page at dBs Institute! Learn more about our Music Production Courses on our website or at the next Open Day and make sure to Subscribe to our YouTube channel for more tutorials!