The Battle for Your Voice: Cissy Jones & Larissa Gallagher Take on AI | The Pro Audio Suite
The Pro Audio SuiteMay 13, 2025x
16
00:44:0580.6 MB

The Battle for Your Voice: Cissy Jones & Larissa Gallagher Take on AI | The Pro Audio Suite

Award-winning voice actors Cissy Jones and Larissa Gallagher have seen their voices stolen by AI. Literally. In this hard-hitting episode of The Pro Audio Suite, they join us to share the shocking reality of finding their voices on AI platforms — being sold, shared, and synthesized without consent or compensation. But they’re not just here to sound the alarm — they’re here with solutions. We dive into:
  • How they discovered their voices were being used
  • The legal and ethical grey areas of AI voice cloning
  • Their fight for control through advocacy with NAVA (National Association of Voice Actors)
  • The groundbreaking tech tools being developed to protect voice actors
  • What YOU can do to secure your voice and future
Whether you're a voice actor, producer, or just someone who cares about consent in the age of AI, this is a must-listen. A big shout out to our sponsors, Austrian Audio and Tri Booth. Both these companies are providers of QUALITY Audio Gear (we wouldn't partner with them unless they were), so please, if you're in the market for some new kit, do us a solid and check out their products, and be sure to tell em "Robbo, George, Robert, and AP sent you"... As a part of their generous support of our show, Tri Booth is offering $200 off a brand-new booth when you use the code TRIPAP200. So get onto their website now and secure your new booth... https://tribooth.com/ And if you're in the market for a new Mic or killer pair of headphones, check out Austrian Audio. They've got a great range of top-shelf gear.. https://austrian.audio/ We have launched a Patreon page in the hopes of being able to pay someone to help us get the show to more people and in turn help them with the same info we're sharing with you. If you aren't familiar with Patreon, it’s an easy way for those interested in our show to get exclusive content and updates before anyone else, along with a whole bunch of other "perks" just by contributing as little as $1 per month. Find out more here.. https://www.patreon.com/proaudiosuite George has created a page that is strictly for Pro Audio Suite listeners, so check it out for the latest discounts and offers for TPAS listeners. https://georgethe.tech/tpas If you haven't filled out our survey on what you'd like to hear on the show, you can do it here: https://www.surveymonkey.com/r/ZWT5BTD Join our Facebook page here: https://www.facebook.com/proaudiopodcast And the FB Group here: https://www.facebook.com/groups/357898255543203 For everything else (including joining our mailing list for exclusive previews and other goodies), check out our website https://www.theproaudiosuite.com/ “When the going gets weird, the weird turn professional.” Hunter S Thompson


00:00:00

Get ready.




00:00:00

Be history.




00:00:01

Get started.




00:00:01

Welcome.




00:00:02

Hi. Hi. Hi.




00:00:03

Hello, everyone. To the pro
audio suite.




00:00:06

These guys are professional
and motivated.




00:00:08

Please take the video.




00:00:09

Stars
George Wisdom, founder of Source




00:00:11

Element
Robert Marshall, international audio




00:00:14

engineer Darren Robin
Roberts and global voice repeaters.




00:00:17

Thanks to tribal Austrian audio,
maging passionate elements.




00:00:21

George the tech wisdom and Rob Irwin
APIs international demos




00:00:25

to find out more about us.




00:00:26

Jake the Proteas sweetcorn line up.




00:00:29

Ready? Here we go. I'm ready.




00:00:32

And welcome to another pro
audio suite




00:00:34

thanks to Austrian audio
making passion herd. And.




00:00:38

Don't forget the Code Trip




00:00:40

200 that will get you
200 USD off your tribal




00:00:46

AI is a hot topic,
and we have a couple of guests




00:00:49

to talk about trying to beat AI
or at least control it a little.




00:00:53

We have the Russell, Gallaher and CC.




00:00:57

You. Thanks for having us.




00:01:00

Thanks for having us.
Oh, you've been had.




00:01:03

If you believe you're really on the
podcast, thousands of admiring fans.




00:01:07

Womp womp.




00:01:11

Who wants to fire off first?




00:01:13

How are you going to save
the voiceover industry from AI?




00:01:16

We are.




00:01:16

Thank you so much for asking.




00:01:18

We are, you know.




00:01:21

So here's the here's the thing.




00:01:22

Larissa and I have both been voice
actors for many, many years.




00:01:25

I've been a voice actress
for, 17, 18 years.




00:01:29

I've worked in all the mediums
that there are here.




00:01:32

And when I started finding my voice




00:01:34

on multiple websites
without my consent,




00:01:36

and when I reached out
to the companies that were,




00:01:39

offering access to my voice for money
without paying me anything.




00:01:43

And I said to them, hey, that's me.




00:01:45

You don't have my permission to have




00:01:47

my replica in the first place,
and you're not paying me for it.




00:01:50

Please remove it.
They said no, really?




00:01:52

They said you don't own your voice.




00:01:54

And, they were right.




00:01:57

So this approach has been multi-fold
number one.




00:02:01

I sit on the board of the National
Association of Voice Actors.




00:02:04

Anybody in the voiceover industry,
I would hope, is familiar with Nava




00:02:07

and the work that they've been doing.




00:02:09

They also spawned Ava
in Australia and Canada and,




00:02:13

multiple other,
advocacy groups around the globe.




00:02:16

We've been working with legislators
to get meaningful laws




00:02:19

passed around the protection
of our voices.




00:02:23

Adding the definition of voice
to biometric data




00:02:25

and copyrightable data,
which is currently not a thing.




00:02:29

So in addition
to doing the legislative path




00:02:32

and really
working to get laws passed,




00:02:35

you know, Larissa and I and




00:02:36

our co-founder Julian started talking
and saying, you know,




00:02:40

even if we get these laws passed,
there is nobody that is going to care




00:02:43

more about a voice actors
voice than voice actors.




00:02:47

Why are we not starting
our own company?




00:02:49

Why are we not trying to beat them
at their own game?




00:02:51

Why are we not planting a stake
in the ground and saying, great,




00:02:55

we understand this technology is
here.




00:02:56

It's not moving backwards.
It's only getting better.




00:02:59

This is how we expect to be treated
as human beings and as professionals.




00:03:03

And I think there's a a dignity
to that that other companies




00:03:07

just aren't offering
because they are scraping every.




00:03:10

So so I just want to peel
a few things back.




00:03:13

Just the first thing you said,
which is you don't actually




00:03:16

have rights to your own voice.




00:03:19

Can you elaborate on currently?




00:03:21

Yeah. Yeah. So we've never had to.




00:03:23

So if you look at biometric data
and, rights of publicity,




00:03:27

it always refers to name,
image, likeness.




00:03:29

So for example, a celebrity can say
that's my picture on a billboard.




00:03:34

You don't have my permission
to use my picture on a billboard.




00:03:36

That is an infringement
of my rights of publicity.




00:03:40

And you have to take it down.




00:03:40

Great voice was never




00:03:42

they never needed to add it
to any of those definitions,




00:03:45

because we never had technology
that could believably copy it until,




00:03:52

several years ago.




00:03:52

Well, they didn't pass any laws
about this, but several years ago,




00:03:56

I believe it was a Toyota dealership.




00:03:58

Hired
someone to sound like Bette Midler




00:04:00

because they went to Bette Midler
and said,




00:04:02

hey, we really want you to do our ad.




00:04:04

And she said, no, go pound rocks.




00:04:06

And so they hired a sound
alike to come do it and pass it off




00:04:09

as Bette Midler.




00:04:10

She was able to sue and get it
taken down because she's a celebrity.




00:04:14

Right.




00:04:15

So she is able to get some,




00:04:18

you know, rights of publicity




00:04:19

associated to her voice
because she is a well known quantity.




00:04:23

The rest of us are not.




00:04:25

So right now
you're having a spate of,




00:04:28

you know, people getting fake
kidnaping




00:04:30

phone calls from, like, a loved one
who's been kidnaped and sent $10




00:04:34

to this Nigerian prince
to get your, kid back or whatever.




00:04:37

And they're very believable
and they're very convincing,




00:04:39

and they're not illegal yet
because there are no laws.




00:04:43

So comment on this one.




00:04:45

I can't name names,
but a friend of mine is the voice




00:04:48

of a international reality
show, doing it for years.




00:04:52

It's a good gig.




00:04:54

And over the years,
they've, beat them up.




00:04:58

We can't now pay
you residuals for international play.




00:05:00

We're going to pay you a flat rate
for this.




00:05:03

We're going to lower your contract
for that.




00:05:05

Every year they sign it again
because they need the gig.




00:05:08

And then one year they say we're
only going to hire you for pickups.




00:05:13

And, and and they're like for what?




00:05:15

They're like, well the pickups
we can't get the eye to do correctly.




00:05:18

And then they say, well,
I didn't give you permission.




00:05:21

And then they pull out
whatever contract they signed




00:05:24

and said, well,




00:05:25

you gave us the rights to own
everything and therefore we own it.




00:05:29

We can pump it into the theft machine




00:05:31

and you have no right to say that
we don't and kick rocks.




00:05:36

And then they said, well, I guess
that's the end of our relationship.




00:05:38

I'm never going to do a pickup
with you.




00:05:40

So good luck using the I.




00:05:43

And you have what you have, I guess.




00:05:45

And it just destroyed the whole
or possibly I don't know, it's




00:05:48

just an awful situation and it seems,
you know, obviously all driven




00:05:52

by how far low can we drive
the budgets and drive profit.




00:05:56

And it didn't need to happen.




00:05:58

Laws are currently being crafted
to combat exactly that




00:06:02

because we as voice actors,
you all know, have to typically




00:06:05

sign contracts, especially for games
and especially for, you know,




00:06:09

this type of job
that says in perpetuity




00:06:11

throughout the known universe
and any technology




00:06:13

currently existing
or to be developed.




00:06:15

Well, I'm sorry, that cannot
hold up in a court of law.




00:06:19

And there are laws being crafted
right now that specify exactly




00:06:23

that any contract
that was signed with that language




00:06:26

cannot possibly
be held up in a court of law, because




00:06:30

that's ridiculous and egregious.




00:06:32

So there is legislation coming,




00:06:33

but the government takes its sweet
time to get it done.




00:06:37

Yeah.




00:06:38

If I can interject, that was
that was kind of part of that.




00:06:40

Everyone, is part
of the other reason that,




00:06:44

you know, kind
of we wanted to set this standard




00:06:47

because the current companies out
there that are providing




00:06:50

AI and providing these solutions
for everyone,




00:06:53

a kind of throwing their hands
up in the air and going.




00:06:56

But it's impossible.




00:06:57

We can't do this.




00:06:58

It's and it's not true.




00:07:00

It's very difficult
because they built they did the old




00:07:04

Silicon Valley, you know,




00:07:07

for free, break things,
break out forgiveness later.




00:07:11

And so we were like, well, look, part
of this is about setting a standard




00:07:15

and proving that it can be done,
proving that it's not impossible,




00:07:18

and that we're going to try
and do it.




00:07:20

Part of our going to Sag-Aftra
was about saying, we want to create




00:07:24

a training foundational model
that isn't scraped.




00:07:29

Is it going to cost money? Yes.




00:07:30

Is it going to be difficult? Yes.




00:07:32

But if we can prove it can be done
and people are trying to do it,




00:07:37

it just makes it a little bit harder
for the bad actors to go.




00:07:40

But no, you can't, can't, can't
I don't know.




00:07:44

So that's, that's kind of the,
the whole journey is as long as




00:07:47

there are people




00:07:48

who are doing the bad things
and there's




00:07:50

no one proving that you can do it
otherwise,




00:07:52

then why wouldn't they keep doing it?




00:07:54

So we're we're we're on the journey
to kind of prove that it can be done.




00:07:58

And we're making huge headway
with the companies coming to us




00:08:02

going, please, this is all we want
because these guys are great




00:08:05

and they're amazing.




00:08:07

But the ethical authenticated
contract




00:08:10

did past line
back to the beginning doesn't exist.




00:08:14

And that's what we want.




00:08:15

And that's what
we're kind of came about to show.




00:08:18

We can do that
and we're going to make it happen.




00:08:20

If I may also real quickly
because we come from this industry,




00:08:24

we're not some tech bros coming in
and being like, yeah,




00:08:27

but I got this tech and it's super,
bro dude.




00:08:29

Right?




00:08:30

We're actually




00:08:30

coming in saying, listen,
we understand the existing ecosystem.




00:08:34

Ecosystem.
We understand the infrastructure.




00:08:36

We understand the agency landscape.




00:08:38

This is the thing that we want
to include in this conversation.




00:08:41

We're not trying to be
a casting place.




00:08:43

We're not trying to be a marketplace.




00:08:45

We want to make this a thing
that works for this industry, right?




00:08:50

With a technology
that is getting insanely good.




00:08:53

And if we don't step up and say,
cool, cool, cool, that's great.




00:08:57

But I'm a human being and
this is how you work with my replica.




00:09:00

This is my agent.




00:09:00

She'll negotiate my contract right?




00:09:03

We have to put our foot down
to say how we expect




00:09:06

to be treated
in this situation. Right?




00:09:08

I mean, there's




00:09:09

just no difference to me
between then then musicians getting,




00:09:13

you know, as soon as you replicate
more than 2 or 3 notes in a melody.




00:09:17

Tenant.




00:09:18

No no no no no no no no no
no no no no no.




00:09:21

Anybody remember
that. Ice. Ice maybe.




00:09:24

Oh sorry.




00:09:25

It's melody.




00:09:26

It's it's rhythm




00:09:27

I mean what
what makes Christopher Walken




00:09:29

Christopher Walken
is his delivery right.




00:09:31

So you can't just,
you know, this translates directly




00:09:35

to what voices are and voices
and likeness are to my in my opinion.




00:09:39

But of course, these are prior
art things




00:09:40

that you have to call against
and say, yo, yeah,




00:09:44

I think the problem is
that the companies that did




00:09:46

this went racing to an obvious fact,
and then they're like,




00:09:49

well,
you didn't to find theft before.




00:09:51

And it's like you knew absolutely
what it was before you started.




00:09:55

And just because you have like,




00:09:58

and the best parallel is,
is your image likeness.




00:10:01

If this whole paradigm exists
with your image,




00:10:03

how can it not exist with you?




00:10:04

Well, and for Scarlett Johansson




00:10:06

to get the traction she did
when OpenAI released her.




00:10:09

Right.




00:10:10

That's basically saying




00:10:12

even if it was not Scarlett
Johansson at the end,




00:10:15

if it was a different workflow,




00:10:17

she clearly hired her
to sound like Scarlett




00:10:20

and then released it with a tweet




00:10:21

that said her, hi, here's
my right of publicity.




00:10:25

You cannot make money off of my name.




00:10:28

But but everyone deserves




00:10:29

some some level of right,
of publicity, of their own voice.




00:10:32

And and the other thing I think,
who is it like?




00:10:35

Siri, not Siri? TikTok.




00:10:37

When one voice actress Bev standing.
Yeah.




00:10:40

Oh, she settled out of court.




00:10:42

I think she settled so
no one knows what the settlement was.




00:10:44

And I think that was a
bit of possibly a disservice,




00:10:48

to, to not




00:10:50

really rake them over the coals
in public and.




00:10:54

Sure,
but she's a one person voice actress




00:10:57

going up against a multi




00:10:58

and national conglomerate
with billions of dollars.




00:11:01

And she was.




00:11:02

But if she had gone into open court,




00:11:03

they would have dragged it out
for years.




00:11:05

I mean, listen,




00:11:06

we've had a lot of conversations
about like court cases, right?




00:11:09

And oftentimes in a court case,
it doesn't matter if you're right, it




00:11:12

matters if you have the money
to withstand the other guy.




00:11:15

Yeah. Yeah. Right. Right.




00:11:15

And he's something I was going to ask
you. Right.




00:11:18

Because, you know, with what
you're suggesting, how does it work?




00:11:22

Like it's okay for voice actors
and all that sort of stuff.




00:11:25

It's going to be clear. Right.




00:11:26

But say, you know, little Freddy
who has a podcast with his mate,




00:11:31

they record in their bedroom.




00:11:33

If someone
decided that they liked his voice




00:11:36

and they were going to clone him,
how does he look after himself?




00:11:40

Because little Freddy
can't even afford a court case.




00:11:42

Let me tell you, you can't.




00:11:44

Even voice actors can't do this
because I have clients that I know




00:11:48

have loaded sub mixes into what
have you.




00:11:52

I machine to get a scratch track out,
and there's no concept that,




00:11:58

oh, I've loaded this until just like
the open end of it, that takes it.




00:12:02

And the closest example I have
is like the fly




00:12:04

where like you go into the pod
and the fly flew in the pod




00:12:06

two and now your voice comes out




00:12:08

with like a fly
sticking out the side of it.




00:12:10

And, and you're in it,
you're in it, you're not in it.




00:12:14

It's arguable that it's all you.




00:12:15

It's half you because your data
has been put into this




00:12:19

giant mill without you
having any input about that.




00:12:23

And the fact that anybody
can put anybody's voice into it




00:12:27

is a problem in and of itself.




00:12:30

Agreed.




00:12:30

There's only one way I can say
that you could have ever control this




00:12:34

properly, control the input,




00:12:36

and I can't see it actually happening
anyway.




00:12:38

Is that there?




00:12:38

If there was some way
and we talked about this




00:12:40

with Tim Friedman,
was that correct for you?




00:12:42

Friedlander.




00:12:44

Friedlander. Friedlander.




00:12:45

Sorry, who's the president of Nava,
by the way?




00:12:47

And just an all around
amazing person.




00:12:49

And just thinking of a musician,
I am.




00:12:51

I'm thinking of the wetlands. Yeah.




00:12:54

So, if
you could actually get some kind of,




00:12:57

like, a fingerprint of your voice
that did it.




00:13:00

We have actually partnered
with a group




00:13:03

that has created
one of the most robust




00:13:05

watermarking, tracing, tracking
technologies that we've seen.




00:13:08

And we've evaluated a lot.




00:13:10

But we found this company out of,




00:13:13

a group of graduate students
out of USC that have created this




00:13:16

really incredible watermarking
technology that is, lossless.




00:13:21

It's it's, it's a visual token on




00:13:25

an audio or visual,
you know, video, thing.




00:13:28

It's label. So you can have it.




00:13:30

If I record from home,
I can put a token on it from home.




00:13:33

And then you send it
to the production studio,




00:13:34

they can watermark it
there. It's got a different token.




00:13:37

Go to your distribution
partners. It's got a different token.




00:13:39

So you can see where any leaks
may happen.




00:13:40

And also




00:13:42

if it does end up on a non authorized
non sanctioned like YouTube page




00:13:46

number one it will call home.
And so they




00:13:49

we can see when it's being
unauthorized and used elsewhere.




00:13:52

But the metadata
that's included in the watermark




00:13:56

will point
anybody who clicks on that original




00:13:58

or that YouTube link,
that unauthorized YouTube link




00:14:01

will point them
back to the original data owner.




00:14:03

So it actually generates
more sales for people.




00:14:06

If it's put up on a, on
an unauthorized YouTube page,




00:14:08

which is super, super cool,




00:14:10

and it's also looking at data
poisoning technology that can,




00:14:13

basically function like nightshade
for, visual art, where




00:14:18

if you have
this technology on your visual art




00:14:21

and somebody tries to upload it
through glaze or whatever,




00:14:23

it will corrupt the file
and give you a useless piece of art.




00:14:26

And so we've been talking
with other companies




00:14:28

that are creating a data
poisoning software




00:14:31

that will do the same thing
to your voice going through 11.




00:14:33

So if they try to upload your voice




00:14:34

through 11,
it will garble it and make it. Wow.




00:14:36

Yeah.




00:14:36

1111 is the biggest interesting here
isn't it?




00:14:39

Like, well, I mean there's
a lot of different companies.




00:14:42

11 is the 800 pound gorilla for sure,
because they have the most money




00:14:45

and they have, you
know, venture capital backing.




00:14:49

And they've been around, you know,
to be fair, they're not illegal.




00:14:51

They're not doing anything
illegally yet.




00:14:53

So yeah, it's. Yeah.




00:14:55

And for the uninitiated like me, 1111
labs is a voice, voice AI platform.




00:15:00

Okay.




00:15:01

It's probably the most prolific
at this point.




00:15:03

And they charge ten bucks a month
and you can have access




00:15:06

to thousands of voices
in their marketplace.




00:15:08

That's them calling.




00:15:08

They're threatening a lawsuit
because we're talking about them.




00:15:11

Sorry. And they are extraordinary.




00:15:13

I do like they, they
they have really, you know,




00:15:16

gotten some great technology
and they, they have offered a service




00:15:19

to people who needed it at the time.




00:15:22

What we're trying to do is say, hey,
there's a whole lot of people who,




00:15:26

you know, who want to use something
different because,




00:15:30

privacy is mandatory
or they just feel uncomfortable




00:15:34

being in this kind of general
marketplace environment.




00:15:36

They want a more B2B environment
or the training data.




00:15:40

They want to be secured.




00:15:42

And, Andrew Peters
gave me the finger in the best way.




00:15:45

Oh, Andrew.




00:15:46

Yeah, I do, I just, you know,
I decided just to for Robert




00:15:48

because you do know 11 labs.




00:15:49

Robert I well I know them
as soon as you said 11 labs.




00:15:52

But I was sitting here going,
what the fuck's 11.




00:15:54

Yeah.




00:15:54

Because of the whole scandal
in Australia at the moment




00:15:56

is there's a radio station using it.




00:15:58

I just it has nothing to do with
Spinal Tap 11.




00:16:01

Yeah. That scandal is outrageous,
by the way.




00:16:03

But exactly.




00:16:04

That's the point




00:16:05

is that we're all about, you know,
we don't want to be the marketplace.




00:16:08

We don't want to be.
Come and find voices.




00:16:10

And you can make these voices
say whatever they want.




00:16:13

The actors
that that have already signed with us




00:16:17

to be digital replicas or any company




00:16:19

that comes to us who says
we want to work with these people or,




00:16:23

you know, find us some people,
whatever our our whole point




00:16:26

is to try and help with onboarding
through agents and actors.




00:16:29

But every actor,




00:16:31

whilst they may not be able
to approve the script due to privacy




00:16:34

or whatever secrecy IP protection,




00:16:37

they will be able
to approve the general content.




00:16:40

So it will be the actor saying yay
or nay




00:16:44

rather than the tech platform going
ahead and saying we're fine with it.




00:16:48

Because again, we're all about
we want it to be about AI




00:16:52

complementing.




00:16:53

Yes, the human voice actors work,
not about replacing it,




00:16:57

so we don't want to offer it
to everyone




00:16:59

because we want to make sure
that the companies we work with




00:17:01

are companies that are looking
to complement the actors experience,




00:17:05

as opposed to just do a quick, dirty
fix and rock the AI in.




00:17:08

And thanks very much.




00:17:09

Give me ten bucks and go, can I can
I ask you a quick question?




00:17:11

If you if you
we look forward from here,




00:17:13

let's look forward say another five
years even right by go back




00:17:20

25 years.




00:17:22

Right. Pretty much.




00:17:23

And my we're all working at macaroni.




00:17:25

Yeah. We're all what? Mac OS?
I wasn't born then. Sorry.




00:17:28

My my best mates.




00:17:29

Old man was the chairman
of the board of a company




00:17:31

that was one of the first developers
in voice recognition.




00:17:35

Right.




00:17:35

He was the chairman of a company
in the UK that basically pioneered




00:17:38

that stuff. Awesome.




00:17:40

And I remember being on a car ride
with Roy, who was my mates dad,




00:17:44

and we were talking about voiceover
and all these stuff, and he said,




00:17:47

we were talking about
how voiceover works.




00:17:48

And he said, there will come a day
where a voiceover artist




00:17:52

won't be selling them,
going into a studio




00:17:56

to do a session,
they will be selling.




00:17:59

And back in those days
he said CD right, but they will be




00:18:02

selling a file of their voice
to someone to use for the purpose.




00:18:07

That's been described.




00:18:08

Very Nostradamus of his. Fascinating.




00:18:10

So that's how long this stuff's been
in the pipeline, is my point, right?




00:18:13

That means the CIA has had it.




00:18:15

For how long?




00:18:16

Yeah, yeah, that's my point though.




00:18:18

They were talking about these
like at the time they would the time




00:18:21

they were developing new stuff
they're actually talking about.




00:18:23

Okay, how do we do this.




00:18:25

Well, and I would say even more stark
is this when I first




00:18:29

really started paying attention
to this in 2020, it took about




00:18:34

7 to 10




00:18:35

hours of solid recording to get
a really good replica of a person.




00:18:38

And you can do it one shot.




00:18:39

And then in the beginning of,




00:18:41

I was going to say at the beginning
of 2022, took about six hours,




00:18:44

2024 took maybe an hour.




00:18:48

And today you can literally do
what's called a zero shot for 3 to 5.




00:18:52

I'll say something
about the zero shot.




00:18:54

I think the zero shot
relies on insane




00:18:57

amounts of previously collected
data on everybody else's voice




00:19:00

from before
to understand the entire process.




00:19:05

And so all of that is still based
on theft




00:19:08

of everybody, because they've gone
and sampled the whole world.




00:19:11

It's the same way that




00:19:12

GTP could never have learned
and figured out everything




00:19:17

without having the whole internet
to suck up.




00:19:20

And, they've collected so many voices
now that they can do a one shot.




00:19:25

You know that that is based
on having insane amounts of data,




00:19:30

essentially the
and that goes back to what we're,




00:19:32

you know,
I mean, all voice platforms.




00:19:35

I mean, you know,
you can't build a house on, a swamp




00:19:40

unless you kind of bring




00:19:41

in the materials to support it,
but you have to live in another house




00:19:45

until you can build the proper,
strong foundation.




00:19:49

So I think, you know, the, the,
the craziness of zero shot




00:19:53

is incredibly valuable to audio,
audio production houses, etc..




00:19:57

So along the way of of getting there,
if we can create the foundation




00:20:01

that it's not stolen voices,
but it's voices




00:20:04

that who have signed a contract,
who get a licensing fee




00:20:08

when that foundational model
is, is sold and licensed,




00:20:12

then those actors have contributed
and are getting paid.




00:20:16

And it's not scraped,
it's not stolen, it's not theft.




00:20:19

But until we build it,
we have to live next door.




00:20:21

Years and years ago,
I went to a meeting with the AA,




00:20:25

which is the Australian Union
for actors,




00:20:28

and they were talking about,




00:20:31

kind of the way residuals work,
which we don't have here necessarily.




00:20:34

We have rollovers
and stuff like that,




00:20:36

but it was all about like basically
the amount of times an ad gets




00:20:40

played, you get paid for it.




00:20:43

And I said to them,
well, we have a system here already




00:20:45

with the music industry,
which is APRA,




00:20:48

where
because everything was computerized




00:20:49

and radio,
every time a song gets played,




00:20:51

it actually goes into a database




00:20:53

and people know




00:20:53

what songs being played and
they get their cut of their royalty.




00:20:57

I said, why don't you do a number
with the same system




00:21:01

which has all the all the information
in the key number,




00:21:04

including the talent.




00:21:05

So every time it gets played,
you get money.




00:21:08

Like if you do a deal where you go,
okay, for three months only




00:21:12

they could play that 10 times in
three months, or you do a 12 months.




00:21:15

It could get played 500 times.




00:21:17

So the whole thing
about the duration of the usage of




00:21:21

that ad is irrelevant.




00:21:23

When you look at
how many times it gets played.




00:21:25

But talking of that system,
that's the kind of idea we need




00:21:28

for our voices.




00:21:29

So we have a key number
in everything that goes out with us.




00:21:34

Our information
inside that key number,




00:21:36

which is basically what you were
talking about before. But




00:21:39

I thought I try




00:21:40

and yeah, I by the way, I got
I got Pooh poohed about that.




00:21:43

We have the MPAA. Yes.




00:21:46

Is Nava
the MPAA is the voiceover industry?




00:21:50

No. Nava is an advocacy group.




00:21:53

So Nava is not a union.




00:21:55

Nava is not trying to track residuals




00:21:57

or any of that stuff because that's
a whole other set work.




00:22:00

But, but I mean, what's going to be
the equivalent of the MPAA




00:22:03

and the RIAA for voiceover?




00:22:05

Someone's
going to have to step up and do it.




00:22:06

I mean, we are trying to get to
who who's




00:22:08

who's motivated, who's motivated
to do, who has the time and the money




00:22:12

and who wants to paint a target
on their boards.




00:22:15

Yeah, sure. Yeah.




00:22:17

I mean,




00:22:17

I mean, how much how much of
this is SAG and they after




00:22:20

it is like sitting back
and being reactionary instead of,




00:22:24

on top of it actually.




00:22:26

Yeah,
there's a lot of really good things.




00:22:29

And I think SAG know, listen,
and I, I think SAG




00:22:33

could improve on a lot of things.




00:22:34

And I think voice over
is one of those things. Right.




00:22:36

We are often very overlooked.




00:22:39

In the greater sphere of things,
which is incredibly frustrating.




00:22:42

But, you know,




00:22:43

thanks to SAG, I have a pension
and I have health care.




00:22:47

Right.




00:22:48

So is it perfect? No,
it is certainly not.




00:22:50

But are we better than we would be
without it?




00:22:52

Yes, I believe so. Yeah.




00:22:53

I think it's
just the size of the contracts.




00:22:55

Like a record label can is raking in
ginormous sums of money.




00:22:59

Film industry
is raking in some sums of money, but




00:23:02

voice actors can be or not.




00:23:05

Yeah. Sorry. Game companies.




00:23:09

But the




00:23:09

actors themselves don't have that.




00:23:13

I don't know.




00:23:14

That's what's interesting.




00:23:15

Like recording artists have these,




00:23:17

have these copyright things
and all this stuff that protects




00:23:20

them, mechanical licenses
and all these things for voice




00:23:22

actors just never have had that.




00:23:24

But also, like, a recording artist
doesn't actually make money




00:23:27

until they go on tour, right? Well,
now they don't.




00:23:29

They used to. That's true now. Yeah.




00:23:30

That's used to make a lot of money
before. Yeah.




00:23:32

We also used to make a lot of money
on a nationwide union commercial.




00:23:36

Right, right.




00:23:36

And those have gone the way the dodo.




00:23:37

So like,




00:23:39

is there room for improvement
all around?




00:23:41

Yes, absolutely.




00:23:42

Which is again,
why we have to figure out




00:23:45

how to put a stake in the ground
and say this, not that this.




00:23:50

And I




00:23:50

and I think,




00:23:51

I think one of the things to realize




00:23:52

is that there's always going to be
the Wild West.




00:23:54

There's going to be technology
companies from whatever country




00:23:58

that are going to be like,




00:23:59

we don't care about the rules and
and go sign up and go clone a voice.




00:24:03

But as long as the sort of




00:24:06

whatever, like rules based like
if you want to be a legit company




00:24:10

in the US, in Europe, etc.,
selling product, making product,




00:24:14

having a decent name,




00:24:16

then you're going to play
by these rules




00:24:17

and you're not going to steal voices.




00:24:18

And then if you want to have like
some small time YouTube channel, that




00:24:22

or even YouTube shouldn't
allow any of it to be trafficked




00:24:25

within it. But,




00:24:27

you know, you can't put
all the toothpaste back in the tube,




00:24:30

but I feel like you can get people
to voluntarily




00:24:33

take their little bit of toothpaste
and put it back in the tube




00:24:36

until there's
a significant amount of it. I've.




00:24:38

I've tried that, Robert,
and I've actually managed




00:24:40

to get most of the toothpaste
back in the tube




00:24:42

to squeeze all the air out
and then suck it back in.




00:24:44

Yeah, yeah. Did you did
you use it first?




00:24:47

Did you brush your teeth
and then somehow like,




00:24:50

wow, absolutely.




00:24:51

Waste
not want not recycling, man. Come on.




00:24:54

Well it's
clearly going to be a carrot and or




00:24:56

and a stick kind of approach
to this thing change.




00:25:01

It's going to be some of both.




00:25:03

I don't know
if anybody in the voiceover world




00:25:05

has a big enough stick.




00:25:06

But you know, there's
a lot of carrots out there.




00:25:09

I think one of the other things
that it's not just the actors,




00:25:13

it is everything that revolves around
the actors.




00:25:16

There's entire scopes of layers
of industry and peeling the onion.




00:25:21

And this is what I do.




00:25:22

I have a terrible voice,
but I edit a lot of voice.




00:25:26

You know, like there's a




00:25:28

whole bunch of jobs that this
it affects directly in a big way.




00:25:32

I kind of feel like the way forward
here is to, to sort of




00:25:35

not just focus on voice voice actors,
because as I was kind of trying




00:25:38

to allude to before, I mean, Jocko
Willock's willingness, right?




00:25:42

One of the hottest podcasts
in the world.




00:25:44

How much money could a company




00:25:45

make by eyeing his voice and using it
wherever?




00:25:48

If he's got no right over it




00:25:50

and all that sort of stuff,
the guy's got millions of followers.




00:25:53

Yeah, well, I'm probably giving away
giving away the sacred C, right?




00:25:56

But I mean,
and this is kind of where I wonder




00:25:58

whether rather than, than voice
trying to do their own thing,




00:26:03

whether everybody who's got a stake
here should come together.




00:26:06

I know there's no podcast as a union
and that's not a clear way forward,




00:26:09

but surely if you got there would be
a great way to get critical mass?




00:26:12

Absolutely.




00:26:13

You know, if everyone is calling
their their congressmen saying, hey,




00:26:16

I don't want my grandmother
calling me or me or who I don't know




00:26:20

who, calling my grandmother
and telling you that I've been.




00:26:22

That's one of the reasons
why with our technology, we




00:26:25

we absolutely must have watermarking




00:26:28

right wi and trace tracing, tracking
and hopefully soon poison pill.




00:26:32

But that's exactly the reason is
because there's too much at stake




00:26:36

for too many people.




00:26:39

Right.




00:26:40

Yeah.




00:26:40

For Tracy in the audience
can you explain to her




00:26:42

what poison polling is.




00:26:43

Yeah. So, hi, Tracy.




00:26:46

Hi, Tracy. Let's tell you something.




00:26:48

I thought you said for tracing.




00:26:49

Should I tell you,
poison pill is basically to,




00:26:53

put some code
in our watermark technology




00:26:56

that would corrupt a file off
if someone tries to upload it




00:27:00

to an AI voice platform and generate
a new voice with our audio.




00:27:04

Right.




00:27:04

So if we could put in a watermark
from home that says, this is mine,




00:27:08

I own it, this is where it's going,
please don't post it online.




00:27:12

If it shows up on YouTube, we,




00:27:14

you know, it
pops up on our tracing tracking site.




00:27:17

And then if somebody tries
to take that audio




00:27:19

and upload it to 11
without your consent,




00:27:21

it would corrupt the file
and spit out an unusable audio.




00:27:25

Thank you.




00:27:26

So just just curious.




00:27:28

But you pass that through an analog
pass and poison pill is gone.




00:27:32

Supposedly it stays.




00:27:34

I mean, that




00:27:35

that is one of the things
that we have been working on




00:27:37

a lot
is making sure the watermark stays.




00:27:39

Now, the poison pill
could be different.




00:27:42

That's a separate technology
and one that's going to be




00:27:44

a bit more in-depth,
but the watermark stays.




00:27:48

This is like you're saying, because
you're thinking of it like Hdcp.




00:27:51

Robert.




00:27:51

Yeah, I mean,




00:27:52

I mean, any of those things like rely
on somewhat of a faithful transfer.




00:27:56

So there's always a way to transfer
it unfaithfully enough




00:27:59

but good enough to strip it
of whatever protection




00:28:02

that's been going on
since the days of scams.




00:28:05

The cat tapes in the CDs,




00:28:07

you know,




00:28:08

the logical way of stripping it,




00:28:09

the logical way
of doing it is the run that




00:28:12

we should be giving this away.




00:28:14

But I would imagine
you run it through analog to.




00:28:17

That's that's what I just said.




00:28:18

You know,




00:28:18

you just do a tape to an analog
device and you probably will strip.




00:28:22

Yeah, but a whole bunch




00:28:23

if we have to, we have to have
some element of relying




00:28:26

on the goodwill of people
because not everyone's an enemy




00:28:29

not allowed to kill people,
but people still kill people.




00:28:32

So that's one of the things




00:28:33

if you want to find a way to be to do
it, you can, sorry, roll back




00:28:37

even banks, a fortress
that was going to give made my point.




00:28:40

I mean, you know, there's
you can only do so much, right?




00:28:43

But at least you're doing something.
Yeah.




00:28:44

And we've got it to the point
where the mastering it'll, it'll




00:28:48

it lasts through,




00:28:50

a ton of mastering, kind




00:28:52

of, you know, reverb, compression,
ripping it apart.




00:28:56

It's like got very, very short
gaps through.




00:28:58

So you could really literally




00:28:59

just take two seconds of it,
drop it into something else,




00:29:02

master it, put music on it,
pull it somewhere else,




00:29:05

and it'll follow it
all the way around.




00:29:07

So it's been there any sort of,




00:29:08

anything that I have to do
as an engineer to make that happen?




00:29:13

Or does that happen
the moment you put that on?




00:29:15

Well, actually. Okay, how about this?




00:29:16

How about you come in to the studio
and I'm recording you?




00:29:21

What is
is there a legal obligation on me




00:29:24

to put that watermark on there,
or how does that work?




00:29:26

It's it's like a thing
that she puts on her tooth.




00:29:29

And then when she speaks,
it all comes out in code.




00:29:32

It's just,




00:29:33

I'm sorry, James Bond,
but but it makes a little bit




00:29:36

of everything that's the problem.




00:29:37

It's it's it's like having a dog
in the back of your throat.




00:29:41

Yeah. No, I'm just interested in.




00:29:42

How does that end where currently
there's no law, right.




00:29:45

Currently it's a it's a handshake.




00:29:46

No, but how do you see it though.
How do you how do we say.




00:29:49

Well, I would just say it.




00:29:51

Listen, I would love




00:29:52

to have it in the hands of everybody
so that I'm recording from home.




00:29:55

I watermark all of my stuff.
That leaves here.




00:29:58

You know,
you get it at your production studio,




00:29:59

you send a different watermark




00:30:01

on a different layer
and send it off to whomever. Right?




00:30:03

I would love to see that.




00:30:06

And I would love for that to be,
you know,




00:30:08

kind of the quote
unquote gentleman's handshake




00:30:10

of the industry of like,
hey, you like to work?




00:30:12

I like to work. Oh my God,
do you like money? I like money.




00:30:15

Let's do this together
and make sure that we don't,




00:30:17

you know, kill our industry.




00:30:19

For ease of uploading your voice
to 11.




00:30:21

Right.




00:30:23

The other day,
I saw my first ad on Facebook saying,




00:30:25

take the stress out of mixing.




00:30:27

And it was, you know,
I do your mixing for you.




00:30:30

So, you know, it's like,
take yourself out of a job.




00:30:32

I was at Blackmagic booth at NAB,
you just bring in your,




00:30:37

DaVinci resolve project, you bring
the audio into into Fairlight




00:30:42

and you say Fairlight.




00:30:43

Just mix it, please.




00:30:45

In about three minutes later,
the video is mixed well,




00:30:49

so obviously they don't want
any audio engineers as their clients.




00:30:54

Black magic is video.




00:30:55

I mean,
I mean, no one's using Fairlight




00:30:56

to mix their audio anyways,
so I guess that goes to video




00:30:59

people are using.




00:31:00

The whole point is the video
people will not need to use as many




00:31:03

audio books.




00:31:04

Yeah, I guess that's like yeah,
yeah, yeah.




00:31:07

But what what we're finding though




00:31:08

is, is exactly that,
that people are coming to us




00:31:12

because we don't like, I think
at heart, the majority of people,




00:31:15

or at least the majority,
everyone that's coming to us




00:31:18

is because they don't
want to just give into the system.




00:31:22

And yes, these technologies
are amazing at the moment




00:31:25

because they have all this money
and investor money behind it,




00:31:29

but they're coming to us
because we're offering them




00:31:31

something that
these other companies don't.




00:31:33

And I think the more, you know,
not that I'm inviting competition,




00:31:37

but the whole point is
make this better for the industry,




00:31:40

protect voice actors, protect
audio engineers, protect the,




00:31:44

you know, agents.




00:31:45

And I mean, we when we started,
we were even kind of going,




00:31:48

oh, we should come up with like the
contracts for the actors and decide




00:31:52

how much money are you going
to get paid for your AI thing?




00:31:54

And we were like, whoa, whoa, whoa,
hang on a second.




00:31:57

There are agents that already do that
that are superb at doing it.




00:32:01

And then talking to our agent,
she was like, hey, you know,




00:32:04

and we all kind of said, well,
what about if it's like,




00:32:07

you know, in a year's time
or even now, really, to be fair,




00:32:10

but as the agents, they look after




00:32:13

the human being Larisa actor
that goes out and does whatever or




00:32:17

and sorry
the digital replica Larissa actor




00:32:21

and then you can put the two together
for certain jobs.




00:32:24

So if Larissa
is doing a Toyota campaign




00:32:27

because I'm amazing is you know that,
then she can




00:32:30

she can say that to, you know,
the Toyota production house.




00:32:33

Well, you know, would you like her
to do your internal messaging




00:32:37

so you keep a brand consistent voice
so you have a human being,




00:32:40

voice working




00:32:41

and you have the complimentary
digital replica




00:32:45

or, you know, a system
that keeps the industry




00:32:48

the same industry where people
can still keep their jobs




00:32:51

and still be creative beings,
but it gives a lift




00:32:55

and a support to an area
that is in some places.




00:32:59

I mean, it can do it better, faster,




00:33:01

but is as long as it's
also also doing it authentically.




00:33:05

That's what we care about
as long as people are getting paid




00:33:07

when it happens
from everyone down the chain.




00:33:12

Then that I think that's that's
what we're striving for.




00:33:14

I was going to say how
how are the agents?




00:33:16

Amazing, amazing.




00:33:18

It's been amazing because it really
is, you know, our job, the way we see




00:33:24

this is our job is to augment
their ability to make money.




00:33:29

Right.




00:33:29

So I see a future where agents
have an animation department,




00:33:33

a commercial department,
a trailer department




00:33:35

and a digital replica
department. Right.




00:33:38

So we make, audio demos of the talent
that record with us and send it to




00:33:43

their agent so their agent can use it
as a marketing tool.




00:33:47

And they've been amazing.




00:33:48

We've spoken
with every agency in town,




00:33:50

and several agencies
around the globe.




00:33:51

And again,
the fact that we're actually coming




00:33:54

to them as people in the industry
not going like, yeah,




00:33:57

we made some tech, bro, like,
we actually want to work with you.




00:34:01

Let's
we are going to protect your actors.




00:34:04

We are not going to add




00:34:05

their information to training data
unless they have a separate contract




00:34:08

that says, yes,
I would like to be added to your




00:34:11

training data
for an additional fee. Right.




00:34:14

We're
not opening this up as a marketplace.




00:34:17

The people that contract with us




00:34:18

have very specific access
to very specific models.




00:34:21

It's not just a free for all.




00:34:22

So agents feel comfortable with us
because we actually give a damn,




00:34:26

you know,
the biggest winner out of this?




00:34:28

If this if this flies
the way you want it to fly,




00:34:30

the biggest winner
is going to be the agents.




00:34:33

Because without this, they're gone.




00:34:35

Yeah, yeah.




00:34:36

Can can
can I come back to to one question




00:34:39

I started asking before,
but we never got to the end of this.




00:34:42

Before you go,
if we get to this nirvana where,




00:34:46

all the boxes it takes
in, everybody's aligned and voice




00:34:49

actors are selling their samples
to go off




00:34:51

and have commercials
made out of them and stuff.




00:34:54

What does that do to
the craft of voice acting in general?




00:34:57

Listen, I don't think main characters
are going to be replaced




00:35:01

anytime soon, right?




00:35:02

I think there
will always be a place for,




00:35:06

you know, the Courtney




00:35:06

Taylors and the, you know, Nolan




00:35:10

North's to come in and bring life
to a character beautifully.




00:35:14

I think there is going
to be some hardship




00:35:17

in different parts of the industry,
and it's hard to see it coming




00:35:20

and know that there's very little
any of us can do to stop it.




00:35:24

That scares me. A lot.




00:35:26

And entry level jobs
I'm worried about. Right.




00:35:28

So again, like our
our whole vision has been to




00:35:31

not replace people but to give them
additional sources of income.




00:35:35

So we've actually turned away
a ton of companies




00:35:37

that just want to come in and start
replacing people willy nilly.




00:35:40

And we really want to make sure that
that is not the path that we take.




00:35:44

There are others
who don't care. Right.




00:35:46

And they will take those jobs.




00:35:48

Other
AI platforms will take those jobs.




00:35:51

And that's very difficult
to know that it's coming.




00:35:53

I think the industry will shrink.




00:35:55

I think the people who are able
to make a living doing this




00:35:59

will become a smaller circle.




00:36:01

I think
it's going to be a really tough




00:36:04

couple of years until the pendulum
kind of shifts back.




00:36:07

I liken it to this around 2000,




00:36:10

if you'll recall, 3D generation 3D




00:36:13

graphics came out and suddenly
everything looked like Shrek, right?




00:36:16

It was like this big, bulbous her
dick.




00:36:18

Negative already
walking around, right.




00:36:20

And everything looked like that.




00:36:21

And it was ugly, right.




00:36:23

And it was really kitschy
for a minute. It was kind of neat.




00:36:26

And then it was like,
oh God, this all looks the same.




00:36:28

And people started complaining,
right?




00:36:30

Like, oh,
it's already that way with AI voices.




00:36:32

And I think the only thing
that's even better




00:36:34

is if I starts feeding on itself




00:36:36

and starts
getting worse and worse and worse.




00:36:37

So now we have more 2D animation,
main characters, and 3D animation




00:36:41

background, right?




00:36:42

I can see the same thing happening




00:36:44

with AI,
because a lot of the companies




00:36:46

that we've spoken to have been like,
look, man,




00:36:48

I don't know, my boss just told me
I got to implement.




00:36:50

I don't know how,
I don't know why, but I got to do it.




00:36:53

Even if it's more expensive
than hiring a human.




00:36:55

I got to fit AI in here somehow.




00:36:58

So there's this massive
pendulum swing




00:36:59

because it's the buzzword
and everybody's




00:37:01

hair is on fire to get AI going
and make it the thing.




00:37:04

And there's a backlash.




00:37:06

Games right now
are very cautious about using AI.




00:37:09

Any of the major game
studios are like,




00:37:12

I'm not going to ship
anything with AI.




00:37:14

I don't want to upset my fans.




00:37:15

Great. That's great for actors.




00:37:18

And so I think there is a place
that that it will work.




00:37:21

And there is a place
that it will even help.




00:37:24

Can I add two things?




00:37:25

Yeah.




00:37:25

One is like what we are hearing a lot
is that all of you, all who,




00:37:30

audio production
people are all creatives as well.




00:37:33

And creatives
want to work with creatives.




00:37:36

So the kind of that
that synergy together will always say




00:37:40

that's what all other game companies
are doing as it's like, you know,




00:37:43

we may use it for scratch or for R&D
and to speed up the process.




00:37:47

Or maybe in commercials,
we might do it




00:37:50

so that we can get a feel and hear
how the commercial sounds.




00:37:53

But we want to go to the actor




00:37:54

for the real thing,
because we understand that,




00:37:58

that that kind of magic




00:37:59

that comes out of a human being,
the soul that comes out of




00:38:02

the human being.




00:38:03

What is it I doesn't have
trauma is kind of the catch phrase




00:38:06

is that that's
where the magic happens.




00:38:08

And that there have been studies
specifically, SiriusXM




00:38:12

did an extraordinary study
where they were playing like,




00:38:16

polling a whole lot of people,




00:38:17

playing them, AI, playing
them, human beings, blah, blah, blah.




00:38:20

What they discovered
is that 100% across the board,




00:38:24

if I was trying to sell them




00:38:27

magic or persuade them
or anything with them,




00:38:30

you know, bring passion or emotion,
anything with emotion, terrible.




00:38:35

People hated it, don't want it.
They could pick it.




00:38:37

They didn't like it.




00:38:38

Even if they didn't know if it was,
they just didn't like the thing.




00:38:41

Now, if I was giving them a list
of car parts, if I was giving them




00:38:46

these are the features of the hotel
all about it.




00:38:48

If I saying come to our beautiful
hotel, don't want to hear it.




00:38:52

So there is.




00:38:53

I think human beings
do recognize the difference




00:38:56

will that change in 20 years time
when this generation comes up?




00:39:00

Having heard it so much,
I don't know.




00:39:01

I'm not a fortune teller,
but I do think there really is hope




00:39:05

because creative artists
want to be artists.




00:39:08

I think there's another aspect to it.




00:39:10

Also, if you're a creative
leaning on AI technology,




00:39:15

what you use will eventually come
right back around and hit you.




00:39:19

And if you're a writer
using an AI voice,




00:39:22

you're
the next one to be replaced as well.




00:39:25

And I think that in the back
of everyone's head,




00:39:26

they know that it's called
mutually assured destruction.




00:39:29

Robert.




00:39:29

Yeah I yeah I have a friend
who is doing an ADR session




00:39:34

and they couldn't
get the actor to do it properly




00:39:37

and they ended up,




00:39:39

you know, at the end of the session
at the end of the hour with




00:39:41

and this is good enough
and you know half hour hour later my,




00:39:45

my buddy who's an engineer emails
the director back and says like, hey,




00:39:48

I've got something
that I think you're going to like.




00:39:52

It solves the problem.




00:39:53

I don't think you're going
to want to know how I did it.




00:39:55

And the director said,
I don't even want to hear it.




00:39:57

Don't I don't want to like it.




00:39:58

I don't, I just
I want nothing to do with it.




00:40:01

And that was that, and that was cool.




00:40:03

It's like, I just




00:40:04

rather have an organic production
and not have any of this junk in it.




00:40:07

And that way, you know,
we're all in it together and,




00:40:11

so what are our listeners
supposed to do now?




00:40:14

Join us.




00:40:15

Action item.




00:40:15

Where do they go reclaim. Number one.




00:40:18

Number one, if you are a voice actor,
please join Nava.




00:40:22

It's nava voices.org.




00:40:24

They are doing the Lord's work.




00:40:26

They're out there every day.




00:40:27

Yeah.




00:40:28

Lava India or Ava or Kava or,
you know,




00:40:31

look at the association of voice
actors in whatever area you live in.




00:40:35

They are the best of us.




00:40:37

Truly doing incredible.




00:40:39

We're truly doing incredible work.




00:40:40

Also, Nava and I'm sure all that
and karma and all of the other




00:40:44

all of us have it on their website
for free right now.




00:40:47

An AI writer
that you can attach to any contract.




00:40:50

It is free for anybody
under the sun to put on any contract




00:40:54

that you work on that says, you know,
you don't have the right




00:40:56

to take my, recordings
to generate an AI voice.




00:40:59

So that is for free right now.




00:41:01

At Nava voices.org,
I believe on there I subpage.




00:41:06

So that's number one.




00:41:06

Number two,
if you want to hear more about what




00:41:08

we're doing, you can come to Ethio
Voxer I that's eta voxer.




00:41:13

I, we have a sign up where you can
just, you know, get our newsletters




00:41:18

when we have five minutes
to put them together,




00:41:21

which is not as often
as I would like.




00:41:23

You should just have I do that?




00:41:25

Hey, I will make wonderful
newsletters.




00:41:27

Yeah.




00:41:28

No, because we're humans.




00:41:29

We want the humans to answer.
I mean, I know it's easier, but.




00:41:32

But there is a real, real value
to that. Sorry.




00:41:35

I would also say legitimately,
and I know




00:41:37

this feels like lip service,
but it's not.




00:41:39

If you live in a country,




00:41:43

learn what




00:41:44

laws are going through your system
and how you can be involved.




00:41:47

If it means you live in the States




00:41:49

and you're calling your reps,
call your reps.




00:41:51

I don't know what the process is
in Australia, but, get involved.




00:41:54

I mean, they they need to know
that this matters, right?




00:41:57

They need to know that,




00:41:58

it's not just the rich and famous
that are losing their voices.




00:42:00

It's literally all of us.




00:42:03

I think those are my.




00:42:04

Those are my big three risks.
You got any others?




00:42:06

No, I think that's it.




00:42:07

Like, just,
you know, just just be very reached.




00:42:10

Contracts read, read.




00:42:11

Your contract is almighty production
houses read your contracts




00:42:15

for the fight. The fine.




00:42:18

What is the fine? Teeny tiny language
at the bottom of that thing.




00:42:20

Thank you. Fine print. Thank you.




00:42:22

And of all of those contracts
about training data




00:42:24

and what you were submitting to
when you want to get your quick




00:42:27

little pick up from zero shot
or whatever, can be really helpful




00:42:30

if you're in the studio late at night




00:42:32

doing that,
but you are giving your actor's




00:42:33

voice away and you were giving
your studios IP away.




00:42:36

So read your contract.




00:42:38

Please just have a conversation
with us, not us.




00:42:41

I have a conversation
with your actors.




00:42:43

Like I'm sure an actor would rather
just do like a five minute




00:42:46

pickup than have their stuff uploaded
to a site without their knowledge.




00:42:49

Please don't do that. Absolutely.
Please don't do that.




00:42:51

Yeah, I think that's a huge thing.




00:42:53

If you're a writer, if
you're an editor, if you're anybody,




00:42:57

you got to realize that
you do not have the right to upload




00:43:00

any willy
nilly audio file into these systems.




00:43:03

These systems are currently,
I think, pariahs.




00:43:06

Yeah.




00:43:07

And the other thing, one on the on
the closing note,




00:43:09

I cannot replicate the imperfections
of the human.




00:43:14

That's right.




00:43:14

I mean, how would it possibly do
Ted Kennedy's voice?




00:43:17

I call that on that note.




00:43:20

What we should say.




00:43:21

Thank you, Clarissa, for being here.




00:43:23

But where they really.




00:43:25

Oh, I don't know how
that's done to dum dum dum dum.




00:43:30

Well, that was fun.




00:43:31

Is it over the audio? Sweet.




00:43:34

Thanks to try and Austrian
audio recorded using Source Connect,




00:43:39

edited by Andrew Peaches
and mixed by the others.




00:43:42

Got your own audio issues?




00:43:43

Just ask Robert.




00:43:44

Don't call him tech support
for George the tech wisdom.




00:43:47

Don't forget to subscribe to the show
and joining the conversation




00:43:50

on our Facebook group
to leave a comment,




00:43:52

suggest a topic, or just say goodbye,
drop us a note at our website.




00:43:56

Rodeo.com.