Transcript
WEBVTT
1
00:00:04.679 --> 00:00:11.910
It's about persuasion, not coercion or
deception. It's a simple formula. Most
2
00:00:12.070 --> 00:00:18.910
consumers are willing to be persuaded.
As we move into a time when our
3
00:00:18.949 --> 00:00:25.500
business relationships are increasingly driven by and
influenced by artificial intelligence and machine learning,
4
00:00:25.780 --> 00:00:31.539
it's important to understand the consequences.
It's important to deploy tech in a way
5
00:00:31.579 --> 00:00:36.090
that puts our people in their best
position to win. Mightime is Ethan butte
6
00:00:36.450 --> 00:00:42.090
coauthor of the Book Rehumanize Your Business
and I also host the CX series here
7
00:00:42.170 --> 00:00:46.729
on b Tob Growth and our guest, William Ammerman, wrote an excellent book
8
00:00:47.170 --> 00:00:52.119
on the ethics and impacts of artificial
intelligence. He breaks it down into common
9
00:00:52.159 --> 00:00:57.039
language and he'll get you thinking about
what ai means for you as a customer
10
00:00:57.479 --> 00:01:03.000
and as an operator inside a business. Here we go, all right,
11
00:01:03.000 --> 00:01:07.150
I'm opening this episode of the Customer
Experience Podcast with a quote, and here
12
00:01:07.189 --> 00:01:11.950
it is. As our interactions with
artificially intelligent agents become more human like through
13
00:01:11.989 --> 00:01:18.430
natural language Algorithms, we will begin
to have conversations with machines. We will
14
00:01:18.469 --> 00:01:23.540
build empathic relationships with those machines in
which we are even more vulnerable to their
15
00:01:23.700 --> 00:01:26.620
influence. Those words come from a
book that you should read. It's called
16
00:01:26.659 --> 00:01:32.060
the invisible and marketing in the age
of automation, big data and machine learning
17
00:01:32.140 --> 00:01:36.810
and obviously has implications for the entire
customer life cycle, marketing, sales and
18
00:01:36.890 --> 00:01:41.329
customer success. But it's author,
and our guest today, also covers its
19
00:01:41.370 --> 00:01:46.689
implications for finance and investing, policy
and politics, health and medicine, insurance,
20
00:01:46.730 --> 00:01:49.879
education, religion, even sex.
It's a book I highly recommend and
21
00:01:49.920 --> 00:01:55.200
it draws on his master's work at
UNC Chapel Hill about persuasion as a science
22
00:01:55.480 --> 00:01:59.640
and machine learnings ability to keep our
faces glued to our screens. It also
23
00:01:59.680 --> 00:02:04.510
draws on his work at MIT on
natural language processing and humans and pathic responses
24
00:02:04.670 --> 00:02:08.229
to our devices. He has years
of experience as a VP and EEDP and
25
00:02:08.270 --> 00:02:14.389
digital media and digital advertising. He's
currently the Executive Vice President of digital media
26
00:02:14.430 --> 00:02:19.699
at the engaged media portfolio of brands. William Ammerman, welcome to the customer
27
00:02:19.740 --> 00:02:24.340
experience podcast here. I'm glad to
have the opportunity to talk about the invisible
28
00:02:24.379 --> 00:02:28.340
brand with you. As I said, I really, really enjoyed it.
29
00:02:28.379 --> 00:02:31.330
I recommend it to anyone who's listening
there's a great deal of tension in there,
30
00:02:31.610 --> 00:02:36.530
and by the tension I mean,
you know, the human experience machines.
31
00:02:36.569 --> 00:02:39.770
As they get smarter and they know
more about us, they can leverage
32
00:02:39.810 --> 00:02:45.840
that against us, potentially in this
vulnerability of this empathic response that we have
33
00:02:45.960 --> 00:02:49.120
an all these things just so rich. So, before we get going,
34
00:02:49.319 --> 00:02:55.039
let's cut right to it. is
human domination by SUPERINTELLIGENCE INEVITABLE? No,
35
00:02:55.240 --> 00:03:00.830
I don't think so. I think
a partnership between humans and machines is probably
36
00:03:00.870 --> 00:03:05.870
a better way of expressing what I
view as the future. But I do
37
00:03:06.030 --> 00:03:09.430
think that machines get smarter and smarter
and I think that machines will continue to
38
00:03:09.469 --> 00:03:15.699
do things that we associate with human
like intelligence and that we are just really
39
00:03:16.020 --> 00:03:20.699
starting to experience it, so that
we've just kind of seen the tip of
40
00:03:20.740 --> 00:03:23.740
the iceberg. What are your thoughts? When I see customer experiencing? You're
41
00:03:23.780 --> 00:03:28.889
preparing to come on to the customer
experience podcast. What does customer experience mean
42
00:03:28.969 --> 00:03:31.409
to you? I don't have a
pat definition, but I think what I
43
00:03:31.490 --> 00:03:37.810
think of the customer experience. I
think of the interaction between brands and consumers
44
00:03:38.289 --> 00:03:44.360
and I think about whether or not
the consumer has a positive or negative impression
45
00:03:44.400 --> 00:03:50.439
of the brand and I also think
about whether or not the relationship feels ethical
46
00:03:50.639 --> 00:03:53.840
and transparent and trusted. So when
I think of, you know, kind
47
00:03:53.879 --> 00:03:57.949
of the customer decision journey, which
is a phrase that we use a lot
48
00:03:58.069 --> 00:04:02.310
in marketing, I think the customer
experience is kind of informing the customer decision
49
00:04:02.349 --> 00:04:08.949
journey throughout the relationship between the brand
and the and the buyer, the consumer.
50
00:04:09.590 --> 00:04:13.099
I like that you tapped there one
of those kind of tension points that
51
00:04:13.180 --> 00:04:16.540
I felt in reading the invisible brand, which is this the transparency element,
52
00:04:16.699 --> 00:04:19.899
right, like what's inside the black
box? Why am I getting these particular
53
00:04:19.980 --> 00:04:24.370
recommendations? Why am I seeing these
particular ads and all of that, and
54
00:04:24.490 --> 00:04:29.009
we will get into that. There's
one explicit reference to customer experience in the
55
00:04:29.089 --> 00:04:32.569
book, in those words, and
it was around a negative experience with a
56
00:04:32.610 --> 00:04:35.930
fifty percent off sale. Do you
want to tell that story just quickly?
57
00:04:38.250 --> 00:04:42.639
Oh boy, yeah, so my
family was visiting me. I had an
58
00:04:42.759 --> 00:04:46.199
apartment up in New York City and
around Christmas time my family came up and
59
00:04:46.279 --> 00:04:51.519
we did some shopping in the city
together and my daughter spied a coat in
60
00:04:51.639 --> 00:04:58.829
the window of a Columbia store and
in the meat packing district, I guess
61
00:04:58.910 --> 00:05:03.550
we were, and the kind of
near Chelsea market. Anyway, the window,
62
00:05:03.990 --> 00:05:09.220
you know, clearly showed a fifty
percent off sign with a rack of
63
00:05:09.339 --> 00:05:12.620
coads. We went in, my
daughter tried one on, you know,
64
00:05:12.779 --> 00:05:15.939
retail price on it was like three
hundred and fifty dollars, which was more
65
00:05:15.980 --> 00:05:19.500
than I was planning on spending on
a single present. And but she said,
66
00:05:19.540 --> 00:05:23.370
she pointed out it was fifty percent
off and I thought was still pretty
67
00:05:23.370 --> 00:05:26.170
expensive, but you know, she
needs a winner code. And okay.
68
00:05:26.569 --> 00:05:30.810
So we took it up to the
cash register and they rang it up and
69
00:05:30.730 --> 00:05:34.689
it was not fifty percent off and
I said it's on the rack that says
70
00:05:34.720 --> 00:05:39.240
fifty percent off and and the guy
you know set up. Sorry, it's
71
00:05:39.279 --> 00:05:41.800
not in the it's not in the
system. I said, you know,
72
00:05:41.879 --> 00:05:45.720
as your manager here, and he's
you know, we went through the rigmarole.
73
00:05:46.040 --> 00:05:50.550
Well, I reached out to their
online customer service and I took photographs
74
00:05:50.589 --> 00:05:53.709
of it. I said look,
you know, this is a coat hanging
75
00:05:53.790 --> 00:05:58.430
on a act that's clearly marked fifty
percent off, and I got to say
76
00:05:58.470 --> 00:06:03.459
they're their customer service department failed utterly. You know, they were unable to
77
00:06:03.699 --> 00:06:06.459
help me at all and in fact, they said, you know, you
78
00:06:06.579 --> 00:06:10.980
really need to talk to the store
manager at that individual store because we're just
79
00:06:11.100 --> 00:06:14.420
the online store. And I said, well, you know, I talked
80
00:06:14.459 --> 00:06:16.699
to him and he wouldn't honor the
sign that was in his own store that
81
00:06:16.819 --> 00:06:20.170
clearly said fifty percent off, and
and they said, well, you'll you
82
00:06:20.290 --> 00:06:24.569
really have to talk to him or
his regional manager. And I said,
83
00:06:24.769 --> 00:06:27.290
well, who's the regional manager?
And they said, I don't know,
84
00:06:28.810 --> 00:06:31.730
I don't know. This is so
it just sound like talking to Columbia.
85
00:06:31.769 --> 00:06:35.399
I'm talking to the actual corporation.
WHO's the regional manager? Who Do I
86
00:06:35.480 --> 00:06:41.000
askalate this to? I don't know. They literally failed every step of the
87
00:06:41.079 --> 00:06:46.079
way, and so what could have
been an opportunity to build a positive customer
88
00:06:46.160 --> 00:06:48.550
relationship. Fifty percent off, you
know, drawing a customer in. I
89
00:06:48.629 --> 00:06:51.310
was on the street, I wasn't
planning on going into the store. They,
90
00:06:51.509 --> 00:06:55.029
you know, lured me into the
store with their sign. They could
91
00:06:55.029 --> 00:06:58.829
have gathered information about, you know, who I was and how I was
92
00:06:58.949 --> 00:07:02.339
responding to their sales and they probably
didn't even bother to figure out that I
93
00:07:02.420 --> 00:07:08.819
was angry about the fact that I
had been snubbed and that they're advertising had
94
00:07:08.819 --> 00:07:13.579
been deceptive and that the store manager
operating their store was, you know,
95
00:07:13.779 --> 00:07:18.850
clearly using false advertising. And so
it's an opportunity lost for a company like
96
00:07:18.970 --> 00:07:24.769
Columbia. They lost my business and
in fact I have three children and a
97
00:07:24.850 --> 00:07:28.290
wife who no longer shop at Columbia
as a result. It was insulting.
98
00:07:28.329 --> 00:07:30.959
It was a, you know,
very poor decision on their part and the
99
00:07:31.079 --> 00:07:36.240
fact that they're, you know,
their organizational customer service was so bad that
100
00:07:36.319 --> 00:07:41.199
they couldn't even recognize that they'd made
a mistake was humiliating for that. I
101
00:07:41.279 --> 00:07:44.750
mean, you know, it's just
a blunder for any corporation to go through
102
00:07:44.790 --> 00:07:48.790
life blind to what it's doing and
how it's treating consumers. So lots of
103
00:07:48.910 --> 00:07:54.509
opportunities missed and it ended up in
a book right and on this show,
104
00:07:54.910 --> 00:07:57.990
on this show all over, and
I'm shy, I would guess others,
105
00:07:58.310 --> 00:08:03.540
all over one hundred seventy five dollars, right, right, just so crazy.
106
00:08:03.579 --> 00:08:05.699
When you think about that long term, I just think you just,
107
00:08:05.980 --> 00:08:09.740
you know, your family alone could
purchase en x that amount over the next
108
00:08:09.740 --> 00:08:13.250
five years. Oh in on ski
season. Right, there you go.
109
00:08:13.610 --> 00:08:16.370
So, anyway, as soon as
I saw that word, I always read
110
00:08:16.370 --> 00:08:20.410
a book with a Pencil and I
underline things and put things in parentheses.
111
00:08:20.490 --> 00:08:24.009
Periodically I'll mark things in the column. So when I saw customer experience as
112
00:08:24.050 --> 00:08:26.160
like I'm going to ask him about
that. So let's let's getting into the
113
00:08:26.199 --> 00:08:31.079
invisible brand very specifically. Let's start
at a high level. What are you
114
00:08:31.199 --> 00:08:35.799
addressing in the book from a brand
or company perspective? Like what's going on
115
00:08:35.960 --> 00:08:39.919
at a high level from a marketers
or salespersons or companies are a brand's perspective?
116
00:08:39.960 --> 00:08:45.309
Yeah, so let's let's first start
with that title, the invisible brand.
117
00:08:45.750 --> 00:08:50.669
I was playing off of Adam Smith's
wealth of nations, where he coined
118
00:08:50.750 --> 00:08:54.820
the phrase is the invisible hand,
this feature of the economy that operates to
119
00:08:56.059 --> 00:09:00.299
create wealth and in society, and
I thought, you know, we're really
120
00:09:00.620 --> 00:09:05.460
onto something new, we're experiencing a
brand new force in the economy and I
121
00:09:05.580 --> 00:09:09.330
had to give it a name and
I thought the invisible advertise or the invisible,
122
00:09:09.889 --> 00:09:13.929
you know, marketer, and I
thought what rhymes with hand, and
123
00:09:13.169 --> 00:09:16.970
then it kind of hit me between
the eyes. Invisible brand. But the
124
00:09:18.330 --> 00:09:24.090
reference is very much about revealing something
that is hidden and I wanted to kind
125
00:09:24.090 --> 00:09:28.240
of pull back the curtain and show
people how the technology that they're using through
126
00:09:28.240 --> 00:09:33.399
their phones and their laptops and even
their televisions and cars, is being used
127
00:09:33.440 --> 00:09:41.470
by marketers to gather information, personalize
information delivery back to the consumer in such
128
00:09:41.509 --> 00:09:46.149
a way that they build a relationship
with you so that they can get you
129
00:09:46.710 --> 00:09:52.629
to be more open to their messaging
and ultimately figure out how to use artificial
130
00:09:52.710 --> 00:09:58.259
intelligence to leverage that personal interaction to
persuade you and to change the way you
131
00:09:58.580 --> 00:10:03.419
act and think and what you buy. And so kind of pulling back the
132
00:10:03.580 --> 00:10:07.289
curtain on that invisible force in our
lives was really the purpose of the book.
133
00:10:07.809 --> 00:10:11.129
And it's all right here. I
mean you're not talking. I mean
134
00:10:11.129 --> 00:10:15.529
there are there is some you know, future look throughout, but you mean,
135
00:10:15.570 --> 00:10:18.649
this is all very, very present. So from a consumers point of
136
00:10:18.690 --> 00:10:22.279
view, or because we're all customers
as well, as you already established with
137
00:10:22.399 --> 00:10:26.320
your story. They're about your Columbia
experience. Talk about this a little bit
138
00:10:26.360 --> 00:10:33.759
from the customers perspective. Yeah,
so from from the perspective. Most consumers
139
00:10:33.840 --> 00:10:39.029
have this kind of sensation that they're
being watched and followed. They recognize that
140
00:10:39.710 --> 00:10:43.389
they've got this device sitting on their
counter that listens to them, that they
141
00:10:43.509 --> 00:10:50.299
talked to. So consumers are kind
of aware and they increasingly have this awareness
142
00:10:50.340 --> 00:10:56.659
of being surveiled and and even spied
upon. I often say there's a fine
143
00:10:56.700 --> 00:11:01.259
line between marketing and espionage, and
so where we are today is that consumers
144
00:11:01.340 --> 00:11:07.169
have this kind of sneaking suspicion that
they're being manipulate. But what I was
145
00:11:07.289 --> 00:11:11.009
trying to do in the book is
to explain exactly how that works and to
146
00:11:11.169 --> 00:11:16.250
equip people with kind of the knowledge
that they can think both about the opportunities
147
00:11:16.330 --> 00:11:20.440
that that this technology presents as well
as the risks. And one of the
148
00:11:20.799 --> 00:11:24.960
examples, I don't think I wrote
it exactly this way in the book,
149
00:11:24.960 --> 00:11:30.120
but it's kind of a simple explanation. Imagine your operating a pet store and
150
00:11:30.720 --> 00:11:33.269
somebody comes to leave their pet with
you. You have the right to,
151
00:11:33.629 --> 00:11:37.230
you know, ask for specific types
of information. I need to know where
152
00:11:37.230 --> 00:11:39.269
you're going, I need to know
how long you're going to be there,
153
00:11:39.309 --> 00:11:43.029
I need to know how to reach
you if, you know, fido gets
154
00:11:43.110 --> 00:11:48.299
sick. So there's this sharing of
information where the consumer gives up certain data
155
00:11:48.980 --> 00:11:54.139
that the business owner needs in order
for them to have a positive relationship.
156
00:11:54.259 --> 00:12:00.419
And consumers for the most part,
believe in that kind of information exchange.
157
00:12:00.460 --> 00:12:05.330
Its transparent, it's open, they
recognize that there's a value that they're getting
158
00:12:05.370 --> 00:12:07.850
back, and in this case it's
the safety of my pet. But where
159
00:12:09.090 --> 00:12:11.690
we are, you know, ready
to draw the line is when we move
160
00:12:11.929 --> 00:12:16.480
from using the information the way we
expected to be used to doing something else
161
00:12:16.559 --> 00:12:20.200
with it. You know, in
the extreme example here would be you know,
162
00:12:20.399 --> 00:12:22.360
you know I'm going to be out
of town this weekend, so you
163
00:12:22.480 --> 00:12:26.759
go rob my house. Okay,
now we've crossed the creepy line. I
164
00:12:26.840 --> 00:12:31.629
don't want you roaming around in my
house and eating my food and, you
165
00:12:31.750 --> 00:12:37.350
know, stealing my stuff because you've
been privileged to receive information about my whereabouts.
166
00:12:37.750 --> 00:12:41.750
And I think that for most consumers
they sense this intuitively, that there's
167
00:12:41.789 --> 00:12:48.779
this line that business shouldn't cross.
But I think where we have to understand
168
00:12:48.899 --> 00:12:52.379
that, you know, the technology
has taken us is that we've gone way
169
00:12:52.419 --> 00:12:58.289
faster than regulatory environments have kept up
with that we are in a realm where
170
00:12:58.330 --> 00:13:03.490
data is being transmitted about who we
are, where we are, what we
171
00:13:03.610 --> 00:13:07.009
do, what we buy, what
we think, and that ownership over that
172
00:13:07.450 --> 00:13:15.279
knowledge is something that there's a an
active conversation going on today about, you
173
00:13:15.360 --> 00:13:18.000
know, who really owns that data
and knowledge, and some people, Tim
174
00:13:18.080 --> 00:13:22.279
Burners Lee, the father of the
Internet, are rather the World Wide Web,
175
00:13:22.799 --> 00:13:26.879
has said openly, you know,
we need to have a Magna Carta
176
00:13:26.000 --> 00:13:31.830
for privacy, for a personal information, and I think that to a degree
177
00:13:31.909 --> 00:13:37.629
he's right that there needs to be
a real serious rethinking of who owns our
178
00:13:37.750 --> 00:13:41.700
data and who owns information about us. Yeah, and we're starting to see
179
00:13:41.740 --> 00:13:46.500
that roll out a little bit.
Obviously, you hit on GDPR and the
180
00:13:46.980 --> 00:13:50.419
California law that's in flight. My
understanding is that several other states are kind
181
00:13:50.419 --> 00:13:54.659
of in consideration there. But you
know, to your point, there's so
182
00:13:54.740 --> 00:13:58.529
much that hasn't been regulated and honestly, I'm you know, I'm a little
183
00:13:58.570 --> 00:14:01.970
bit skeptical. I wouldn't call myself, I had, a glass half empty
184
00:14:03.049 --> 00:14:07.690
type of person, but I would
say on average I would not trust companies
185
00:14:07.730 --> 00:14:11.960
with their own specific motives with my
data in a completely unregulated environment. The
186
00:14:13.039 --> 00:14:16.480
other interesting layer here is because,
and I was happy to hear the way
187
00:14:16.480 --> 00:14:20.559
you describe the way you approach the
book, because I think you really delivered
188
00:14:20.639 --> 00:14:22.639
on your own hopes or expectations for
it, at least the way you describe
189
00:14:22.759 --> 00:14:24.909
here, which is, you know, peel it back and put it in
190
00:14:26.029 --> 00:14:28.149
common language. In this is how
this stuff works. And so the other
191
00:14:28.269 --> 00:14:33.909
thing too, is that that data
can then be paired and used and resold
192
00:14:33.990 --> 00:14:37.429
in all these other things and all
of a sudden it's just like long gone
193
00:14:37.549 --> 00:14:41.379
genies out of the bottle, the
Pandora's boxes open and all this informations out
194
00:14:41.419 --> 00:14:48.340
there and it's being all matched up, and now people have machine have develop
195
00:14:48.379 --> 00:14:52.500
our developing profiles that know me as
well or better than myself, and not
196
00:14:52.700 --> 00:14:54.809
only, I'm getting a little bit
ahead of the conversation here, not only
197
00:14:56.529 --> 00:15:00.129
predict what I'm going to do,
but but prescribe it to me. So
198
00:15:00.289 --> 00:15:05.889
it is something to be sensitive to. Just stay in the spirit of let's
199
00:15:05.889 --> 00:15:09.240
make this all approachable and walk it
down. Just give me a quick again
200
00:15:09.360 --> 00:15:13.320
a few terms here. Just give
me some quick, lightweight definitions on them.
201
00:15:13.399 --> 00:15:20.000
Artificial intelligence. I started the book
off with a Joke I've heard a
202
00:15:20.039 --> 00:15:24.350
few times, which is artificial intelligence
is the art of making machines act like
203
00:15:24.549 --> 00:15:30.269
they do in the movies. And
of course you know the the movie making
204
00:15:30.309 --> 00:15:35.059
industry portrays. You know, robots
that can easily, you know, blend
205
00:15:35.139 --> 00:15:39.539
into society without being detected. I
don't think we're there yet. I think
206
00:15:39.659 --> 00:15:43.820
that, you know, we are
beyond the turing test and that you know,
207
00:15:43.980 --> 00:15:48.220
a customer Chat Bot engine can pull
off being a human being, but
208
00:15:48.299 --> 00:15:52.690
I don't think we've achieved, you
know, what we refer to as general
209
00:15:52.929 --> 00:15:58.570
artificial intelligence. Most of the applications
of Ai that are working today are narrow
210
00:15:58.649 --> 00:16:04.120
applications. They offer specific solutions to
problems that we would normally think of as
211
00:16:04.279 --> 00:16:08.840
being solved by a human but they
aren't general in the sense that they don't
212
00:16:08.840 --> 00:16:12.279
solve all problems. They just solve
one narrow type. So when I think
213
00:16:12.320 --> 00:16:18.830
of artificial intelligence, I think of
it as the process of teaching computers to
214
00:16:18.990 --> 00:16:23.269
solve problems that we would normally think
of as being solved by human beings.
215
00:16:23.309 --> 00:16:30.389
And right now the applications of AI
are still narrow. They're very sophisticated and
216
00:16:30.549 --> 00:16:37.019
they're super impressive, but they're still
fairly narrow in scope. Super Algorithm,
217
00:16:37.340 --> 00:16:44.860
so unlike a mathematical algorithm, computer
algorithm is simply a set of instructions that
218
00:16:44.980 --> 00:16:48.610
you follow, and I in the
book I talked about you have an algorithm
219
00:16:48.730 --> 00:16:52.769
for waking up in the morning.
Your alarm clock goes off, you turn
220
00:16:52.809 --> 00:16:56.289
the alarm off, you swing your
feet out of bed and you stand up
221
00:16:56.330 --> 00:16:59.330
and turn on the light or whatever
that is. So there's a sequence of
222
00:16:59.409 --> 00:17:03.120
events that you follow that you could
consider an algorithm. Well, it's interesting
223
00:17:03.159 --> 00:17:07.799
about algorithms as they can be packed
into other algorithms. That little get up
224
00:17:07.799 --> 00:17:11.279
out of bed algorithm can be paired
up with a brush your teeth algorithm and
225
00:17:11.400 --> 00:17:15.710
an eat breakfast algorithm and to go
to work algorithm, and all of those
226
00:17:15.789 --> 00:17:18.670
can be nested inside a larger algorithm
called, you know, my day,
227
00:17:19.190 --> 00:17:23.910
and the my day algorithm could be
nested inside an algorithm called by month and
228
00:17:25.029 --> 00:17:29.509
my year. So you can quickly
see how algorithms can pair with one another.
229
00:17:29.630 --> 00:17:33.099
They can be nested inside of one
another group together to solve problems.
230
00:17:33.220 --> 00:17:37.859
So think of an algorithm. Start
by thinking of an algorithm as a set
231
00:17:37.900 --> 00:17:42.220
of instructions for a computer. Follow
quick follow up there on Algorithm is again
232
00:17:42.259 --> 00:17:47.690
I'm offering this for the listener who
doesn't understand these things, much like myself.
233
00:17:48.250 --> 00:17:52.849
Is there a final rule in a
set that basically observe, learn,
234
00:17:52.650 --> 00:18:02.240
optimize so that it's self perpetuating and
gets more and more effective efficient? Yes,
235
00:18:02.400 --> 00:18:04.880
when we talk about machine learning,
we're really talking about the ability for
236
00:18:06.000 --> 00:18:12.390
machines, on one hand, to
detect patterns in data and use either supervised
237
00:18:12.390 --> 00:18:22.029
learning or unsupervised learning to more efficiently
recognize those patterns and then ultimately learn how
238
00:18:22.109 --> 00:18:26.390
to change outcomes from those patterns.
So if you see, you know,
239
00:18:26.630 --> 00:18:30.619
a shopping experience and you watch the
customer decision journey from the top of the
240
00:18:30.660 --> 00:18:33.460
funnel all the way through the bottom
of the funnel and you can see where
241
00:18:33.460 --> 00:18:38.259
people drop out, you can gather
tens of millions of data points about the
242
00:18:38.339 --> 00:18:42.730
customer experience and you've got sixty variables
to, you know, play with,
243
00:18:44.049 --> 00:18:47.809
you can start to actually see how
a computer would be much more efficient at
244
00:18:47.849 --> 00:18:53.289
not only detecting patterns in that data
but then learning how to actually change the
245
00:18:53.329 --> 00:18:57.359
outcomes. If I tweak this,
let me watch what happens as a consumer
246
00:18:57.480 --> 00:19:02.160
moves through the funnel. If I
tweak this, let me see what impact
247
00:19:02.240 --> 00:19:07.799
that has and ultimately bend the consumer
experience towards more positive outcomes. Whatever your
248
00:19:07.839 --> 00:19:12.190
KPI is, whether it's a conversion
are, somebody making a purchase or total
249
00:19:12.309 --> 00:19:18.430
dollars, the computer can look at
all of those variables that go into the
250
00:19:18.549 --> 00:19:22.789
customer experience and then, at the
end start to see, okay, if
251
00:19:22.869 --> 00:19:26.220
I, you know, do x, this happens. If I do why,
252
00:19:26.700 --> 00:19:30.940
this other thing happens, and I
can start to actually bend my campaign's,
253
00:19:30.099 --> 00:19:33.980
my marketing, my customer outreach and
all of those touch points with the
254
00:19:34.059 --> 00:19:41.089
consumer to change the outcome. So
instead of just thinking of the modern computer
255
00:19:41.289 --> 00:19:45.410
experience as just plotting points on a
graph and then drawing a line of best
256
00:19:45.490 --> 00:19:48.009
fit so that you can try to
figure out, you know, where the
257
00:19:48.089 --> 00:19:51.490
next point will fall, think of
it as a bunch of points on a
258
00:19:51.569 --> 00:19:56.680
graph that the computer recognizes and says, okay, if I want the next
259
00:19:56.759 --> 00:20:00.119
point to fall on this line,
I have to do x, Y and
260
00:20:00.200 --> 00:20:04.519
see, and that's prescriptive. The
computers starting to think about how to actually
261
00:20:04.559 --> 00:20:11.069
change the outcomes rather than merely predict
the outcome. Really good. That was
262
00:20:11.109 --> 00:20:15.069
I'm so glad I asked that follow
up question. Another one neural network.
263
00:20:17.630 --> 00:20:22.259
Well, the reference to neural is
simply organizing a computer network in the way
264
00:20:22.539 --> 00:20:27.940
that a human brain is organized,
and I think that that term is often
265
00:20:29.299 --> 00:20:33.660
a little maybe misleading and it's a
you know, we're not actually building human
266
00:20:33.819 --> 00:20:37.450
brains, but what we're trying to
do is we're trying to suggest that the
267
00:20:37.609 --> 00:20:42.210
interaction of the data through the network
is not following a single linear path,
268
00:20:42.369 --> 00:20:48.769
that there's lots of processes that are
happening simultaneously and it is the culmination of
269
00:20:48.970 --> 00:20:55.839
lots of simultaneous processes that is generating
answers or outcomes, which makes it more
270
00:20:56.480 --> 00:21:00.440
of a neural network. And when
we start stacking neural networks, you know,
271
00:21:00.519 --> 00:21:03.880
we can start to create what we
think of as deep learning. We
272
00:21:03.960 --> 00:21:08.390
start to recognize new benefits from the
complexity of the systems that were leveraging and
273
00:21:08.829 --> 00:21:15.710
that gives us some really powerful outcomes. Good last one, and this is
274
00:21:15.789 --> 00:21:21.539
a term I believe you created in
constructing the book. It captures the personalized,
275
00:21:21.700 --> 00:21:26.779
persuasive ability to learn, the anthropomorphic
nature of our the way the machines
276
00:21:27.259 --> 00:21:32.380
interface with us, and that is
psychotechnology. Feel free to to kind of
277
00:21:32.420 --> 00:21:34.650
run with this one a little bit
talk about the elements of it and how
278
00:21:34.690 --> 00:21:38.609
you arrived at that term. Yeah, in the book I define for key
279
00:21:38.730 --> 00:21:44.049
trends that you just mentioned and I'll
try to illuminate them just briefly. The
280
00:21:44.170 --> 00:21:48.160
first that you mentioned and I previously
mentioned is the personalization of information. If
281
00:21:48.200 --> 00:21:52.039
you and I were sitting next to
each other at the airport and we both
282
00:21:52.079 --> 00:21:55.359
had our laptop so open and we
went to the New York Timescom we might
283
00:21:55.440 --> 00:21:56.960
see different ads. You might see
ads that are tailored to you, I
284
00:21:57.039 --> 00:22:00.670
might see ads tailored to me.
So we're seeing, you know, different
285
00:22:00.750 --> 00:22:04.309
information. Your facebook feed is different
than my facebook feed. The news you
286
00:22:04.430 --> 00:22:08.869
consume is tailored to you, customized
to your you know, wants, and
287
00:22:10.029 --> 00:22:14.430
increasingly we are seeing information reflect back
to us, almost like an echo chamber,
288
00:22:14.829 --> 00:22:18.579
of what we have previously demonstrated an
interest in, and that's a real
289
00:22:18.779 --> 00:22:25.740
important fundamental idea is that the machine
is now equipped to address you ethan a
290
00:22:25.980 --> 00:22:32.250
you bill in different terms. That's
the personalization of information and that's distinct from
291
00:22:32.289 --> 00:22:37.289
the world I grew up in,
where a broadcast tower delivered a message to
292
00:22:37.410 --> 00:22:41.569
everyone at the same time. Now
computers are developing one on one relationships with
293
00:22:41.730 --> 00:22:47.799
individuals. We are even able to
write news copy headlines on the fly,
294
00:22:48.640 --> 00:22:52.359
tailored to you and how you are
motivated and what things you like to read.
295
00:22:52.880 --> 00:22:59.109
A second is that persuasion has become
a science. We are no longer
296
00:22:59.589 --> 00:23:02.750
six people in a room with a
wet finger in the air trying to feel
297
00:23:02.750 --> 00:23:04.349
out which way the wind blows.
You know, it's no longer kind of
298
00:23:04.430 --> 00:23:08.630
that that simple focus group testing of
messages. We are now at a point
299
00:23:08.990 --> 00:23:15.339
where we can ab test messaging across
thousands and even tens of thousands, and
300
00:23:15.380 --> 00:23:19.859
if you consider Google, of giant
Focus Group, billions of people and and
301
00:23:21.019 --> 00:23:26.779
the ability to persuade really comes down
to defining key performance indicators that can be
302
00:23:26.900 --> 00:23:32.809
tracked, watching what messaging people are
exposed to and how they behave, and
303
00:23:32.930 --> 00:23:37.170
then seeing can I change those behaviors
by changing the messaging? And the answer
304
00:23:37.210 --> 00:23:41.799
is yes, we can. You
know, that is something as simple as
305
00:23:41.319 --> 00:23:45.000
on your facebook APP on your phone, a little number one that pops up
306
00:23:45.400 --> 00:23:51.640
that causes you to think, hope, somebody liked my picture, somebody liked
307
00:23:51.880 --> 00:23:56.150
my post. Well, guess what? We tend to gravitate toward those things
308
00:23:56.430 --> 00:24:00.710
and compulsively click to open that up
and see, well, who liked my
309
00:24:02.109 --> 00:24:06.109
who like my facebook post? That's
a form of persuasion. You are being
310
00:24:06.349 --> 00:24:11.059
manipulated, you're being changed by that
little number one that pops up and the
311
00:24:11.180 --> 00:24:15.220
reward you get is that little drip
of dopamine in your brain that makes you
312
00:24:15.259 --> 00:24:18.740
feel like, ah, somebody loves
me. Well, that kind of persuasion
313
00:24:18.819 --> 00:24:23.329
is being deployed in marketing. It's
being deployed in video games. You know,
314
00:24:23.410 --> 00:24:26.809
we can keep your kids glued to
their video game for seventy hours without
315
00:24:26.809 --> 00:24:32.490
going to the bathroom just to,
you know, earn the the last piece
316
00:24:32.569 --> 00:24:36.690
of armor in there, you know, for their night in their medieval war
317
00:24:36.849 --> 00:24:41.039
game. So you know, we
were really good at persuasion. The next
318
00:24:41.200 --> 00:24:47.480
is that natural language processing is being
deployed with machine learning so that we can
319
00:24:47.599 --> 00:24:53.670
now talk to machines that learn how
to persuade US using personalized information. So
320
00:24:53.710 --> 00:25:00.390
I kind of blended the last two
together. There's the PERSONALIZASIAN information, persuasion
321
00:25:00.430 --> 00:25:04.230
as a science, natural language processing
and machine learning all taken together, other
322
00:25:04.869 --> 00:25:11.019
create technology that we're talking to and
we have to be aware that the machine
323
00:25:11.019 --> 00:25:17.099
that we're talking to is listening.
It's learning about us in order to persuade
324
00:25:17.140 --> 00:25:22.049
US using our own personalized information,
and I thought we need a name for
325
00:25:22.130 --> 00:25:25.250
that. We gotta what do we
call that? And I thought about it
326
00:25:25.329 --> 00:25:30.690
really hard. That's technology that is
operating on US psychologically. What would you
327
00:25:30.769 --> 00:25:36.680
call psychological technology? You call it
psychotechnology, or you could even shorten that
328
00:25:36.720 --> 00:25:42.000
if you wanted to, psychotech.
And I am committed to that idea that
329
00:25:42.160 --> 00:25:48.160
this is truly something new and unique
and I need to be out there explaining
330
00:25:48.279 --> 00:25:52.230
to everybody how it works, because
they deserve to know and understand how it
331
00:25:52.589 --> 00:25:56.869
is impacting them. Already. It's
already at work changing you and me.
332
00:25:57.829 --> 00:26:02.910
It is, and you've already cited
a couple examples of it just in that
333
00:26:03.069 --> 00:26:04.779
explanation, which is really, really
good. I think the way folded those
334
00:26:04.859 --> 00:26:08.259
together made a lot of sense to
me and I think it'll resonate with folks
335
00:26:08.299 --> 00:26:11.579
that kind of observe these bits and
pieces. But you did is such a
336
00:26:11.619 --> 00:26:15.019
nice job pulling it all together into
something that we can kind of wrap our
337
00:26:15.019 --> 00:26:18.289
heads around a little bit. So
let's go just for the sake of time,
338
00:26:18.329 --> 00:26:23.009
let's let's do a little bit of
a go on kind of the ethical
339
00:26:23.049 --> 00:26:26.730
considerations. You already touched on it
a little bit. Trust, transparency.
340
00:26:27.329 --> 00:26:30.210
Companies need to be open, but
the you know, their whole M is,
341
00:26:30.730 --> 00:26:34.160
how can I use AI to get
people to buy more stuff? Right?
342
00:26:34.599 --> 00:26:38.559
And on the other side you have
the customers, who you've already you've
343
00:26:38.599 --> 00:26:41.960
already kind of hinted at. This
is like, I'm comfortable giving you this
344
00:26:42.119 --> 00:26:48.549
information in this context because it's an
appropriate exchange of, you know, value
345
00:26:48.710 --> 00:26:53.069
my valuable information in an exchange for
your valuable service or access to this document
346
00:26:53.190 --> 00:26:56.869
or whatever the case may be.
I'm comfortable there, but when I get
347
00:26:56.910 --> 00:27:02.059
that little further peek behind the curtain, I get very uncomfortable. I'm creeped
348
00:27:02.099 --> 00:27:04.940
out, you know, I'm turned
off, I'm kind of shocked. And
349
00:27:06.019 --> 00:27:10.779
so there's this expectation management function that
it, I think, is around opening
350
00:27:10.819 --> 00:27:15.299
up the black box and being more
transparent about what's going on and why.
351
00:27:15.460 --> 00:27:19.009
From the company perspective, at the
same time, their motivation is to sell
352
00:27:19.089 --> 00:27:22.569
more stuff using AI. And then
at a certain point, I'm getting a
353
00:27:22.569 --> 00:27:26.930
little bit farther ahead, here is, you know, you discussed it at
354
00:27:27.289 --> 00:27:32.599
couple different points in the book,
this idea that we won't be able to
355
00:27:32.720 --> 00:27:37.079
recognize the machine or even the data
scientists don't totally understand what's going on inside
356
00:27:37.079 --> 00:27:41.799
the black box at times, and
so even that is a little bit beyond
357
00:27:41.000 --> 00:27:45.630
the company's control at some level.
And so just talk about that give and
358
00:27:45.710 --> 00:27:49.750
take and the creepiness factor. But
the you know, we like more targeted
359
00:27:49.789 --> 00:27:55.190
ads. Maybe just talk about that
given take. Yeah, so, without
360
00:27:55.230 --> 00:27:59.660
getting too far down the kind of
the Sci Fi side of this, let's
361
00:27:59.660 --> 00:28:04.339
just talk about the the consumer experience. If you are in customer service or
362
00:28:04.660 --> 00:28:10.099
if you're a business owner, if
you're a marketer, I strongly, strongly
363
00:28:10.220 --> 00:28:18.490
encourage you to write this down.
It's about persuasion, not coercion or deception.
364
00:28:18.009 --> 00:28:25.450
So simple formula. Most consumers are
willing to be persuaded. We are
365
00:28:25.569 --> 00:28:30.680
all willing to be convinced. Convinced
me, tell me that your product is
366
00:28:30.799 --> 00:28:37.960
better, but don't trick me into
it, don't deceive me and don't do
367
00:28:37.160 --> 00:28:42.670
anything that is coercive, and I
think that keeps you on the right side
368
00:28:42.829 --> 00:28:48.390
of the creepy line. I'll define
the creepy line a little bit more for
369
00:28:48.509 --> 00:28:53.190
you as you're as you're thinking about
this. There was a terrific example of
370
00:28:53.750 --> 00:29:00.140
what I consider the invisible brand at
work. A number of years ago target
371
00:29:00.380 --> 00:29:03.779
had a lead scientist who is asked, can you tell us when a woman
372
00:29:03.859 --> 00:29:07.940
is pregnant? Because at a when
a woman is pregnant, she makes lots
373
00:29:07.980 --> 00:29:14.769
of decisions about brands and things that
she will purchase that last for decades.
374
00:29:15.049 --> 00:29:18.289
She will be you know, she
raises her babies with the same kind of
375
00:29:18.769 --> 00:29:22.210
dishwashing to terge at, the same
kind of toothpaste, the same kind of
376
00:29:22.849 --> 00:29:26.759
laundry to tergeon. And in the
process of making those decisions she makes a
377
00:29:26.839 --> 00:29:30.160
lot of new decisions. He tries
a lot of new products during the point
378
00:29:30.400 --> 00:29:36.279
where she's pregnant and then soon after
when she has young children. And so
379
00:29:36.599 --> 00:29:41.029
for marketers it's a very strategically important
part of a woman's life is to figure
380
00:29:41.029 --> 00:29:42.630
out how do I know when a
woman is pregnant? Well, it turns
381
00:29:42.670 --> 00:29:47.549
out that target has a lot of
data in their system and from your basket
382
00:29:47.549 --> 00:29:49.509
of goods, of what you're purchasing, they can actually do a pretty good
383
00:29:49.549 --> 00:29:53.819
job of determining when your pregnant.
Surprisingly so, and the story goes,
384
00:29:55.019 --> 00:29:57.740
and this is now a famous kind
of marketing, oneonone story. They sent
385
00:29:57.740 --> 00:30:02.859
out a mailer to women who were
pregnant and one of those mailers went to
386
00:30:02.940 --> 00:30:07.059
a young woman who was sixteen years
old and her father received it and was
387
00:30:07.250 --> 00:30:12.450
incensed the target would have the nerve
to send his sixteen year old daughter information
388
00:30:12.529 --> 00:30:17.650
about pregnancy. She's too young,
of course. And you know, surprise,
389
00:30:18.089 --> 00:30:21.839
it turned out that she was actually
in fact pregnant and the algorithms that
390
00:30:21.960 --> 00:30:26.039
target knew it before dad did.
And that's not the end of the story.
391
00:30:26.160 --> 00:30:29.960
The end of the story is the
fact that target recognized that for that
392
00:30:30.200 --> 00:30:33.920
consumer they had crossed the creepy line. That consumer didn't need to know that
393
00:30:34.240 --> 00:30:37.869
his daughter was pregnant before he did, and the fact that a, you
394
00:30:37.950 --> 00:30:44.150
know, corporation could deduce from the
basket of goods that you were purchasing that
395
00:30:44.269 --> 00:30:48.150
your daughter was pregnant was something very
startling to that particular individual. And the
396
00:30:48.309 --> 00:30:55.220
learning from that that that target took
away is don't be so obvious. So
397
00:30:55.460 --> 00:30:59.660
learning that they they gathered from it
was, okay, we're going to put
398
00:30:59.700 --> 00:31:03.650
gas grills and golf clubs into the
into the mailer so that it's not all
399
00:31:03.769 --> 00:31:07.970
just baby stuff. And oh by
the way, if you're happening, to
400
00:31:07.089 --> 00:31:11.849
look past the gas grills and golf
clubs and there's some you know, some
401
00:31:11.970 --> 00:31:15.450
pregnancy stuff in there. That's nice, but but please don't think that we
402
00:31:15.690 --> 00:31:19.039
sent you a pregnancy mailer or a
baby mailer. So there was this degree
403
00:31:19.079 --> 00:31:26.200
of obfuscation that they started crafting into
their marketing so that it wasn't so obvious
404
00:31:26.319 --> 00:31:30.759
to consumers when they were crossing the
creepy line. I think where that leaves
405
00:31:30.839 --> 00:31:36.630
us today is that, as consumers, we are suspicious that we are being
406
00:31:36.869 --> 00:31:41.470
stocked and preyed upon, but we
are also suspicious that the corporations are hiding
407
00:31:41.670 --> 00:31:45.630
and masking what they do. It's
something I call Google Noia, which is
408
00:31:45.670 --> 00:31:51.779
kind of a combination of Google and
paranoid, and this creeping Google noia that
409
00:31:51.900 --> 00:31:56.460
we experience suggest to US somehow that, you know, all of these search
410
00:31:56.579 --> 00:32:00.569
engines and all of these recommendation engines
and the ads and everything is really ganging
411
00:32:00.690 --> 00:32:06.170
up on us and and stocking us
and persuading us in ways that we don't
412
00:32:06.170 --> 00:32:09.170
understand. And in fact a lot
of that is actually happening. That is
413
00:32:09.289 --> 00:32:14.490
the case, and for consumers the
trick is you better not let me know
414
00:32:14.690 --> 00:32:20.319
it and and so for for marketers, the simple rule is persuasion, not
415
00:32:20.519 --> 00:32:23.400
coercion or deception, and I think
that if you stay on that side,
416
00:32:23.440 --> 00:32:29.599
if you know and recognize the persuasion
is fair, consumers are willing to be
417
00:32:29.759 --> 00:32:37.309
persuaded and that you offer a transparent
value exchange your data in exchange for these
418
00:32:37.470 --> 00:32:40.670
benefits, that you, as a
company, will be more often than not
419
00:32:40.869 --> 00:32:45.380
on the right side of the creepy
line. I like it. It's it's
420
00:32:45.579 --> 00:32:50.019
you know, golden rule obviously applies
here. Or could you justify this decision
421
00:32:50.019 --> 00:32:52.500
or this behavior to someone you love
and respect, like a family member or
422
00:32:52.619 --> 00:32:57.299
close friend? You know, it's
easy to sit around and make decisions in
423
00:32:57.420 --> 00:33:00.970
the company's interests inside a closed setting, but to be able to justify it
424
00:33:00.130 --> 00:33:04.329
externally and explain it to people.
If you're willing to do that, then
425
00:33:04.329 --> 00:33:07.089
you're probably on the safe side of
it. I really love to another that
426
00:33:07.250 --> 00:33:12.289
that that target pregnancy story would like
kind of broke into the mainstream, as
427
00:33:12.369 --> 00:33:17.480
did that Google Duplex. The AI
assistant scheduling a haircut appointment in the interesting
428
00:33:17.559 --> 00:33:20.799
resolve of getting you did a drive
by on that in the book. The
429
00:33:20.920 --> 00:33:25.000
interesting resolve there is that Google ended
up based on customer feedback or your peep,
430
00:33:25.160 --> 00:33:30.349
you know, consumer feedback. This
idea that the machine will present itself
431
00:33:30.390 --> 00:33:34.869
as a machine acting on behalf of
a human. Again, like this transparent
432
00:33:34.910 --> 00:33:38.349
step so as not to act as
if like acting as if it's a human
433
00:33:38.390 --> 00:33:40.980
but it's not a human. It's
funny. I had a conversation about live
434
00:33:42.059 --> 00:33:45.819
chat and chat bots here on the
show and and what the gentleman observed was
435
00:33:45.980 --> 00:33:52.299
that so often the people will ask
directly into the chat exchange, are you
436
00:33:52.420 --> 00:33:55.210
a person? You know, like
where are you? You know? Are
437
00:33:55.569 --> 00:33:59.450
It? Because people want to know
that too. So it's interesting. Even
438
00:33:59.490 --> 00:34:04.809
in the interactions we want to have
that level of transparency. What are the
439
00:34:04.970 --> 00:34:07.650
marketing jobs of the future, and
I asked this with you know, you
440
00:34:07.769 --> 00:34:13.719
talked about you already mentioned new stories
that right themselves, email copy that is
441
00:34:13.800 --> 00:34:19.239
going to optimize itself, adds that
create run and optimize themselves. Where does
442
00:34:19.320 --> 00:34:22.559
this leave the marketer when we go
more strategic and less tactical, because a
443
00:34:22.599 --> 00:34:27.190
lot of the tactics are knocked down
by the machines, which is sensible,
444
00:34:27.230 --> 00:34:30.389
again, putting humans in position to
do their best work in partnership where does
445
00:34:30.429 --> 00:34:36.389
that leave the human marketer in x
number of years? Yeah, so the
446
00:34:36.469 --> 00:34:43.579
definition I typically apply to marketing is
anticipating consumer demand and then finding products that
447
00:34:43.739 --> 00:34:47.739
can deliver on that demand profitably,
and I think when you start to unpack
448
00:34:47.940 --> 00:34:52.449
that, that's a pretty sound definition
of marketing. That isn't going to change.
449
00:34:52.570 --> 00:34:57.849
What's changing are the tools, as
you said, the tactics. I'll
450
00:34:57.889 --> 00:35:04.610
give you one very clear example.
That's happening today and I would extol encourage
451
00:35:04.769 --> 00:35:09.280
marketers to take this very seriously.
You might remember twenty years ago, you
452
00:35:09.360 --> 00:35:14.760
know, the infancy, the dawn
of kind of search and search engine optimization.
453
00:35:15.039 --> 00:35:19.519
Business is really faced with the challenge
and that was, do I divert
454
00:35:19.679 --> 00:35:24.469
resources into search engine optimization? Is
this important? Is this a fad or
455
00:35:24.550 --> 00:35:28.510
is this something that's going to affect
my business for the next, you know,
456
00:35:28.590 --> 00:35:31.070
multi decades? Then, as we
know, you know that the end
457
00:35:31.070 --> 00:35:37.059
of that story is that search engine
optimization has been critical and those businesses that
458
00:35:37.260 --> 00:35:40.940
chose to invest succeeded and those that
chose to ignore it did so at their
459
00:35:40.980 --> 00:35:46.539
own peril. I think we are
at a similar inflection point with voice based
460
00:35:46.619 --> 00:35:51.849
interfaces. We are now talking to
our cars we're talking to our television as
461
00:35:51.889 --> 00:35:54.329
we're talking to our cell phones,
we're talking to, you know, Alexa
462
00:35:54.409 --> 00:36:00.010
and Siri, and for businesses today
they're facing a similar challenge that they did
463
00:36:00.050 --> 00:36:05.719
twenty years ago. Is this something
worth investing in? Do I make my
464
00:36:05.960 --> 00:36:10.440
business visible, for audible, if
you will, through voice based interfaces?
465
00:36:10.679 --> 00:36:14.199
And so I would ask you,
if you're a marketer and LEA, you're
466
00:36:14.239 --> 00:36:17.789
listening to this, ask this simple
question. Can My products and services be
467
00:36:19.110 --> 00:36:24.630
purchased today through voice? Go over
to Alexa and try to find your product
468
00:36:24.989 --> 00:36:29.989
through Alexa. See if Ciri can
locate your business, see if Ciri can
469
00:36:30.429 --> 00:36:35.659
identify the products that you have available
for sale and, more importantly, can
470
00:36:35.699 --> 00:36:40.099
you actually make the purchase using voice, because there are a lot of products
471
00:36:40.179 --> 00:36:45.380
that are already being sold using voice. I can sit down in my living
472
00:36:45.420 --> 00:36:47.489
room and I can say, you
know, recommend a scary movie in My
473
00:36:47.650 --> 00:36:52.690
TV will pull up scary movies and
I can select when. All using voice.
474
00:36:52.090 --> 00:36:57.010
That's changing things. You're driving in
your car and you say in a
475
00:36:57.090 --> 00:37:01.760
directions to the nearest Thai restaurant.
That's using voice to purchase things. Increasingly,
476
00:37:02.199 --> 00:37:07.079
we are spending money with our mouths. It's easier to spend money with
477
00:37:07.199 --> 00:37:10.639
our mouths that our fingers at this
point, and for businesses who are really
478
00:37:10.760 --> 00:37:15.150
thinking hard about where things are going, you've got to recognize that this is
479
00:37:15.230 --> 00:37:20.989
an opportunity for you to either capture
market share or get left behind. And
480
00:37:21.429 --> 00:37:27.550
why do I bring up voice?
Voice is the consumer facing edge of artificial
481
00:37:27.630 --> 00:37:31.659
intelligence at some of our world's largest
and most valuable companies. Think about the
482
00:37:31.780 --> 00:37:37.500
companies that you think of as the
world's most valuable brands. Probably on that
483
00:37:37.659 --> 00:37:42.900
list, you're going to say,
is apple, it's Microsoft, it's certainly
484
00:37:42.940 --> 00:37:47.489
Amazon. Well, when you think
about those and Google, you're naming companies
485
00:37:47.570 --> 00:37:54.809
that have voice based ecosystems. The
most valuable brands in the world have Siri
486
00:37:55.170 --> 00:38:01.880
and Alexa and Google assistant and Cortana. These are voice based systems that human
487
00:38:01.960 --> 00:38:07.119
beings are interacting with at increasing ease. You know what I think about?
488
00:38:07.119 --> 00:38:10.079
A story that I tell in the
book? My neighbor's four year old son
489
00:38:10.320 --> 00:38:15.190
was able to get Alexa to play
music for him and I was thinking,
490
00:38:15.550 --> 00:38:19.469
you know, this is really truly
amazing, because a four year old hasn't
491
00:38:19.469 --> 00:38:23.150
necessarily nerd learned how to read they
can't now the gate graphical user interface,
492
00:38:23.269 --> 00:38:31.059
but here he is easily using a
voice based interface to navigate a complex environment,
493
00:38:31.659 --> 00:38:35.579
in this case Alexa, and I
thought, you know, that's really
494
00:38:35.659 --> 00:38:39.219
impressive. Voice is something more innate, it's something deeper in us than reading.
495
00:38:39.300 --> 00:38:42.969
We learned to speak when we're a
year, year and a half,
496
00:38:43.449 --> 00:38:45.889
whereas we don't learn to read until
we're maybe four, five, six years
497
00:38:45.889 --> 00:38:52.090
old. And so that interface,
that voice interface, is something that marketers
498
00:38:52.409 --> 00:38:55.559
today have to start grappling with.
What does it mean to your business to
499
00:38:55.639 --> 00:39:01.719
have consumers shifting their attention to an
interface which is all voice based? So
500
00:39:01.840 --> 00:39:05.840
these are some of the challenges that
marketers face. To bring it back to
501
00:39:05.880 --> 00:39:10.190
your question again, the strategic role
of marketing hasn't changed, but the tools
502
00:39:10.230 --> 00:39:16.030
and tactics that are available as changing
extremely rapidly and marketers need to be on
503
00:39:16.150 --> 00:39:21.630
top of those changes and learn to
adapt. Excellent. I have a several
504
00:39:21.710 --> 00:39:23.940
more questions that I will not be
asking you. I again I found the
505
00:39:23.980 --> 00:39:30.260
invisible brand to be fun, easy
and very interesting and informative, and again
506
00:39:30.300 --> 00:39:34.099
I recommend it highly to to anyone
that's made it this far into the interview.
507
00:39:34.099 --> 00:39:37.420
You know there's so much more in
there that we can't pack into this
508
00:39:37.579 --> 00:39:43.250
conversation. So will end here where
I always end, which is on our
509
00:39:43.449 --> 00:39:46.889
number one core value at bombombing here
on the show, which is human relationships.
510
00:39:47.650 --> 00:39:51.489
So I would love to give you
the chance to incor mentioned someone who's
511
00:39:51.489 --> 00:39:54.840
had a positive impact on your life
or your career. And counter to where
512
00:39:54.840 --> 00:39:59.440
we started with that retail story,
you had maybe give a mention to a
513
00:39:59.559 --> 00:40:02.320
company that you really appreciate a respect
for the type of experience they're delivering for
514
00:40:02.480 --> 00:40:07.630
you as a customer. Well,
I would be a fool not to thank
515
00:40:07.789 --> 00:40:10.110
the folks that helped me write the
book. You know, one of the
516
00:40:10.150 --> 00:40:15.110
things about writing a book is that
it is not a solitary experience. When
517
00:40:15.110 --> 00:40:17.550
I got started, I thought this
is just me and a typewriter and I
518
00:40:19.070 --> 00:40:22.380
thought I had to like lock myself
in a room and not, you know,
519
00:40:22.500 --> 00:40:25.619
peek out. But in the acknowledge
riots in the book I listed a
520
00:40:25.659 --> 00:40:30.099
whole Bunch of folks, but very
specifically I got bogged down in the writing
521
00:40:30.139 --> 00:40:35.929
process. I recognize that I don't
have the temperament personality to be somebody who
522
00:40:35.929 --> 00:40:38.449
can just, you know, knock
out eight hours of writing every day until
523
00:40:38.449 --> 00:40:42.889
a book is finished. That I'm
you know, I find moments, flashes
524
00:40:42.929 --> 00:40:45.409
of inspiration where I'll write you three, four, five pages and then it
525
00:40:45.409 --> 00:40:50.039
might be a week or two before
I get involved with it again. And
526
00:40:50.840 --> 00:40:53.519
I ended up working with a researcher, felon in Darren, who helped me
527
00:40:54.039 --> 00:40:59.199
at a level that I you know, really it unlocked my ability to get
528
00:40:59.320 --> 00:41:02.719
book done because, instead of dreading
digesting a, you know, a thirty
529
00:41:02.760 --> 00:41:06.269
page study, I could hand it
to him and say, you know what,
530
00:41:06.510 --> 00:41:08.590
I think this is important, I
think this is something that we should
531
00:41:08.590 --> 00:41:12.389
include, but I don't want to, you know, spend a whole chapter
532
00:41:12.510 --> 00:41:15.230
on it. Maybe if you could
give me two paragraphs and he would do
533
00:41:15.349 --> 00:41:19.739
the drudgery of digesting that thing and
feeding me back to to two paragraphs that
534
00:41:19.780 --> 00:41:23.300
I could easily staple in or,
you know, kind of mortar into the
535
00:41:23.420 --> 00:41:29.099
bricks of this larger structure. And
at first I thought this is cheating,
536
00:41:29.219 --> 00:41:31.099
this is too easy, but then
I realized it doesn't. You know,
537
00:41:31.219 --> 00:41:36.250
you've got to figure out what your
own faults are, what your own weaknesses
538
00:41:36.369 --> 00:41:38.650
are, and you've got to be
willing to reach out and get help and
539
00:41:38.889 --> 00:41:43.250
so I owe a debt of gratitude
to everybody who helped me with the book.
540
00:41:43.610 --> 00:41:46.159
To the second part of your question, who's doing it? My default
541
00:41:46.199 --> 00:41:52.159
answer here is Amazon, and I
say that admiringly but also with a degree
542
00:41:52.199 --> 00:41:58.199
of caution. And the admiring is
that Amazon has figured out how to connect
543
00:41:58.519 --> 00:42:01.230
the top of the funnel and the
bottom of the funnel in a unique way
544
00:42:01.309 --> 00:42:07.070
that facebook and Google can't, and
that is that they plant the seeds at
545
00:42:07.110 --> 00:42:08.590
the top of the funnel. They, you know, have that ability to
546
00:42:08.750 --> 00:42:14.630
plant the ideas and to see the
market with people who bought this also bought
547
00:42:14.670 --> 00:42:17.539
this. You know you might like
this, and in fact they've made tremendous
548
00:42:17.659 --> 00:42:22.980
in roads in digital marketing in terms
of the dollars that they're bringing in,
549
00:42:23.539 --> 00:42:29.179
which is very impressive. But they
also have something that Google and facebook don't,
550
00:42:29.420 --> 00:42:31.530
which is the cash register. Of
course, facebook and Google have to
551
00:42:31.690 --> 00:42:36.969
rely on third party data, but
here we have a company, Amazon,
552
00:42:37.329 --> 00:42:39.570
that is the cash registry. You
can actually make the purchase right there,
553
00:42:39.809 --> 00:42:45.360
and so what Amazon is doing is
their leveraging artificial intelligence to connect, to
554
00:42:45.840 --> 00:42:52.960
create Attu aribution between what you bought
and what you experience through the customer experience,
555
00:42:53.559 --> 00:43:00.550
by understanding algorithmically the steps you took
along your customer experience. Through their
556
00:43:00.670 --> 00:43:07.710
technology, they're getting better at changing
and and molding your purchase behavior, and
557
00:43:07.869 --> 00:43:15.059
that's what's making them so successful is
their application of artificial intelligence to the problem
558
00:43:15.539 --> 00:43:21.179
of attribution. And when I say
that the attribution problem is by definition,
559
00:43:21.260 --> 00:43:24.500
it's figuring out to what do I
attribute this purchase? You know, you
560
00:43:24.619 --> 00:43:28.530
bought a radio ad, you bought
a TV Ad, you bought a newspaper
561
00:43:28.610 --> 00:43:31.889
ad and I have no idea which
one of those caused you to walk in
562
00:43:32.090 --> 00:43:38.530
today and buy dog food. But
Amazon can watch that entire experience through their
563
00:43:38.690 --> 00:43:45.639
portal, and I'm talking about through
their movies and through their music and through
564
00:43:45.039 --> 00:43:49.599
all of the things that you're doing
when you interact with the range of you
565
00:43:49.679 --> 00:43:53.000
know Amazon prime and all of the
you know all the products that they're selling.
566
00:43:53.159 --> 00:43:58.869
You know they are a market place
that includes a lot of things that
567
00:43:59.030 --> 00:44:04.510
you wouldn't traditionally think of as customer
experiences, and they're able to weave all
568
00:44:04.630 --> 00:44:10.019
that together algorithmically to make observations about
your behaviors what you will buy next.
569
00:44:10.699 --> 00:44:17.579
That companies like facebook and Google can't
see. So they've really applied artificial intelligence
570
00:44:17.659 --> 00:44:22.420
to better understanding the customer experience and
I would have to cite them as kind
571
00:44:22.420 --> 00:44:27.329
of my you know, kind of
top company to watch in the space.
572
00:44:27.929 --> 00:44:30.409
Great Call in a great breakdown there. I mean just folding, and I
573
00:44:30.449 --> 00:44:35.369
didn't think about it this way,
but folding in what music I'm listening to
574
00:44:35.530 --> 00:44:38.599
and how often, what movies and
TV shows I'm watching through prime, what
575
00:44:38.719 --> 00:44:43.000
I'm buying at a whole foods market
through the APP, you know, all
576
00:44:43.039 --> 00:44:46.320
the just a personal profile they can
build on me is really, really interesting.
577
00:44:46.360 --> 00:44:50.960
Besides obviously going to order this,
that or the other thing, you
578
00:44:51.039 --> 00:44:54.110
know, books and whatever else,
through the website directly. There's so much
579
00:44:54.190 --> 00:44:58.349
of a profile they can build.
I'm going to add one shout out.
580
00:44:58.389 --> 00:45:02.269
The gentleman who brought the two of
us together today is Douglas Burdette, Marketing
581
00:45:02.309 --> 00:45:07.619
Artillery and the marketing book podcast.
We were both guests on his show as
582
00:45:07.699 --> 00:45:10.500
authors of books that are relevant to
marketers and marketing. So shout out to
583
00:45:10.539 --> 00:45:14.900
Douglas. Thanks for bringing us together. Thank you so much for your time
584
00:45:15.019 --> 00:45:19.019
here on the show today. I
enjoyed it very much and continued success to
585
00:45:19.059 --> 00:45:21.969
you with folks want to follow up
on this. They want to obviously order
586
00:45:22.050 --> 00:45:23.809
the book or connect with you.
What are some ways that people can take
587
00:45:23.889 --> 00:45:30.809
this conversation step farther? The simplest
thing to do is say Alexa, order
588
00:45:30.090 --> 00:45:36.800
the invisible brand by William Ammerman and
it will arrive at their doorstep tomorrow morning.
589
00:45:37.159 --> 00:45:39.440
But if they're not willing to do
that, they can simply go to
590
00:45:39.639 --> 00:45:45.119
my website, which is double you
for William Ammerman, a m erm a
591
00:45:45.320 --> 00:45:50.510
andcom, and there they can find
out or about me and about the book.
592
00:45:51.230 --> 00:45:52.989
Excellent. Thank you again so much
for your time. Well done on
593
00:45:53.190 --> 00:45:57.750
the book and I just really appreciate
what you shared here. Thank you so
594
00:45:57.869 --> 00:46:02.309
much, Ethan. Great work.
If you found this conversation interesting and valuable,
595
00:46:02.550 --> 00:46:07.579
I know that you'll enjoy his book
the invisible brand. And if you're
596
00:46:07.619 --> 00:46:12.260
thinking about how to put your humans
in the best position to win inside your
597
00:46:12.380 --> 00:46:17.179
company, be sure to check out
rehumanize your business. How personal videos accelerate
598
00:46:17.219 --> 00:46:23.489
sales and improve customer experience. You
can learn more about it by visiting Bombombcom
599
00:46:23.969 --> 00:46:31.489
forward book. That's bomb Bombcom book, or you can search for humanize your
600
00:46:31.530 --> 00:46:36.480
business at Amazon. My name is
Ethan Bute and thank you for listening to
601
00:46:36.599 --> 00:46:43.800
the BB growth show. I hate
it when podcasts incessantly ask their listeners for
602
00:46:43.960 --> 00:46:46.869
reviews, but I get why they
do it, because reviews are enormously helpful
603
00:46:46.869 --> 00:46:50.989
when you're trying to grow a podcast
audience. So here's what we decided to
604
00:46:51.030 --> 00:46:53.630
do. If you leave a review
for be tob growth and apple podcasts and
605
00:46:53.829 --> 00:46:59.230
email me a screenshot of the review
to James at Sweet Fish Mediacom, I'll
606
00:46:59.269 --> 00:47:01.780
send you a signed copy of my
new book, content based networking, how
607
00:47:01.820 --> 00:47:06.019
to instantly connect with anyone you want
to know. We get a review,
608
00:47:06.059 --> 00:47:07.219
you get a free book. We
both win.