Transcript
WEBVTT
1
00:00:00.080 --> 00:00:04.559
Are you struggling to come up with
original content weekend and week out? Start
2
00:00:04.559 --> 00:00:09.189
a podcast, interview your ideal clients, let them talk about what they care
3
00:00:09.230 --> 00:00:14.589
about most and never run out of
content ideas again. Learn more at sweet
4
00:00:14.589 --> 00:00:23.699
fish Mediacom. You're listening to be
tob growth, a daily podcast for B
5
00:00:23.820 --> 00:00:28.059
TOB leaders. We've interviewed names you've
probably heard before, like Gary Vannerd truck
6
00:00:28.140 --> 00:00:32.020
and Simon Senek, but you've probably
never heard from the majority of our guests.
7
00:00:32.659 --> 00:00:36.649
That's because the bulk of our interviews
aren't with professional speakers and authors.
8
00:00:37.210 --> 00:00:41.049
Most of our guests are in the
trenches leading sales and marketing teams. They're
9
00:00:41.090 --> 00:00:46.570
implementing strategy, they're experimenting with tactics, they're building the fastest growing BTB companies
10
00:00:46.609 --> 00:00:49.770
in the world. My name is
James Carberry. I'm the founder of sweet
11
00:00:49.770 --> 00:00:53.399
fish media, a podcast agency for
BB brands, and I'm also one of
12
00:00:53.439 --> 00:00:57.439
the CO hosts of the show.
When we're not interviewing sales and marketing leaders,
13
00:00:57.600 --> 00:01:00.159
you'll hear stories from behind the scenes
of our own business. Will share
14
00:01:00.240 --> 00:01:03.549
the ups and downs of our journey
as we attempt to take over the world.
15
00:01:04.189 --> 00:01:15.750
Just getting well maybe let's get into
the show. Welcome to the AI
16
00:01:15.950 --> 00:01:19.420
segment of the be tob grills show. I'm your host Samantha Stone, and
17
00:01:19.540 --> 00:01:22.900
today I have my friend who,
Steph Miserra, with me, whose last
18
00:01:22.939 --> 00:01:26.379
name might think I terribly mispronounced,
and she can correct me in a minute.
19
00:01:26.540 --> 00:01:30.099
So I Apologize Advantage'll say it correctly
for you. As you know,
20
00:01:30.340 --> 00:01:34.250
my loyal listeners, I don't often
pronounce names correctly. It's a little bit
21
00:01:34.290 --> 00:01:38.010
of a disability that I have.
I don't think that disability has a name,
22
00:01:38.049 --> 00:01:42.810
but I wish it did. But
nonetheless, super enthusiastic for her to
23
00:01:42.849 --> 00:01:47.129
be here today. She's the founder
of the core connect conference and a professor
24
00:01:47.200 --> 00:01:52.200
at Nyu, and the reason we're
chatting today as we had the pleasure of
25
00:01:52.319 --> 00:01:57.480
CO presenting at an event that Mark
Schaeffer hosted called the uprising a few weeks
26
00:01:57.480 --> 00:02:00.640
ago and we totally hit it off. And one of the reasons that we
27
00:02:00.680 --> 00:02:05.750
hit it off is because we both
understand how important artificial intelligence is to marketing
28
00:02:05.790 --> 00:02:09.270
and how much it's an impacting us
and how important is for us to understand
29
00:02:09.430 --> 00:02:15.110
how it can help with human interactions, and so I'm thrilled that she's joining
30
00:02:15.150 --> 00:02:19.900
us today to help share a little
bit of our private conversations with you on
31
00:02:19.979 --> 00:02:23.780
the podcast today. Welcome. Thank
you so much to ment. It's just
32
00:02:23.900 --> 00:02:27.060
so exciting. We've been talking about
this to know and making it happen.
33
00:02:27.659 --> 00:02:31.610
And the last name is Mirrsa me
on. Is that? Thank you.
34
00:02:32.729 --> 00:02:38.530
I beg you. You almost I
was close. I was close. I
35
00:02:38.689 --> 00:02:43.689
have so thank you so much for
being here today. I'm really excited.
36
00:02:44.009 --> 00:02:46.680
You have exciting news because you recently
launched your own podcast. Is that not
37
00:02:46.879 --> 00:02:53.159
cracked? Yes, it's very,
very exciting. I started my podcast about
38
00:02:53.159 --> 00:02:59.030
almost two months ago. It's called
clarity connects and it's all about the human
39
00:02:59.069 --> 00:03:04.349
essence and brands, where I talk
about branding, marketing and business and the
40
00:03:04.430 --> 00:03:08.110
human aspects as it flows through all
those different areas. So I actually did
41
00:03:08.229 --> 00:03:13.460
one episode this morning, so I
guess today it's write a podcast day.
42
00:03:13.780 --> 00:03:16.780
So I'm totally enjoying this, this
new journey that I'm on now. Well,
43
00:03:16.860 --> 00:03:21.740
I thank you for squeezing us in
and I'm really excited. I've had
44
00:03:21.780 --> 00:03:23.580
the chance to listen to you at
one or two of the episodes and it's
45
00:03:23.659 --> 00:03:28.370
great and I'm really glad that you're
talking about these important issues and I actually
46
00:03:28.409 --> 00:03:32.409
think that's a perfect sort of segue
into our conversation today, because when we
47
00:03:32.610 --> 00:03:38.050
talk about artificial intelligence sometimes we get
lost in the tech component of what that
48
00:03:38.250 --> 00:03:44.879
means, but in our conversations that
we've had, we both recognize that there's
49
00:03:44.879 --> 00:03:49.719
a very human element to how we
use this technology and when we use this
50
00:03:49.879 --> 00:03:55.990
technology. That's instrumental to realizing the
potential of it and also for holding off
51
00:03:57.150 --> 00:04:02.069
some of the concerns that we might
have and abusing the technology in the vein
52
00:04:02.150 --> 00:04:08.990
of focusing on the positive and and
helping our audience understand how to apply this
53
00:04:09.110 --> 00:04:12.860
and where to apply list. I'd
love to just start by you talking about
54
00:04:12.939 --> 00:04:17.540
the role you think artificial intelligence has
an interfacing with people. So I think
55
00:04:17.779 --> 00:04:21.019
I think you do bring a very
good point. I think there's right now
56
00:04:21.379 --> 00:04:27.009
with artificial intelligence. It's still in
its infancy, but we can already see
57
00:04:27.170 --> 00:04:30.970
the tremendous power that it has and
with that power, I think there's a
58
00:04:30.009 --> 00:04:34.810
lot of fear that comes into that
because we don't know how it's actually going
59
00:04:34.889 --> 00:04:40.879
to evolve, and I think it's
a very important time right now to actually
60
00:04:41.680 --> 00:04:45.279
look at what we want to do
with the technology. I mean, we
61
00:04:45.480 --> 00:04:49.480
are the owners of the technology and
I think that it is a tremendous tool,
62
00:04:50.120 --> 00:04:55.230
but we also have to make sure
that this tool is use it is
63
00:04:55.269 --> 00:04:58.110
a tool. Right, it feels
like magic, but at the end of
64
00:04:58.149 --> 00:05:01.430
the day it's a tool. We
have to make sure that we're using artificial
65
00:05:01.470 --> 00:05:10.019
intelligence that is helping us in terms
of our humanness and also that it's helping
66
00:05:10.060 --> 00:05:15.339
us flourish, that it's actually complimenting
the human experience and not taking over the
67
00:05:15.420 --> 00:05:20.339
human experience. And I think that
this is where we need to really take
68
00:05:20.490 --> 00:05:25.329
the time to make the decisions that
we need to do at this point,
69
00:05:25.649 --> 00:05:29.050
because I think because it's still in
this in the infancy stage, it's very
70
00:05:29.410 --> 00:05:33.850
important to have these types of conversations
and it's going to be harder interesting to
71
00:05:33.930 --> 00:05:36.120
people say well, you know,
yeah, we'll address it. You know,
72
00:05:36.199 --> 00:05:40.600
when we get there, no know
where they're. Like we are literally
73
00:05:40.639 --> 00:05:45.000
there and we need to have these
conversations before it gets much more complicated.
74
00:05:45.079 --> 00:05:49.189
And I think also what's really important
is to always ask ourselves the question,
75
00:05:49.870 --> 00:05:55.949
just because we can use AI,
does it mean that we should? That's
76
00:05:55.990 --> 00:06:00.589
a choice. It doesn't mean that
just because we have any type of technology
77
00:06:00.110 --> 00:06:04.740
that we should use that technology.
And there's an example that I always give,
78
00:06:04.779 --> 00:06:09.459
which is a very simple example,
which is pop up. Right,
79
00:06:09.540 --> 00:06:15.540
pop ups is very simple. If
the technology that we use on platforms,
80
00:06:15.540 --> 00:06:17.889
that we use on landing pages,
but it's through a choice whether you want
81
00:06:17.930 --> 00:06:20.689
to use it or not just because
it's there. A lot of people what
82
00:06:20.810 --> 00:06:25.370
they do with their brand or their
their personal pages. They say, Oh,
83
00:06:25.410 --> 00:06:27.730
you know, I can put a
pop up, but you have to
84
00:06:27.970 --> 00:06:32.279
stop and think, is it actually
bringing home the desired outcome that I want
85
00:06:32.360 --> 00:06:39.240
that technology to bring? And I
mean I'm very annoyed by pop up because
86
00:06:39.279 --> 00:06:44.759
it's actually interfering with what I'm trying
to get at, and so I'm sure
87
00:06:44.879 --> 00:06:47.949
that's not what the person that created
the website wanted me to feel, but
88
00:06:48.029 --> 00:06:50.990
that's how I'm feeling and we pretty
much, you know, pretty much everybody
89
00:06:51.029 --> 00:06:56.110
feels that way. So I think
that's the same type of thought that we
90
00:06:56.310 --> 00:07:00.870
need to think about when we talk
about artificial intelligence, is, is this
91
00:07:00.149 --> 00:07:05.019
really going to be a service of
what I'm trying to reach as my objective?
92
00:07:05.779 --> 00:07:10.819
I'm so glad that you brought that
up, because I think we sometimes
93
00:07:10.899 --> 00:07:15.139
chase shining new objects. We get
very, very excited about it and we
94
00:07:15.329 --> 00:07:19.649
don't always take the care we need
to understand where is it going to add
95
00:07:19.769 --> 00:07:26.490
value to my interaction with a customer
and where is it going to actually take
96
00:07:26.569 --> 00:07:30.639
away from the interaction with our with
my customer? And one of the things
97
00:07:30.680 --> 00:07:35.519
that we need to understand is that
sometimes artificial intelligence is used directly in their
98
00:07:35.560 --> 00:07:41.040
interaction, like a chat bought,
for example, but sometimes it's actually used
99
00:07:41.199 --> 00:07:46.430
behind the scenes and analytics or serving
up segmentation models and helping us, where
100
00:07:46.470 --> 00:07:49.589
it's sort of hidden from the person
we're interacting with. They don't they don't
101
00:07:49.629 --> 00:07:55.629
see that. When you think about
marketers making their plans for two thousand and
102
00:07:55.629 --> 00:08:01.019
twenty and so very sort of right
around the corner short term, where are
103
00:08:01.100 --> 00:08:07.420
some of the places that you think
marketers should begin their journey of experimenting with
104
00:08:07.620 --> 00:08:11.779
artificial intelligence? So, in the
same vein, because I'm the human girl
105
00:08:11.819 --> 00:08:16.290
right, I always try to make
sure that whatever we are creating, that
106
00:08:16.449 --> 00:08:20.250
it needs to resonate on a human
level because at the end end of the
107
00:08:20.290 --> 00:08:24.089
day, thankfully, we are not
robots. We are all human beings.
108
00:08:24.170 --> 00:08:28.639
We are individual human beings. We
work with other human beings and we are
109
00:08:28.800 --> 00:08:31.639
trying to sell products and services to
other human beings. So therefore we have
110
00:08:31.720 --> 00:08:37.919
to connect on an emotional level and
that is an implication that also comes with
111
00:08:37.200 --> 00:08:43.190
with Ai. I think there's different
ways to look at how you can include
112
00:08:43.190 --> 00:08:48.870
a eye in marketing. One way
that that we can do that is is
113
00:08:48.629 --> 00:08:56.779
actually to inform better from the data
of how you do creative so it's really
114
00:08:56.779 --> 00:08:58.139
a little bit like what you were
saying in terms of the back end,
115
00:08:58.700 --> 00:09:03.940
because you know, twenty years ago
we didn't have that much data. Now
116
00:09:05.059 --> 00:09:07.860
we have ridiculous amounts of data and
now we have the opposite problem, right.
117
00:09:09.179 --> 00:09:11.169
So now we have so much data
and we don't really know what to
118
00:09:11.289 --> 00:09:16.929
do with it and we don't really
know how to actually fist through all of
119
00:09:16.049 --> 00:09:20.889
these points so that we can get
certain insights that will help up, help
120
00:09:22.049 --> 00:09:26.879
us directionalize our decisionmaking, and I
think this is an area where ai can
121
00:09:28.000 --> 00:09:33.200
be of great use, where you
can literally, and again it's not just
122
00:09:33.360 --> 00:09:35.519
to say, okay, we're just
going to use AI and we're going to
123
00:09:35.559 --> 00:09:39.190
see what comes out, what the
AI is going to tell us in terms
124
00:09:39.230 --> 00:09:43.950
of the output I think it's very
important that human at the beginning of the
125
00:09:45.149 --> 00:09:54.179
AI program or implementation process, that
the individuals actually are really clear on what
126
00:09:54.379 --> 00:09:58.580
is the objective that we're trying to
reach and bring forth those different guard rail
127
00:09:58.580 --> 00:10:03.899
and give that to into the code
so that when AI is doing its job,
128
00:10:05.059 --> 00:10:07.610
that it knows what's to do.
And I think the other thing that's
129
00:10:07.610 --> 00:10:11.289
really important from the front and before
letting ai do its job in terms of
130
00:10:11.610 --> 00:10:16.330
going through the data and honing into
what's important and and focusing on those areas,
131
00:10:16.850 --> 00:10:22.919
is to make sure that you are
addressing unconscious biases, because that can
132
00:10:22.000 --> 00:10:26.919
come clearly skew the output and it's
not going to be what you're going to
133
00:10:28.039 --> 00:10:31.399
be really looking for, that's not
really going to be useful data. So
134
00:10:31.799 --> 00:10:35.080
it's really to be cognizant of those
two things as number one, to really
135
00:10:35.120 --> 00:10:37.950
know the objective of what it is, and then number two is to make
136
00:10:39.070 --> 00:10:41.909
sure that you're addressing unconscious bys.
And what I mean by that, I
137
00:10:41.990 --> 00:10:46.429
guess, in a more concrete way, is to make sure that what is
138
00:10:46.509 --> 00:10:52.580
the content that you're feeding the the
AI machine and also who is feeding this
139
00:10:52.659 --> 00:10:54.740
Ai Machine? There's human beings that
are going to decide this. So you
140
00:10:54.820 --> 00:11:01.460
need to make sure if that person
has an understanding of the full breast of
141
00:11:01.700 --> 00:11:09.289
the types of content and data that
we're trying to compile and analyze and so
142
00:11:09.450 --> 00:11:11.889
forth, and from the the other
end of it. Then you put all
143
00:11:11.929 --> 00:11:15.850
that stuff into the AI. It
does, it does it's magic. It
144
00:11:16.090 --> 00:11:20.039
gives you output and at that point
again one it's it's the output. I
145
00:11:20.120 --> 00:11:24.240
don't think you should be taking it
just as okay, this is what we're
146
00:11:24.279 --> 00:11:28.840
going to go with. You have
to again bring the human lends and look
147
00:11:30.000 --> 00:11:33.590
at the output and to see whether
or not this is actually it. Did
148
00:11:33.669 --> 00:11:39.070
we miss something? Is actually the
data giving us information, which now we
149
00:11:39.230 --> 00:11:45.590
need to go back and fine tune
actually the algorithm so it can give this
150
00:11:45.750 --> 00:11:48.100
more precise information? So you have
to have the human judgment at the end
151
00:11:48.100 --> 00:11:52.340
of the day as well and and
address any biases that might have fallen through
152
00:11:52.379 --> 00:11:56.740
the crux in order to make a
final position. So I think that the
153
00:11:56.820 --> 00:12:01.100
human at the beginning, the AI
does its job and the human at the
154
00:12:01.139 --> 00:12:03.690
end, and then once you have
that output and it's and it's ebitded,
155
00:12:05.169 --> 00:12:09.169
then you can use that content to
actually really inform you in terms of your
156
00:12:09.570 --> 00:12:13.289
creativity and you have much more specific
data, much more relevant data, much
157
00:12:13.289 --> 00:12:18.080
more potentially juicy data that you can
use to help in the creative process.
158
00:12:18.720 --> 00:12:22.240
I think it's this idea of sort
of, I know you didn't call it
159
00:12:22.320 --> 00:12:24.080
an AI sandwich, but you know, sort of that's the visual that I
160
00:12:24.320 --> 00:12:28.159
create when I when I think of
this. But I actually think that,
161
00:12:28.320 --> 00:12:31.429
you know, joking aside, that's
a really important concept. We often think
162
00:12:31.429 --> 00:12:37.149
of artificial intelligence as living independent of
human thought, because we can't keep up
163
00:12:37.269 --> 00:12:39.549
with the speed of thought of machines. But in fact there's a lot of
164
00:12:39.590 --> 00:12:43.820
work, as you've described, in
preparing the data that the system will be
165
00:12:43.980 --> 00:12:50.580
using and in interpreting the recommendations that
assistem may be giving us and and choosing
166
00:12:50.740 --> 00:12:54.500
when to use and what to use. And so, for those of you
167
00:12:54.580 --> 00:13:01.889
who are beginning to think about things
like chat bots and segmentation strategies and personalization
168
00:13:01.009 --> 00:13:07.690
in our web pages and better ads
serving and all the places that we as
169
00:13:07.769 --> 00:13:15.120
marketers use artificial intelligence, this point
about keeping the human as a part of
170
00:13:15.320 --> 00:13:20.279
the formula and a part of the
process isn't just something we can do,
171
00:13:20.080 --> 00:13:22.840
it's actually something we have to do
right to stuff. I mean this is
172
00:13:24.000 --> 00:13:30.629
this is actually a responsibility of ours
to include that it is absolutely essential for
173
00:13:31.230 --> 00:13:37.470
our evolution on this planet. I
mean, I love the author you all
174
00:13:37.590 --> 00:13:39.950
know, a Harari. He wrote
the book stapens and he also wrote the
175
00:13:39.990 --> 00:13:45.700
Book Twenty One lessons for the twenty
one century, and he has this tremendous
176
00:13:45.860 --> 00:13:50.659
mind that is able to really bring
forth all the different aspects of history and
177
00:13:50.740 --> 00:13:56.370
philosophy and marketing and artificial intelligence and
uses that from a historical perspective to understand
178
00:13:56.409 --> 00:14:00.970
where we are now and where we
are going. And it was really by
179
00:14:01.929 --> 00:14:07.610
by reading his works that I understood
just the implications and the impacts of Ai
180
00:14:07.889 --> 00:14:11.200
if we don't talk about it in
a very sir regular way in everything that
181
00:14:11.360 --> 00:14:18.200
we're that we're creating. And this
is where artificial intelligence and ethics comes into
182
00:14:18.279 --> 00:14:24.669
play, and I actually went to
a conference last year. It's from an
183
00:14:24.669 --> 00:14:28.950
entity that's called a I now,
artificial intelligence now, and it is a
184
00:14:30.710 --> 00:14:35.269
sort of a collective of different people
from Nyu that come together and talk about
185
00:14:35.269 --> 00:14:39.899
the social implications of artificial intelligence.
So you have people who are background in
186
00:14:39.980 --> 00:14:43.500
technology, of people in anthropology,
you have lawyers and so you have really
187
00:14:43.580 --> 00:14:50.340
rich conversations of all the different implications. And one of the things that's really
188
00:14:50.580 --> 00:14:54.370
important to understand is when we talk
about ethics, what ethics really mean in
189
00:14:54.450 --> 00:14:58.929
terms of the definition, because it
can feel a little abstract. It's more
190
00:15:00.169 --> 00:15:05.210
principles that govern a person's behavior or
conducting of an activity. So when we're
191
00:15:05.210 --> 00:15:09.879
talking about artificial intelligence, you actually
need to swap the person part to a
192
00:15:11.000 --> 00:15:16.159
machine. So that basically means that
it is moral principles that govern a machine
193
00:15:16.440 --> 00:15:22.909
behavior or conducting of an activity.
Because as as a very simple way to
194
00:15:22.230 --> 00:15:28.230
talk about artificial intelligence, it's depending
on what you feed it. He will
195
00:15:28.230 --> 00:15:31.629
learn from that and then it will
grow from there by looking at the environment
196
00:15:31.789 --> 00:15:35.700
that it is exposed to. So
it's literally learning from what you're feeding it
197
00:15:37.059 --> 00:15:41.100
and where it's growing. And when
we talk about you know, I'm going
198
00:15:41.100 --> 00:15:43.940
to like sway well a little bit
from marketing here, but when you talk
199
00:15:45.019 --> 00:15:50.129
about we talk about self driving cars, that's when everything becomes so critical of
200
00:15:50.250 --> 00:15:56.490
understanding how are we going to make
these decisions, because before you could philosophize
201
00:15:58.210 --> 00:16:00.049
all the time about well, you
know, if I'm sitting in a car
202
00:16:00.330 --> 00:16:04.000
and then you know there's a tree
and there's a person walking in front of
203
00:16:04.120 --> 00:16:07.519
me, and obviously I'm going to
make the right decision and I'm going to
204
00:16:07.600 --> 00:16:11.879
crash in the tree because I want
to kill another person. That being said,
205
00:16:11.399 --> 00:16:15.960
if that actually comes in reality and
I'm behind the wheel at that point,
206
00:16:17.039 --> 00:16:18.870
I'm going to make a flid decision. Right. I'm not going to
207
00:16:18.909 --> 00:16:23.669
go through all my principles of whole
year than now. I'm actually just going
208
00:16:23.710 --> 00:16:26.549
to go with a gut and I'm
going to make a decision at that point
209
00:16:26.590 --> 00:16:30.190
and maybe my intuition is going to
kick you know. Who the hell knows
210
00:16:30.470 --> 00:16:33.139
what's going to kick in and I'm
going to make the decision I'm going to
211
00:16:33.179 --> 00:16:37.980
make. But when we're dealing with
algorithms, that decision needs to be coded
212
00:16:37.059 --> 00:16:44.419
in. It means that decision actually
we need to figure out before this unfortunate
213
00:16:44.460 --> 00:16:48.250
accident happens, we actually have to
tell the computer now, okay, now,
214
00:16:48.289 --> 00:16:51.889
you're actually going to crash into the
tree and then you know you're the
215
00:16:51.929 --> 00:16:56.289
guy that that owns the car who's
sleeping in the back might be kails,
216
00:16:56.409 --> 00:17:00.129
or are you going to instead go
towards the four year old kid who ran
217
00:17:00.169 --> 00:17:04.000
down the street to grab a ball, and so it's that serious. And
218
00:17:04.160 --> 00:17:08.880
so that's why one of the things
that Mr Harare he says is that we
219
00:17:10.039 --> 00:17:15.519
have to really look at engineers now
as philosophers, like they have to understand,
220
00:17:15.029 --> 00:17:19.109
they have to understand morality and because
they have to embent that in the
221
00:17:19.190 --> 00:17:25.750
code. And that's why it's such
a it is and and like any any
222
00:17:26.390 --> 00:17:30.380
powerful tool, I think ai can
be used for tremendous good, like it
223
00:17:30.500 --> 00:17:36.220
can be used for tremendous bad.
It's all a question of are we going
224
00:17:36.259 --> 00:17:41.339
to take that responsibility, to have
full ownership and make the right decisions to
225
00:17:41.539 --> 00:17:45.569
bring in it that direction? And
like with any great power comes great responsibility,
226
00:17:45.650 --> 00:17:48.529
and this is another form of that. I love that example because it
227
00:17:48.609 --> 00:17:52.650
seems like very straightforward example when you
first say that do I hit a tree
228
00:17:52.650 --> 00:17:53.690
or do I hit a person?
Of course the hit the tree. What
229
00:17:53.769 --> 00:17:56.289
if hitting the tree kills the passenger? Right, is just sort of where
230
00:17:56.369 --> 00:18:00.480
where you took that example. And
so these are really big, hard decisions
231
00:18:00.519 --> 00:18:03.519
for our human being to make.
So now we want to have a machine
232
00:18:03.640 --> 00:18:07.039
make them and it's it's equally hard. And look, that's a it's that's
233
00:18:07.079 --> 00:18:11.559
a dramatic example because we're talking about
life and death of a person, but
234
00:18:11.720 --> 00:18:15.630
we make these kinds of decisions every
day and our interaction with customers. We
235
00:18:15.789 --> 00:18:22.950
make the decision when our electricity has
been turned off for home and someone needs
236
00:18:22.990 --> 00:18:26.710
to explain why the power is off
at that person's home and they're calling up
237
00:18:26.750 --> 00:18:30.940
our customer support department, or we're
shipping out a an item that a customer
238
00:18:32.059 --> 00:18:36.059
has ordered and we are considering what
we put in with it and how do
239
00:18:36.140 --> 00:18:40.059
we pack it. Like every decision
we make may not be life threatening,
240
00:18:40.460 --> 00:18:45.289
but they are brand threatening right.
They affect how people perceive who we are
241
00:18:45.329 --> 00:18:51.609
as a company and the relationship and
how much we value them as a customer.
242
00:18:52.049 --> 00:18:56.160
Absolutely, absolutely. Yeah, I
went completely like the the crazy world
243
00:18:56.160 --> 00:18:59.599
that we living in an AI.
So we need to really take this seriously.
244
00:18:59.880 --> 00:19:04.440
But absolutely it's also completely applicable in
so many small details that that are
245
00:19:04.519 --> 00:19:07.480
important as well from a grand perspective. You know, one of the things
246
00:19:07.480 --> 00:19:11.829
that we have chatted a lot ourselves
about, you and I, but also
247
00:19:11.910 --> 00:19:15.109
we see talking about, is sort
of the rise and the use of chat
248
00:19:15.230 --> 00:19:22.829
bots. So chat bots are an
artificial intelligence engine of some kind interacting with
249
00:19:22.869 --> 00:19:26.460
a real human being who is asking
a question or looking for information, typically
250
00:19:26.500 --> 00:19:30.460
on a website or on their mobile
device, and we have all seen,
251
00:19:30.539 --> 00:19:34.099
as consumers, a dramatic rise in
the availability of it. Now, just
252
00:19:34.420 --> 00:19:37.690
you go on record and say chat
blocks can be really useful if I'm looking
253
00:19:37.690 --> 00:19:41.289
for what are your hours, or
I'm trying to do something and I don't
254
00:19:41.289 --> 00:19:45.529
necessarily need to talk to human,
and sometimes talking to human, frankly,
255
00:19:45.690 --> 00:19:49.490
is overrated right and I just want
some information or help finding something. But
256
00:19:49.890 --> 00:19:57.119
there are instances where talking to machine
can be very frustrating and may actually not
257
00:19:57.440 --> 00:20:02.839
be the right thing to do.
I'm curious, from your perspective, with
258
00:20:02.960 --> 00:20:07.150
this this cone of always being human, what are some of the ways and
259
00:20:07.349 --> 00:20:15.390
places we should not depend on technology
but stay true to human to human interactions?
260
00:20:15.869 --> 00:20:19.470
That's a good question. I think
we need to it's okay to experiment.
261
00:20:21.190 --> 00:20:25.299
I think it's okay to try.
You know what, if I tried
262
00:20:25.619 --> 00:20:33.019
this chat box and see what happens
and if I realize that people are aggravated,
263
00:20:33.259 --> 00:20:37.329
that people are not engaging, or
maybe it's also like the interface of
264
00:20:37.369 --> 00:20:40.089
the chat bocks and it might not
be to Chet pucks itself it could be
265
00:20:40.170 --> 00:20:44.170
many things, and then you need
to say, all right, so this
266
00:20:44.369 --> 00:20:48.490
is not working, so how can
I actually we make this a little bit
267
00:20:48.529 --> 00:20:52.279
more of a human experience? I
feel that the end of the day,
268
00:20:52.839 --> 00:20:59.039
you're trying to create experiences for people
that are not just innovative but that needs
269
00:20:59.079 --> 00:21:03.750
to feel comfortable. And what I
mean by that is that technology is moving
270
00:21:04.190 --> 00:21:11.309
very fast and human beings are not
moving as fast at that the technology is
271
00:21:11.349 --> 00:21:15.750
going, and for some people that
might bring a sense of things as anxiety
272
00:21:15.190 --> 00:21:22.140
because they have they're losing control they
didn't there's something that feels really unfamiliar and
273
00:21:22.220 --> 00:21:23.660
they don't know how to handle it
and they don't even want to go there.
274
00:21:23.740 --> 00:21:29.420
So I think that when you are
presenting new technology, new AI,
275
00:21:30.140 --> 00:21:33.569
it's fine to to try to see
how much you can push the envelope,
276
00:21:33.930 --> 00:21:37.410
but the user, at the end
of the day, still needs to feel
277
00:21:37.410 --> 00:21:41.410
a certain level of comfort of what
they already know, and you can you
278
00:21:41.490 --> 00:21:45.410
can sort of build up on that. And there's one thing I was even
279
00:21:45.529 --> 00:21:48.640
talking about thinking about at a certain
point, which was, you know,
280
00:21:48.680 --> 00:21:55.680
the iphone is pretty completely novel product. That came through. How many years
281
00:21:55.720 --> 00:21:57.160
ago now? I don't know,
like eight years ago, ten years ago
282
00:21:57.200 --> 00:22:03.190
or something like that, and it
came through and the people adopted fairly readily,
283
00:22:03.509 --> 00:22:06.509
and so then I started to think, okay, so what was the
284
00:22:06.589 --> 00:22:10.869
comfort level there if it was a
completely novel product that was adopted like pretty
285
00:22:10.869 --> 00:22:14.230
much right off the bat? But
then, when you start thinking about it,
286
00:22:14.509 --> 00:22:18.940
we were already exposed to the IPOD, so we were already exposed to
287
00:22:18.460 --> 00:22:22.460
that interfaith and that sleekness and sort
of that screen was a little bit similar
288
00:22:22.500 --> 00:22:26.539
to that, and so we were
already aware of that that product. And
289
00:22:26.660 --> 00:22:30.970
then also there was a phone capability, and we all know what a phone
290
00:22:32.130 --> 00:22:34.329
was because we've already always been around
phone. It was just the interface that
291
00:22:34.450 --> 00:22:40.089
was different. So also from that
perspective it was already there. The thing
292
00:22:40.130 --> 00:22:45.359
that was extremely novel that wasn't there
previously worthy apt and that was an addition
293
00:22:45.720 --> 00:22:49.200
that was completely new. But after
thinking about it some more, I think
294
00:22:49.240 --> 00:22:53.200
that's probably one of the reasons why
it was able to be so successful,
295
00:22:53.880 --> 00:23:00.750
because it's still had a foundation of
elements that we were all already comfortable with
296
00:23:00.990 --> 00:23:04.109
and that we trusted. And and
then from those two perspectives, then you
297
00:23:04.190 --> 00:23:11.670
can build on different types of technological
aspects that are more abstract, but then
298
00:23:11.750 --> 00:23:15.220
we can adopt further from that point. That's such an important guiding principle,
299
00:23:15.299 --> 00:23:18.740
which is, you know, sometimes
things feel like these overnight successes and these
300
00:23:19.059 --> 00:23:22.579
rapid, you know, magic things
happen, but the truth is they're typically
301
00:23:22.579 --> 00:23:29.650
building. It's really evolution that revolution, and we what we can control and
302
00:23:29.769 --> 00:23:33.210
what we can do as business leaders
is spend the time to figure out what
303
00:23:33.250 --> 00:23:37.089
a natural next step is to evolve
over something that we know today, to
304
00:23:37.210 --> 00:23:41.920
get better, to get more intuitive, to improve the relationship we have with
305
00:23:41.039 --> 00:23:48.559
customers. It doesn't have to be
fully forming completely new. It will feel
306
00:23:48.640 --> 00:23:52.759
like a new experience, but in
fact it's actually has some aspect of familiarity
307
00:23:52.880 --> 00:23:59.670
to it, and that is something
that I think we have a big opportunity
308
00:23:59.750 --> 00:24:02.990
around to look at all the places
that we interact and say, oh,
309
00:24:03.069 --> 00:24:06.589
how can I just make these three
things a little bit better? What what
310
00:24:06.710 --> 00:24:10.019
might I be able to do right
and and just say, how can I
311
00:24:10.299 --> 00:24:14.019
make a step back and make this
feel a little bit more human, make
312
00:24:14.140 --> 00:24:18.220
this feel a little bit more normal
to our usual experience? I'll give you
313
00:24:18.259 --> 00:24:23.140
a quick example. I was talking
with a person that that I don't remember
314
00:24:23.140 --> 00:24:26.930
if who was chase that it was. It was one of the it was
315
00:24:26.970 --> 00:24:30.089
at a conference and I was having
this conversation and they were saying that they
316
00:24:30.130 --> 00:24:34.849
we were trying to implement into their
ATMs. They wanted to have much fewer
317
00:24:34.930 --> 00:24:41.279
tellers, live tellers, and have
everything in the ATM. And you know,
318
00:24:41.359 --> 00:24:44.480
most of us goes to the ATM, but you had a certain segments
319
00:24:44.640 --> 00:24:48.640
of people who were more elderly,
like sixty, seventy and up, that
320
00:24:48.759 --> 00:24:52.710
would just still go see the teller
and they said, you know, we
321
00:24:52.829 --> 00:24:57.750
tried everything. We tried to put
like more explanations on the screen and all
322
00:24:57.789 --> 00:25:03.789
that stuff to try to help them
through the process, but it still didn't
323
00:25:03.789 --> 00:25:06.789
work. And then I said for
them, I said, but why do
324
00:25:06.829 --> 00:25:08.859
you think they go see let's take
two steps back. Of what that what
325
00:25:10.099 --> 00:25:14.539
the more human experience could be through
the ATM. Why do they go see
326
00:25:14.539 --> 00:25:18.579
the teller? If because they probably
know the teller right, so maybe it's
327
00:25:18.700 --> 00:25:22.089
Nadia or Tom, they know,
not your Tom. What if you would
328
00:25:22.130 --> 00:25:26.609
put Nadia or Tom's voice in BTM? Or what if you would have their
329
00:25:26.730 --> 00:25:30.849
faith on on the screen and that
that helps you through and that's what I
330
00:25:30.970 --> 00:25:37.480
mean is just to go back like
what is the original human experience that we
331
00:25:37.640 --> 00:25:44.000
want and how can you start integrating
that into a technological platform? And that
332
00:25:44.119 --> 00:25:48.880
way it's innovative but it's also comfortable. And I think you know the big
333
00:25:48.000 --> 00:25:51.670
word. I it's been a while. That's this is a big word and
334
00:25:51.750 --> 00:25:53.910
I think it's going to be here
for a while. Disruption. So everybody
335
00:25:53.910 --> 00:25:56.710
wants to disrupt. Everybody wants to
disrupt, but I think you have to
336
00:25:56.789 --> 00:26:00.349
take that with a grain of salt. You can disrupt and at the same
337
00:26:00.390 --> 00:26:03.980
time, how do you still remain
human? Then is such a good example
338
00:26:04.140 --> 00:26:07.460
because I think about it, and
I think you know, I'm going to
339
00:26:07.900 --> 00:26:11.859
believe this bank had goodness in their
heart when they're making the decision, but
340
00:26:11.900 --> 00:26:17.019
the decision to get rid of people
in the tellers and focus more than machines
341
00:26:17.140 --> 00:26:22.569
really was never a probably about the
customers, probably that efficiency and collecting data.
342
00:26:22.210 --> 00:26:27.250
And so instead, if we had
taken the frame that said what do
343
00:26:27.369 --> 00:26:33.880
people want to do in our physical
environment right and started mapping out the business
344
00:26:34.000 --> 00:26:38.240
things, the way people interact is
the big having more automated things may have
345
00:26:38.319 --> 00:26:42.160
ended up still being the solution,
but the frame of references did it,
346
00:26:42.240 --> 00:26:47.390
and so the experience they create in
those automated things might have more likely had
347
00:26:47.470 --> 00:26:52.750
some of the human elements that you're
talking about and that, as leaders and
348
00:26:52.910 --> 00:26:56.589
for those of you who are listening, that's what I want you to do.
349
00:26:56.750 --> 00:26:57.869
That's what we want you to think
about. We want you to think
350
00:26:57.869 --> 00:27:02.180
about the ethics. We want you
to think about confirmation by us. We
351
00:27:02.299 --> 00:27:08.420
want you to think about the applications
of familiarity and do the hard work of
352
00:27:10.619 --> 00:27:15.410
separating what we want from what the
people were interacting want and put that front
353
00:27:15.450 --> 00:27:22.410
and center and that let that lead
to the technology solutions instead of starting with
354
00:27:22.609 --> 00:27:26.890
the technology and trying to figure out
how to apply it to so thank you
355
00:27:26.009 --> 00:27:30.160
so much. If joined, if
I could talk for four hours on this
356
00:27:30.359 --> 00:27:33.240
with you, I just think I
think you make a very good point.
357
00:27:33.279 --> 00:27:37.519
If the customer, the customer,
the customer and, depending on who you're
358
00:27:37.559 --> 00:27:41.680
talking to, the fact that we
are all emotional being the fact that still
359
00:27:42.319 --> 00:27:47.470
still with all this technology, fifty
to sixty percent of effectiveness of an ad
360
00:27:47.589 --> 00:27:51.750
is based on treativity. That is
still who we are, and technology and
361
00:27:51.829 --> 00:27:56.109
artificial intelligence is there to help.
Like propagate that, but it is not
362
00:27:56.230 --> 00:28:00.859
the foundation of it. Thank you
so much for all of you have been
363
00:28:00.940 --> 00:28:04.220
listening. This has been a really
important conversation and you know, it's not
364
00:28:04.299 --> 00:28:07.259
always unto, I wouldive, to
talk about people when we talk about artificial
365
00:28:07.299 --> 00:28:12.569
intelligence, but it is imperative that
we do so. Thank you so much
366
00:28:12.569 --> 00:28:17.410
for joining us on our discussion today. I would encourage you to check out
367
00:28:17.410 --> 00:28:22.410
two steps podcast. It's a really
great continuation of our humanity and understanding how
368
00:28:22.450 --> 00:28:26.730
we interact with people and how to
think about it and how to map out
369
00:28:26.730 --> 00:28:30.240
our processes. I wish you a
very successful rest of your day and thank
370
00:28:30.279 --> 00:28:33.480
you again for joining us to Suff
for those folks who are listening who might
371
00:28:33.519 --> 00:28:37.519
want to follow up with you and
learn a little bit more about the podcast,
372
00:28:37.640 --> 00:28:41.319
where can they find it and where
can they find you? First of
373
00:28:41.359 --> 00:28:42.829
all, thank you so much again, Samantha, for having me. This
374
00:28:42.990 --> 00:28:48.750
is great. You can follow me
on twitter at team there's that. My
375
00:28:48.950 --> 00:28:56.299
podcast is called clarity connects. The
website is cloudy Connect Dot Coeo that you
376
00:28:56.380 --> 00:29:00.420
can find me on itunes and anywhere
else where there is a podcast and also,
377
00:29:00.460 --> 00:29:03.019
if you want to learn more about
me, you can go at my
378
00:29:03.380 --> 00:29:07.940
website, which is truth, and
there's Acom. Excellent. Thank you so
379
00:29:07.019 --> 00:29:15.289
much for joining us. Thank you. We totally get it. We publish
380
00:29:15.329 --> 00:29:18.849
a ton of content on this podcast
and it can be a lot to keep
381
00:29:18.849 --> 00:29:22.930
up with. That's why we've started
the BB growth big three, a no
382
00:29:22.089 --> 00:29:26.880
fluff email that boils down our three
biggest takeaways from an entire week of episodes.
383
00:29:27.319 --> 00:29:33.200
Sign up today at Sweet Phish Mediacom
Big Three. That sweet fish Mediacom
384
00:29:34.000 --> 00:29:34.759
Big Three