Transcript
WEBVTT
1
00:00:04.879 --> 00:00:09.429
Welcome back to the new podcast series. It's Kelsey corps with sweet fish media.
2
00:00:09.869 --> 00:00:13.150
Today I'm going to share with you
a new podcast by our friends over
3
00:00:13.189 --> 00:00:18.589
at in Taros. It's called what
lies beneath. This podcast is for anything
4
00:00:18.629 --> 00:00:25.100
and everything about risk. The host
is Jennifer Bisegley and she interviews risk management
5
00:00:25.179 --> 00:00:30.300
professionals from every industry, learning their
INS and outs, ups and downs,
6
00:00:30.339 --> 00:00:35.060
about protecting their companies and themselves from
risk. If you think you'll find the
7
00:00:35.140 --> 00:00:39.729
show valuable after you check out this
quick snippet, just search what lies beneath
8
00:00:39.810 --> 00:00:44.570
and apple or your favorite podcast player
makes you subscribe and if you really like
9
00:00:44.729 --> 00:00:48.850
it, don't forget to leave a
review. In this episode snippet she talks
10
00:00:48.929 --> 00:00:55.560
to nick by about the human and
AI partnership. Let's tune in. Ai
11
00:00:55.719 --> 00:01:00.799
Is the most significant tack advancement that
we've seen in the last fifteen years and
12
00:01:00.840 --> 00:01:03.280
even though it's been around for a
long time, the increase in productivity we've
13
00:01:03.280 --> 00:01:08.310
seen with the advent of big data, of major internet platforms and available publicly,
14
00:01:08.870 --> 00:01:12.390
has been able to technology to take
off. And, in addition,
15
00:01:12.709 --> 00:01:19.340
the technology is advancing in many different
dimensions, not just some machine learning in
16
00:01:19.579 --> 00:01:25.420
computer vision and natural language processing,
and given that the major fuels for AI,
17
00:01:26.219 --> 00:01:30.299
data processing power and algorithms, are
all growing at a very significant rate,
18
00:01:30.379 --> 00:01:33.930
I think we're at the beginning of
a very long AI boom. So
19
00:01:34.129 --> 00:01:38.969
I think this is the most significant
technology that will likely reshape the economic landscape
20
00:01:40.209 --> 00:01:45.049
in the decade ahead. With regards
to the Investment Strategy, I guess that's
21
00:01:45.090 --> 00:01:49.840
either. There are two major things
that I'm particularly interested in. The first
22
00:01:49.040 --> 00:01:53.200
is the use of Ai to do
something fun, mentally new and valuable,
23
00:01:53.560 --> 00:01:57.439
and I can be used for a
lot of things and certainly as a hot
24
00:01:57.560 --> 00:02:01.750
area. It's overmarked at these days, but when AI is combined with significant
25
00:02:01.790 --> 00:02:07.590
and often unique data sets, it
can create fundamentally new capabilities that weren't possible
26
00:02:08.030 --> 00:02:10.590
just, you know, three or
four years ago, and sometimes not even
27
00:02:10.629 --> 00:02:16.620
conceivable four or five years ago,
and those are the types of uses of
28
00:02:16.659 --> 00:02:21.939
Ai that I think the biggest companies
will likely be created around. The Second
29
00:02:22.020 --> 00:02:29.300
Tiller is a distinctive data advantage.
A lot of AI capabilities will be competized
30
00:02:29.340 --> 00:02:34.050
and offered by large platforms as free
services in exchange for use of data.
31
00:02:34.729 --> 00:02:38.409
What's interesting to me is when a
company has found a way to get access
32
00:02:38.449 --> 00:02:45.360
to privileged access, exclusive access to
very significant data sets and really be able
33
00:02:45.400 --> 00:02:50.919
to get a meaningful head start in
an area and build additional motes on top
34
00:02:51.000 --> 00:02:54.680
of that initial data set access.
And sometimes that comes from additional customer data,
35
00:02:55.520 --> 00:03:00.469
sometimes it comes through large business development
deals with other data providers. But
36
00:03:00.550 --> 00:03:05.030
if a company is able to do
something fundamentally new and valuable and they have
37
00:03:05.189 --> 00:03:08.870
a meaningful data advantage and that leads
to significant boats, I think is a
38
00:03:08.909 --> 00:03:14.379
chance to build a multi billion dollar
company. I think they're all good points
39
00:03:14.580 --> 00:03:16.340
and you know, here at and
tear us we see the same thing that
40
00:03:16.539 --> 00:03:21.860
technology and AI of the last ten
years of just created for us. It's
41
00:03:21.900 --> 00:03:24.259
been a market. I do think
it's a little bit interesting to think about
42
00:03:24.379 --> 00:03:30.129
things that aren't talked about enough when
it comes to artificial intelligence, and two
43
00:03:30.210 --> 00:03:32.930
things that come to mind for me
immediately. One may because we had lawyers
44
00:03:32.969 --> 00:03:38.250
in the office yesterday, are the
business ethics side of ai or even the
45
00:03:38.330 --> 00:03:43.159
unconscious bias that are a lot of
folks are talking about. So you know,
46
00:03:43.280 --> 00:03:46.000
what are your thoughts on those?
Or do you see other areas that
47
00:03:46.080 --> 00:03:49.080
haven't really been addressed or that are
on the horizon. We should be thinking
48
00:03:49.080 --> 00:03:53.919
about great question. I think there
are a variety of subjects that aren't talked
49
00:03:53.960 --> 00:03:57.509
about enough. I think the one
that I've put at the top of my
50
00:03:57.669 --> 00:04:02.710
list is that AI is very different
from human intelligence. It's a natural initial
51
00:04:02.750 --> 00:04:09.270
reaction when a new technology comes out
to understand and think creatively about the threats
52
00:04:09.310 --> 00:04:14.300
that it poses. And I think
there is an immediate point that many people
53
00:04:14.460 --> 00:04:17.819
need in AI research labs and then
the press sort of seized upon and the
54
00:04:17.899 --> 00:04:21.620
early days of our most recent ai
boom, which is that this will lead
55
00:04:21.660 --> 00:04:28.449
to a kind of super human form
of intelligence, Agi and its strongest form.
56
00:04:28.930 --> 00:04:30.930
I think what we're finding, and
is you really big into the details
57
00:04:30.970 --> 00:04:36.329
of how AI is progressing, it's
very different than human intelligence. Human intelligence
58
00:04:36.410 --> 00:04:43.839
isn't interesting evolutionary luge of a lot
of different capabilities that have merged together to
59
00:04:44.000 --> 00:04:50.000
help us optimize our survival and reproduction, and to think that computer intelligence will
60
00:04:50.040 --> 00:04:55.189
follow the same general path or that
is representative of the same types of general
61
00:04:55.230 --> 00:05:00.509
understandings, I think is is an
interesting leap. We may both be subsets
62
00:05:00.550 --> 00:05:04.350
of a rawder whole, but a
lot remains to be proven. And specifically
63
00:05:04.589 --> 00:05:10.699
what a as the very good at
to date is solving very narrow pattern recognition
64
00:05:10.740 --> 00:05:17.620
problems extremely well, so image recognition, language recognition or paradigmatic use cases.
65
00:05:18.339 --> 00:05:21.730
But when it comes to the field
of human judgment, so far, and
66
00:05:21.850 --> 00:05:26.490
we're so early in this revolution,
so there's a long way ways to go,
67
00:05:27.529 --> 00:05:30.569
the results so far have been pretty
disappointing. Ai has not been able
68
00:05:30.730 --> 00:05:38.279
to combine different understandings of context and
be able to form a kind of reasoning
69
00:05:38.439 --> 00:05:42.360
that we would see as legitimate for
complex human decisions. So I think there's
70
00:05:42.360 --> 00:05:46.959
a lot to be done and and
the immediate anthropomorphism of saying Oh ai is
71
00:05:47.000 --> 00:05:50.110
going to look a lot like us, I think is giving rise to a
72
00:05:50.269 --> 00:05:55.790
more complex reality. And many people
have said, I think in my view,
73
00:05:55.870 --> 00:06:00.790
rightly, that AI has made its
biggest jumps by mimicking the human mind
74
00:06:00.829 --> 00:06:03.779
and it may need to do more
of that in particular and building in more
75
00:06:03.899 --> 00:06:10.579
innate structure, sort of prewiring to
do certain things very well like we do.
76
00:06:10.699 --> 00:06:15.180
We have a predisposition to language that's
extraordinary and it's made us very effective.
77
00:06:15.259 --> 00:06:17.689
There are many other examples as well. So I guess at the top
78
00:06:17.769 --> 00:06:21.930
of my list of things that are
not discussed enough. I would I would
79
00:06:21.930 --> 00:06:27.730
rank that very high. On ethics, I think it's a critical area for
80
00:06:27.850 --> 00:06:31.370
AI to be broadly used and broadly
trusted, and particularly when you get to
81
00:06:31.449 --> 00:06:36.519
the level of making human like judgments, for people will ask, why should
82
00:06:36.519 --> 00:06:41.399
I trust that? I know who
who said so and why can you explain
83
00:06:41.439 --> 00:06:45.519
this to me? We need to
be able to trust machines with thing with
84
00:06:46.000 --> 00:06:49.110
the type of logic that would count
in a human explanation, and we're we
85
00:06:49.269 --> 00:06:53.910
have a ways to go in that. I think there is a part of
86
00:06:53.990 --> 00:06:59.509
that problem is a bias of data
sets. You know one, one fascinating
87
00:06:59.670 --> 00:07:02.540
bias of data sets and you see
this in science, he's you see in
88
00:07:02.660 --> 00:07:06.060
public safety, you see it in
a lot of other fields. Relates to
89
00:07:06.259 --> 00:07:12.339
gender by my wife is a genomics
entrepreneur in the field of women's health and
90
00:07:12.500 --> 00:07:18.009
she's really taught me about this and
there's some powerful example things like the development
91
00:07:18.050 --> 00:07:26.170
of seat belts. Seat belts were
made by men and optimize for men and
92
00:07:26.290 --> 00:07:32.399
using data relating to men and as
a result, seat belts historically works very
93
00:07:32.439 --> 00:07:35.360
well for men but worked much less
well for women. And there are a
94
00:07:35.399 --> 00:07:39.879
lot more fatilities and car accidents for
women, just because of the way that
95
00:07:40.040 --> 00:07:43.839
sort of the data that they were
built upon. In a lot of fields
96
00:07:43.920 --> 00:07:46.870
and health, the initial, you
know, discoveries and the choice of what
97
00:07:46.990 --> 00:07:53.149
subjects to pursue were made by men, and the theories that followed them the
98
00:07:53.310 --> 00:07:57.230
men. The you know, products
that came out of them were developed by
99
00:07:57.269 --> 00:07:59.459
men, and so, as you
can imagine, there was very little focus
100
00:07:59.540 --> 00:08:05.980
on women at all and I think
an understanding of the significance of data.
101
00:08:07.100 --> 00:08:11.139
That is it need not be biased
in the narrow sense. It could do
102
00:08:11.379 --> 00:08:16.610
just be not representative broadly, but
understanding that when you're developing models, I
103
00:08:16.689 --> 00:08:22.810
think is critical. Such an interesting
topic by Nick over at Van Rock and
104
00:08:22.970 --> 00:08:26.970
wow. Jennifer is going to be
a phenomenal host to this podcast again.
105
00:08:26.089 --> 00:08:31.720
To find this show, just search
what lies beneath an apple podcast or wherever
106
00:08:31.879 --> 00:08:35.080
you do. You're listening, subscribe
and leave a review if you like it.
107
00:08:35.600 --> 00:08:37.519
Maybe this podcast isn't for you,
but you know someone who would like
108
00:08:37.639 --> 00:08:41.870
it, well, don't forget to
tell them so you in next week for
109
00:08:41.950 --> 00:08:48.190
an extra special episode. Until then, I hate it when podcasts incessantly ask
110
00:08:48.269 --> 00:08:52.710
their listeners for reviews, but I
get why they do it, because reviews
111
00:08:52.750 --> 00:08:56.539
are enormously helpful when you're trying to
grow a podcast audience. So here's what
112
00:08:56.620 --> 00:08:58.460
we decided to do. If you
leave a review for me to be growth
113
00:08:58.539 --> 00:09:03.019
in apple podcasts and email me a
screenshot of the review to James at Sweet
114
00:09:03.019 --> 00:09:07.779
Fish Mediacom, I'll send you a
signed copy of my new book, content
115
00:09:07.860 --> 00:09:11.529
based networking, how to instantly connect
with anyone you want to know. We
116
00:09:11.649 --> 00:09:13.330
get a review, you get a
free book. We both win.