Transcript

0:00
here’s an idea it’s unethical to not
0:02
0:08
forgive me my double negative for one
0:10
second and just let me explain
0:11
artificial intelligence specifically
0:13
robots which can learn problem solve and
0:15
be creative have been a signifier for
0:18
future niska’s about the mid 20th
0:20
century sure in the present we have Siri
0:22
and Watson and deep blue and even the
0:24
Kinect all of which are built on
0:26
artificial intelligence but lack a
0:27
central feature of that shiny metal
0:29
future from movies they are not embodied
0:31
they are stuck inside little boxes and
0:34
left to interact with the world through
0:35
their computerized voices and not well
0:37
bodies Siri tell me a joke I can’t
0:43
I always forget the punchline charming
0:45
as they may be these AI aren’t much more
0:48
than simple information valets though
0:50
that might soon change
0:52
if you’ve yet to have the pleasure allow
0:53
me to introduce you to Baxter at $22,000
0:56
a pop this robot which has hands eyes is
0:59
trainable in comes pre-programmed with a
1:01
certain amount of common sense is
1:03
cheaper than a year’s worth of most
1:05
repetitive fordist human labor and is
1:07
smarter than his Robo factory ancestor
1:10
for some people that is pretty
1:11
threatening and rightfully so
1:13
if you could have a staff of trainable
1:15
multi-task able Baxter’s why would you
1:17
hire people
1:18
people need managing and health care and
1:20
birthday parties and casual Fridays on
1:22
which to wear their Hawaiian shirts and
1:23
Baxter is not the only smartypants Robo
1:26
bro out there the Google self-driving
1:28
car is just a predecessor to the Jonica
1:30
Pro Joe the teaching robot could become
1:32
CGP grey digital Aristotle Watson yes
1:35
Watson from jeopardy is being rejiggered
1:37
to work in diagnostic medicine he could
1:39
be the grandfather to dr. perceptron
1:41
there are even some very smart people
1:43
developing a Perry Mason for all of your
1:45
legal BOTS needs what I’m saying is that
1:48
robots might someday replace us
1:51
professionally there’s an idea now you
1:54
probably think you know what I’m gonna
1:55
say next you think I’m gonna say all of
1:57
these robots doing these jobs that’s bad
2:00
putting all those people out of work
2:01
it’s unethical except I’m not gonna say
2:04
that I’m actually gonna say that
2:05
replacing human laborers with
2:07
delia Tama tones is arguably one of the
2:09
most ethical things you could do now
2:11
there’s an easy line here saying that we
2:13
would just make the robot to do the jobs
2:15
that human beings shouldn’t be doing
2:16
anyways it’s a well-traveled path so
2:18
we’re not going to go down it besides
2:19
we’re not talking about artificial
2:21
intelligence replacing just the
2:22
dangerous and menial jobs but also the
2:24
complex fast-paced extremely precise and
2:26
knowledge heavy jobs that’s a lot of
2:29
jobs and so that might cause a lot of
2:32
problems and the problems that come with
2:34
large-scale social economic and
2:36
corporate restructuring are many and
2:38
varied and some are real scary but the
2:41
more complex ethical discussion doesn’t
2:43
involve the relationship between humans
2:44
and each other or robots but between
2:46
humans and the future is it ethical to
2:50
stop improvement would it be ethical for
2:52
us to stop making and deploying Baxter’s
2:55
and Google cars and warehouse robots and
2:57
streamlines simplify cost reduce and
2:59
increase the reliability of any number
3:02
of processes because people are
3:04
currently doing them because up until
3:06
this point people is all we had
3:07
philosopher Alain Badiou describes what
3:09
he calls the ethic of truths the most
3:11
important thing is the event which
3:13
seizes humanity and breaks it from the
3:16
norm that event and the way people are
3:18
faithful to it can contribute to
3:19
humanity’s immortality as a group of
3:21
beings which create and continue and
3:24
there’s a lot of hand waving when you
3:26
talk about the future but you rights
3:27
that there is only one question in the
3:29
ethic of truths
3:30
how will I as someone continue to exceed
3:33
my being how will i link the things I
3:36
know in a consistent fashion via the
3:39
effects of being seized by the unknown
3:41
not following through not continuing to
3:44
make plastic piles that are fun to be
3:45
with could be seen as a betrayal of that
3:47
norm breaking event for instance do we
3:50
deny future generations the possibility
3:52
of cheaper better medical care from
3:53
robot doctors because we want to
3:55
maintain the no robó status quo I mean
3:57
the printing press threatened so much
3:59
about the status quo and we’re all very
4:01
glad we saw that through aren’t we not
4:03
to mention the computer and the
4:04
automobile but hmm because human
4:08
progress gave us medicine and the
4:10
internet and cup holders but it also
4:11
gave us the atomic bomb
4:15
and furbies any ethics of progress has
4:18
to account for the fact that on the
4:19
horizon of that progress lays some
4:21
terrible atrocity it has to approach
4:24
ideas of progress on objectively you
4:26
can’t stop progress isn’t exactly the
4:29
most comforting ethic is it progress as
4:31
an ethic can’t unequivocally prioritize
4:33
that progress before everything else
4:36
happiness and physical safety are both
4:38
pretty important but an ethics of
4:40
progress can help us organize what comes
4:42
next in life we have to accept the
4:44
possibility of the bad stuff that comes
4:46
packaged with progress and focus on what
4:48
happens after the progress dust has
4:50
settled we have to keep going and why
4:53
because of the greater grander human
4:55
experience we’d only be able to achieve
4:57
with the help of our artificially
4:59
intelligent robot friends you got
5:01
thought what do you guys think is it
5:04
unethical to stop the development of
5:06
artificial intelligence let us know in
5:08
the comments and I for one welcome our
5:10
new robot overlords subscribe subscribe
5:13
subscribe I think the saddest music in
5:16
the world is probably any song written
5:18
by the Vengaboys what seems you guys had
5:20
to say about the source of emotion in
5:21
music TV joj says that our emotional
5:23
response to music might be a kind of
5:24
chicken or the egg problem in that we
5:26
are trained in some way watching movies
5:28
and other visual media to associate
5:30
certain kinds of sounds with certain
5:32
images and that it’s kind of uses the
5:34
word Pavlovian which is cisco ql asks if
5:37
sad music isn’t sad does that mean that
5:39
a sad picture isn’t sad I think a lot of
5:41
the same stuff applies but a picture can
5:44
be a lot more literal it’s not nearly as
5:46
encoded as a piece of music but you know
5:49
personal experience still plays a huge
5:51
role in determining what your emotional
5:53
reaction to an artwork or something is
5:55
going to be congratulations to douche
5:57
masta who managed to summarize the
5:59
entire episode rather well in one
6:01
comment if you can find it this
6:02
conversation between op d-day 2001 and
6:04
symbiotic ism is really great suggest
6:07
you try to get them its new ctrl F
6:09
happen.i
6:10
seeking upon it Jesse Harris makes the
6:12
astute point that the only piece of
6:14
music which contains objective emotion
6:16
is yakety sax and I think I’d totally
6:18
agree Duke Keefe D has a further
6:20
correction on a previous correction
6:21
saying that MT GOx actually was Magic
6:24
the Gathering and not mining team Goss
6:26
and I I don’t know what to think anymore
6:29
a wolf can makes a really interesting
6:30
point about the enjoyment of non
6:33
mainstream music and music that doesn’t
6:34
contain standardized emotional content
6:36
and wonders whether or not when people
6:38
enjoy that music it’s a kind of reaction
6:41
or that they are comforting themselves
6:43
because they are responding to the
6:45
mainstream which i think is is really
6:47
interesting it’s really good this week’s
6:48
episode was brought to you by the work
6:49
of these diligent people and the tweet
6:52
of the week comes from a wolfgang smith
6:53
who imagines the internet as one
6:55
building which it is

By Admin

Leave a Reply

Your email address will not be published. Required fields are marked *