How to look inside the brain - Carl Schoonover
a<00:00:16.490> thousand-year-old<00:00:17.490> drawing<00:00:18.119> of<00:00:18.330> the<00:00:18.420> brain
a thousand-year-old drawing of the brain
a thousand-year-old drawing of the brain
it's<00:00:20.160> a<00:00:20.220> diagram<00:00:20.610> of<00:00:20.700> the<00:00:20.850> visual<00:00:21.150> system<00:00:21.300> and
it's a diagram of the visual system and
it's a diagram of the visual system and
some<00:00:22.260> things<00:00:22.470> look<00:00:22.650> very<00:00:22.710> familiar<00:00:23.189> today
some things look very familiar today
some things look very familiar today
it's<00:00:24.510> two<00:00:24.689> eyes<00:00:24.869> at<00:00:25.200> the<00:00:25.320> bottom<00:00:25.650> optic<00:00:26.579> nerve
it's two eyes at the bottom optic nerve
it's two eyes at the bottom optic nerve
flowing<00:00:27.329> out<00:00:27.480> from<00:00:27.869> the<00:00:27.960> back<00:00:28.169> it<00:00:29.160> is<00:00:29.310> a<00:00:29.340> very
flowing out from the back it is a very
flowing out from the back it is a very
large<00:00:29.880> nose<00:00:30.419> that<00:00:31.200> doesn't<00:00:31.500> seem<00:00:31.650> to<00:00:31.800> be
large nose that doesn't seem to be
large nose that doesn't seem to be
connected<00:00:32.310> to<00:00:32.369> anything<00:00:32.610> in<00:00:32.730> particular<00:00:33.290> and
connected to anything in particular and
connected to anything in particular and
if<00:00:34.710> we<00:00:34.800> compare<00:00:35.130> this<00:00:35.280> to<00:00:35.579> more<00:00:36.210> recent
if we compare this to more recent
if we compare this to more recent
representations<00:00:37.260> of<00:00:37.320> the<00:00:37.410> visual<00:00:37.680> system
representations of the visual system
representations of the visual system
you'll<00:00:38.399> see<00:00:38.550> that<00:00:38.670> things<00:00:38.850> have<00:00:38.970> gotten
you'll see that things have gotten
you'll see that things have gotten
substantially<00:00:40.170> more<00:00:40.290> complicated<00:00:40.920> over<00:00:41.430> the
substantially more complicated over the
substantially more complicated over the
intervening<00:00:41.879> thousand<00:00:42.300> years<00:00:42.510> and<00:00:42.809> that's
intervening thousand years and that's
intervening thousand years and that's
because<00:00:43.379> today<00:00:43.500> we<00:00:43.769> can<00:00:43.890> see<00:00:44.039> what's<00:00:44.219> inside
because today we can see what's inside
because today we can see what's inside
of<00:00:44.760> the<00:00:45.030> brain<00:00:45.300> rather<00:00:45.840> than<00:00:46.019> just<00:00:46.320> looking<00:00:46.559> at
of the brain rather than just looking at
of the brain rather than just looking at
its<00:00:46.800> overall<00:00:47.129> shape<00:00:47.340> imagine<00:00:48.329> you<00:00:48.449> wanted<00:00:48.660> to
its overall shape imagine you wanted to
its overall shape imagine you wanted to
understand<00:00:49.260> how<00:00:49.649> a<00:00:50.010> computer<00:00:50.730> works<00:00:50.910> and<00:00:51.480> all
understand how a computer works and all
understand how a computer works and all
you<00:00:52.320> could<00:00:52.440> see<00:00:52.620> was<00:00:52.769> a<00:00:52.829> keyboard<00:00:53.309> a<00:00:53.519> mouse<00:00:53.609> a
you could see was a keyboard a mouse a
you could see was a keyboard a mouse a
screen<00:00:54.809> you<00:00:55.710> really<00:00:56.010> would<00:00:56.309> be<00:00:56.429> kind<00:00:56.640> of<00:00:56.670> out
screen you really would be kind of out
screen you really would be kind of out
of<00:00:56.850> luck<00:00:56.969> you<00:00:57.329> you<00:00:57.539> want<00:00:57.750> to<00:00:57.839> be<00:00:57.899> able<00:00:57.989> to<00:00:58.109> open
of luck you you want to be able to open
of luck you you want to be able to open
it<00:00:58.620> up<00:00:58.769> crack<00:00:59.070> it<00:00:59.250> open<00:00:59.339> look<00:00:59.940> at<00:01:00.059> the<00:01:00.179> wiring
it up crack it open look at the wiring
it up crack it open look at the wiring
inside<00:01:00.809> and<00:01:01.410> up<00:01:01.589> until<00:01:01.829> a<00:01:02.249> little<00:01:02.519> more<00:01:02.760> than<00:01:02.850> a
inside and up until a little more than a
inside and up until a little more than a
century<00:01:03.120> ago<00:01:03.390> nobody<00:01:04.290> was<00:01:04.440> able<00:01:04.530> to<00:01:04.650> do<00:01:04.830> that
century ago nobody was able to do that
century ago nobody was able to do that
with<00:01:05.100> the<00:01:05.220> brain<00:01:05.430> nobody<00:01:05.790> had<00:01:05.880> had<00:01:06.030> a<00:01:06.060> glimpse
with the brain nobody had had a glimpse
with the brain nobody had had a glimpse
at<00:01:06.510> the<00:01:06.600> brains<00:01:06.840> wiring<00:01:07.200> and<00:01:07.380> that's<00:01:07.530> because
at the brains wiring and that's because
at the brains wiring and that's because
if<00:01:07.950> you<00:01:08.070> take<00:01:08.310> a<00:01:08.610> brain<00:01:09.120> out<00:01:09.360> of<00:01:09.480> the<00:01:09.570> skull<00:01:09.870> and
if you take a brain out of the skull and
if you take a brain out of the skull and
you<00:01:10.140> cut<00:01:11.010> a<00:01:11.040> thin<00:01:11.310> slice<00:01:11.460> of<00:01:11.730> it<00:01:11.820> put<00:01:12.060> it<00:01:12.150> under
you cut a thin slice of it put it under
you cut a thin slice of it put it under
even<00:01:12.630> a<00:01:12.690> very<00:01:12.900> powerful<00:01:13.170> microscope<00:01:14.070> there's
even a very powerful microscope there's
even a very powerful microscope there's
nothing<00:01:15.030> there<00:01:15.210> it's<00:01:15.660> grey<00:01:15.960> formless<00:01:16.890> there's
nothing there it's grey formless there's
nothing there it's grey formless there's
no<00:01:17.250> structure<00:01:17.820> it<00:01:18.180> won't<00:01:18.390> tell<00:01:18.600> you<00:01:18.630> anything
no structure it won't tell you anything
no structure it won't tell you anything
and<00:01:19.410> this<00:01:19.980> all<00:01:20.250> changed<00:01:20.790> in<00:01:20.940> the<00:01:21.030> late<00:01:21.420> 19th
and this all changed in the late 19th
and this all changed in the late 19th
century<00:01:22.040> suddenly<00:01:23.040> new<00:01:23.700> chemical<00:01:24.150> stains<00:01:24.420> her
century suddenly new chemical stains her
century suddenly new chemical stains her
brain<00:01:24.840> tissue<00:01:25.110> were<00:01:25.680> developed<00:01:26.160> and<00:01:26.370> they
brain tissue were developed and they
brain tissue were developed and they
gave<00:01:26.910> us<00:01:27.060> our<00:01:27.210> first<00:01:27.540> glimpses<00:01:28.140> at<00:01:28.380> brain
gave us our first glimpses at brain
gave us our first glimpses at brain
wiring<00:01:29.370> the<00:01:29.520> computer<00:01:29.940> was<00:01:30.120> cracked<00:01:30.480> open
wiring the computer was cracked open
wiring the computer was cracked open
so<00:01:31.680> what<00:01:32.040> really<00:01:32.280> launched<00:01:32.880> modern
so what really launched modern
so what really launched modern
neuroscience<00:01:33.810> was<00:01:34.200> a<00:01:34.230> stain<00:01:34.620> called<00:01:35.010> the
neuroscience was a stain called the
neuroscience was a stain called the
Golgi<00:01:35.490> stain<00:01:35.700> and<00:01:36.210> they<00:01:36.720> works<00:01:36.930> in<00:01:37.080> a<00:01:37.170> very
Golgi stain and they works in a very
Golgi stain and they works in a very
particular<00:01:37.620> way<00:01:37.890> instead<00:01:38.400> of<00:01:38.460> staining<00:01:38.760> all
particular way instead of staining all
particular way instead of staining all
of<00:01:39.240> the<00:01:39.360> cells<00:01:39.630> inside<00:01:39.900> of<00:01:40.380> a<00:01:40.470> tissue<00:01:40.680> it
of the cells inside of a tissue it
of the cells inside of a tissue it
somehow<00:01:42.090> only<00:01:42.270> stains<00:01:42.720> about<00:01:42.990> 1%<00:01:43.500> of<00:01:43.830> them<00:01:43.950> he
somehow only stains about 1% of them he
somehow only stains about 1% of them he
clears<00:01:44.700> the<00:01:44.820> forests<00:01:45.420> reveals<00:01:46.260> the<00:01:46.500> trees
clears the forests reveals the trees
clears the forests reveals the trees
inside<00:01:47.130> if<00:01:47.370> everything<00:01:47.760> had<00:01:47.910> been<00:01:48.030> labeled
inside if everything had been labeled
inside if everything had been labeled
nothing<00:01:49.110> would<00:01:49.230> have<00:01:49.320> been<00:01:49.350> visible<00:01:49.860> so
nothing would have been visible so
nothing would have been visible so
somehow<00:01:50.400> it<00:01:50.790> shows<00:01:51.150> what's<00:01:51.870> there<00:01:52.200> spanish
somehow it shows what's there spanish
somehow it shows what's there spanish
neuroanatomists<00:01:53.730> Santiago<00:01:54.150> Ramon<00:01:54.420> y<00:01:54.450> Cajal
neuroanatomists Santiago Ramon y Cajal
neuroanatomists Santiago Ramon y Cajal
who's<00:01:55.110> widely<00:01:55.470> considered<00:01:55.980> the<00:01:56.130> father<00:01:56.550> of
who's widely considered the father of
who's widely considered the father of
modern<00:01:56.940> neuroscience<00:01:57.510> applied<00:01:58.320> this<00:01:58.620> Golgi
modern neuroscience applied this Golgi
modern neuroscience applied this Golgi
stain<00:01:59.070> which<00:01:59.430> yields<00:01:59.790> data<00:01:59.910> that<00:02:00.210> looks<00:02:00.390> like
stain which yields data that looks like
stain which yields data that looks like
this<00:02:00.780> and<00:02:01.620> and<00:02:01.920> really<00:02:02.100> gave<00:02:02.400> us<00:02:02.430> the<00:02:02.820> modern
this and and really gave us the modern
this and and really gave us the modern
notion<00:02:03.810> of<00:02:04.140> the<00:02:04.260> nerve<00:02:04.470> cell<00:02:04.860> the<00:02:05.100> neuron<00:02:05.340> and
notion of the nerve cell the neuron and
notion of the nerve cell the neuron and
if<00:02:05.610> you're<00:02:05.760> thinking<00:02:06.090> of<00:02:06.180> the<00:02:06.300> brain<00:02:06.570> as<00:02:06.810> a
if you're thinking of the brain as a
if you're thinking of the brain as a
computer<00:02:07.399> this<00:02:08.399> is<00:02:08.910> the<00:02:09.090> transistor<00:02:09.450> and<00:02:09.929> very
computer this is the transistor and very
computer this is the transistor and very
quickly<00:02:11.039> cahal<00:02:11.459> realized<00:02:11.910> that<00:02:12.180> neurons
quickly cahal realized that neurons
quickly cahal realized that neurons
don't<00:02:12.690> operate<00:02:13.230> alone<00:02:14.069> but<00:02:14.310> rather<00:02:14.370> make
don't operate alone but rather make
don't operate alone but rather make
connections<00:02:15.300> with<00:02:15.540> others<00:02:15.930> that<00:02:16.140> form
connections with others that form
connections with others that form
circuits<00:02:16.860> just<00:02:17.190> like<00:02:17.400> in<00:02:17.550> a<00:02:17.610> computer<00:02:18.000> today<00:02:18.990> a
circuits just like in a computer today a
circuits just like in a computer today a
century<00:02:20.070> later<00:02:20.220> when<00:02:20.430> researchers<00:02:20.910> want<00:02:21.090> to
century later when researchers want to
century later when researchers want to
visualize<00:02:21.450> neurons
visualize neurons
visualize neurons
they<00:02:22.230> light<00:02:22.380> them<00:02:22.530> up<00:02:22.590> from<00:02:22.770> the<00:02:22.860> inside
they light them up from the inside
they light them up from the inside
rather<00:02:23.760> than<00:02:23.910> darkening<00:02:24.420> them<00:02:24.599> and<00:02:24.810> there<00:02:24.990> are
rather than darkening them and there are
rather than darkening them and there are
several<00:02:25.319> ways<00:02:25.470> of<00:02:25.620> doing<00:02:25.800> this<00:02:26.040> but<00:02:26.280> one<00:02:26.370> of
several ways of doing this but one of
several ways of doing this but one of
the<00:02:26.520> most<00:02:26.670> popular<00:02:27.060> ones<00:02:27.270> involves<00:02:28.170> green
the most popular ones involves green
the most popular ones involves green
fluorescent<00:02:28.980> protein
fluorescent protein
fluorescent protein
now<00:02:30.270> green<00:02:30.540> fluorescent<00:02:30.989> protein<00:02:31.020> which
now green fluorescent protein which
now green fluorescent protein which
oddly<00:02:31.860> enough<00:02:32.040> comes<00:02:32.459> from<00:02:32.700> a<00:02:32.819> bioluminescent
oddly enough comes from a bioluminescent
oddly enough comes from a bioluminescent
jellyfish<00:02:34.319> is<00:02:34.590> very<00:02:35.099> useful<00:02:35.550> because<00:02:35.700> if<00:02:35.970> you
jellyfish is very useful because if you
jellyfish is very useful because if you
can<00:02:36.209> get<00:02:36.420> the<00:02:36.599> gene<00:02:36.930> for<00:02:37.260> green<00:02:37.410> fluorescent
can get the gene for green fluorescent
can get the gene for green fluorescent
protein<00:02:38.250> and<00:02:38.340> deliver<00:02:39.120> it<00:02:39.300> to<00:02:39.330> a<00:02:39.510> cell<00:02:39.870> that
protein and deliver it to a cell that
protein and deliver it to a cell that
cell<00:02:40.500> will<00:02:41.010> glow<00:02:41.220> green<00:02:41.550> or<00:02:42.239> any<00:02:42.630> of<00:02:43.020> the<00:02:43.230> many
cell will glow green or any of the many
cell will glow green or any of the many
variants<00:02:44.040> now<00:02:44.280> of<00:02:44.520> green<00:02:44.850> fluorescent
variants now of green fluorescent
variants now of green fluorescent
protein<00:02:45.420> you<00:02:45.720> get<00:02:45.840> a<00:02:45.900> cell<00:02:46.110> to<00:02:46.260> glow<00:02:46.380> many
protein you get a cell to glow many
protein you get a cell to glow many
different<00:02:46.890> colors<00:02:47.130> and<00:02:47.340> so<00:02:47.819> coming<00:02:48.120> up<00:02:48.209> back
different colors and so coming up back
different colors and so coming up back
to<00:02:48.630> the<00:02:48.720> brain<00:02:48.930> this<00:02:49.170> is<00:02:49.230> from<00:02:49.560> a<00:02:49.800> genetically
to the brain this is from a genetically
to the brain this is from a genetically
engineered<00:02:50.640> mouse<00:02:51.239> called<00:02:51.900> Brainbow<00:02:52.620> and
engineered mouse called Brainbow and
engineered mouse called Brainbow and
it's<00:02:53.340> so<00:02:53.519> called<00:02:53.760> of<00:02:53.910> course<00:02:53.940> because<00:02:54.480> all<00:02:54.780> of
it's so called of course because all of
it's so called of course because all of
these<00:02:55.230> neurons<00:02:55.470> are<00:02:55.800> glowing<00:02:56.310> different
these neurons are glowing different
these neurons are glowing different
colors<00:02:57.350> now<00:02:58.350> sometimes<00:02:58.970> neuroscientists
colors now sometimes neuroscientists
colors now sometimes neuroscientists
need<00:03:00.269> to<00:03:00.480> identify<00:03:01.110> individual<00:03:02.040> molecular
need to identify individual molecular
need to identify individual molecular
components<00:03:03.150> of<00:03:03.510> neurons<00:03:03.810> molecules<00:03:04.470> rather
components of neurons molecules rather
components of neurons molecules rather
than<00:03:04.950> the<00:03:05.040> entire<00:03:05.340> cell<00:03:05.760> and<00:03:06.120> there<00:03:06.540> are
than the entire cell and there are
than the entire cell and there are
several<00:03:06.780> ways<00:03:07.019> of<00:03:07.200> doing<00:03:07.260> this<00:03:07.620> but<00:03:07.799> one<00:03:07.950> of
several ways of doing this but one of
several ways of doing this but one of
the<00:03:08.069> most<00:03:08.220> popular<00:03:08.790> ones<00:03:09.000> involves<00:03:09.870> using
the most popular ones involves using
the most popular ones involves using
antibodies<00:03:11.130> and<00:03:11.580> you're<00:03:12.180> familiar<00:03:12.360> of<00:03:12.600> course
antibodies and you're familiar of course
antibodies and you're familiar of course
with<00:03:12.900> antibodies<00:03:13.680> as<00:03:13.860> the<00:03:14.130> henchmen<00:03:14.519> of<00:03:14.610> the
with antibodies as the henchmen of the
with antibodies as the henchmen of the
immune<00:03:15.030> system<00:03:15.510> but<00:03:16.019> it<00:03:16.140> turns<00:03:16.350> out<00:03:16.500> that
immune system but it turns out that
immune system but it turns out that
they're<00:03:16.769> so<00:03:16.950> useful<00:03:17.310> to<00:03:17.430> the<00:03:17.519> immune<00:03:17.730> system
they're so useful to the immune system
they're so useful to the immune system
because<00:03:18.390> they<00:03:18.630> can<00:03:18.810> recognize<00:03:19.200> specific
because they can recognize specific
because they can recognize specific
molecules<00:03:20.610> like<00:03:20.819> for<00:03:21.030> example<00:03:21.120> the<00:03:21.840> coat
molecules like for example the coat
molecules like for example the coat
protein<00:03:22.530> of<00:03:22.739> a<00:03:22.830> virus<00:03:23.250> that's<00:03:23.819> invading<00:03:24.540> the
protein of a virus that's invading the
protein of a virus that's invading the
body<00:03:24.900> and<00:03:25.280> researchers<00:03:26.280> have<00:03:26.430> used<00:03:26.700> this<00:03:26.940> fact
body and researchers have used this fact
body and researchers have used this fact
in<00:03:27.630> order<00:03:27.930> to<00:03:28.320> recognize<00:03:28.980> specific<00:03:29.489> molecules
in order to recognize specific molecules
in order to recognize specific molecules
inside<00:03:30.900> of<00:03:31.079> the<00:03:31.140> brain<00:03:31.350> recognize<00:03:32.340> specific
inside of the brain recognize specific
inside of the brain recognize specific
sub<00:03:33.060> structures<00:03:33.570> of<00:03:33.720> the<00:03:33.810> cell<00:03:34.079> and<00:03:34.260> identify
sub structures of the cell and identify
sub structures of the cell and identify
them<00:03:35.310> individually<00:03:36.060> and<00:03:36.570> you<00:03:36.840> know<00:03:36.930> a<00:03:36.959> lot<00:03:37.200> of
them individually and you know a lot of
them individually and you know a lot of
the<00:03:37.410> images<00:03:37.680> I've<00:03:37.829> been<00:03:37.950> showing<00:03:38.160> you<00:03:38.280> here
the images I've been showing you here
the images I've been showing you here
are<00:03:38.940> very<00:03:39.209> beautiful<00:03:39.690> but<00:03:39.720> they're<00:03:39.959> also<00:03:39.989> very
are very beautiful but they're also very
are very beautiful but they're also very
powerful<00:03:40.859> they<00:03:41.280> have<00:03:41.430> great<00:03:41.760> explanatory
powerful they have great explanatory
powerful they have great explanatory
power<00:03:42.959> this<00:03:43.530> for<00:03:43.769> example<00:03:43.980> is<00:03:44.190> an<00:03:44.280> antibody
power this for example is an antibody
power this for example is an antibody
staining<00:03:44.970> against<00:03:45.600> serotonin<00:03:46.410> transporters
staining against serotonin transporters
staining against serotonin transporters
in<00:03:47.070> a<00:03:47.250> slice<00:03:47.489> of<00:03:47.700> mouse<00:03:47.970> brain<00:03:48.329> and<00:03:48.570> you've
in a slice of mouse brain and you've
in a slice of mouse brain and you've
heard<00:03:49.170> of<00:03:49.260> serotonin<00:03:49.470> of<00:03:49.890> course<00:03:50.130> in<00:03:50.459> the
heard of serotonin of course in the
heard of serotonin of course in the
context<00:03:51.090> of<00:03:51.180> diseases<00:03:51.690> like<00:03:51.840> depression<00:03:52.019> and
context of diseases like depression and
context of diseases like depression and
anxiety<00:03:52.650> you've<00:03:53.190> heard<00:03:53.370> of<00:03:53.459> SSRIs<00:03:53.940> which<00:03:54.390> are
anxiety you've heard of SSRIs which are
anxiety you've heard of SSRIs which are
drugs<00:03:55.079> that<00:03:55.170> are<00:03:55.320> used<00:03:55.590> to<00:03:56.190> treat<00:03:56.730> these
drugs that are used to treat these
drugs that are used to treat these
diseases<00:03:57.150> and<00:03:57.690> in<00:03:58.470> order<00:03:58.739> to<00:03:58.799> understand<00:03:59.280> how
diseases and in order to understand how
diseases and in order to understand how
serotonin<00:04:00.030> works<00:04:00.239> it's<00:04:00.510> critical<00:04:00.989> to
serotonin works it's critical to
serotonin works it's critical to
understand<00:04:01.470> where<00:04:01.739> the<00:04:01.890> serotonin<00:04:02.519> machinery
understand where the serotonin machinery
understand where the serotonin machinery
is<00:04:03.120> and<00:04:03.329> antibody<00:04:04.079> staining<00:04:04.380> is<00:04:04.620> like<00:04:04.829> this
is and antibody staining is like this
is and antibody staining is like this
one<00:04:05.130> can<00:04:05.310> be<00:04:05.430> used<00:04:05.640> to<00:04:06.209> understand<00:04:06.780> that<00:04:07.079> sort
one can be used to understand that sort
one can be used to understand that sort
of<00:04:07.350> question<00:04:07.970> I'd<00:04:08.970> like<00:04:09.030> to<00:04:09.269> leave<00:04:09.420> you<00:04:09.540> with
of question I'd like to leave you with
of question I'd like to leave you with
the<00:04:09.810> following<00:04:10.170> thought<00:04:10.530> green<00:04:11.519> fluorescent
the following thought green fluorescent
the following thought green fluorescent
protein<00:04:12.090> and<00:04:12.709> antibodies<00:04:13.709> are<00:04:13.890> both<00:04:14.100> totally
protein and antibodies are both totally
protein and antibodies are both totally
natural<00:04:15.359> products<00:04:15.930> at<00:04:16.140> the<00:04:16.320> get-go<00:04:16.680> they<00:04:17.310> were
natural products at the get-go they were
natural products at the get-go they were
evolved<00:04:18.479> by<00:04:18.690> nature<00:04:19.109> in<00:04:19.289> order<00:04:19.769> to<00:04:19.889> get<00:04:20.070> a
evolved by nature in order to get a
evolved by nature in order to get a
jellyfish<00:04:20.609> to<00:04:20.789> glow<00:04:20.940> green<00:04:21.120> for<00:04:21.359> whatever
jellyfish to glow green for whatever
jellyfish to glow green for whatever
reason<00:04:21.989> or<00:04:22.620> in<00:04:22.740> order<00:04:22.950> to<00:04:23.070> detect<00:04:23.340> the<00:04:23.430> coat
reason or in order to detect the coat
reason or in order to detect the coat
protein<00:04:24.180> of<00:04:24.870> an<00:04:25.080> invading<00:04:25.500> virus<00:04:25.950> for<00:04:26.070> example
protein of an invading virus for example
protein of an invading virus for example
and<00:04:26.610> only<00:04:27.210> much<00:04:27.450> later<00:04:27.810> did<00:04:28.169> scientists<00:04:28.650> come
and only much later did scientists come
and only much later did scientists come
onto<00:04:29.160> the<00:04:29.250> scene<00:04:29.460> and<00:04:29.580> say<00:04:29.669> hey<00:04:29.850> these<00:04:30.510> are
onto the scene and say hey these are
onto the scene and say hey these are
tools<00:04:31.200> these<00:04:31.680> are<00:04:31.890> functions<00:04:32.580> that<00:04:32.789> we<00:04:32.909> could
tools these are functions that we could
tools these are functions that we could
use<00:04:33.360> in<00:04:33.599> our
use in our
use in our
research<00:04:34.460> tool<00:04:34.819> palette<00:04:35.330> and<00:04:35.629> instead<00:04:36.469> of
research tool palette and instead of
research tool palette and instead of
applying<00:04:37.520> feeble<00:04:38.240> human<00:04:38.719> minds<00:04:39.229> to<00:04:39.409> designing
applying feeble human minds to designing
applying feeble human minds to designing
these<00:04:40.069> tools<00:04:40.400> from<00:04:40.550> scratch<00:04:40.939> there<00:04:41.479> were
these tools from scratch there were
these tools from scratch there were
these<00:04:41.689> ready-made<00:04:42.349> solutions<00:04:42.919> right<00:04:43.129> out
these ready-made solutions right out
these ready-made solutions right out
there<00:04:43.460> in<00:04:43.699> nature<00:04:43.849> developed<00:04:44.539> and<00:04:44.749> refined
there in nature developed and refined
there in nature developed and refined
steadily<00:04:46.099> for<00:04:46.550> millions<00:04:46.819> of<00:04:46.939> years<00:04:47.180> by<00:04:47.419> the
steadily for millions of years by the
steadily for millions of years by the
greatest<00:04:47.870> engineer<00:04:48.229> of<00:04:48.319> all<00:04:48.469> thank<00:04:48.949> you
mind neuroscience brain neuron TED Fellow, Carl, Schoonover, TED, TED-Ed, TED, Ed, TEDEducation