You can not select more than 25 topics
Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
2501 lines
82 KiB
2501 lines
82 KiB
WEBVTT
|
|
|
|
00:00.000 --> 00:24.880
|
|
That brings in a house!
|
|
|
|
00:24.880 --> 00:29.440
|
|
I'm so excited I can freak.
|
|
|
|
00:29.440 --> 00:35.280
|
|
Not just this muscle car that gets me excited, it's my guest today.
|
|
|
|
00:35.280 --> 00:38.080
|
|
Sooth spider is in the in the chat already.
|
|
|
|
00:38.080 --> 00:40.760
|
|
Look at that, garden variety human.
|
|
|
|
00:40.760 --> 00:41.760
|
|
That should be good.
|
|
|
|
00:41.760 --> 00:45.600
|
|
Just give me a sound check if you're in the chat already, please, for this one.
|
|
|
|
00:45.600 --> 00:49.040
|
|
Just to make sure everybody's lined up, it should be okay, I just started it.
|
|
|
|
00:49.040 --> 00:52.480
|
|
Thank you there, sir.
|
|
|
|
00:52.480 --> 00:56.800
|
|
And yes, we are speaking to William M. Briggs today.
|
|
|
|
00:56.800 --> 01:01.840
|
|
He is a member of this broken science initiative, it's my first BSI guest.
|
|
|
|
01:01.840 --> 01:08.360
|
|
If you didn't follow my directive yesterday and go and overload the website of brokenscience.org
|
|
|
|
01:08.360 --> 01:14.000
|
|
with your signups, please do it today so that Emily and Greg get a little blip in their
|
|
|
|
01:14.000 --> 01:18.640
|
|
website traffic and they see when we mention us, it would be really great to show them
|
|
|
|
01:18.640 --> 01:30.800
|
|
the power of gigo-on-biological.
|
|
|
|
01:30.800 --> 01:41.840
|
|
Matt, you may recognize Rodney Mullen here in this little spin.
|
|
|
|
01:41.840 --> 01:44.400
|
|
Oh, yeah.
|
|
|
|
01:44.400 --> 01:45.960
|
|
Guys, it's sweetheart.
|
|
|
|
01:45.960 --> 01:46.960
|
|
Oh, my goodness.
|
|
|
|
01:46.960 --> 01:51.920
|
|
Did you know that he got surgery on his spine and now he can stand up straight like skating
|
|
|
|
01:51.920 --> 01:58.040
|
|
again like differently and he's got to relearn everything now because his body moves again.
|
|
|
|
01:58.040 --> 01:59.040
|
|
It's crazy.
|
|
|
|
01:59.040 --> 02:01.040
|
|
I'll be thought, no, I have no idea.
|
|
|
|
02:01.040 --> 02:04.240
|
|
Apparently he's got like three more inches on him though.
|
|
|
|
02:04.240 --> 02:11.800
|
|
I'm afraid that the latest data tells us that we're dealing with essentially...
|
|
|
|
02:11.800 --> 02:16.760
|
|
I want you to go meet my friend Mark on this show that's now on the screen after this.
|
|
|
|
02:16.760 --> 02:24.000
|
|
I think he's a really good guy and you guys would be good acquaintances, I think.
|
|
|
|
02:24.000 --> 02:25.000
|
|
Sure.
|
|
|
|
02:25.000 --> 02:28.000
|
|
And he's got readers that watch him, you know, so you're a book guy.
|
|
|
|
02:28.000 --> 02:33.240
|
|
I mean, he's got readers.
|
|
|
|
02:33.240 --> 02:38.680
|
|
Not that my viewers don't read, I'm not saying that, but Mark has got some really studious
|
|
|
|
02:38.680 --> 02:39.680
|
|
followers.
|
|
|
|
02:39.680 --> 02:42.320
|
|
I really like Mark.
|
|
|
|
02:42.320 --> 02:44.480
|
|
I think truth is good for kids.
|
|
|
|
02:44.480 --> 02:48.720
|
|
We're so busy lying, we don't even recognize the truth no more in society.
|
|
|
|
02:48.720 --> 02:50.920
|
|
We want everybody to feel good.
|
|
|
|
02:50.920 --> 02:55.240
|
|
That's not the way life is.
|
|
|
|
02:55.240 --> 03:00.960
|
|
But you can tell if someone's lying, you know, you can sort of feel it in people.
|
|
|
|
03:00.960 --> 03:01.960
|
|
And I have lied.
|
|
|
|
03:01.960 --> 03:02.960
|
|
I'm sure I'll lie again.
|
|
|
|
03:02.960 --> 03:05.520
|
|
I don't want to lie, you know, and I don't think I'm a liar.
|
|
|
|
03:05.520 --> 03:06.600
|
|
I try not to be a liar.
|
|
|
|
03:06.600 --> 03:08.000
|
|
I don't want to be a liar.
|
|
|
|
03:08.000 --> 03:14.320
|
|
I think it's like really important not to be a liar.
|
|
|
|
03:14.320 --> 03:30.400
|
|
Ah, ladies and gentlemen, I'm so excited to be here today and have this show running
|
|
|
|
03:30.400 --> 03:37.680
|
|
in the afternoon working in sync with my family and I just feels really good to start
|
|
|
|
03:37.680 --> 03:44.280
|
|
the stream in the afternoon and so I'm really happy that we've got Matt on the show.
|
|
|
|
03:44.280 --> 03:45.280
|
|
Just a few more seconds.
|
|
|
|
03:45.280 --> 03:50.720
|
|
I got to get this, the recording in the background running, so I'm going to keep this intro going
|
|
|
|
03:50.720 --> 03:51.720
|
|
as usual.
|
|
|
|
03:51.720 --> 04:01.160
|
|
It's like one more minute or so, I think I'm in it 20 seconds.
|
|
|
|
04:01.160 --> 04:09.160
|
|
If you are joining us for the first time, this is Giga Ohm Biological.
|
|
|
|
04:09.160 --> 04:14.520
|
|
It's a high-resistance low-noise information brief brought to you by a biologist.
|
|
|
|
04:14.520 --> 04:20.000
|
|
We are based in Pittsburgh, Pennsylvania in the back of a rented garage and yeah, that's
|
|
|
|
04:20.000 --> 04:24.120
|
|
the state of knowledge and information in our world.
|
|
|
|
04:24.120 --> 04:27.440
|
|
We're basically working to dispel an enchantment.
|
|
|
|
04:27.440 --> 04:33.880
|
|
It is an enchantment that has been laid down a long time ago and laid down deep and thick.
|
|
|
|
04:33.880 --> 04:38.320
|
|
If you have been following me for a while, you're here at the top of the wave where we
|
|
|
|
04:38.320 --> 04:41.120
|
|
are staying focused on the biology.
|
|
|
|
04:41.120 --> 04:45.680
|
|
We don't take the bait on TV and social media and we love our neighbors.
|
|
|
|
04:45.680 --> 04:52.320
|
|
The way it works is that people share this work and some people even financially contribute
|
|
|
|
04:52.320 --> 04:56.400
|
|
and slowly biology finds new people and more people wake up.
|
|
|
|
04:56.400 --> 04:57.400
|
|
That's my plan anyway.
|
|
|
|
04:57.400 --> 05:03.440
|
|
You can find me at GigaOhm Biological.com and you should be watching at stream.gigohm.vial
|
|
|
|
05:03.440 --> 05:09.000
|
|
if you're not already, which is the new place that I'm trying to store all these videos.
|
|
|
|
05:09.000 --> 05:16.120
|
|
I think it's working out really well thanks to our supporter and follower Ted on the other
|
|
|
|
05:16.120 --> 05:17.280
|
|
side of the planet.
|
|
|
|
05:17.280 --> 05:20.440
|
|
Mike Vandenberg, mixed date.
|
|
|
|
05:20.440 --> 05:24.960
|
|
These are the people that make Giga Ohm Biological possible including all of the subscribers
|
|
|
|
05:24.960 --> 05:31.440
|
|
that are giving some small or large amount monthly or even per year.
|
|
|
|
05:31.440 --> 05:36.720
|
|
There are also sub-stack subscribers that have recently joined the fight and it's been
|
|
|
|
05:36.720 --> 05:42.760
|
|
really a privilege to see how far we can get with this.
|
|
|
|
05:42.760 --> 05:48.600
|
|
I hope that we can continue to bring this biology to the masses.
|
|
|
|
05:48.600 --> 05:50.120
|
|
This is the independent bright web.
|
|
|
|
05:50.120 --> 05:55.960
|
|
It's the opposite of the intellectual dark web which is this loosely organized group
|
|
|
|
05:55.960 --> 06:01.560
|
|
of influencers that was put in place before the pandemic to control the narrative, if
|
|
|
|
06:01.560 --> 06:07.440
|
|
you will, to set this limited spectrum of debate that we are all trapped within.
|
|
|
|
06:07.440 --> 06:11.280
|
|
But it's actually an illusion that is sustained only through your active participation.
|
|
|
|
06:11.280 --> 06:14.840
|
|
You can drop your own hand from your own eyes and escape.
|
|
|
|
06:14.840 --> 06:16.800
|
|
It is just a question of non-compliance.
|
|
|
|
06:16.800 --> 06:20.640
|
|
You've got to actually accept that you've been lied to.
|
|
|
|
06:20.640 --> 06:26.400
|
|
And I think one of the most exciting things about broken science is that it really, although
|
|
|
|
06:26.400 --> 06:33.600
|
|
to steal a phrase from Greg, it may not be for everybody, but it is for anybody to understand
|
|
|
|
06:33.600 --> 06:42.200
|
|
how science has been distorted, has been essentially co-opted into something that no longer is
|
|
|
|
06:42.200 --> 06:50.400
|
|
a real pursuit of knowledge, but it's instead a pursuit of noise or the creation of noise
|
|
|
|
06:50.400 --> 06:58.640
|
|
and I hope that somebody has sophisticated, unlike me, William M. Briggs is going to
|
|
|
|
06:58.640 --> 07:05.080
|
|
be able to help us get through this because the distortion of science and the pursuit
|
|
|
|
07:05.080 --> 07:10.160
|
|
of knowledge has been an integral part of the conscious and intelligent manipulation
|
|
|
|
07:10.160 --> 07:14.840
|
|
of the organized habits and opinions of the masses, and it is an important element of
|
|
|
|
07:14.840 --> 07:15.840
|
|
how we are governed.
|
|
|
|
07:15.840 --> 07:22.760
|
|
In fact, I would argue it is a primary way by which we are governed, and as a recovering
|
|
|
|
07:22.760 --> 07:30.660
|
|
academic biologist, I really feel as though I have a unique, sad insight into how distorted
|
|
|
|
07:30.660 --> 07:38.560
|
|
one's mind can become if you pursue questions based on this reiterative, preparean, questioning
|
|
|
|
07:38.560 --> 07:44.760
|
|
mechanism, and don't realize how easily that can spin out of control and stop producing
|
|
|
|
07:44.760 --> 07:49.040
|
|
knowledge and instead produce noise.
|
|
|
|
07:49.040 --> 07:52.840
|
|
I believe that's how the intellectual dark web works, and I believe that's how they've
|
|
|
|
07:52.840 --> 07:58.800
|
|
essentially removed our ability to exercise informed consent with almost anything with
|
|
|
|
07:58.800 --> 08:03.600
|
|
regard to socioeconomic status and also with regard to public health, with regard to the
|
|
|
|
08:03.600 --> 08:04.840
|
|
education of our children.
|
|
|
|
08:04.840 --> 08:11.280
|
|
We're just so clueless that we just don't have the ability to exercise informed consent
|
|
|
|
08:11.280 --> 08:12.280
|
|
anymore.
|
|
|
|
08:12.280 --> 08:20.400
|
|
And one of the primary bats with with barbed wire wrapped around it is this use of academic
|
|
|
|
08:20.400 --> 08:26.160
|
|
science to create the illusion of knowledge that can then be used against us in a technocratic
|
|
|
|
08:26.160 --> 08:27.160
|
|
sort of way.
|
|
|
|
08:27.160 --> 08:30.240
|
|
Oh my gosh, I'm still talking.
|
|
|
|
08:30.240 --> 08:31.240
|
|
Let me see.
|
|
|
|
08:31.240 --> 08:32.440
|
|
I got him over here.
|
|
|
|
08:32.440 --> 08:33.440
|
|
Yes.
|
|
|
|
08:33.440 --> 08:34.440
|
|
Good.
|
|
|
|
08:34.440 --> 08:38.120
|
|
I'm going to put this down and I'm going to shut this music off and I'm going to say hello
|
|
|
|
08:38.120 --> 08:46.040
|
|
to my good friend, William M. Briggs, actually Matt and I met on a beach and it was very
|
|
|
|
08:46.040 --> 08:47.040
|
|
romantic.
|
|
|
|
08:47.040 --> 08:54.200
|
|
Matt was overdressed for the beach and I was flying my kite and I don't know.
|
|
|
|
08:54.200 --> 08:57.320
|
|
We later met at another meeting and so it was wonderful.
|
|
|
|
08:57.320 --> 09:00.280
|
|
We've been friends since.
|
|
|
|
09:00.280 --> 09:01.280
|
|
I don't know.
|
|
|
|
09:01.280 --> 09:02.280
|
|
Go ahead and introduce yourself.
|
|
|
|
09:02.280 --> 09:08.080
|
|
Tell us what you're up to and then we'll just, I don't know, randomly discuss our clean
|
|
|
|
09:08.080 --> 09:09.080
|
|
eyes.
|
|
|
|
09:09.080 --> 09:13.400
|
|
You know, Greg Glassman and Emily Capin, like you said, the broken science thing.
|
|
|
|
09:13.400 --> 09:19.680
|
|
They found me and I've been delighted to be able to have found them in return and people
|
|
|
|
09:19.680 --> 09:21.600
|
|
like yourself.
|
|
|
|
09:21.600 --> 09:26.240
|
|
There's a core group of people who understand what's going on and what's going wrong with
|
|
|
|
09:26.240 --> 09:28.080
|
|
science, I should say.
|
|
|
|
09:28.080 --> 09:32.960
|
|
It's not everybody like you say, like Greg says, but it is, it is some people anyway
|
|
|
|
09:32.960 --> 09:41.400
|
|
and we, we are trying to gather together and at least speak to those willing to listen
|
|
|
|
09:41.400 --> 09:45.520
|
|
about what we think is wrong with science and how we could go about fixing it.
|
|
|
|
09:45.520 --> 09:50.800
|
|
And my background is like yourself, recovering academic, you know, I started off in academia.
|
|
|
|
09:50.800 --> 09:59.440
|
|
I was in the, I was in the Department of Medicine at Cornell University, you know, the med school.
|
|
|
|
09:59.440 --> 10:02.000
|
|
I was a statistician there, bio statistician there.
|
|
|
|
10:02.000 --> 10:05.880
|
|
I'm sorry, but my favorite part is the fact that you were a meteorologist because when
|
|
|
|
10:05.880 --> 10:10.640
|
|
I was a child, when I was a child, I thought that the weather was also very fascinating.
|
|
|
|
10:10.640 --> 10:13.640
|
|
And I sat outside in, in thunderstorms and stuff.
|
|
|
|
10:13.640 --> 10:19.560
|
|
I thought that tornado warnings were kind of cool and my parents never understood it.
|
|
|
|
10:19.560 --> 10:22.040
|
|
But I just, I was never afraid of these things.
|
|
|
|
10:22.040 --> 10:25.360
|
|
I just thought they were interesting.
|
|
|
|
10:25.360 --> 10:31.640
|
|
The reason I got involved, I was in the Air Force a long time ago doing cryptography.
|
|
|
|
10:31.680 --> 10:37.360
|
|
And I was wondering, should I go get out of the Air Force and do something more useful
|
|
|
|
10:37.360 --> 10:38.920
|
|
with my life?
|
|
|
|
10:38.920 --> 10:44.760
|
|
And at that time, this was in the late 80s, early 90s, global warming as it was called
|
|
|
|
10:44.760 --> 10:46.360
|
|
then was the thing.
|
|
|
|
10:46.360 --> 10:48.400
|
|
And I thought, hey, this is, this is interesting.
|
|
|
|
10:48.400 --> 10:53.280
|
|
And I sort of, and I wrote some of the people and the original IPCC, the Intergovernmental
|
|
|
|
10:53.280 --> 10:58.360
|
|
Panel on Climate Change, some of the scientists involved in that.
|
|
|
|
10:58.360 --> 11:03.360
|
|
And I was just this, nobody sergeant at a cadina Air Force base in Okinawa.
|
|
|
|
11:03.360 --> 11:06.600
|
|
I said, hey, can I read this, this in that paper?
|
|
|
|
11:06.600 --> 11:12.920
|
|
And this one gentleman sent it to me and to my surprise, I could read it.
|
|
|
|
11:12.920 --> 11:15.920
|
|
So I thought, you know what, maybe I better get out and do something about this.
|
|
|
|
11:15.920 --> 11:16.920
|
|
It seems so important.
|
|
|
|
11:16.920 --> 11:18.480
|
|
And I, I bought all this stuff.
|
|
|
|
11:18.480 --> 11:24.560
|
|
I mean, I thought it was just as dire and, and calamitous as was being predicted and
|
|
|
|
11:24.560 --> 11:27.560
|
|
so forth because I was young and naive.
|
|
|
|
11:27.600 --> 11:33.920
|
|
But I, I progressed through my studies and my specialty actually, I got my Ph.D.
|
|
|
|
11:33.920 --> 11:36.120
|
|
and not, not in atmospheric physics.
|
|
|
|
11:36.120 --> 11:41.120
|
|
I got, that was my masters, but I got, I did mathematical statistics.
|
|
|
|
11:41.120 --> 11:44.960
|
|
And the reason is, I was interested in models.
|
|
|
|
11:44.960 --> 11:45.960
|
|
What makes a good model?
|
|
|
|
11:45.960 --> 11:50.920
|
|
What makes a good climate model, medical model, biological model, any of these kind of models?
|
|
|
|
11:50.920 --> 11:53.280
|
|
And how do you tell if it's good or bad?
|
|
|
|
11:53.280 --> 11:57.400
|
|
And that involves probability and uncertainty and all this kind of a thing.
|
|
|
|
11:57.400 --> 12:03.480
|
|
So that's how I slowly became a skeptic, the more I learned about how uncertainty really
|
|
|
|
12:03.480 --> 12:12.520
|
|
works and how, about how over certain everything that they pushing at us is, and not everything
|
|
|
|
12:12.520 --> 12:17.040
|
|
I should say, you know, you know, we have a pretty good understanding of the way capacitors
|
|
|
|
12:17.040 --> 12:20.120
|
|
work, for instance, and so forth.
|
|
|
|
12:20.120 --> 12:21.880
|
|
But these are not political things.
|
|
|
|
12:21.880 --> 12:28.160
|
|
It's all the stuff that turned political that the over certainty is vast as we've learned
|
|
|
|
12:28.160 --> 12:31.960
|
|
to our, you know, sadness this last four years.
|
|
|
|
12:31.960 --> 12:33.200
|
|
Right, right.
|
|
|
|
12:33.200 --> 12:37.680
|
|
I think over certain is a, is a wonderful way to really overlaps with how I've been saying
|
|
|
|
12:37.680 --> 12:42.640
|
|
it is that you can almost, you should immediately be skeptical of anyone who's certain and then
|
|
|
|
12:42.640 --> 12:48.920
|
|
even more if they're certain about a very simple explanation for a complex thing.
|
|
|
|
12:48.920 --> 12:54.040
|
|
And oftentimes I've started with that basis of interpreting the pandemic, for example,
|
|
|
|
12:54.040 --> 13:00.760
|
|
is it just can't be this simple that somebody spilled something in Wuhan and now it's everywhere.
|
|
|
|
13:00.760 --> 13:10.640
|
|
And so anyway, what do you think from the perspective of the broad picture, how can we,
|
|
|
|
13:10.640 --> 13:14.160
|
|
in your experience, you know, you're selling, you're selling a couple books, which I just
|
|
|
|
13:14.160 --> 13:16.400
|
|
put up before it, I don't mean it to say it in a bad way.
|
|
|
|
13:16.400 --> 13:22.760
|
|
I mean it to say it in a good way that, that he is capable of putting down his thoughts
|
|
|
|
13:22.760 --> 13:25.480
|
|
in a way that I can't and has done it several times.
|
|
|
|
13:25.480 --> 13:30.240
|
|
I actually, I don't, I, somewhere I can't find everything you believe is wrong book.
|
|
|
|
13:30.240 --> 13:35.360
|
|
I know I have it, but for some reason I couldn't find it to put on the screen, but I do have
|
|
|
|
13:35.360 --> 13:36.720
|
|
the book on certainty.
|
|
|
|
13:36.720 --> 13:43.000
|
|
And I think because that's the one I have, that's the only one I've read.
|
|
|
|
13:43.000 --> 13:45.440
|
|
That's this one here.
|
|
|
|
13:45.440 --> 13:53.840
|
|
How do we get people interested in this philosophical level of science in a way that, you know,
|
|
|
|
13:53.840 --> 14:01.920
|
|
most people won't be, you know, I mean, they'll be interested in the more practical matters
|
|
|
|
14:01.920 --> 14:06.080
|
|
of here's what these people are trying to sell you and here's why it's wrong.
|
|
|
|
14:06.080 --> 14:10.640
|
|
I mean, we could clearly see that they're trying to manipulate you into taking certain
|
|
|
|
14:10.640 --> 14:15.560
|
|
medications and so forth or they're trying to, they're trying to tell you a certain
|
|
|
|
14:15.560 --> 14:16.560
|
|
thing.
|
|
|
|
14:16.560 --> 14:20.160
|
|
Tomorrow I have a thing on gas stoves, now they're going after gas stoves even as they
|
|
|
|
14:20.160 --> 14:21.960
|
|
tell you they're not going to.
|
|
|
|
14:21.960 --> 14:25.080
|
|
They want to tell you that they're bad for the, the environment of their bad for your
|
|
|
|
14:25.080 --> 14:26.760
|
|
health and that kind of stuff.
|
|
|
|
14:26.760 --> 14:31.920
|
|
So we could show that those kinds of claims are wrong and that's enough for most people
|
|
|
|
14:31.920 --> 14:33.320
|
|
and that's fine.
|
|
|
|
14:33.320 --> 14:38.960
|
|
But if you want to know why those things are wrong, how it is that the scientists themselves
|
|
|
|
14:39.960 --> 14:46.880
|
|
beyond simple matters like fraud and lying and so forth, which we see a lot from top experts
|
|
|
|
14:46.880 --> 14:51.720
|
|
and rulers and the like, but most scientists are sincere in their beliefs.
|
|
|
|
14:51.720 --> 14:57.040
|
|
They're just vastly over certain or they're wrong, but they don't know how they're being
|
|
|
|
14:57.040 --> 14:58.200
|
|
wrong.
|
|
|
|
14:58.200 --> 15:01.720
|
|
And that's the level at which we need to fix science.
|
|
|
|
15:01.720 --> 15:06.160
|
|
Now a lot of people can just, you know, follow along, they can gain entry to, you know, podcasts
|
|
|
|
15:06.160 --> 15:07.960
|
|
like yours and so forth.
|
|
|
|
15:07.960 --> 15:08.960
|
|
The woman found it.
|
|
|
|
15:08.960 --> 15:10.760
|
|
Others that show what things are doing wrong.
|
|
|
|
15:10.760 --> 15:11.760
|
|
That's the other one.
|
|
|
|
15:11.760 --> 15:13.760
|
|
The woman found it.
|
|
|
|
15:13.760 --> 15:14.760
|
|
Thank you.
|
|
|
|
15:14.760 --> 15:19.760
|
|
And then we can get people to sort of grasp these more fundamental things, but they are
|
|
|
|
15:19.760 --> 15:21.160
|
|
more difficult.
|
|
|
|
15:21.160 --> 15:25.280
|
|
You know, I mean, it's, it's, you were there, we were all there last week or a week and
|
|
|
|
15:25.280 --> 15:26.280
|
|
a half ago.
|
|
|
|
15:26.280 --> 15:31.680
|
|
I guess we were out in back in Arizona for another epistemological boot camp or I guess
|
|
|
|
15:31.680 --> 15:36.200
|
|
we called it and they were difficult subjects being broached.
|
|
|
|
15:36.200 --> 15:41.200
|
|
You know, what, what is the nature of probability if we're speaking of uncertainty, we must
|
|
|
|
15:41.200 --> 15:42.200
|
|
use probability.
|
|
|
|
15:42.200 --> 15:46.360
|
|
How do we, how do we then understand what probability is and so forth and how does that
|
|
|
|
15:46.360 --> 15:51.040
|
|
relate across a broad range of sciences, everything from, you know, medicine biology all the way
|
|
|
|
15:51.040 --> 15:56.920
|
|
to astrophysics, fundamental quantum probability and all those kind of a thing.
|
|
|
|
15:56.920 --> 15:58.200
|
|
So it's difficult.
|
|
|
|
15:58.200 --> 16:02.440
|
|
We're not going to get everybody along and that kind of a thing, but we can get people
|
|
|
|
16:02.440 --> 16:09.280
|
|
to understand that because people are using these procedures that they think is guaranteeing
|
|
|
|
16:09.280 --> 16:16.760
|
|
them certainty or not, we could just show that they're wrong on that kind of a thing.
|
|
|
|
16:16.760 --> 16:22.560
|
|
Whether they get delved deeper into the mechanics of why exactly it's wrong, that's, that's,
|
|
|
|
16:22.560 --> 16:25.280
|
|
that's going to be only for a fraction of people, I think.
|
|
|
|
16:25.280 --> 16:31.920
|
|
Do you think that, um, one of the things that struck me years ago already was that, um,
|
|
|
|
16:31.920 --> 16:36.560
|
|
in neuroscience, there was this time, I, I, I've been in neuroscience, I was in neuroscience
|
|
|
|
16:36.560 --> 16:38.520
|
|
for around 18 years.
|
|
|
|
16:38.520 --> 16:46.120
|
|
And when I started in neuroscience, I noticed that there was this momentum in the acquisition
|
|
|
|
16:46.120 --> 16:47.480
|
|
of physicists.
|
|
|
|
16:47.480 --> 16:52.680
|
|
And the reason why was because physicists could make simple models and models were what
|
|
|
|
16:52.680 --> 17:00.520
|
|
every, every, uh, every biology's journal wanted you to have a model with a little neuronal
|
|
|
|
17:00.520 --> 17:05.680
|
|
network that showed how your data was relevant to brain function.
|
|
|
|
17:05.680 --> 17:12.720
|
|
And because these physicists had this toolbox, you could just bring them in to a, to a department
|
|
|
|
17:12.720 --> 17:18.560
|
|
and they could be on every paper that needed a model and every paper needs a model in this,
|
|
|
|
17:18.560 --> 17:21.040
|
|
in this, in this kind of scheme.
|
|
|
|
17:21.040 --> 17:25.120
|
|
And I was struck by how, what are they doing like transport models and the like that kind
|
|
|
|
17:25.120 --> 17:26.120
|
|
of thing.
|
|
|
|
17:26.120 --> 17:31.360
|
|
So just simple network models that show that, you know, something, some connection could
|
|
|
|
17:31.360 --> 17:32.360
|
|
be relevant.
|
|
|
|
17:32.360 --> 17:40.000
|
|
Um, it, to me, it's really just a question of again, how are we using them and are we
|
|
|
|
17:40.000 --> 17:45.440
|
|
using them to advance our understanding or are we using them to, to, I don't know,
|
|
|
|
17:45.440 --> 17:50.800
|
|
obfuscate obfuscate or, or, or, or to add complexity for the sake of, for the sake of
|
|
|
|
17:50.800 --> 17:53.080
|
|
complexity, which turns it into science.
|
|
|
|
17:53.080 --> 17:54.080
|
|
It's funny.
|
|
|
|
17:54.080 --> 17:58.080
|
|
You mentioned it's just a week ago or two weeks ago that somebody had a paper out that
|
|
|
|
17:58.080 --> 18:00.440
|
|
looked at literature papers of all things.
|
|
|
|
18:00.440 --> 18:02.440
|
|
And I guess these are spoof papers.
|
|
|
|
18:02.440 --> 18:06.840
|
|
They had regular papers and then they had the same papers, but they had equations in them.
|
|
|
|
18:06.840 --> 18:10.960
|
|
And the ones that had equations in them just for no reason were the ones that were viewed
|
|
|
|
18:10.960 --> 18:12.400
|
|
as a much more scientific.
|
|
|
|
18:12.400 --> 18:13.400
|
|
Oh, yeah.
|
|
|
|
18:13.400 --> 18:14.400
|
|
Wow.
|
|
|
|
18:14.400 --> 18:15.400
|
|
That, that's the level of stuff.
|
|
|
|
18:15.400 --> 18:16.400
|
|
I mean, it's easy to create a lot.
|
|
|
|
18:16.400 --> 18:20.800
|
|
I mean, not, not everybody can do it, but once you, once you learn how it's, it's trivial
|
|
|
|
18:20.800 --> 18:23.000
|
|
to create models.
|
|
|
|
18:23.000 --> 18:24.640
|
|
That's one of the problems.
|
|
|
|
18:24.640 --> 18:25.640
|
|
It's, it's too easy.
|
|
|
|
18:25.640 --> 18:26.640
|
|
Science is too easy.
|
|
|
|
18:26.640 --> 18:33.200
|
|
It's, it's difficult to understand that maybe, but because science is so easy to do now in
|
|
|
|
18:33.200 --> 18:36.200
|
|
a lot.
|
|
|
|
18:36.200 --> 18:42.600
|
|
Hi, what happened there?
|
|
|
|
18:42.600 --> 18:46.680
|
|
Is garage is like my garage?
|
|
|
|
18:46.680 --> 18:47.680
|
|
Come on back now.
|
|
|
|
18:47.680 --> 18:49.680
|
|
You hear?
|
|
|
|
18:49.680 --> 18:50.680
|
|
I don't know.
|
|
|
|
18:50.680 --> 19:02.440
|
|
Maybe you should snap to that might help on he'll call back.
|
|
|
|
19:02.440 --> 19:08.760
|
|
I want very much to see if there's too many people doing it and numbers of people just
|
|
|
|
19:08.760 --> 19:12.360
|
|
because there's going to be a lot more error, but it's something, something.
|
|
|
|
19:12.360 --> 19:13.360
|
|
Here you go.
|
|
|
|
19:13.360 --> 19:14.360
|
|
There we go.
|
|
|
|
19:14.360 --> 19:15.360
|
|
Yeah.
|
|
|
|
19:15.360 --> 19:21.160
|
|
It's probably me, the, the camera, it doesn't, it does a visual acuity check on me and sort
|
|
|
|
19:21.160 --> 19:22.160
|
|
of bocks.
|
|
|
|
19:22.160 --> 19:23.160
|
|
Yeah, it's all good.
|
|
|
|
19:23.160 --> 19:26.520
|
|
So, we were talking about models and it just took for a couple of seconds.
|
|
|
|
19:26.520 --> 19:28.280
|
|
So it's okay.
|
|
|
|
19:28.280 --> 19:30.040
|
|
What is different?
|
|
|
|
19:30.040 --> 19:34.360
|
|
I mean, for the, for example, you were in meteorology, they could only get away with,
|
|
|
|
19:34.360 --> 19:36.560
|
|
they can't get away with crappy models, right?
|
|
|
|
19:36.560 --> 19:40.480
|
|
If they use a crappy model, then they're, they're prediction for even tomorrow would
|
|
|
|
19:40.480 --> 19:41.480
|
|
be off, right?
|
|
|
|
19:41.480 --> 19:46.040
|
|
But to a certain, this is, this is, this is a, you bring up a good point, an excellent
|
|
|
|
19:46.040 --> 19:49.440
|
|
point, as a matter of fact, because it's one of the few fields.
|
|
|
|
19:49.440 --> 19:56.680
|
|
In fact, one of the very, very few fields that model goodness, the idea of what we call
|
|
|
|
19:56.680 --> 20:00.880
|
|
it, the technical term is verification, model verification.
|
|
|
|
20:00.880 --> 20:02.120
|
|
That's where it was really born.
|
|
|
|
20:02.120 --> 20:06.320
|
|
That's where all the statistics and the probability and the understanding of the philosophy of
|
|
|
|
20:06.320 --> 20:12.120
|
|
model goodness arose from meteorology, because it's for obvious reasons.
|
|
|
|
20:12.120 --> 20:16.220
|
|
They make weather forecasts, they come out at least twice a day, and they want to verify
|
|
|
|
20:16.220 --> 20:18.000
|
|
how good they're doing.
|
|
|
|
20:18.000 --> 20:26.280
|
|
And because they did, they developed all kinds of ideas, all kinds of measures, like numerical
|
|
|
|
20:26.280 --> 20:30.840
|
|
measures that show you how good the predictions are behaving and the like.
|
|
|
|
20:30.840 --> 20:34.440
|
|
And they use those to sort of feedback and improve the models.
|
|
|
|
20:34.440 --> 20:38.800
|
|
And they have meteorological models over the last 50 years or so, they've increased
|
|
|
|
20:38.800 --> 20:40.040
|
|
dramatically.
|
|
|
|
20:40.040 --> 20:46.120
|
|
They're now very accurate out to a short, short period of time, out to like four or five
|
|
|
|
20:46.120 --> 20:51.560
|
|
days, they're great, out to beyond that, they diminish in quality fairly rapidly.
|
|
|
|
20:51.560 --> 20:56.360
|
|
And you know, they, for technical reasons, you know, the atmosphere is just chaotic and
|
|
|
|
20:56.360 --> 21:00.280
|
|
it's, it's not cooperative at, at times past that.
|
|
|
|
21:00.320 --> 21:05.240
|
|
And let me ask you this, just in the context, if they had more measuring devices with these
|
|
|
|
21:05.240 --> 21:09.320
|
|
models become better or the models are already at their peak sort of in that sense.
|
|
|
|
21:09.320 --> 21:10.320
|
|
No.
|
|
|
|
21:10.320 --> 21:11.960
|
|
So not really.
|
|
|
|
21:11.960 --> 21:12.960
|
|
Okay.
|
|
|
|
21:12.960 --> 21:18.520
|
|
Part of the problem, they have, we have satellite data and they can sort of make satellite data
|
|
|
|
21:18.520 --> 21:22.360
|
|
for a lot of a lot more points and so forth.
|
|
|
|
21:22.360 --> 21:27.120
|
|
But their computational power becomes one of the, one of the bottlenecks.
|
|
|
|
21:27.120 --> 21:29.360
|
|
The other is the atmospheric chaotic.
|
|
|
|
21:29.360 --> 21:30.360
|
|
Good Lorenz.
|
|
|
|
21:30.360 --> 21:34.440
|
|
He was one of the, a meteorologist at the time using early versions of computers.
|
|
|
|
21:34.440 --> 21:39.480
|
|
He was running a program, very simple dynamical model, very simple dynamical model.
|
|
|
|
21:39.480 --> 21:42.240
|
|
I think he coded it in Fortran.
|
|
|
|
21:42.240 --> 21:46.720
|
|
And he was running the thing and then he had to step away or something like this and he
|
|
|
|
21:46.720 --> 21:51.680
|
|
came back and somebody else had used the time on the computer.
|
|
|
|
21:51.680 --> 21:55.400
|
|
Back then you had to get time on a computer, you didn't have personal use of it.
|
|
|
|
21:55.400 --> 21:58.520
|
|
You had to sort of subscribe for time.
|
|
|
|
21:58.520 --> 22:01.680
|
|
And he wanted to restart the computer exactly where it left off.
|
|
|
|
22:01.680 --> 22:07.440
|
|
So he reprogrammed in the, you know, using punch cards, the constants that the model
|
|
|
|
22:07.440 --> 22:11.920
|
|
spit out at a certain point, but he didn't get them to the exact decimal level all the
|
|
|
|
22:11.920 --> 22:14.640
|
|
way out to however many digits the computer did.
|
|
|
|
22:14.640 --> 22:18.800
|
|
And he found to his surprise that diverged wildly.
|
|
|
|
22:18.800 --> 22:25.680
|
|
And chaos as a science was born from his paper.
|
|
|
|
22:25.680 --> 22:27.120
|
|
And it's amazing.
|
|
|
|
22:27.600 --> 22:33.200
|
|
And it turns out to be sensitivity to initial conditions is sort of defining characteristic
|
|
|
|
22:33.200 --> 22:34.200
|
|
of chaos.
|
|
|
|
22:34.200 --> 22:41.640
|
|
In other words, you could have a perfectly defined rigorously analytic dynamical system
|
|
|
|
22:41.640 --> 22:46.440
|
|
that because it starts off in a slightly, just the tiniest little, you wouldn't make
|
|
|
|
22:46.440 --> 22:54.320
|
|
any difference initial point, but can diverge wildly from a model that starts off slightly
|
|
|
|
22:54.320 --> 22:55.720
|
|
different from it.
|
|
|
|
22:55.760 --> 22:59.240
|
|
So that's the way the atmosphere sort of behaves, people are discovering that you can
|
|
|
|
22:59.240 --> 23:03.840
|
|
kind of get away with that by making your predictions coarser.
|
|
|
|
23:03.840 --> 23:10.280
|
|
And only speaking of averages and the like and do a little bit better in that respect.
|
|
|
|
23:10.280 --> 23:14.840
|
|
But it's still meteorology that led the way.
|
|
|
|
23:14.840 --> 23:18.880
|
|
And that's how I learned of all this kind of stuff, because that's how I started doing
|
|
|
|
23:18.880 --> 23:22.600
|
|
and doing model goodness, how do you check?
|
|
|
|
23:22.640 --> 23:26.520
|
|
So you could apply this, these same tools to all kinds of models, like we were being
|
|
|
|
23:26.520 --> 23:28.520
|
|
foisted upon us all the time.
|
|
|
|
23:28.520 --> 23:31.080
|
|
And it turns out a lot of the models just stink.
|
|
|
|
23:31.080 --> 23:32.080
|
|
They're terrible.
|
|
|
|
23:32.080 --> 23:33.080
|
|
They're never verified.
|
|
|
|
23:33.080 --> 23:39.000
|
|
In fact, the leading practice of most science, the leading practice of most science and most
|
|
|
|
23:39.000 --> 23:50.160
|
|
science papers is to fit a model to data, announce the quality of the fit and stop.
|
|
|
|
23:50.160 --> 23:51.160
|
|
That's it.
|
|
|
|
23:51.160 --> 23:53.560
|
|
And then they act as if they've found something.
|
|
|
|
23:53.560 --> 23:59.840
|
|
But as we already discussed, anybody could fit a model to data, anybody could fit any
|
|
|
|
23:59.840 --> 24:01.400
|
|
kind of model to any kind of data.
|
|
|
|
24:01.400 --> 24:04.280
|
|
It just doesn't take that much ingenuity anymore.
|
|
|
|
24:04.280 --> 24:10.920
|
|
And therefore, we're flooded with, you know, just acres and acres of this stuff of bad
|
|
|
|
24:10.920 --> 24:14.040
|
|
science of just good model fits.
|
|
|
|
24:14.040 --> 24:15.120
|
|
Well, so what?
|
|
|
|
24:15.120 --> 24:17.760
|
|
These models are never checked against reality after that.
|
|
|
|
24:17.760 --> 24:20.920
|
|
But everybody takes it as if it was.
|
|
|
|
24:20.920 --> 24:21.920
|
|
It's a huge problem.
|
|
|
|
24:21.920 --> 24:22.920
|
|
Right.
|
|
|
|
24:22.920 --> 24:26.800
|
|
And I think you could formulate that in the reverse and say that if your scientific paper
|
|
|
|
24:26.800 --> 24:29.800
|
|
doesn't start out with, my model predicts the following.
|
|
|
|
24:29.800 --> 24:36.400
|
|
So we checked or measured the following, then you're really not, you're really not verifying
|
|
|
|
24:36.400 --> 24:37.400
|
|
a model.
|
|
|
|
24:37.400 --> 24:42.600
|
|
You're actually, again, as you said, making observations and then fitting a model to it,
|
|
|
|
24:42.600 --> 24:43.920
|
|
which is big deal.
|
|
|
|
24:43.920 --> 24:46.520
|
|
That's a big difference.
|
|
|
|
24:46.520 --> 24:49.680
|
|
You can almost say anything you want if you use that strategy.
|
|
|
|
24:49.680 --> 24:56.280
|
|
But the vast majority papers, biology and medicine, and all of the soft sciences, almost
|
|
|
|
24:56.280 --> 24:58.120
|
|
all their models are like this.
|
|
|
|
24:58.120 --> 25:04.720
|
|
They just fit a model and they're all ad hoc models, for the most part, in the soft sciences.
|
|
|
|
25:04.720 --> 25:07.480
|
|
They fit the model to the data.
|
|
|
|
25:07.480 --> 25:13.600
|
|
They announced after they manipulate the data and the model that the model fits well.
|
|
|
|
25:13.600 --> 25:14.600
|
|
And that's it.
|
|
|
|
25:14.600 --> 25:18.520
|
|
And now we're supposed to take their word that they have found some causative agent in
|
|
|
|
25:18.520 --> 25:19.520
|
|
nature.
|
|
|
|
25:19.520 --> 25:21.960
|
|
And it's just not the case.
|
|
|
|
25:21.960 --> 25:24.240
|
|
They may have happened upon something.
|
|
|
|
25:24.240 --> 25:26.320
|
|
They may have gotten lucky.
|
|
|
|
25:26.320 --> 25:30.160
|
|
But it's not because of their procedures that they're correct.
|
|
|
|
25:30.160 --> 25:35.560
|
|
It's just by chance, by luck, that they're off and right.
|
|
|
|
25:35.560 --> 25:41.720
|
|
Or by some intuition too, a lot of scientists are smart guys, but it doesn't mean that
|
|
|
|
25:41.720 --> 25:47.840
|
|
the procedures they're using are guaranteeing them the kind of veracity that they tell us
|
|
|
|
25:47.840 --> 25:49.240
|
|
that they have.
|
|
|
|
25:49.240 --> 25:53.280
|
|
That's why when people say, follow the science, no, it isn't like that.
|
|
|
|
25:53.280 --> 26:00.440
|
|
We should not follow the science and tell that science has been independently, by disinterested
|
|
|
|
26:00.440 --> 26:08.640
|
|
parties, checked and its accuracy, its predictive accuracy, otherwise we've never known.
|
|
|
|
26:08.640 --> 26:13.800
|
|
One of the things that I've been thinking about is how can, because broken science
|
|
|
|
26:13.800 --> 26:19.000
|
|
brings together this really weird group of people where there can be lots of owners
|
|
|
|
26:19.000 --> 26:24.200
|
|
of CrossFit gyms or athletes or martial artists or people interested in nutrition
|
|
|
|
26:24.200 --> 26:31.080
|
|
or training, and then also, you know, physicists and people like yourself with a million different
|
|
|
|
26:31.080 --> 26:33.040
|
|
things in their background.
|
|
|
|
26:33.040 --> 26:39.040
|
|
And what I'm wondering is a lot of times we get so deep into that discussion that we can't,
|
|
|
|
26:39.040 --> 26:45.240
|
|
we need, I love the list that Greg has made of all of the pseudo sciences.
|
|
|
|
26:45.240 --> 26:53.040
|
|
But I felt like the last meeting was missing a little session about concrete examples,
|
|
|
|
26:53.040 --> 26:54.800
|
|
which is what you said earlier.
|
|
|
|
26:54.800 --> 26:59.760
|
|
And I almost want to suggest that somebody should build a website of concrete examples
|
|
|
|
26:59.760 --> 27:03.800
|
|
of what the model is and how it's not ever tested.
|
|
|
|
27:03.800 --> 27:09.000
|
|
I mean, the ones that I've been plugging for a long time are the model of antibodies
|
|
|
|
27:09.000 --> 27:11.880
|
|
equals immunity, but they never really test that.
|
|
|
|
27:12.880 --> 27:17.160
|
|
And when they do, they don't get the answer that their model predicts they should get,
|
|
|
|
27:17.160 --> 27:19.080
|
|
but they don't care.
|
|
|
|
27:19.080 --> 27:24.640
|
|
And so I wonder if it wouldn't be useful for us to make a list like that.
|
|
|
|
27:24.640 --> 27:25.640
|
|
That was my first thing.
|
|
|
|
27:25.640 --> 27:28.120
|
|
And I also have a thing about funding, but what do you think about that?
|
|
|
|
27:28.120 --> 27:29.120
|
|
Is there a...
|
|
|
|
27:29.120 --> 27:33.000
|
|
I think it's a fantastic idea, of course, absolutely.
|
|
|
|
27:33.000 --> 27:39.880
|
|
I do these things all the time, you know, maybe once or twice a week, I publish a piece daily,
|
|
|
|
27:39.880 --> 27:45.680
|
|
but maybe once or twice a week is looking at bad papers and from all range of science
|
|
|
|
27:45.680 --> 27:47.440
|
|
that use these techniques.
|
|
|
|
27:47.440 --> 27:49.760
|
|
So yeah, it would be very worthwhile.
|
|
|
|
27:49.760 --> 27:50.760
|
|
That's a good project.
|
|
|
|
27:50.760 --> 27:57.880
|
|
I'm going to consider that and how to bring it about putting it on a more systematic basis.
|
|
|
|
27:57.880 --> 27:59.520
|
|
I think that's a fantastic idea.
|
|
|
|
27:59.520 --> 28:04.120
|
|
I mean, neuroscience is just littered with models of how the brain works that have like
|
|
|
|
28:04.120 --> 28:06.120
|
|
two variables.
|
|
|
|
28:06.120 --> 28:07.920
|
|
And it's extraordinary.
|
|
|
|
28:07.920 --> 28:11.760
|
|
I mean, and then they would take an fMRI measurement and look, there it is.
|
|
|
|
28:11.760 --> 28:17.840
|
|
We got some difference between these two relative signals and it looks a lot like our model.
|
|
|
|
28:17.840 --> 28:20.920
|
|
That's a perfect example of the type of thing where models...
|
|
|
|
28:20.920 --> 28:22.840
|
|
It's model upon model.
|
|
|
|
28:22.840 --> 28:23.840
|
|
The fMRI...
|
|
|
|
28:23.840 --> 28:27.320
|
|
You'd know this better than I do, but for your listeners probably know it better than
|
|
|
|
28:27.320 --> 28:28.320
|
|
I do too.
|
|
|
|
28:28.320 --> 28:29.320
|
|
But the fMRI...
|
|
|
|
28:29.320 --> 28:32.120
|
|
The output from that is already a model.
|
|
|
|
28:32.120 --> 28:33.200
|
|
It's already a model.
|
|
|
|
28:33.200 --> 28:36.640
|
|
It's not like they've gone into the brain and dissected, taking a look at this mirror
|
|
|
|
28:36.640 --> 28:39.560
|
|
and saying, oh, yeah, this is a bigger fashion over here.
|
|
|
|
28:39.560 --> 28:43.160
|
|
No, this thing is an inverse problem and it's already a model.
|
|
|
|
28:43.160 --> 28:46.720
|
|
So there's uncertainty already in the data that you have out of it.
|
|
|
|
28:46.720 --> 28:53.920
|
|
And then they use this uncertain data as if it were certain downstream and further models.
|
|
|
|
28:53.920 --> 28:58.560
|
|
And to some of the most preposterous things, I mean, Sam Harris was...
|
|
|
|
28:58.560 --> 29:01.600
|
|
He has one I picked apart a long time ago.
|
|
|
|
29:01.600 --> 29:03.240
|
|
Ridiculous things.
|
|
|
|
29:03.240 --> 29:07.520
|
|
People are more likely to believe in Santa Claus if the sublocutist area of the brain
|
|
|
|
29:07.520 --> 29:09.240
|
|
lights up in their effort.
|
|
|
|
29:09.240 --> 29:15.960
|
|
All kind of statistical nonsense that they try to pull off because it shows how well their
|
|
|
|
29:15.960 --> 29:19.720
|
|
model could fit, the model of the model.
|
|
|
|
29:19.720 --> 29:25.840
|
|
So yes, all he's there is they all suffer from the same problems over modeling.
|
|
|
|
29:25.840 --> 29:29.640
|
|
What do you have to have models in science, but they're not verified.
|
|
|
|
29:29.640 --> 29:35.440
|
|
Another place that models probably play a more reasonable role in describing reality
|
|
|
|
29:35.440 --> 29:40.520
|
|
is in business where uncertainty equals risk.
|
|
|
|
29:40.520 --> 29:47.400
|
|
Is there any way we could use the idea that businesses don't use models that don't work?
|
|
|
|
29:47.400 --> 29:49.720
|
|
Insurance companies don't use models that don't work.
|
|
|
|
29:49.720 --> 29:55.160
|
|
So there have to be people that understand that model verification is at the heart of
|
|
|
|
29:55.160 --> 29:56.160
|
|
knowledge generation.
|
|
|
|
29:57.160 --> 29:58.160
|
|
Oh, yeah, sorry.
|
|
|
|
29:58.160 --> 29:59.160
|
|
Did you hear what I just said?
|
|
|
|
29:59.160 --> 30:01.640
|
|
I'm going to let that come back.
|
|
|
|
30:01.640 --> 30:09.320
|
|
Yeah, business models of verification and business models use uncertainty as risk.
|
|
|
|
30:09.320 --> 30:16.040
|
|
And so there's a lot of sort of feedback in business for their models in actuarial science
|
|
|
|
30:16.040 --> 30:17.040
|
|
and whatever.
|
|
|
|
30:17.040 --> 30:23.680
|
|
So there have to be people that from a professional perspective understand how if they do it like
|
|
|
|
30:23.680 --> 30:28.920
|
|
this at work, but academic science doesn't like that, then there ought to be lots of
|
|
|
|
30:28.920 --> 30:33.960
|
|
people that could be woken up very easily who just because of their professional background
|
|
|
|
30:33.960 --> 30:38.320
|
|
are, this will just snap into their head, don't you think or?
|
|
|
|
30:38.320 --> 30:42.720
|
|
I think you're absolutely right, actuaries are an excellent point because if the insurance
|
|
|
|
30:42.720 --> 30:49.160
|
|
companies didn't model accurately, they would stand to lose a lot of money.
|
|
|
|
30:49.160 --> 30:55.520
|
|
Of course, the other problem after that is not scientific, but sociological, the government
|
|
|
|
30:55.520 --> 31:00.240
|
|
comes in and begins to regulate them because their models are sometimes too accurate and
|
|
|
|
31:00.240 --> 31:05.760
|
|
they're not politically desirable accuracies, that's so that's a whole other problem.
|
|
|
|
31:05.760 --> 31:06.760
|
|
Oh, wow.
|
|
|
|
31:06.760 --> 31:08.360
|
|
I didn't know that that happened.
|
|
|
|
31:08.360 --> 31:09.360
|
|
Okay.
|
|
|
|
31:09.360 --> 31:13.600
|
|
Well, you know, you ever you know, we don't want to talk about all this idea, but you
|
|
|
|
31:13.600 --> 31:21.440
|
|
know, there's official, you know, there's some groups need more need more help from
|
|
|
|
31:21.440 --> 31:23.000
|
|
the government dollars.
|
|
|
|
31:23.000 --> 31:26.120
|
|
And so the government steps in and and and modifies it.
|
|
|
|
31:26.120 --> 31:30.440
|
|
Well, you know, you're not you're not allowed to have noticing of certain things.
|
|
|
|
31:30.440 --> 31:35.120
|
|
And so now I see what you mean, yeah, there is that, but but actuaries do do a good job
|
|
|
|
31:35.120 --> 31:38.520
|
|
of looking at how well their models portray.
|
|
|
|
31:38.520 --> 31:44.880
|
|
And because, you know, they're not the most complex models in the world, they're more
|
|
|
|
31:44.880 --> 31:49.040
|
|
or less just, you know, counting.
|
|
|
|
31:49.040 --> 31:50.040
|
|
Here's what happened.
|
|
|
|
31:50.040 --> 31:53.320
|
|
And here's what we can expect to happen again in the future, which is a very good way to
|
|
|
|
31:53.320 --> 31:54.320
|
|
do things.
|
|
|
|
31:54.320 --> 31:55.320
|
|
Right.
|
|
|
|
31:55.320 --> 31:56.320
|
|
Not that the math is simple.
|
|
|
|
31:56.320 --> 31:57.320
|
|
I don't mean that.
|
|
|
|
31:57.320 --> 32:03.640
|
|
But I mean, to say that as far as model complexity goes, it's not that not that difficult.
|
|
|
|
32:03.640 --> 32:10.080
|
|
Would you would you want to, um, for my viewers, just take a crack at a brief explanation
|
|
|
|
32:10.080 --> 32:18.920
|
|
of how the the no hypothesis falsification thing isn't actually verifying models and,
|
|
|
|
32:18.920 --> 32:22.400
|
|
you know, pull an example out of your out of your brain at random?
|
|
|
|
32:22.400 --> 32:23.400
|
|
Is that a?
|
|
|
|
32:23.400 --> 32:26.280
|
|
Yeah, it's a it's a very good question.
|
|
|
|
32:26.280 --> 32:31.360
|
|
Not the null hypothesis and statistics is kind of the straw man of statistics.
|
|
|
|
32:31.360 --> 32:34.640
|
|
And everybody knows the straw man is a sort of informal fallacy.
|
|
|
|
32:34.640 --> 32:38.640
|
|
It's it's set up to be able to knock down and knock down easily.
|
|
|
|
32:38.640 --> 32:45.280
|
|
Normally, what happens is you set up an ad hoc model, something like a regression, very common
|
|
|
|
32:45.280 --> 32:52.560
|
|
thing, and you relate some X to some Y, some, some measure to some observable thing that
|
|
|
|
32:52.560 --> 32:54.760
|
|
you're interested in.
|
|
|
|
32:55.760 --> 33:01.480
|
|
The null hypothesis is that whatever your X is has nothing to do with this Y.
|
|
|
|
33:01.480 --> 33:07.920
|
|
And you encapsulate that into a probability model, a parameterized probability model,
|
|
|
|
33:07.920 --> 33:12.960
|
|
and you set the value of that parameter at some level, typically zero, you say, if this
|
|
|
|
33:12.960 --> 33:19.440
|
|
parameter exactly equals zero precisely exactly equals zero, why that's my null hypothesis.
|
|
|
|
33:19.440 --> 33:26.240
|
|
And then you calculate all kinds of numbers, namely, like a P value, a bit of arcane magic
|
|
|
|
33:26.240 --> 33:30.560
|
|
that if it's small enough, you can go waving your we P value in front of people saying,
|
|
|
|
33:30.560 --> 33:37.640
|
|
look how small my P value is, I have disproved my null hypothesis and therefore the opposite
|
|
|
|
33:37.640 --> 33:40.880
|
|
of the contrary of the null hypothesis must be true.
|
|
|
|
33:40.880 --> 33:47.920
|
|
If we've said this parameter is exactly precisely equal to zero, and the parameter is a very
|
|
|
|
33:47.920 --> 33:55.960
|
|
sort of measure of strength, therefore it's not zero, the null hypothesis has been shown
|
|
|
|
33:55.960 --> 34:02.000
|
|
to be false, and so not zero, therefore it must be important, this X must be important
|
|
|
|
34:02.000 --> 34:09.160
|
|
in describing this Y. Of course, it's not true. For one thing, you've never disproved
|
|
|
|
34:09.160 --> 34:13.280
|
|
the null hypothesis, you've just made an active will, you said, you know what, I'm going
|
|
|
|
34:13.280 --> 34:19.560
|
|
to decide that this null hypothesis is not significant, and what does significant mean?
|
|
|
|
34:19.560 --> 34:23.640
|
|
Significant means the P value is small. And what does the P value being small mean? Well,
|
|
|
|
34:23.640 --> 34:30.040
|
|
it means significant. It's a circular definition. It's just an active will to say, I think this
|
|
|
|
34:30.040 --> 34:36.760
|
|
correlation is causation. And that's it. I think this correlation that I have found
|
|
|
|
34:36.760 --> 34:42.480
|
|
is causation. And you have some mathematical apparatus that helps you in this ritual. One
|
|
|
|
34:42.480 --> 34:50.920
|
|
of the guys who was at the thing last week, Greg Gigarenzer, it's his phrase, the ritual
|
|
|
|
34:50.920 --> 34:57.320
|
|
of statistics or ritualized statistics or something. Just following these things to sort of bless
|
|
|
|
34:57.320 --> 35:02.800
|
|
your results, you run these typical procedures over them, and if things work out, well then
|
|
|
|
35:02.800 --> 35:10.160
|
|
you claim that you have found something. So null hypotheses, we can eliminate them entirely.
|
|
|
|
35:10.160 --> 35:17.320
|
|
We don't need them in science at all. We can instead form our models in this predictive
|
|
|
|
35:17.320 --> 35:23.280
|
|
sense. If you say that X, knowing what X is, tells you something about Y. So one of my
|
|
|
|
35:23.280 --> 35:31.040
|
|
favorite examples is, I've used this example many times, attendance at a Fourth of July
|
|
|
|
35:31.040 --> 35:37.000
|
|
parade when you're young. That's our X. The attendance at a Fourth of July parade when
|
|
|
|
35:37.040 --> 35:41.960
|
|
you're young, predicts whether or not you're going to turn out to be a Republican. That's
|
|
|
|
35:41.960 --> 35:52.120
|
|
our why. You're frozen again, just hold on one second. Darn it. He's got to fix that
|
|
|
|
35:52.120 --> 35:58.520
|
|
camera. At least he stopped at a place where I can accurately characterize where he needs
|
|
|
|
35:58.520 --> 36:04.320
|
|
hypothesis is catching up again, your camera parade attendance has. Yeah, sorry, your camera
|
|
|
|
36:04.320 --> 36:08.280
|
|
just caught you off. So I don't know what the deal is. It's a it's okay. You're at
|
|
|
|
36:08.280 --> 36:13.640
|
|
perfect. It's right at the part where you said, okay, parade attendance predicts whether
|
|
|
|
36:13.640 --> 36:17.800
|
|
or not you're going to be a Republican and then it stopped. So all right. So the null
|
|
|
|
36:17.800 --> 36:21.680
|
|
hypothesis is parade attendance has nothing to do with it. Whether you're going to be
|
|
|
|
36:21.680 --> 36:27.080
|
|
a Republican or Democrat or something like this. So you form this model, you put it
|
|
|
|
36:27.080 --> 36:31.520
|
|
in a regression, you get a VP value, which is what these researchers from Harvard did.
|
|
|
|
36:31.520 --> 36:40.000
|
|
And they said, yes, attendance at a Fourth of July parade, when you're young, predicts
|
|
|
|
36:40.000 --> 36:44.360
|
|
whether or not you're going to be a Republican. And that's the that was what the headlines
|
|
|
|
36:44.360 --> 36:49.360
|
|
read. So, you know, these young patriotic little kids you're imagining are being radical
|
|
|
|
36:49.360 --> 36:55.640
|
|
on ice into into being far right extremist leg waivers by attending Fourth of July parade,
|
|
|
|
36:55.640 --> 37:01.560
|
|
but they cheated. They didn't use for the July parade attendance. What they did was
|
|
|
|
37:01.560 --> 37:07.600
|
|
they looked at people's residents when they were young, the address that they thought
|
|
|
|
37:07.600 --> 37:12.360
|
|
they lived at when they were young. And then they went back into the meteorological records
|
|
|
|
37:12.360 --> 37:18.920
|
|
and looked whether or not it rained on the Fourth of July. And if there was any precipitation
|
|
|
|
37:18.920 --> 37:26.120
|
|
on the Fourth of July, they said there was no parade. And the child could not have gone.
|
|
|
|
37:26.120 --> 37:31.680
|
|
And if it did not rain, if it was any kind of cloudy or clear day, they said, well, there
|
|
|
|
37:31.680 --> 37:36.680
|
|
must have been a Fourth of July parade and there must have been the kid attending it.
|
|
|
|
37:36.680 --> 37:42.920
|
|
That's the level of science. It's absolutely absurd because I always use this joke, but
|
|
|
|
37:42.920 --> 37:47.840
|
|
in San Francisco, of course, it never rains on the Fourth of July, almost never. And so
|
|
|
|
37:47.840 --> 37:54.520
|
|
the city should be teeming with Republicans because they would all go to these parades
|
|
|
|
37:54.520 --> 37:59.720
|
|
and so forth. But because we have this null hypothesis that's so easy to swap down. I
|
|
|
|
37:59.720 --> 38:04.040
|
|
mean, if you're not getting, if you're not rejecting your null hypothesis, you're not
|
|
|
|
38:04.040 --> 38:09.880
|
|
trying hard enough. It's always possible to do just just requires a little bit of ingenuity
|
|
|
|
38:09.880 --> 38:13.360
|
|
and you could do it. That's why we have so much bad science.
|
|
|
|
38:13.720 --> 38:21.600
|
|
Yeah, wow, that's a really good example because there you're really actually testing for
|
|
|
|
38:21.600 --> 38:28.200
|
|
even raining on Fourth of July. It's not even the parade, right? So that's one thing. But
|
|
|
|
38:28.200 --> 38:33.760
|
|
I really feel like that's a very good analogy for how far we are from actual understanding
|
|
|
|
38:33.760 --> 38:37.640
|
|
of what happened during the pandemic. They want you to believe it's as simple as some
|
|
|
|
38:37.640 --> 38:43.000
|
|
RNA molecules filled in a puddle and Wuhan. And now it's everywhere. But there's so much
|
|
|
|
38:43.000 --> 38:53.640
|
|
involved in that it's a massive pile of assumptions in the end. And I think it's useful to think
|
|
|
|
38:53.640 --> 39:00.040
|
|
of that as a model of reality that makes predictions. If you say that that's the way the pandemic
|
|
|
|
39:00.040 --> 39:04.720
|
|
worked, then that makes all kinds of predictions about how future RNA should transmit around
|
|
|
|
39:04.720 --> 39:10.120
|
|
the world and how these future signals could be found. And I think those that's where this
|
|
|
|
39:10.160 --> 39:15.360
|
|
whole thing would really fall apart like a like it should. I mean, because again, if
|
|
|
|
39:15.360 --> 39:22.600
|
|
you challenge them to make predictions, they can't. And that's I feel another side of the
|
|
|
|
39:22.600 --> 39:27.720
|
|
coin with regard to models being fitted to data versus, you know, making a model and
|
|
|
|
39:27.720 --> 39:32.440
|
|
then going out and seeking validation of it. I don't know. Absolutely. I don't know where
|
|
|
|
39:32.440 --> 39:36.120
|
|
I was going with that question, but no, no, it's absolutely right. You're absolutely right.
|
|
|
|
39:36.120 --> 39:41.080
|
|
If you can't make observe predictions of observables, it's the whole point is these
|
|
|
|
39:41.080 --> 39:45.880
|
|
observables are the things that are supposed to matter to us. And if you can't make, you
|
|
|
|
39:45.880 --> 39:52.080
|
|
can't make skillful predictions, it's a technical term, skillful predictions of these things,
|
|
|
|
39:52.080 --> 39:56.520
|
|
then you have no business feisting it off onto the public and making us follow the science
|
|
|
|
39:56.520 --> 39:59.200
|
|
is just, it's absurd, it's absurd.
|
|
|
|
39:59.200 --> 40:05.480
|
|
Well, that's the reason why I think it's important for people to understand then that if if done
|
|
|
|
40:05.480 --> 40:12.000
|
|
well, and even with malevolence, for example, if they want to create the illusion of pandemic
|
|
|
|
40:12.000 --> 40:17.360
|
|
potential using academic biology, they can do it. And they can even put p values behind
|
|
|
|
40:17.360 --> 40:23.000
|
|
it. And they can get peer review behind it. And so this whole system, right, has been
|
|
|
|
40:23.000 --> 40:29.600
|
|
distorted into, they call it knowledge, but it's not knowledge, it's not generating knowledge.
|
|
|
|
40:29.600 --> 40:35.440
|
|
And I really feel that big picture is something that a lot of people can get a grasp on. And
|
|
|
|
40:35.440 --> 40:41.680
|
|
if they do, it might put them in a place of healthy skepticism that gives them also a
|
|
|
|
40:41.680 --> 40:48.000
|
|
short, rather a small toolbox. You know, if you approach these problems as okay, what
|
|
|
|
40:48.000 --> 40:53.240
|
|
is the model that they are presenting to me and try to understand the cartoon version
|
|
|
|
40:53.240 --> 40:58.160
|
|
of reality that they are foisting upon you with the story that they tell and then evaluate
|
|
|
|
40:58.160 --> 41:02.200
|
|
whether or not that model makes any predictions that have value. I don't think that's hard
|
|
|
|
41:02.240 --> 41:06.680
|
|
for well, it could be hard for some people, but I think there are a lot of people who
|
|
|
|
41:06.680 --> 41:07.520
|
|
are capable of doing it.
|
|
|
|
41:07.520 --> 41:13.800
|
|
Couldn't be. It shouldn't be difficult. I mean, the best science we have, all science
|
|
|
|
41:13.800 --> 41:18.440
|
|
uses models. It's not using models. That's the problem that we should emphasize. You
|
|
|
|
41:18.440 --> 41:24.000
|
|
have to have a model, because you can't encompass all of reality all at once. You have to have
|
|
|
|
41:24.000 --> 41:30.680
|
|
a slice of it. You have to have some abstraction and some kind of model that that's not wrong.
|
|
|
|
41:30.680 --> 41:38.080
|
|
But the model should at least have proved itself. And fitting past data is nothing.
|
|
|
|
41:38.080 --> 41:44.800
|
|
It's absolutely zero. It's necessary, but it's far from sufficient, because any set
|
|
|
|
41:44.800 --> 41:51.600
|
|
of past data that you give me and any field, I can fit not just one, but any number of
|
|
|
|
41:51.600 --> 41:59.720
|
|
models to it and to any arbitrary precision. I can fit them perfectly. I can always find
|
|
|
|
41:59.760 --> 42:06.320
|
|
at least one model that will fit that past data perfectly. And this is a trivial, absolutely
|
|
|
|
42:06.320 --> 42:12.880
|
|
trivial mathematical exercise. So that we have sophisticated models done on a computer
|
|
|
|
42:12.880 --> 42:17.160
|
|
and they're done by experts and the model has been peer reviewed by people who also
|
|
|
|
42:17.160 --> 42:22.760
|
|
create their own models means nothing. It's absolutely nothing. It adds nothing to it.
|
|
|
|
42:22.760 --> 42:27.360
|
|
Like you say, it's not generating knowledge. It's generating noise. It's only when these
|
|
|
|
42:27.400 --> 42:34.080
|
|
models are verified and we're found. And again, it has to be by disinterested people.
|
|
|
|
42:34.080 --> 42:39.280
|
|
It can't be the same people who create the model and then make predictions to say, yeah,
|
|
|
|
42:39.280 --> 42:45.360
|
|
trust us. Everything's fine. No, no, we should know enough about now how the world works
|
|
|
|
42:45.360 --> 42:51.200
|
|
that we cannot have people who have an extreme interest in their own project, be that their
|
|
|
|
42:51.200 --> 42:56.360
|
|
own judge. There's legal maximums that guide this kind of a stuff. We need those same
|
|
|
|
42:56.360 --> 43:00.680
|
|
maximums to guide science. That was one of the useful discussions. I think we had two over
|
|
|
|
43:00.680 --> 43:06.600
|
|
the last week. Right. Right. I think a concrete example for people that are listening or watching
|
|
|
|
43:06.600 --> 43:12.520
|
|
for what Matt just said would be a mechanical model of the solar system. And you could crank
|
|
|
|
43:12.520 --> 43:16.840
|
|
it backwards and you can see that, Oh, wow, last month, it's okay. And the month before that,
|
|
|
|
43:16.840 --> 43:21.800
|
|
it's okay. And wow, it works for that eclipse from two months ago. But then you crank it
|
|
|
|
43:21.880 --> 43:28.040
|
|
forward and you go to tomorrow and the moon is backwards. And then you go a week a farther than
|
|
|
|
43:28.040 --> 43:32.680
|
|
that. And it's like the earth turns upside down. Then something inside of that box is not working
|
|
|
|
43:32.680 --> 43:37.320
|
|
correctly. Because even though when you crank it backwards, it seems to recapitulate the past
|
|
|
|
43:38.200 --> 43:42.840
|
|
astronomical data, when you crank it forward, if it doesn't make a prediction that happens in
|
|
|
|
43:42.840 --> 43:48.120
|
|
the future, then those gears are not working. And I think if you if you apply that to some of
|
|
|
|
43:48.120 --> 43:53.320
|
|
these biological models, or these nutritional models, or these, you know, a lot of this pandemic
|
|
|
|
43:53.320 --> 43:57.880
|
|
models, you get to the point where when you crank that thing forward, it's useless. It's
|
|
|
|
43:57.880 --> 44:05.400
|
|
absolutely useless. That's precisely it. And it's even worse than the picture than you're painting.
|
|
|
|
44:06.280 --> 44:10.120
|
|
Because think about this, think about that model we have now, we do have a model of the solar
|
|
|
|
44:10.120 --> 44:15.560
|
|
system, like a Copernican type model, we do crank it forward. And we do see that it does work rather
|
|
|
|
44:15.560 --> 44:22.440
|
|
well. However, there was an older model, a tolemaic model that had all these epicycles of things
|
|
|
|
44:22.440 --> 44:28.280
|
|
spinning this way and that way. That model also predicted very well. It predicted beautifully.
|
|
|
|
44:29.000 --> 44:36.760
|
|
So even if this is why it's a bare minimum to get the model to make good predictions,
|
|
|
|
44:36.760 --> 44:42.120
|
|
that's the first entry. After that, it still doesn't mean that you have proved you found a real
|
|
|
|
44:42.120 --> 44:47.400
|
|
cause of something. I love that. Oh, I love that you have a lot more to do.
|
|
|
|
44:47.960 --> 44:52.360
|
|
I love that you said that. Wow, that's really spectacular. I love that you said that. And that's
|
|
|
|
44:52.360 --> 44:57.960
|
|
really where where the grinding of science happens when there are two models that work pretty well.
|
|
|
|
44:57.960 --> 45:03.400
|
|
And now you need to come up with experiments that's able to dissect their their predictive
|
|
|
|
45:03.960 --> 45:10.120
|
|
value like that. That I imagined as a kid was what what all scientists were doing was at the
|
|
|
|
45:10.120 --> 45:14.680
|
|
me too, the cutting edge of knowledge deciding between the two best models
|
|
|
|
45:15.560 --> 45:19.880
|
|
with a clever experiment that got them a lab like obviously, isn't that what they're all doing?
|
|
|
|
45:21.160 --> 45:26.040
|
|
I don't know. Sometimes we can't always do experiments, you know, like in cosmology and so
|
|
|
|
45:26.040 --> 45:33.000
|
|
forth, or they're very limited. But yeah, that's it. That's it. We have to find, like I'm always
|
|
|
|
45:33.000 --> 45:38.440
|
|
saying, you we have to, this brings us back to broken science. One of the things that's broken
|
|
|
|
45:38.520 --> 45:46.200
|
|
about science is is the metaphysics. In order to do physics, you have to first have a metaphysics.
|
|
|
|
45:46.200 --> 45:54.280
|
|
So we need to understand the limits of the metaphysics that we currently use and of
|
|
|
|
45:54.280 --> 45:58.520
|
|
possible replacements and the superiority of replacements and so forth. But that's a whole
|
|
|
|
45:58.520 --> 46:05.640
|
|
other discussion. That's that's why it's very difficult to explain these kinds of things because
|
|
|
|
46:06.520 --> 46:12.440
|
|
we have to first get past the point where we recognize we've had a broken model and then we
|
|
|
|
46:12.440 --> 46:17.160
|
|
have to get to the point where we realize that models are not being used in a predictive sense.
|
|
|
|
46:17.160 --> 46:22.840
|
|
And then we have to realize that we have to have a firm understanding of what it means to cause
|
|
|
|
46:22.840 --> 46:29.000
|
|
and what what does cause something mean and all that kind of thing. So we really have to we really
|
|
|
|
46:29.000 --> 46:33.480
|
|
have to fix the foundations of science. And that's that's kind of what this broken science
|
|
|
|
46:33.480 --> 46:39.640
|
|
things about. I heard you leading a disc or having a discussion randomly between sessions.
|
|
|
|
46:40.920 --> 46:46.680
|
|
And it had to do a little bit with this, the idea that the table doesn't exist because actually
|
|
|
|
46:46.680 --> 46:50.040
|
|
it's mostly space. And then if we go farther and farther and farther, all of a sudden there's
|
|
|
|
46:50.040 --> 46:56.200
|
|
no causes at all or something like this. And that's really kind of where in a sort of,
|
|
|
|
46:56.920 --> 47:02.040
|
|
I'm terrible at these kinds of discussions, but that's sort of where where Sam Harris is with
|
|
|
|
47:02.040 --> 47:06.200
|
|
regard to, you know, free will and everything else is just we're all particles moving, right?
|
|
|
|
47:06.200 --> 47:10.600
|
|
And if we just had enough data, we would be able to predict the future and everything.
|
|
|
|
47:12.040 --> 47:16.520
|
|
Of course, I think that's false. And that he does. He's the kind of guy that there's not the
|
|
|
|
47:16.520 --> 47:21.720
|
|
table's not really there because it's mostly mostly empty space. And my metaphysics says the
|
|
|
|
47:21.720 --> 47:27.160
|
|
table's here. It's real. And in fact, the particles don't exist in it anymore. They only exist virtually
|
|
|
|
47:27.160 --> 47:32.920
|
|
in potentia. They're now part of the thing that is the table. There's no separation of
|
|
|
|
47:32.920 --> 47:38.040
|
|
particles and atoms anymore that makes the table. It just is a table, just as common sense
|
|
|
|
47:38.600 --> 47:43.320
|
|
shows it to you. And of course, about free will, I always make the joke. It's a standard joke,
|
|
|
|
47:43.320 --> 47:48.760
|
|
but it's true that you read these guys papers who try to prove to you that free will doesn't
|
|
|
|
47:48.760 --> 47:54.360
|
|
exist. And every one of these papers, they always say, if only people understood they couldn't make
|
|
|
|
47:54.360 --> 48:01.080
|
|
choices, they would make better choices. They always want to save the world by telling us that
|
|
|
|
48:01.080 --> 48:05.800
|
|
we really can't make choices. That's absurd. It's absurd. Here, what are they doing? They're
|
|
|
|
48:05.800 --> 48:11.960
|
|
using their free will to write it. That contradiction, when you call them on it, they'll just say,
|
|
|
|
48:11.960 --> 48:16.840
|
|
Oh, no, no, it doesn't work. It's an illusion. Okay. Fine. It's an illusion. Who is having the
|
|
|
|
48:16.840 --> 48:23.960
|
|
illusion? Right. Wow. Yeah. Who is having the illusion? It's a bluff. It's always a fluff. There's
|
|
|
|
48:23.960 --> 48:29.000
|
|
a lot of bluffs in science. And that's one of them. Yeah, that's wonderful way to put it. That's
|
|
|
|
48:29.000 --> 48:37.160
|
|
great. I'm I'm just sort of humbled by the fact that the first couple of these meetings were so
|
|
|
|
48:37.160 --> 48:44.360
|
|
overwhelming to me that the big picture of how to use this is a way to bring people to the light
|
|
|
|
48:44.360 --> 48:50.440
|
|
of things. It really felt like this was the first time where Greg nailed it, where other people nailed
|
|
|
|
48:50.520 --> 48:57.400
|
|
it and where I was comfortable enough in my seat to absorb it. And so I'm very, very excited about
|
|
|
|
48:58.120 --> 49:05.400
|
|
being able to to apply these ideas as you do now to biological papers and to like think about
|
|
|
|
49:06.040 --> 49:11.320
|
|
what I'm really calling out as far as what's broken with how they're doing it. And the other
|
|
|
|
49:11.320 --> 49:15.960
|
|
thing that was missing from my descriptions was understanding that there were lots of people
|
|
|
|
49:16.040 --> 49:20.360
|
|
who've thought about this, right? It's not some kind of secret. I mean, people have written books
|
|
|
|
49:20.360 --> 49:27.000
|
|
like you, but before you as well that that that part of the reason why we're here is because people
|
|
|
|
49:27.000 --> 49:33.800
|
|
like Popper wrote stuff. And yeah, and so it is important. I was pretty naive to all of that,
|
|
|
|
49:33.800 --> 49:38.920
|
|
that literature. And so it's been a big eye opening experience for me to realize that in the
|
|
|
|
49:38.920 --> 49:43.400
|
|
best. Well, that's just it. You could go through as a scientist. I did. I went through the whole
|
|
|
|
49:43.400 --> 49:50.280
|
|
thing. Cornell, I got my master's and atmospheric, you know, PhD, whatever. Many, many people,
|
|
|
|
49:50.280 --> 49:56.680
|
|
you never once have to be confronted by any of these philosophical matters. Never.
|
|
|
|
49:57.400 --> 50:02.520
|
|
Never. You can go through and finish. You don't, at the end, you're going to ask, what is science?
|
|
|
|
50:02.520 --> 50:08.040
|
|
Well, science is what I do. Yeah, right. You know, that's an answer, but it's not, it's not a very
|
|
|
|
50:08.040 --> 50:12.360
|
|
good answer. But that's what most people's answer is. And if they've absorbed anything,
|
|
|
|
50:12.360 --> 50:18.280
|
|
they've absorbed this idea of the Popperian, you know, falsifiability, and you now see the limits of
|
|
|
|
50:18.280 --> 50:25.000
|
|
that. But most scientists don't because they sort of dismiss a lot of philosophers.
|
|
|
|
50:25.560 --> 50:30.040
|
|
Well, I think they should dismiss a lot of them. There's a lot of these academic philosophers,
|
|
|
|
50:30.040 --> 50:35.400
|
|
you know, kind of postmodernistic kind of nonsense that they preach. It doesn't seem to have any
|
|
|
|
50:35.400 --> 50:41.800
|
|
bearing on what a scientist does. And they're right, it doesn't. But there's, there's, you still
|
|
|
|
50:41.800 --> 50:47.800
|
|
can't operate and be a scientist without some philosophy. You might not recognize that it's a
|
|
|
|
50:47.800 --> 50:53.640
|
|
philosophy, but you have one. You absolutely must have a philosophy in order to apply
|
|
|
|
50:55.320 --> 51:01.240
|
|
the skills that you've learned to, to build your models, to understand your models,
|
|
|
|
51:01.240 --> 51:04.120
|
|
to know what a model is, make predictions, all these kinds of things.
|
|
|
|
51:04.120 --> 51:10.840
|
|
I mean, I'll give you an example of how, I mean, how bad I'm going to burn myself here, but I
|
|
|
|
51:10.840 --> 51:18.840
|
|
think it's an important example. I think you'll be able to see this. So if you, if you had a
|
|
|
|
51:18.840 --> 51:24.920
|
|
neuron and a slice of brain where I used to work, and it would be surrounded by other neurons,
|
|
|
|
51:24.920 --> 51:30.920
|
|
of course. And in that slice, there would also be axons, which were coming in from other brain
|
|
|
|
51:30.920 --> 51:36.440
|
|
regions that were cut off, but were still transmitting neurotransmitter to the neuron
|
|
|
|
51:36.440 --> 51:41.560
|
|
that you're going to record from. And so one of the measurements that you would get from this
|
|
|
|
51:41.560 --> 51:48.840
|
|
recording would be these spontaneous events, which would look like this. And they're basically
|
|
|
|
51:49.960 --> 51:55.480
|
|
signals that are coming from these synapses, which have been cut, but are still active.
|
|
|
|
51:55.480 --> 52:01.720
|
|
And you can see the synaptic events in real time as this neuron receives them. Now, interestingly,
|
|
|
|
52:01.720 --> 52:07.640
|
|
one of the main, not one of the main, one of the ways that you create this, this scenario where
|
|
|
|
52:07.640 --> 52:15.480
|
|
you can write a paper about it would be to apply a drug here. And if the baseline amount of this
|
|
|
|
52:15.480 --> 52:24.920
|
|
signal were to change, then you would say that somehow or another, that drug is interacting with
|
|
|
|
52:25.800 --> 52:30.440
|
|
with these, the receptors or the proteins on these and changing the release probability
|
|
|
|
52:30.440 --> 52:35.640
|
|
of the glutamate there. And so then you start to understand how the drug works or doesn't work.
|
|
|
|
52:35.640 --> 52:41.160
|
|
Now, of course, this assumption would be backed up by pharmacological blocking here and all kinds
|
|
|
|
52:41.160 --> 52:48.280
|
|
of other things like that. But my point is more abstract in the sense of if you imagine how complex
|
|
|
|
52:48.280 --> 52:55.960
|
|
the brain is, it's pretty easy to get here and have a significant result when you pharmacologically
|
|
|
|
52:55.960 --> 53:05.320
|
|
alter the circuit. Now, the reason why I'm saying this is because I feel as though we're at the
|
|
|
|
53:05.320 --> 53:11.640
|
|
verge of being able to explain to people how it is. And with where I wasn't, I'm going to stop
|
|
|
|
53:11.640 --> 53:19.560
|
|
talking very soon. When I was in grad school, no one ever used the word straw man to describe
|
|
|
|
53:19.560 --> 53:27.080
|
|
the null hypothesis. And that was partly because it was kind of a sacred ritual. You would never
|
|
|
|
53:27.080 --> 53:33.880
|
|
be so, be so, how would you say coy or, or, or sorry, it's almost like sarcastic and not,
|
|
|
|
53:33.880 --> 53:39.560
|
|
it's not the right word. But you're being so honest, if you said that, that it was a straw man, and
|
|
|
|
53:39.640 --> 53:45.160
|
|
but it would be so useful from the perspective of teaching a graduate student exactly what this
|
|
|
|
53:45.160 --> 53:50.360
|
|
game is. Quite frankly, I feel as though knowing what I know now in broken science, I could go
|
|
|
|
53:50.360 --> 53:56.280
|
|
back to academia and be much more successful, because I understand what the formula is asking
|
|
|
|
53:56.280 --> 54:03.400
|
|
for. And it's not clouded by this ideal of what I imagine I'm doing. Can you can you relate to that
|
|
|
|
54:03.400 --> 54:09.160
|
|
illusion that I created for myself? That's absolutely it. That's that's that's how people
|
|
|
|
54:09.880 --> 54:15.160
|
|
that you described how a lot of people wake up from the rituals that they're practicing and
|
|
|
|
54:15.160 --> 54:22.200
|
|
understanding that this this method just doesn't give what they they wanted to give. I mean,
|
|
|
|
54:22.200 --> 54:25.880
|
|
you're trying to what you're trying to do there, you're trying to fundamentally fundamentally
|
|
|
|
54:25.880 --> 54:31.400
|
|
understand the causes at play. And that's exactly the right way to do it. That's absolutely the
|
|
|
|
54:31.400 --> 54:37.960
|
|
right way to do it. But it's brutal hard work, right? I mean, there's all kind of you realize
|
|
|
|
54:37.960 --> 54:42.440
|
|
after you set up a little simple experiment that you just drew right there. And and you saw the
|
|
|
|
54:42.440 --> 54:46.680
|
|
signals and suddenly you realize or you have other evidence that comes in later, you know,
|
|
|
|
54:46.680 --> 54:50.920
|
|
there's other things that could account for that change in signal to. And you have to keep
|
|
|
|
54:50.920 --> 54:57.000
|
|
eliminating them one by one by one. And it takes a very long time. It's very difficult to do the
|
|
|
|
54:57.080 --> 55:02.440
|
|
very best. Brutal brutal brutal hard work. Absolutely. That's what science is really like.
|
|
|
|
55:02.440 --> 55:08.760
|
|
But we've tried to make it too easy and it is now far too easy just apply this formula. And if it
|
|
|
|
55:08.760 --> 55:13.880
|
|
pops out this number pops out, well, then you you you prove what you want to do it. It's very sweet
|
|
|
|
55:13.880 --> 55:19.560
|
|
the way you say it because actually I I felt when I was stepping out of neuroscience or when I was
|
|
|
|
55:19.560 --> 55:26.600
|
|
my career was waning before I got let go. It was because I was still having to put in
|
|
|
|
55:27.320 --> 55:32.600
|
|
hundreds of hours to get my data set. I couldn't just roll someone into an fMRI machine and then
|
|
|
|
55:32.600 --> 55:38.760
|
|
go back to my computer for a couple weeks and then publish a paper. I had to find something that
|
|
|
|
55:38.760 --> 55:43.720
|
|
was in effect and then do some other experiments that showed that the effect wasn't from something
|
|
|
|
55:43.720 --> 55:48.840
|
|
else. And that was the pharmacological blocking and this kind of thing. And so I did feel
|
|
|
|
55:49.640 --> 55:55.480
|
|
a little bit like I was part of a small group of people that was still and I'm not trying to
|
|
|
|
55:55.480 --> 56:00.120
|
|
to my own horn. I'm trying to say how I felt it didn't fit in. I didn't have this
|
|
|
|
56:00.760 --> 56:05.240
|
|
five papers a year machine going. I couldn't figure I couldn't see it. I didn't I didn't
|
|
|
|
56:05.240 --> 56:09.640
|
|
have an answer to that problem, which was, you know, you're going to need eight papers in the next
|
|
|
|
56:09.640 --> 56:14.520
|
|
three years or you're not going to make it. And it's like what if you tell me that I could say
|
|
|
|
56:14.520 --> 56:18.440
|
|
immediately in that meeting, well, then I'm not going to make it because what I do doesn't work
|
|
|
|
56:18.440 --> 56:24.280
|
|
like that. And so do I need more students? Do I don't know what the answer was? But I was already
|
|
|
|
56:25.240 --> 56:29.240
|
|
finding that wheel unable to grind. I just couldn't do it anymore.
|
|
|
|
56:29.240 --> 56:34.200
|
|
Oh, yeah. For everybody who hasn't been exposed to academia, it's toxic in that way.
|
|
|
|
56:34.200 --> 56:38.920
|
|
It really is publisher parish. It's terrible. It's terrible. That accounts for the flood of bad
|
|
|
|
56:38.920 --> 56:43.320
|
|
papers. I mean, they're pushing out a lot of nonsense just because they have to keep their
|
|
|
|
56:43.320 --> 56:53.240
|
|
positions or to bring in grants to pay for themselves and their lab or and most importantly to to provide
|
|
|
|
56:53.560 --> 57:00.280
|
|
kickback to the deans in the form of overhead. That's you could write one paper a year or even
|
|
|
|
57:00.280 --> 57:05.480
|
|
none if you're bringing in good overhead. If you're bringing in that, which is, you know,
|
|
|
|
57:05.480 --> 57:11.720
|
|
it was at Cornell at one point, I think it was up to like 60%, 59 and a half percent overhead.
|
|
|
|
57:12.440 --> 57:17.640
|
|
I mean, so you get the grant grants a million dollars and then they tack on another $590,000
|
|
|
|
57:18.600 --> 57:23.480
|
|
overhead, and that just goes to the college, which becomes part of the slush funds for the
|
|
|
|
57:23.480 --> 57:29.640
|
|
deans. Yeah, absolutely. When my grant application or my grant, R01, for example, if I had
|
|
|
|
57:31.400 --> 57:40.200
|
|
when my supervisor got an R01, he paid 155% of my salary to the university. So 55% of my salary
|
|
|
|
57:40.200 --> 57:45.800
|
|
was rent. And he did that for every student and himself. So that's quite a, that's quite a load of
|
|
|
|
57:45.800 --> 57:50.200
|
|
money. If you think there are 130 faculty members just in that department that are all
|
|
|
|
57:50.200 --> 57:54.200
|
|
working under the same model and all those grad students walking around are all paying rent
|
|
|
|
57:54.760 --> 58:01.080
|
|
to the tune of thousands of dollars a year. I mean, it's extraordinary amount of money. That's why
|
|
|
|
58:01.080 --> 58:05.240
|
|
they can pay deans millions of dollars a year. They pay deans as much as they pay basketball
|
|
|
|
58:05.240 --> 58:12.760
|
|
coaches these days. Yep. It's extraordinary. So how about this? How about this? What have we got
|
|
|
|
58:12.760 --> 58:21.000
|
|
a bunch of weaponized money and started funding science that was, you know, we do have that in
|
|
|
|
58:21.000 --> 58:28.360
|
|
the way though, private companies that are not tied up with the government that are not tied up
|
|
|
|
58:28.360 --> 58:34.200
|
|
with the government that that proviso is the strongest one I could possibly make. You can say
|
|
|
|
58:34.200 --> 58:38.200
|
|
it's what I have an interest with the government and it becomes political. Well, that's just the
|
|
|
|
58:38.200 --> 58:44.680
|
|
same as being the government. That's where things can be done. So it's exactly that. It's private
|
|
|
|
58:44.680 --> 58:52.040
|
|
money in a patronage-like system trying to discover things for the sake of discovering them.
|
|
|
|
58:52.840 --> 58:58.920
|
|
They could be towards the desire of having a useful product or something. There's nothing
|
|
|
|
58:58.920 --> 59:07.640
|
|
per se wrong with that. It's only when it becomes, you know, as we saw with COVID, when it becomes,
|
|
|
|
59:07.640 --> 59:13.960
|
|
you know, a very political thing that the whole thing becomes somewhat difficult. And it becomes
|
|
|
|
59:13.960 --> 59:18.760
|
|
no different than regular academia. And you know as well as I do that same people go in and out of
|
|
|
|
59:18.760 --> 59:23.320
|
|
these corporations into the government that judges the granting agencies and all this kind of stuff.
|
|
|
|
59:23.320 --> 59:32.120
|
|
It's the same set of people. Yeah. It's a very difficult problem to get your head around. I
|
|
|
|
59:32.120 --> 59:37.240
|
|
think one of the things that I liked that I heard somebody say at the meeting, which I also think
|
|
|
|
59:37.240 --> 59:41.160
|
|
is something that can get people's wheels turning if they haven't thought about this before, is that
|
|
|
|
59:41.160 --> 59:48.440
|
|
all the real good sciences top secret, all the, I don't know to what extent that's really true
|
|
|
|
59:48.440 --> 59:55.000
|
|
or if it's just funny, but I do think that they're that identifying places like, you know, meteorology,
|
|
|
|
59:55.000 --> 01:00:04.040
|
|
like actuarial science, where models are required to verify, we can find examples where, you know,
|
|
|
|
01:00:04.920 --> 01:00:10.680
|
|
we can we can push science in that direction over time. We can teach, you know, if we can break
|
|
|
|
01:00:10.680 --> 01:00:15.640
|
|
into a university and teach and teach this to undergraduates, that would be great. We probably
|
|
|
|
01:00:15.640 --> 01:00:23.960
|
|
ruin their graduate experience, but I don't know. I'm still trying to do it. We can do it. The tools
|
|
|
|
01:00:23.960 --> 01:00:29.960
|
|
are there. Like I say, the tools, the tools we have, it's not it's not unknown. I mean, there's all
|
|
|
|
01:00:29.960 --> 01:00:35.560
|
|
kinds of investigations that can go on to improve these tools and so forth, but the basic set
|
|
|
|
01:00:36.440 --> 01:00:43.080
|
|
are there waiting for us to be used. Can I ask you a personal question now? Hey, can I ask you
|
|
|
|
01:00:43.080 --> 01:00:48.920
|
|
a personal question? What what do you do for, for, you know, to make ends meet? Are you teaching
|
|
|
|
01:00:48.920 --> 01:00:54.440
|
|
anywhere? Do you do you have online course or I mean, why not do it online? I should do an online
|
|
|
|
01:00:54.520 --> 01:01:01.240
|
|
course. No, I I basically have private clients. Okay. Would you want would you want enough?
|
|
|
|
01:01:01.800 --> 01:01:06.840
|
|
Would you want any help with an online course? Like I could be your technical guy or or I could
|
|
|
|
01:01:06.840 --> 01:01:11.800
|
|
I do need to figure out. I do need to do it. I keep swearing. I'm going to, but video is not
|
|
|
|
01:01:11.800 --> 01:01:18.040
|
|
I'm very naive about video. Oh, but you could, for example, you could record it in your own home.
|
|
|
|
01:01:18.040 --> 01:01:23.160
|
|
I have I have a chalkboard. Right. But then you then you send me the video and I edit it for you and
|
|
|
|
01:01:23.160 --> 01:01:28.680
|
|
add subtitles and all that stuff. I can do all that. I would be happy to do it. And we'll chat
|
|
|
|
01:01:28.680 --> 01:01:33.960
|
|
about that. Okay. I would think this would be really a good idea. And also, I would obviously
|
|
|
|
01:01:33.960 --> 01:01:39.160
|
|
learn from it a lot too. I think that's where we need to go. I think we need people like you
|
|
|
|
01:01:40.360 --> 01:01:45.320
|
|
taking over this thing. You know, I mean, if you could it, well, how awesome would it be if you
|
|
|
|
01:01:45.320 --> 01:01:52.200
|
|
meet somebody randomly? Oh, yeah, I took Matt Briggs's course last year. Bang. It should be a it should
|
|
|
|
01:01:52.200 --> 01:01:57.080
|
|
be something that people put on their website or people put on their social media. It I think
|
|
|
|
01:01:57.080 --> 01:02:02.600
|
|
you're a guy who, you know, you have the kind of charisma that can do it. You're you're a good
|
|
|
|
01:02:02.600 --> 01:02:07.640
|
|
speaker. And you and you've got all of the work done already in your books. You've already thought
|
|
|
|
01:02:07.640 --> 01:02:12.440
|
|
about this and develop these ideas. And so there's nothing I would have colored chalk. Well, there
|
|
|
|
01:02:12.440 --> 01:02:17.960
|
|
you go. I mean, is it the fancy stuff from Japan or is it just good? No, that's the one I got that
|
|
|
|
01:02:17.960 --> 01:02:24.600
|
|
got the one to this Korean. Yeah, yeah, that's good too. Yeah, no dust.
|
|
|
|
01:02:27.480 --> 01:02:31.080
|
|
That's fabulous. Oh my gosh, that's great. I love writing. That's awesome.
|
|
|
|
01:02:32.520 --> 01:02:36.920
|
|
If we are at like 59 minutes and I don't know what we said for time, but it feels like a very
|
|
|
|
01:02:36.920 --> 01:02:41.560
|
|
good time to wrap up and you just promised to come back because I'll have you on with more material
|
|
|
|
01:02:41.560 --> 01:02:46.440
|
|
and more more questions next time we can go into something specific that I get out of James as I
|
|
|
|
01:02:46.440 --> 01:02:52.600
|
|
read it. Oh, absolutely. Okay, cool. So I'm going to put this link up one more time.
|
|
|
|
01:02:52.600 --> 01:02:57.000
|
|
This is where you can find, unfortunately, it's an Amazon link, but that's the easiest for me.
|
|
|
|
01:02:57.000 --> 01:03:01.800
|
|
You can see his full name William M Briggs. You can see the books that he's got available.
|
|
|
|
01:03:01.800 --> 01:03:06.440
|
|
I would recommend the yellow one and everything you believe is not wrong, although he has contributed
|
|
|
|
01:03:06.440 --> 01:03:13.480
|
|
to this price of panic, which is specifically about the COVID COVID thing. And you can also find
|
|
|
|
01:03:13.480 --> 01:03:18.120
|
|
him at I think William M Briggs dot sub stack, something like that is at William Briggs.
|
|
|
|
01:03:18.120 --> 01:03:26.440
|
|
W M W M Briggs at sub stack. So thank you very much, Matt, for joining me.
|
|
|
|
01:03:26.440 --> 01:03:32.280
|
|
Can you remind me again? Well, maybe I'll do that offline. But I wanted to say hello to your wife
|
|
|
|
01:03:32.280 --> 01:03:37.560
|
|
as well. And for whatever reason, my mind is blanking on a name. So please give her a hug and say hello
|
|
|
|
01:03:37.560 --> 01:03:43.000
|
|
for me. And we will we will have you on within a month. I guarantee it unless you really object
|
|
|
|
01:03:43.640 --> 01:03:47.720
|
|
anytime. Cool. Very good. Very good. Thank you, Matt, for joining me. And I will talk to you again soon.
|
|
|
|
01:03:51.720 --> 01:03:57.080
|
|
Bao, Zawi. Okay, that went well. Thank you very much, guys, for joining me. I'm going to put this
|
|
|
|
01:03:57.080 --> 01:04:02.600
|
|
little slide back up again. This is, of course, the broken science meeting that happened a couple
|
|
|
|
01:04:02.600 --> 01:04:07.400
|
|
weeks ago. I hope Matt wasn't surprised that I let him go there so quick. I'll call him back on
|
|
|
|
01:04:07.400 --> 01:04:16.360
|
|
the phone. We are a group of people loosely organized around two people named Emily Kaplan
|
|
|
|
01:04:16.360 --> 01:04:22.520
|
|
and Greg Glassman. Greg Glassman is the originator, the creator of CrossFit, a very smart guy who
|
|
|
|
01:04:22.520 --> 01:04:28.440
|
|
actually had a very smart Papa named Jeff Glassman who worked for Hughes Aviation. And it was his
|
|
|
|
01:04:28.440 --> 01:04:33.720
|
|
upbringing with his very intelligent dad that has brought him all the way to understanding this
|
|
|
|
01:04:33.720 --> 01:04:40.520
|
|
stuff because CrossFit ran into legal problems with fake science. Basically, science, fake science
|
|
|
|
01:04:40.520 --> 01:04:45.960
|
|
was published to try and say that CrossFit enters people. And over the course of investigating this,
|
|
|
|
01:04:45.960 --> 01:04:50.600
|
|
he found out that his dad's been right about science his whole life. And it has put him on this
|
|
|
|
01:04:50.600 --> 01:04:57.960
|
|
crusade to try and, you know, leave a legacy for his children and for our children of his dad and
|
|
|
|
01:04:58.920 --> 01:05:06.760
|
|
these philosophers like Stove, these thinkers like Stove, and also our contemporaries like Matt
|
|
|
|
01:05:06.760 --> 01:05:13.560
|
|
Briggs. And so I really think this is one of the most worthwhile sort of initiatives and I'm so
|
|
|
|
01:05:14.120 --> 01:05:19.560
|
|
humbled to be considered a small part of it. So thank you, Emily. Thank you, Greg. And thank
|
|
|
|
01:05:19.560 --> 01:05:25.720
|
|
you, Matt, for coming on the show. And I think we're going to have a lot of interesting conversations
|
|
|
|
01:05:25.800 --> 01:05:30.120
|
|
as we go forward. At some point in time, of course, we're going to have Greg on the show. But I don't
|
|
|
|
01:05:30.120 --> 01:05:34.920
|
|
want to, I don't want to play that card too early because I don't, number one, I'm no, he's a busy
|
|
|
|
01:05:34.920 --> 01:05:39.560
|
|
guy. And so that'll be hard to organize. But number two, I want to have all of my ducks in a row and
|
|
|
|
01:05:39.560 --> 01:05:44.360
|
|
already have been paid back a little bit to him, the stuff that I've learned through a series of
|
|
|
|
01:05:44.360 --> 01:05:49.240
|
|
streams that this is the first one of. So thank you very much, guys, for joining me. This has been
|
|
|
|
01:05:49.240 --> 01:05:55.320
|
|
Giga Ohm Biological, a high-resistance, low-noise information brief brought to you by a biologist.
|
|
|
|
01:05:55.320 --> 01:06:02.840
|
|
19th of March, 2024. The afternoon shows are going to be a street because I really like this time frame.
|
|
|
|
01:06:05.080 --> 01:06:11.400
|
|
Thanks for joining me, guys. 73 people in the chat. Very understandable conversation. I hope
|
|
|
|
01:06:13.000 --> 01:06:17.240
|
|
good to see you, Pamela. Thanks for the link in the chat. She's always good for that.
|
|
|
|
01:06:17.240 --> 01:06:22.680
|
|
Can I see Janet Reno in the audience? Awesome. I didn't see you there earlier. I'm sorry, Janet.
|
|
|
|
01:06:22.680 --> 01:06:30.360
|
|
Janet's one of my favorite anonymous viewers because obviously it's not Janet Reno. It might be,
|
|
|
|
01:06:30.360 --> 01:06:39.000
|
|
it might be Ford. Some reason I think it might be Ford or something like that anyway. Thanks very
|
|
|
|
01:06:39.000 --> 01:06:47.080
|
|
much for joining me, guys.
|
|
|
|
01:07:17.240 --> 01:07:23.240
|
|
Thanks for joining me, and I'll see you in the next video.
|
|
|
|
|