You can not select more than 25 topics
Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
4037 lines
129 KiB
4037 lines
129 KiB
WEBVTT
|
|
|
|
00:27.398 --> 00:27.618
|
|
I have.
|
|
|
|
01:00.573 --> 01:15.840
|
|
I still can't figure out why that is not.
|
|
|
|
01:16.321 --> 01:17.721
|
|
I have no responsibility.
|
|
|
|
01:29.239 --> 01:30.145
|
|
Stop lying.
|
|
|
|
01:32.138 --> 01:32.460
|
|
Stop.
|
|
|
|
02:18.559 --> 02:20.140
|
|
Okay, good morning everybody.
|
|
|
|
02:20.200 --> 02:21.221
|
|
Welcome to the show.
|
|
|
|
02:22.641 --> 02:25.303
|
|
This has been a long time coming.
|
|
|
|
02:26.304 --> 02:27.004
|
|
Sorry about that.
|
|
|
|
02:27.144 --> 02:33.988
|
|
I meant to have everything ready to go here and I didn't really have it ready to go on time.
|
|
|
|
02:34.008 --> 02:36.830
|
|
So I'm still cleaning stuff up here on my desk.
|
|
|
|
02:36.910 --> 02:37.750
|
|
I apologize.
|
|
|
|
02:38.911 --> 02:45.435
|
|
If you're here for the first time, let me just quick check and see if I got anything going over here on the YouTubage.
|
|
|
|
02:46.814 --> 02:51.117
|
|
It seems like we do have something going there Good morning.
|
|
|
|
02:51.157 --> 02:51.618
|
|
Good morning.
|
|
|
|
02:51.658 --> 03:01.164
|
|
Good morning This is the news program That isn't for everyone but is for anyone who is looking for the truth.
|
|
|
|
03:01.805 --> 03:11.232
|
|
My message is a little bit painful right now It is a message where Kevin McKernan is reading his own script or at least a script that he contributed a lot of
|
|
|
|
03:12.012 --> 03:25.377
|
|
A lot of ideas too and it is a spectacular commitment to this script That's all that's required in order for for these people to win, but we need to understand what that script is And how long that script has been running?
|
|
|
|
03:25.777 --> 03:35.341
|
|
Um, you know, they can they can adjust it on the fly in november, uh 2024 was when he was on the danny jones podcast and basically showed
|
|
|
|
03:36.340 --> 03:46.051
|
|
this spectacular commitment to a script that actually there were some pretty interesting meddlers that were on this script already in March of 2020.
|
|
|
|
03:47.813 --> 03:56.924
|
|
And two of those meddlers include a semi-retired neuroscientist apparently working in Japan, although I highly doubt that,
|
|
|
|
03:58.440 --> 04:09.768
|
|
And a young man from, I believe he might even be from Wisconsin, but he calls himself Atikin Skywalker.
|
|
|
|
04:10.863 --> 04:13.065
|
|
and he is an associate of George Webb.
|
|
|
|
04:13.105 --> 04:27.415
|
|
And in March of 2020, these two characters along with another character by the name of Paul Cottrell are basically already on the same script that Kevin McKernan would recite many years later.
|
|
|
|
04:27.435 --> 04:31.358
|
|
And I think that's only because this is all they ever had was this script.
|
|
|
|
04:32.119 --> 04:33.700
|
|
That's why in 2025, they have this,
|
|
|
|
04:36.747 --> 04:43.633
|
|
this person still five-year track record of basically following me around on the internet.
|
|
|
|
04:43.873 --> 04:51.019
|
|
Sometimes for a couple years he even had a link to my website as a place to find out the biology.
|
|
|
|
04:52.300 --> 04:53.641
|
|
Why is this happening?
|
|
|
|
04:54.882 --> 05:03.029
|
|
This is all they ever had though and they're actually confessing to it when Steve Kirsch has him on his VRSRF
|
|
|
|
05:05.155 --> 05:08.936
|
|
podcast twice, one time actually with the great Kevin McKernan.
|
|
|
|
05:09.976 --> 05:25.661
|
|
And so this timeline does not compute unless you understand that the longtime DNC donor, Steve Kirsch, is behind funding this network that I think Kevin McKernan is behind scripting.
|
|
|
|
05:26.656 --> 05:50.142
|
|
And that's why in 2024, despite the fact that I saw this script coming, unfortunately, in 2021 already, this guy is telling a story about Ralph Baric being some kind of super genius and master baker who shared his baking recipes with Wuhan, although there's this other thing called Golden Gate Assembly, and it's just basically that.
|
|
|
|
05:52.242 --> 06:03.732
|
|
and he of course promotes this guy who came to my house in January of 2022 to try and get me to publicly declare that the defuse proposal was real and that I was part of drastic
|
|
|
|
06:04.701 --> 06:21.193
|
|
Of course, drastic is a group of people, many of whom are from India and Pakistan, which is a really curious thing because it's one of the few countries that nobody ever mentions that Peter Hotez and Robert Malone are both making money from the vaccines in.
|
|
|
|
06:22.514 --> 06:32.002
|
|
Outbreaks never come from India, even though they're all very, very poor and their rivers are apparently full of antibiotic residue.
|
|
|
|
06:32.842 --> 06:38.507
|
|
And so the diffuse proposal didn't have an evaluation, but he says that it did and that they predicted a leak from it.
|
|
|
|
06:38.547 --> 06:39.808
|
|
That's why they didn't fund it.
|
|
|
|
06:39.828 --> 06:40.769
|
|
And that's ridiculous.
|
|
|
|
06:41.429 --> 06:46.033
|
|
He also tells a story about how it was probably already done because proposals are already done.
|
|
|
|
06:46.053 --> 06:51.078
|
|
But the odd thing is, is that's what I was quoted as saying in the Wuhan cover-up book.
|
|
|
|
06:51.758 --> 07:07.915
|
|
And finally, the whole argument is based on this sequence being 100% reable and verifiable and you must accept it and you must accept all of this molecular biology, all these techniques and all of these diagnostics as high fidelity instruments.
|
|
|
|
07:08.015 --> 07:08.355
|
|
And this
|
|
|
|
07:09.216 --> 07:13.998
|
|
should have been verified independently, individually by the FDA.
|
|
|
|
07:14.038 --> 07:17.879
|
|
But of course, it wasn't because we were in a PrEP Act emergency.
|
|
|
|
07:18.360 --> 07:22.921
|
|
And so instead, we've been forced to focus on these molecular details for more than five years.
|
|
|
|
07:23.442 --> 07:36.827
|
|
And one of the worst fake anti-vaxxers of all, James Lyons Weiler, was aware of all these molecular details, not that dissimilar to how Kevin McCairn was aware of these scripted molecular details.
|
|
|
|
07:37.647 --> 07:40.689
|
|
Ladies and gentlemen, it is a spectacular commitment to lies.
|
|
|
|
07:40.749 --> 07:41.809
|
|
That's all that's required.
|
|
|
|
07:42.330 --> 07:57.717
|
|
And so if somebody like Casey Means and somebody like Dr. Cruz wants to coordinatedly pretend that they're arguing with each other and keep a third unwitting party like Mary Talley Bowden on narrative, they can do that.
|
|
|
|
07:58.297 --> 07:59.438
|
|
And people will do that.
|
|
|
|
07:59.518 --> 08:03.240
|
|
People are doing that because this is about all the marbles.
|
|
|
|
08:03.260 --> 08:04.821
|
|
This is about the grandchildren of earth.
|
|
|
|
08:05.509 --> 08:11.390
|
|
And that's why there's a fake anti-vax, anti-vax, fake health freedom movement in America.
|
|
|
|
08:11.890 --> 08:14.591
|
|
And so yesterday I made some pretty bold statements.
|
|
|
|
08:15.731 --> 08:20.412
|
|
And the bold statements were based largely around this video.
|
|
|
|
08:21.253 --> 08:28.514
|
|
And I think it's an important thing to understand what I was trying to get across here.
|
|
|
|
08:28.554 --> 08:30.275
|
|
I think I'm just going to come on.
|
|
|
|
08:30.475 --> 08:31.795
|
|
Oh darn, I didn't do that yet.
|
|
|
|
08:33.097 --> 08:34.279
|
|
Give me one second, please.
|
|
|
|
08:34.299 --> 08:42.887
|
|
Let me set up my My excuse me equipment I Bet that's not even in focus either.
|
|
|
|
08:42.967 --> 08:44.809
|
|
Is it it's not in focus.
|
|
|
|
08:44.869 --> 08:52.636
|
|
I'm sure negative Such a failure I am Okay, I'm just gonna wing it here.
|
|
|
|
08:53.317 --> 08:57.701
|
|
Hopefully I'll get it about right Let me see if I can use a book
|
|
|
|
08:59.853 --> 09:02.535
|
|
Sorry about this, ladies and gentlemen, I apologize.
|
|
|
|
09:03.055 --> 09:05.497
|
|
I didn't check this one camera here.
|
|
|
|
09:05.557 --> 09:06.478
|
|
I think that's pretty good.
|
|
|
|
09:07.178 --> 09:08.539
|
|
If it's not, I'll fix it later.
|
|
|
|
09:09.420 --> 09:10.100
|
|
Let's see.
|
|
|
|
09:10.120 --> 09:13.162
|
|
Oh man, that's too big.
|
|
|
|
09:14.003 --> 09:14.843
|
|
You need to be at A.
|
|
|
|
09:18.601 --> 09:19.421
|
|
There we go.
|
|
|
|
09:20.861 --> 09:21.342
|
|
All right.
|
|
|
|
09:21.442 --> 09:42.906
|
|
So the list that I had up yesterday with regard to this video, and Robert Malone pretending to be a hero, is that he is connected to all of this fraudulent immunology that goes all the way back, all the way back to Gallo and Gardner and the discovery of the HIV
|
|
|
|
09:45.387 --> 09:48.388
|
|
AIDS virus, the HIV virus which causes AIDS.
|
|
|
|
09:49.208 --> 09:53.409
|
|
Again, not at all dissimilar to how SARS-CoV-2 causes COVID.
|
|
|
|
09:54.510 --> 10:07.013
|
|
So this is a kind of narrative, a kind of biological mythology that has been orchestrated and rehearsed several times, tested out or prototyped several times.
|
|
|
|
10:07.773 --> 10:10.734
|
|
One of these kinds of narratives that was
|
|
|
|
10:12.054 --> 10:28.908
|
|
seeded earlier than the pandemic that most of us were not aware of is Judy Mikovits and a potential cure for chronic fatigue syndrome, a potential source of it, a potential way to vaccinate against the cause of it so that the rest of the vaccine schedule would be safe.
|
|
|
|
10:29.809 --> 10:40.258
|
|
And I think this is all, again, it's not coincidence that all of these people are so tightly connected between a very limited number of mentors.
|
|
|
|
10:41.328 --> 10:48.118
|
|
And somewhere in that mentor chain is definitely Tony Fauci, but I'm not sure that he's necessarily a big driver of that brain trust.
|
|
|
|
10:48.178 --> 10:51.162
|
|
However, he is an excellent participant in it.
|
|
|
|
10:52.810 --> 11:15.261
|
|
And so I and I think only really Mark Kulak are able to succinctly say that we think it's important to understand Robert Malone because he is central to the orchestration of the and maybe even the on-the-fly adjustment of the pandemic narrative to try and keep the cruise ship on target.
|
|
|
|
11:16.561 --> 11:17.922
|
|
Pretty much in your face.
|
|
|
|
11:18.003 --> 11:19.424
|
|
All right, look, you gotta go.
|
|
|
|
11:19.764 --> 11:21.205
|
|
We're all very busy.
|
|
|
|
11:21.405 --> 11:22.526
|
|
I want to thank you for taking the time.
|
|
|
|
11:22.546 --> 11:24.567
|
|
I have one more question, very important question.
|
|
|
|
11:25.288 --> 11:26.469
|
|
Do you work for the CIA?
|
|
|
|
11:28.650 --> 11:36.136
|
|
Okay, so I could make a joke out of that, and then that would be clipped and become something on TikTok.
|
|
|
|
11:37.417 --> 11:38.498
|
|
I do not.
|
|
|
|
11:39.078 --> 11:39.919
|
|
I never have.
|
|
|
|
11:41.480 --> 11:45.063
|
|
I have had secret security clearance through the DOD.
|
|
|
|
11:46.687 --> 11:51.412
|
|
You cannot be working, and there was a point in my career where I was out of a job.
|
|
|
|
11:54.106 --> 12:03.029
|
|
basically run out of academia because I had blown the whistle on the Jesse Gelsinger gene therapy death at UPenn.
|
|
|
|
12:03.969 --> 12:06.190
|
|
Now, you can look at it yesterday.
|
|
|
|
12:06.270 --> 12:08.390
|
|
I started the stream with this video yesterday.
|
|
|
|
12:08.410 --> 12:09.390
|
|
I don't want to play it again.
|
|
|
|
12:09.410 --> 12:10.671
|
|
I don't want to belabor the points.
|
|
|
|
12:10.691 --> 12:14.312
|
|
But what I didn't do yesterday is I didn't actually read what I had on the screen.
|
|
|
|
12:14.772 --> 12:20.374
|
|
And a lot of people were really mad because I guess what I said on the screen yesterday was not very fair.
|
|
|
|
12:20.394 --> 12:21.334
|
|
And it was also in red.
|
|
|
|
12:22.054 --> 12:23.654
|
|
you were trying to hide what you were saying.
|
|
|
|
12:23.674 --> 12:25.275
|
|
It was pretty underhanded.
|
|
|
|
12:25.335 --> 12:36.297
|
|
Well, it's not really underhanded because I'm just pointing out how he has admitted in previous videos and previous interviews because he's done so many of them despite having a farm to manage.
|
|
|
|
12:36.777 --> 12:39.538
|
|
He's been so many places telling these stories.
|
|
|
|
12:39.598 --> 12:49.740
|
|
It's very easy to verify that he considers himself one of the heroes of the Zika virus outbreak, one of the heroes of the Ebola vaccine for none other than really bringing it to Merck.
|
|
|
|
12:50.160 --> 12:58.807
|
|
He considers himself an expert on probably what happened around US Amarant because he told Steve Hadfield that he was vetted for that job.
|
|
|
|
12:58.927 --> 13:02.830
|
|
And Steve Hadfield kind of laughed and didn't really take it seriously, but that's what he said.
|
|
|
|
13:06.625 --> 13:18.016
|
|
I believe, and I think Mark Kulak would say this is not crazy at least, if not, this is maybe more or less what he thinks too, that Whitney Webb is probably reading his script.
|
|
|
|
13:18.076 --> 13:23.401
|
|
Remember that Whitney Webb was the one who wrote this controversial article about the man in Wuhan.
|
|
|
|
13:26.223 --> 13:31.188
|
|
Whitney Webb was also present at the inaugural CHD conference where he was also present.
|
|
|
|
13:33.260 --> 13:43.570
|
|
And so I was told the story that Mary Holland and Meryl Nass vetted him for CHD, like went to his house and met him to see if he was a cool guy, stayed there for two days.
|
|
|
|
13:44.011 --> 13:48.996
|
|
And while they were there, they figured out that he didn't even know what the WEF was.
|
|
|
|
13:49.236 --> 13:51.879
|
|
They had to explain to him what the WEF was.
|
|
|
|
13:56.109 --> 14:03.371
|
|
That might sound ridiculous, but that's absolutely what I was told by no other person than Meryl Nass herself.
|
|
|
|
14:05.212 --> 14:11.274
|
|
Later in that same conversation, I said that Robert Malone is either a liar or he's an idiot if he took the shot.
|
|
|
|
14:11.934 --> 14:15.975
|
|
And she stormed away from the table saying, well, not everybody is smart as you, Jay.
|
|
|
|
14:17.155 --> 14:22.897
|
|
And I wish I was making that story up, pulling it out of thin air, but that's actually what happened.
|
|
|
|
14:23.564 --> 14:29.450
|
|
the night before Bobby announced his presidential candidacy in Boston.
|
|
|
|
14:31.432 --> 14:36.597
|
|
The extraordinary thing about it is, is that Mark Kulak was sitting at the table when that happened.
|
|
|
|
14:37.017 --> 14:43.764
|
|
She came over to our table and sat down and started talking smack about her friend Robert Malone.
|
|
|
|
14:44.825 --> 14:46.086
|
|
And when I wouldn't hear it,
|
|
|
|
14:46.840 --> 14:54.103
|
|
that he's either an expert and he's lying about taking the shot or he's not an expert and he's an idiot.
|
|
|
|
14:54.723 --> 14:58.624
|
|
See, that's the only two things that it can be because otherwise he's just a liar.
|
|
|
|
14:58.665 --> 14:59.585
|
|
He didn't take the shot.
|
|
|
|
14:59.605 --> 15:01.085
|
|
Of course he didn't take the shot.
|
|
|
|
15:01.806 --> 15:04.307
|
|
He's been an expert about this for 20 years.
|
|
|
|
15:06.954 --> 15:17.540
|
|
Even if he thought that transfection would be good for something, he was overdosing on famotidine to treat his coronavirus.
|
|
|
|
15:17.980 --> 15:22.702
|
|
He argued on the Brett Weinstein podcast that he took it so that he could travel.
|
|
|
|
15:25.004 --> 15:25.924
|
|
Why was he traveling?
|
|
|
|
15:26.585 --> 15:30.767
|
|
First time he went to Rome, it was also in part to visit the Vatican.
|
|
|
|
15:30.867 --> 15:34.249
|
|
So I mean, he took the shot because the Pope told him to.
|
|
|
|
15:35.270 --> 15:37.293
|
|
because the Italian government required it.
|
|
|
|
15:37.413 --> 15:49.791
|
|
Unfortunately, his handler, Steven Hatfill, was on the Tommy podcast alone and told a separate story about how when he went to Rome with Robert Malone, they both got there and they didn't have their vaccine cards.
|
|
|
|
15:52.150 --> 15:52.991
|
|
It was kind of funny.
|
|
|
|
15:53.791 --> 15:55.613
|
|
It didn't prevent him from entering the country.
|
|
|
|
15:55.633 --> 15:57.014
|
|
It was not really a big problem.
|
|
|
|
15:57.054 --> 16:00.798
|
|
It's probably because they were part of a US government envoy or something like that.
|
|
|
|
16:01.458 --> 16:05.322
|
|
So anyway, Mary Holland is the reason why I was fired from CHD.
|
|
|
|
16:05.742 --> 16:08.845
|
|
I think she's the primary saboteur of that organization.
|
|
|
|
16:08.905 --> 16:11.267
|
|
And a lot of the other ones just kind of go along with it.
|
|
|
|
16:11.307 --> 16:17.452
|
|
She has too much power because as the CEO of that nonprofit, I think she can basically fire and hire who she wants to.
|
|
|
|
16:17.512 --> 16:18.113
|
|
And she does.
|
|
|
|
16:19.194 --> 16:19.294
|
|
Um,
|
|
|
|
16:20.931 --> 16:24.932
|
|
I think that Robert Malone is probably responsible for scripting a lot of these people.
|
|
|
|
16:24.972 --> 16:27.533
|
|
I think he probably scripted Mickiewicz when she came out.
|
|
|
|
16:28.013 --> 16:36.216
|
|
I think he probably scripted a lot of the journalists, and he even brags about scripting the journalists that made her career at the New York Times on the Jesse Gelsinger story.
|
|
|
|
16:37.782 --> 16:44.067
|
|
And I think the whole reason why is because, you know, it's this idea to never question intramuscular injection.
|
|
|
|
16:44.087 --> 16:50.932
|
|
And more importantly, from an intellectual property perspective, never question PCR and lateral flow diagnostics.
|
|
|
|
16:52.106 --> 16:57.128
|
|
as proven standards of diagnostics in medicine.
|
|
|
|
16:57.168 --> 16:58.249
|
|
And that's absurdity.
|
|
|
|
16:58.649 --> 17:00.830
|
|
That's millions, if not billions of dollars.
|
|
|
|
17:01.210 --> 17:09.454
|
|
And that is a remnant stream of DNA that will never go away as long as this is not properly regulated by an FDA with teeth.
|
|
|
|
17:10.353 --> 17:14.336
|
|
which is precisely what they want to prevent from happening.
|
|
|
|
17:14.816 --> 17:27.085
|
|
Because the real FDA would now reflect back on five years of a pandemic emergency and try to take stock on what we overextended ourselves on and take stock in what we've learned.
|
|
|
|
17:27.125 --> 17:34.051
|
|
And one of the things we should have learned is that PCR is a terrible way to diagnose a medical condition because
|
|
|
|
17:35.172 --> 17:47.727
|
|
especially in an emergency situation, you might end up giving people pure supplementary oxygen because they tested positive for something and they have a pulse ox below, you know, let's say 90, 96 or something like that.
|
|
|
|
17:55.106 --> 18:00.048
|
|
So if you look forward on this a little bit, you know, that's why this, this is so crazy.
|
|
|
|
18:00.128 --> 18:05.150
|
|
These current round of vaccines, we're using traditional mRNA technology.
|
|
|
|
18:05.190 --> 18:06.350
|
|
And I just gagged at that.
|
|
|
|
18:09.772 --> 18:20.716
|
|
So, so we've already got, you know, modern and traditional, do you believe, you know, cause I mean, do you still, do you believe that some vaccines are good?
|
|
|
|
18:22.737 --> 18:22.797
|
|
Uh,
|
|
|
|
18:24.514 --> 18:36.978
|
|
I think that, so the statement was made that these current round of vaccines were using traditional mRNA technology, and I just gagged at that.
|
|
|
|
18:39.901 --> 18:42.142
|
|
So I will read these out for you a little bit.
|
|
|
|
18:42.162 --> 18:50.945
|
|
Let me make, this is just what you should, what I missed yesterday in that same interview to point out in that same interview.
|
|
|
|
18:51.425 --> 19:00.529
|
|
So first of all, in that interview, he actually admits that he's a long time DOD insider and the DOD is who Sasha Latapova blames as killing Americans.
|
|
|
|
19:00.589 --> 19:05.571
|
|
So it's very strange contradiction, long time DOD guy.
|
|
|
|
19:07.103 --> 19:13.989
|
|
He claims to be a repeat whistleblower, ethics king in many of the interviews and including this one, this time with Jesse Gelsinger.
|
|
|
|
19:14.009 --> 19:25.560
|
|
The problem with that is, after he says CIA and CIA and Fauci and AIDS and Ebola and all this other stuff in those interviews, Jesse Gelsinger was adenovirus.
|
|
|
|
19:26.701 --> 19:29.143
|
|
Adenovirus injected into the liver.
|
|
|
|
19:32.873 --> 19:51.081
|
|
Now, although he went through acute liver failure, is it really possible that Robert Malone is naive enough about biology that he thought that the reason why the adenovirus didn't work was because he gave too much adenovirus into the liver?
|
|
|
|
19:52.423 --> 19:58.108
|
|
which is of course what he claims, which is of course what he claims to be whistleblowing on, right?
|
|
|
|
19:58.128 --> 20:04.833
|
|
That they went off the protocol and they gave him a higher dose than the protocol allowed.
|
|
|
|
20:05.013 --> 20:12.279
|
|
And in earlier in that video, he says that, you know, it's really all about dose and about which populations it's appropriate for.
|
|
|
|
20:14.300 --> 20:16.242
|
|
And so he's on the same script.
|
|
|
|
20:16.322 --> 20:19.525
|
|
It's the same very, very nuanced,
|
|
|
|
20:20.723 --> 20:22.645
|
|
legal ease like script.
|
|
|
|
20:26.108 --> 20:36.776
|
|
And so I just want to point out that the ethics king should have damn well known that adenovirus injected into the body wherever it would go and express a protein would cause autoimmunity.
|
|
|
|
20:36.796 --> 20:45.924
|
|
And so that might not be liver failure if you don't inject it in the liver, but it could be any other number of autoimmunities that could take any number of years to come about.
|
|
|
|
20:47.492 --> 21:00.762
|
|
I would argue that he actually understood very well why Jesse Gelsinger was an acute failure and why transfection intramuscularly injected might not always be an acute failure.
|
|
|
|
21:03.283 --> 21:08.607
|
|
But he's definitely sophisticated enough to know that it would be a long-term failure inevitably.
|
|
|
|
21:10.188 --> 21:15.092
|
|
That's why I spoke out about it because it is a long-term failure inevitably.
|
|
|
|
21:15.152 --> 21:16.373
|
|
There are no animals
|
|
|
|
21:17.199 --> 21:29.243
|
|
on any academic benches or in any academic labs that have been transfected as babies in order to overexpress a protein and then studied for their whole lives the results of that transfection.
|
|
|
|
21:30.423 --> 21:32.704
|
|
Because transfected animals die.
|
|
|
|
21:37.127 --> 22:01.486
|
|
Um, so he says substack in this interview repeatedly, and interestingly that I just never realized this until yesterday, but the whole reason why Paul Offit does a podcast on YouTube with Vincent Racaniello, the guy who basically is credited with inventing, invented infectious clones with David Baltimore,
|
|
|
|
22:02.580 --> 22:08.684
|
|
they do a video podcast about Paul Offit's Substack.
|
|
|
|
22:09.845 --> 22:16.190
|
|
Isn't that remarkable that Paul Offit makes money from Substack and so does Robert Malone?
|
|
|
|
22:18.143 --> 22:45.993
|
|
And if we just were to list all of the people that make a lot of money from Substack from very early on, who knew that they should go there and were producing articles every day, one or two in that video, even Del Bigtree marvels at him, like you produce like a Substack a day, sometimes multiple Substacks a day, all seven days a week, just like Jessica Hockett, just like a lot of these other fake Substacks that are using a combination of AI
|
|
|
|
22:46.833 --> 22:50.195
|
|
AI content and AI pirated content.
|
|
|
|
22:52.156 --> 23:07.663
|
|
You don't think for one second that they're taking every stream I ever make and feeding it into a transcript and then putting it through a chat bot and then trying to digest something out that Jessica Hockett can publish and that they have a whole team of people crossing stuff out.
|
|
|
|
23:07.703 --> 23:08.924
|
|
Not that, not that, not that.
|
|
|
|
23:08.964 --> 23:09.704
|
|
Okay, take these.
|
|
|
|
23:09.764 --> 23:10.665
|
|
These terms are good.
|
|
|
|
23:10.845 --> 23:11.565
|
|
Absorb those.
|
|
|
|
23:13.366 --> 23:15.147
|
|
They're spending a lot of money
|
|
|
|
23:16.673 --> 23:20.994
|
|
trying to use up what they can and then use the rest to try and discourage me.
|
|
|
|
23:21.034 --> 23:22.174
|
|
So let's keep looking at this.
|
|
|
|
23:22.234 --> 23:27.995
|
|
He says all the book titles accurately, except he calls this one the lab leak coverup.
|
|
|
|
23:29.195 --> 23:29.995
|
|
Check it yourself.
|
|
|
|
23:30.775 --> 23:32.816
|
|
He gets this title completely wrong.
|
|
|
|
23:33.156 --> 23:34.616
|
|
I wonder if that was by accident.
|
|
|
|
23:35.216 --> 23:43.878
|
|
Maybe it was by accident that he spelled David Hone's name wrong in the Anthony Fauci book as one of the readers of it.
|
|
|
|
23:44.118 --> 23:44.958
|
|
His good friend.
|
|
|
|
23:45.874 --> 23:47.255
|
|
Davis Hone.
|
|
|
|
23:50.097 --> 24:00.945
|
|
So he says he's never revealed classified information, but I thought that he screwed himself as a whistleblower, but no, he gave away only information that was in the public domain.
|
|
|
|
24:00.985 --> 24:06.189
|
|
That's what he said in this video, which means he's not really that much of a hero.
|
|
|
|
24:06.229 --> 24:08.310
|
|
He didn't actually burn any bridges.
|
|
|
|
24:09.051 --> 24:11.873
|
|
Maybe he didn't burn any bridges either with Jesse Gelsinger.
|
|
|
|
24:13.715 --> 24:14.215
|
|
Do you see?
|
|
|
|
24:14.756 --> 24:31.710
|
|
If he's carefully walking the line between, it's kind of different than, I don't know, moving to Russia and saying that there's some, I mean, it's, he said in this video that he has many non-disclosure agreements.
|
|
|
|
24:31.790 --> 24:41.318
|
|
One of the only reasons why I can talk the way that I talk and tell you all the things that I can tell you is because I didn't sign a non-disclosure agreement with CHD when I left.
|
|
|
|
24:43.806 --> 24:52.047
|
|
I don't know how much they would have had to offer me for me to sign, but I assure you they offered me $8,000 and it was a pretty easy decision to make not to sign it.
|
|
|
|
24:53.339 --> 24:56.781
|
|
That's why I can tell you what we talked about in every meeting.
|
|
|
|
24:56.841 --> 24:58.781
|
|
I can tell you how many people were in the meeting.
|
|
|
|
24:58.841 --> 25:00.882
|
|
I can tell you how the book is written.
|
|
|
|
25:00.962 --> 25:02.343
|
|
I can tell you how long it took.
|
|
|
|
25:02.443 --> 25:03.824
|
|
I can tell you who was involved.
|
|
|
|
25:03.884 --> 25:19.831
|
|
I can tell you all the people who know that this guy right here is a fraud and involved with it, probably using the script that Kevin McKernan and Robert Malone had prepared before the pandemic, the same script that Charles Rixey was supposed to play hero boy on.
|
|
|
|
25:21.633 --> 25:26.775
|
|
And what they attempted to do was use Charles Rixey to get me to be the guy in this book.
|
|
|
|
25:26.835 --> 25:29.877
|
|
And then, you know, Charles Rixey and I would both be in the book.
|
|
|
|
25:29.917 --> 25:35.940
|
|
And then, you know, Charles Rixey, we'd get on, on Joe Rogan and give a shout out to get going biological with no website.
|
|
|
|
25:35.960 --> 25:36.760
|
|
And then it would be over.
|
|
|
|
25:38.181 --> 25:39.121
|
|
We'd be moving on.
|
|
|
|
25:39.921 --> 25:42.142
|
|
The currency would collapse and yada, yada, yada.
|
|
|
|
25:42.162 --> 25:47.405
|
|
We, everybody would have accepted and everything would be fine because it was a lab leak and we're not there.
|
|
|
|
25:48.143 --> 26:11.035
|
|
We're not there for a number of reasons, but one of the big reasons why we're not there is because these clowns made a mistake, and this guy was in a web meeting, and my reaction was not, oh cool, Rixey brought a guy that I know into a private meeting about the book, and his name is Kevin McCarron, he's cool, he's a neuroscientist.
|
|
|
|
26:11.155 --> 26:12.355
|
|
No, that's not what I did.
|
|
|
|
26:14.596 --> 26:17.738
|
|
I said that this guy is a meddler, I think he's fake,
|
|
|
|
26:20.865 --> 26:28.423
|
|
And I think it's quite scary that Charles Rixey is in a hotel room with that guy and this manuscript open.
|
|
|
|
26:28.443 --> 26:29.505
|
|
That's really weird.
|
|
|
|
26:30.717 --> 26:36.079
|
|
And then Charles Rixey didn't call in for a couple days and they fired him, or at least that's what they said.
|
|
|
|
26:36.099 --> 26:40.561
|
|
I don't know what happens because none of this stuff is ever in person.
|
|
|
|
26:40.581 --> 26:43.862
|
|
There's no Children's Health Defense to go to.
|
|
|
|
26:44.543 --> 26:50.905
|
|
So as far as I know, I was working for something called Children's Health Defense, but who was handling me?
|
|
|
|
26:51.305 --> 26:52.406
|
|
Who was talking to me?
|
|
|
|
26:52.466 --> 26:53.746
|
|
What we were really doing?
|
|
|
|
26:53.766 --> 26:54.987
|
|
I don't know.
|
|
|
|
26:56.025 --> 27:04.590
|
|
I do appear to have been able to get Sina Bhavari and Alison Tortora's 2019 coronavirus prediction paper in the book.
|
|
|
|
27:05.531 --> 27:08.073
|
|
They seem to have absorbed that without much pain.
|
|
|
|
27:08.453 --> 27:11.154
|
|
So there's some sign that I contributed to the book.
|
|
|
|
27:11.195 --> 27:17.038
|
|
There's some sign that something kind of pseudo-organic is contained within this book.
|
|
|
|
27:17.498 --> 27:24.663
|
|
But in the end, something tastes so disturbing, metallic about it.
|
|
|
|
27:26.192 --> 27:41.535
|
|
And it has to do with the fact that every one of these people at one point or another pretended to be my good friend, my big fan, and wanted to help in any way they possibly could until I described any number of ways they could help me.
|
|
|
|
27:41.575 --> 27:43.075
|
|
And then they were pretty busy.
|
|
|
|
27:45.876 --> 27:53.838
|
|
And so as a person who worked for Robert F. Kennedy Jr., he never really lied to me.
|
|
|
|
27:54.018 --> 27:55.858
|
|
I didn't get to talk to him that often.
|
|
|
|
27:57.421 --> 27:59.641
|
|
and then got handed off to CHD.
|
|
|
|
28:02.142 --> 28:11.284
|
|
Yes, I got paid by them, but I tried my best to maintain my integrity until it was impossible to do so.
|
|
|
|
28:12.744 --> 28:24.166
|
|
And when it was impossible to maintain my integrity, I kept going and they fired me.
|
|
|
|
28:25.202 --> 28:29.366
|
|
They fired me because I kept talking about the truth.
|
|
|
|
28:30.507 --> 28:32.469
|
|
And they kept asking me to stop.
|
|
|
|
28:32.729 --> 28:36.393
|
|
So many different people asked me to stop naming names.
|
|
|
|
28:36.553 --> 28:40.196
|
|
And many of them asked me to stop naming that name right there.
|
|
|
|
28:41.057 --> 28:41.818
|
|
Robert Malone.
|
|
|
|
28:43.239 --> 28:46.662
|
|
Jay Bhattacharya has asked me to not name that name anymore.
|
|
|
|
28:46.742 --> 28:50.326
|
|
Mary Holland fired me for naming that name.
|
|
|
|
28:53.058 --> 29:04.048
|
|
And maybe for stating specifically that those two, sorry, Robert Malone and Meryl Nass are extremely good friends.
|
|
|
|
29:04.769 --> 29:15.519
|
|
Because at the Boston announcement, when Robert Malone stood up to walk over to Meryl Nass in the aisle, there were no words.
|
|
|
|
29:17.121 --> 29:18.202
|
|
There was no handshake.
|
|
|
|
29:19.639 --> 29:34.984
|
|
It was an immediate hug, and it was a tight hug, just like the hug that Steve Bannon gave to Lee Min Yan on the stage in January of 2020, or February of 2020, whenever it was, saying that she needs to run for Senate.
|
|
|
|
29:39.166 --> 29:40.066
|
|
And it was striking.
|
|
|
|
29:41.486 --> 29:43.347
|
|
I think one of us got it on camera even.
|
|
|
|
29:44.659 --> 29:50.502
|
|
Almost without words, they walked away and went away from the crowd to talk for a little while and then came back.
|
|
|
|
29:50.982 --> 30:02.408
|
|
This is a very, very big deal that he says he worked on social media control programs for Arab Spring and that that work remains classified and in use by the CIA.
|
|
|
|
30:05.276 --> 30:14.418
|
|
Ladies and gentlemen, that interview is basically a gigantic admission that the conservative health freedom movement in America is a giant farce.
|
|
|
|
30:15.418 --> 30:17.138
|
|
Maybe Robert F. Kennedy Jr.
|
|
|
|
30:17.178 --> 30:33.962
|
|
has been long conned and played and maybe he's not clever enough to see it, but I think more likely it is that several people in his inner circle are actively lying to him and they're happy to do it because they see themselves as part of a national security operation
|
|
|
|
30:35.123 --> 30:50.693
|
|
making sure that there is no meaningful resistance to intramuscular injection as medicine, that there's no meaningful resistance to virology as a science, and that there's no meaningful resistance to public health as a reason.
|
|
|
|
30:53.215 --> 30:53.936
|
|
That's what I think.
|
|
|
|
30:55.016 --> 31:00.140
|
|
Ladies and gentlemen, this is Gigo and Biological, a high-resistance, low-noise information brief brought to you by biologists.
|
|
|
|
31:00.220 --> 31:02.721
|
|
It's 20th of May, 2025.
|
|
|
|
31:04.202 --> 31:05.083
|
|
Just want to be funny.
|
|
|
|
31:06.023 --> 31:11.547
|
|
Gentlemen, people used to say it was impossible to run tests on a single drop of blood.
|
|
|
|
31:11.867 --> 31:16.010
|
|
Well, I fired those people and hired new ones, and now, behold.
|
|
|
|
31:19.269 --> 31:20.631
|
|
Oh, science noise.
|
|
|
|
31:20.931 --> 31:31.660
|
|
With just one drop of blood, our device can tell you exactly which Sopranos character you are without having to take a dumb quiz written by a social media intern.
|
|
|
|
31:31.860 --> 31:32.741
|
|
Why Sopranos?
|
|
|
|
31:33.061 --> 31:36.364
|
|
Because my company is like a criminal enterprise.
|
|
|
|
31:36.684 --> 31:36.984
|
|
All right.
|
|
|
|
31:37.465 --> 31:41.388
|
|
And for no particular reason, I'm going to be behind this curtain for the entire demonstration.
|
|
|
|
31:42.289 --> 31:44.771
|
|
All we need is a tiny blood sample.
|
|
|
|
31:47.698 --> 31:48.979
|
|
Ew, it's doing stuff.
|
|
|
|
31:53.543 --> 31:55.044
|
|
This says Artie Bucco.
|
|
|
|
31:55.064 --> 31:55.685
|
|
Who's that?
|
|
|
|
31:56.125 --> 31:57.346
|
|
Bald guy at the restaurant.
|
|
|
|
31:57.566 --> 31:57.786
|
|
Oh, yeah.
|
|
|
|
31:57.986 --> 31:59.428
|
|
He and Tony were friends growing up.
|
|
|
|
31:59.768 --> 32:00.669
|
|
Ah, right, right, right.
|
|
|
|
32:01.149 --> 32:03.011
|
|
I'm not sure this machine is for us.
|
|
|
|
32:03.491 --> 32:03.952
|
|
Just a sec.
|
|
|
|
32:03.992 --> 32:05.693
|
|
Let me, uh, recalibrate something.
|
|
|
|
32:08.956 --> 32:09.496
|
|
Tony.
|
|
|
|
32:09.797 --> 32:10.297
|
|
All right.
|
|
|
|
32:10.537 --> 32:12.459
|
|
Now this is f****** Walgreens.
|
|
|
|
32:15.965 --> 32:24.750
|
|
So I have been making the argument, of course, that what this is really all about is seen from a longer time horizon.
|
|
|
|
32:25.451 --> 32:32.615
|
|
And if you realize that they understand that the population pyramid of the West is not shaped in a healthy way,
|
|
|
|
32:33.610 --> 32:41.955
|
|
and that recovery could be 10 generations away or more, they understand that there is a population collapse coming.
|
|
|
|
32:41.995 --> 32:57.165
|
|
And so they can make all the, let's say depopulation narratives they want to, and that can go viral on the internet because that essentially attributes a natural phenomenon to their power.
|
|
|
|
32:57.825 --> 33:00.967
|
|
It attributes a natural phenomenon to their control.
|
|
|
|
33:01.547 --> 33:20.657
|
|
And so as we watch our communities kind of slow motion collapse as the elementary schools are less full, and we see our children sort of grow up in a funny way because of the way that social media is warping their understanding of society,
|
|
|
|
33:21.517 --> 33:30.060
|
|
We won't really understand that one of the biggest forces that we have to face is this depopulation phenomenon.
|
|
|
|
33:30.200 --> 33:32.340
|
|
It is going to happen.
|
|
|
|
33:33.220 --> 33:41.503
|
|
All of these houses in Pittsburgh that are occupied by retired people are going to be occupied by empty, empty, they're going to be empty
|
|
|
|
33:42.486 --> 33:47.748
|
|
If there isn't somebody to move into them, to take them over, to maintain them, they're gonna fall apart.
|
|
|
|
33:48.749 --> 33:51.750
|
|
And these people have seen this all coming for a long time.
|
|
|
|
33:53.171 --> 34:03.716
|
|
These people have understood that there's huge opportunities here for wealth, for resources, and for reorganization of the way that societies govern themselves.
|
|
|
|
34:04.096 --> 34:10.139
|
|
And of course there needs to be a reorganization of global trade if the consumer base
|
|
|
|
34:10.879 --> 34:25.952
|
|
that was once the source of a thriving NCAA football season every year and a thriving NCAA basketball season every year, now suddenly those universities are going to very quickly run out of students.
|
|
|
|
34:29.876 --> 34:33.759
|
|
And so something about that ecosystem has got to change drastically.
|
|
|
|
34:35.911 --> 34:42.393
|
|
Something about the way that the Ivory Towers recruit students is gonna have to change drastically.
|
|
|
|
34:42.553 --> 34:49.496
|
|
All of these things are seen, foreseen, and looked at for a long time before they have arrived.
|
|
|
|
34:49.636 --> 35:02.621
|
|
And the coronavirus pandemic, the myth of that, is part of this mythological history that we're experiencing that encompasses much more than that, of course.
|
|
|
|
35:03.686 --> 35:04.974
|
|
And the idea again is.
|
|
|
|
35:07.867 --> 35:10.288
|
|
for us not to understand what's going on.
|
|
|
|
35:10.348 --> 35:23.473
|
|
And one of the most important, maybe insights that I've had for myself in the last, let's say two or three months is this idea popped in my head about three months ago, but it didn't really understand what it actually meant.
|
|
|
|
35:24.353 --> 35:29.975
|
|
And then about a week ago, it really struck me and I rearranged the words and it made much more sense.
|
|
|
|
35:30.636 --> 35:36.678
|
|
And when I say get off social media, I mean, really, this is already, it may already be too late.
|
|
|
|
35:37.967 --> 35:44.973
|
|
to some extent, that the trap was already being set a long time ago, already sprung a long time ago.
|
|
|
|
35:45.513 --> 35:48.295
|
|
Most of the wealth of America may already be gone.
|
|
|
|
35:49.636 --> 35:57.603
|
|
In fact, that may be the reason why we are at where we are at right now, that the clock has already effectively been run out.
|
|
|
|
35:57.723 --> 36:04.809
|
|
And now the people are trying as best they can to leave the stadium without letting anybody know that the game's already over.
|
|
|
|
36:08.482 --> 36:19.930
|
|
I don't want to be too pessimistic, but one of the last ways out is to stop using social media the way they want you to use it and start using it exclusively to subvert it.
|
|
|
|
36:21.271 --> 36:25.234
|
|
The only thing that we should be doing is trying to encourage people to get off of it.
|
|
|
|
36:26.374 --> 36:31.258
|
|
And I don't know all the alternatives, but you know, there's only 24 hours in a day.
|
|
|
|
36:31.338 --> 36:34.340
|
|
Nobody needs to be on social media unless they're selling something.
|
|
|
|
36:35.600 --> 36:39.802
|
|
and no one else are buying something, I guess you can go on social media if you want to.
|
|
|
|
36:40.322 --> 36:55.208
|
|
But look, get off social media because the reason why Twitter had a character limit was because their AI machine learning algorithms had a character limit for optimal training.
|
|
|
|
36:56.868 --> 37:02.851
|
|
They limited it that way so that they could train the AI with a limited amount of text examples.
|
|
|
|
37:04.090 --> 37:12.696
|
|
You know, they basically, they basically streamlined the training input that their AI would have.
|
|
|
|
37:12.756 --> 37:20.642
|
|
And then when the, when they felt like they exhausted that size, a piece of language, they increased it a little bit.
|
|
|
|
37:22.063 --> 37:22.683
|
|
That's all.
|
|
|
|
37:23.164 --> 37:31.750
|
|
It had nothing to do with server space or, you know, the limits of, of, of short form publishing, short form publishing my ass.
|
|
|
|
37:32.150 --> 37:32.791
|
|
Stop lying.
|
|
|
|
37:35.458 --> 37:37.419
|
|
And so we already acquiesced to it.
|
|
|
|
37:37.479 --> 37:45.925
|
|
We already trained social media and X. We already gave it feedback so that it can fool us even better.
|
|
|
|
37:47.526 --> 37:53.150
|
|
So it's already a finely tuned military social control algorithm.
|
|
|
|
37:54.525 --> 37:56.106
|
|
and you have to respect it as that.
|
|
|
|
37:56.146 --> 37:57.787
|
|
You have to taunt it as that.
|
|
|
|
37:58.367 --> 38:12.616
|
|
And so the only way to taunt it might be to find the most subversive accounts and retweet them and try to be a subversive account of your own, but do not interact with these people on that medium.
|
|
|
|
38:14.841 --> 38:17.104
|
|
You're better off not using any time there at all.
|
|
|
|
38:17.224 --> 38:19.767
|
|
I still go there and I still try to subvert it.
|
|
|
|
38:19.847 --> 38:20.868
|
|
You don't need to do that.
|
|
|
|
38:20.888 --> 38:22.150
|
|
You don't follow my example.
|
|
|
|
38:22.170 --> 38:23.371
|
|
You don't need to share that at all.
|
|
|
|
38:23.411 --> 38:31.921
|
|
But if you go there, one of the things you could do is retweet every one of the things that I do that you think is subversive enough to retweet.
|
|
|
|
38:33.283 --> 38:46.031
|
|
So I think no one has ever succinctly described it this way, said it this way, and that should be a huge red flag for you because I only started in this fight a few years ago.
|
|
|
|
38:46.111 --> 38:47.872
|
|
Before that, I was completely lost.
|
|
|
|
38:48.052 --> 38:53.596
|
|
I was completely an academic biologist, wanting tenure, fighting for the grant money.
|
|
|
|
38:54.645 --> 39:02.316
|
|
And for me to be here right now, sitting in front of a virtual slide that says, autism spectrum disorder is a myth.
|
|
|
|
39:02.436 --> 39:10.829
|
|
When I used to consider writing grants about this stuff, I used to do electrophysiology for a guy who worked on Angelman syndrome.
|
|
|
|
39:11.369 --> 39:16.772
|
|
and write grants where Angelman's syndrome is kind of a genetic sort of version of autism.
|
|
|
|
39:17.212 --> 39:27.458
|
|
And so if you fund us studying this genetic model of Angelman's disease in mice, then you'll also be funding autism research.
|
|
|
|
39:28.018 --> 39:35.382
|
|
So I understand what I'm saying when I say it and how crazy it is in some ways to think that I have
|
|
|
|
39:39.136 --> 39:41.258
|
|
moved that far in my thinking.
|
|
|
|
39:42.019 --> 39:56.614
|
|
The same way this has quite a lot of progress from my original place on my bicycle in Pittsburgh in 2020, talking about a coronavirus laboratory leak being not only likely, but it's happened before.
|
|
|
|
39:56.654 --> 39:57.575
|
|
It happens all the time.
|
|
|
|
39:57.595 --> 39:58.676
|
|
Look at all these articles.
|
|
|
|
40:01.673 --> 40:13.160
|
|
And so one of the things that I would challenge you to do is to see that so many of these people that have been writing books and hadn't been in front of these people on stage for decades have not made the kind of progress that I've made.
|
|
|
|
40:15.621 --> 40:16.962
|
|
They've hardly made any progress.
|
|
|
|
40:17.002 --> 40:23.386
|
|
They make no scientific progress and they get right to free speech and anti-mandate.
|
|
|
|
40:24.804 --> 40:29.426
|
|
They get right to there's no placebo controlled trials and then that's it, they're done.
|
|
|
|
40:30.187 --> 40:32.228
|
|
No more progress, that's where we're stuck.
|
|
|
|
40:33.489 --> 40:52.319
|
|
Instead of being able to succinctly explain to a high school student that no, pandemics and gain of function are myths that a national security state has told you in order to make sure that you and your children never question intramuscular injection as a methodology because they tend to use it in perpetuity.
|
|
|
|
40:53.283 --> 40:55.525
|
|
and make you into an experimental animal.
|
|
|
|
40:56.166 --> 41:01.411
|
|
You never questioned virology as a science because that is the central illusion.
|
|
|
|
41:04.493 --> 41:12.981
|
|
And right now, like it or not, the state of the art in virology is PCR as a detection device.
|
|
|
|
41:13.422 --> 41:15.503
|
|
There's no, that's it.
|
|
|
|
41:16.570 --> 41:25.073
|
|
And that's why PCR is a detection device for the pandemic has never been questioned by any of these actors that I thought were good guys.
|
|
|
|
41:25.894 --> 41:30.635
|
|
Because it's part of this central objective list.
|
|
|
|
41:31.056 --> 41:34.657
|
|
The other one is that public health is never questioned as a reason.
|
|
|
|
41:37.093 --> 41:49.142
|
|
And that's important to note because a lot of these people are advocating for a computer-based, an AI-optimized, individualized health program that will include your genotype and all this other stuff.
|
|
|
|
41:50.403 --> 41:53.446
|
|
And so it's a different kind of public health.
|
|
|
|
41:59.344 --> 42:07.552
|
|
Um, the statement was made that, uh, these current round of vaccines were using traditional MRNA technology.
|
|
|
|
42:07.592 --> 42:08.773
|
|
And I just gagged at that.
|
|
|
|
42:12.196 --> 42:23.126
|
|
So, so we've already got, you know, modern and traditional, do you believe, you know, cause I mean, do you still, do you believe that some vaccines are good?
|
|
|
|
42:25.167 --> 42:25.227
|
|
Uh,
|
|
|
|
42:26.920 --> 42:33.742
|
|
I think that, uh, so for instance, some of these like yellow fever, um, is an example.
|
|
|
|
42:34.642 --> 42:42.564
|
|
Uh, I think that the, the, the, another one of my cute little phrases, you give a three year old a hammer and everything becomes a nail.
|
|
|
|
42:42.764 --> 42:42.964
|
|
Yes.
|
|
|
|
42:43.384 --> 42:49.706
|
|
And so of course, this answer goes on a long time, but he doesn't answer anything and that's because this is a joke.
|
|
|
|
42:50.968 --> 43:07.341
|
|
we are having a joke played on us and what is most disturbing to me is that no one on the in the middle or on the left realizes that this administration could be discredited in a grand slam sort of way if they just kind of
|
|
|
|
43:08.510 --> 43:25.031
|
|
Publicized my story a little bit my insights into what they're doing and what they're not doing and how they've done it and and why but of course that's never gonna happen because both sides are on a narrative which Puts
|
|
|
|
43:26.428 --> 43:33.635
|
|
Andrew Wakefield as hero or villain, Paul Offit as hero or villain, Robert F. Kennedy Jr.
|
|
|
|
43:33.815 --> 43:39.241
|
|
as hero or villain, Mary Holland as hero or villain.
|
|
|
|
43:39.581 --> 43:48.489
|
|
And so these are agreements that both of those people promote each other as the opposite of themselves.
|
|
|
|
43:49.355 --> 43:51.358
|
|
And this guy gave you that list as well.
|
|
|
|
43:51.438 --> 43:55.463
|
|
Tony Fauci and Peter Daszak are on that list as well.
|
|
|
|
43:55.503 --> 43:59.689
|
|
It is a cast of characters that reinforces each other.
|
|
|
|
44:01.676 --> 44:10.443
|
|
And the way that this guy appeared on scene and has behaved ever since has revealed him to be one of the central script writers in this.
|
|
|
|
44:10.764 --> 44:21.412
|
|
One of the only guys who can really understand how deep the illusion is and how much of it is a national security thing.
|
|
|
|
44:21.793 --> 44:25.416
|
|
That's the reason why they are working so hard on her.
|
|
|
|
44:25.496 --> 44:26.557
|
|
She's one of the last
|
|
|
|
44:29.241 --> 44:32.005
|
|
Fully bamboozled, I think, unwitting participants.
|
|
|
|
44:32.066 --> 44:34.029
|
|
I don't think that she's a bad guy.
|
|
|
|
44:34.449 --> 44:36.433
|
|
It's possible, but I don't think so.
|
|
|
|
44:36.853 --> 44:42.683
|
|
I think she still has principles, and I think she could come to realize how badly she's been played.
|
|
|
|
44:44.011 --> 44:44.351
|
|
I don't know.
|
|
|
|
44:44.992 --> 44:45.492
|
|
I don't know.
|
|
|
|
44:45.792 --> 44:47.273
|
|
I know for sure that he's scripted.
|
|
|
|
44:47.713 --> 45:01.480
|
|
I know for sure that she's playing for their team because she's advocated very much for the taking of medical data and using AI to figure out the solution to the chronic disease epidemic.
|
|
|
|
45:02.400 --> 45:05.441
|
|
Not that dissimilar to what Mary Holland has advocated for.
|
|
|
|
45:05.961 --> 45:09.682
|
|
Not that dissimilar to what James Lyons Weiler has advocated for.
|
|
|
|
45:10.083 --> 45:14.264
|
|
I'm sure that the Bitcoin Dr. Polymath guy will also advocate for that.
|
|
|
|
45:15.084 --> 45:17.705
|
|
And so this is all scripted bullshit, ladies and gentlemen.
|
|
|
|
45:18.285 --> 45:19.646
|
|
It's all scripted bullshit.
|
|
|
|
45:20.266 --> 45:30.249
|
|
That's the same reason why two of the highest positions in the FDA are occupied by COVID-19 podcasters.
|
|
|
|
45:32.301 --> 45:37.283
|
|
We are one appointment away from this being a hat trick.
|
|
|
|
45:39.264 --> 45:45.726
|
|
That guy has grown his hair out and tried to get a little weight and puts on a suit and pretends he's an adult.
|
|
|
|
45:48.447 --> 45:50.368
|
|
He's never done any meaningful science.
|
|
|
|
45:50.428 --> 45:56.850
|
|
He can't outline one contribution that he's made to our fundamental understanding of anything.
|
|
|
|
45:58.123 --> 46:03.725
|
|
And yes, I can name exactly the fundamental contributions to our understanding of anything that I've made.
|
|
|
|
46:04.645 --> 46:09.366
|
|
And so I have made the argument that Paul Offit is the same scripted bullshit.
|
|
|
|
46:09.426 --> 46:16.288
|
|
That's why Paul Offit is plugging a substack called Beyond the Noise.
|
|
|
|
46:16.988 --> 46:21.329
|
|
Number two, that's why Paul Offit was opposite Mary Holland in 2015 on Democracy Now.
|
|
|
|
46:23.883 --> 46:26.165
|
|
arguing exactly the same talking points.
|
|
|
|
46:26.746 --> 46:29.889
|
|
That is 10 years ago.
|
|
|
|
46:29.969 --> 46:40.719
|
|
They were put in place because they knew that a pandemic narrative like the Spars pandemic or dark winter would never be able to happen
|
|
|
|
46:41.480 --> 46:56.453
|
|
if there was a significant skepticism of the basic biology of intramuscular injection, if there was a significant actual skepticism of virology and an actual skepticism of public health.
|
|
|
|
46:57.714 --> 47:04.319
|
|
And so they made those skepticisms about useless things, like are there viruses or not?
|
|
|
|
47:05.400 --> 47:07.342
|
|
Is there asymptomatic spread or not?
|
|
|
|
47:07.702 --> 47:09.244
|
|
Can you overcycle PCR?
|
|
|
|
47:15.040 --> 47:20.441
|
|
And he has been on this script for this national security script.
|
|
|
|
47:20.841 --> 47:23.342
|
|
And that's what the Human Genome Project is.
|
|
|
|
47:23.662 --> 47:25.862
|
|
It is a national security priority.
|
|
|
|
47:25.922 --> 47:28.403
|
|
It is the biggest national security priority.
|
|
|
|
47:28.983 --> 47:37.785
|
|
As Robert Malone said in that interview, pretended, so I was told they spent a lot more money on this than on the atomic weapons program.
|
|
|
|
47:37.825 --> 47:38.725
|
|
Well, no shit.
|
|
|
|
47:41.426 --> 47:43.346
|
|
So all of these people are playing dumb.
|
|
|
|
47:44.701 --> 47:48.829
|
|
We've been hearing about how all our medical records are going electronic.
|
|
|
|
47:49.470 --> 47:52.736
|
|
So, for the record, how exactly does that work?
|
|
|
|
47:53.518 --> 47:56.424
|
|
Rather than ask your doctor, ask our Lee Cowan.
|
|
|
|
47:58.086 --> 48:02.089
|
|
This may not look like the typical setting for a medical software.
|
|
|
|
48:02.249 --> 48:02.550
|
|
Okay.
|
|
|
|
48:02.590 --> 48:03.591
|
|
So you remember, right?
|
|
|
|
48:03.651 --> 48:05.532
|
|
We can do a timelines about things.
|
|
|
|
48:05.592 --> 48:09.235
|
|
If you do a 60 minutes timeline, you'll have day sack.
|
|
|
|
48:09.255 --> 48:11.317
|
|
And then a year or two later, you have day sack.
|
|
|
|
48:11.797 --> 48:17.943
|
|
And then a year or two later, you have, uh, then like a month or two later, you'll have, uh, George Webb.
|
|
|
|
48:18.023 --> 48:19.444
|
|
And so then, you know, it's, it's just,
|
|
|
|
48:21.470 --> 48:23.932
|
|
60 Minutes is a very interesting timeline.
|
|
|
|
48:24.433 --> 48:36.347
|
|
In 2019, they did a story about the EPIC database and what I've been making the argument about the PCR at all of the universities in America that these
|
|
|
|
48:37.168 --> 48:42.851
|
|
supposedly adults with principals had not enough principals to keep their kids home for a year.
|
|
|
|
48:43.391 --> 48:47.113
|
|
They swabbed them every week at some of these universities.
|
|
|
|
48:47.513 --> 48:49.874
|
|
And those kids are on the EPIC database.
|
|
|
|
48:50.014 --> 48:51.094
|
|
And I told you that.
|
|
|
|
48:51.234 --> 48:57.717
|
|
I said that with those swabs and using AI, they don't need identified records.
|
|
|
|
48:57.777 --> 49:00.919
|
|
They can de-identify the records and then re-identify them.
|
|
|
|
49:00.999 --> 49:03.460
|
|
Some shady means or some, uh,
|
|
|
|
49:03.900 --> 49:05.001
|
|
any means they want to.
|
|
|
|
49:05.141 --> 49:08.922
|
|
They can pay the security guard, the garbage guy, the records person.
|
|
|
|
49:10.403 --> 49:11.503
|
|
It's just a little money.
|
|
|
|
49:13.424 --> 49:15.004
|
|
This scam is real.
|
|
|
|
49:15.325 --> 49:20.427
|
|
Now, would it surprise you if the Epic database had already been running for years?
|
|
|
|
49:22.267 --> 49:31.731
|
|
Was already a billion dollar profitable thing that had over 60% of all the medical records of America already on it?
|
|
|
|
49:32.487 --> 49:33.487
|
|
Ready to go!
|
|
|
|
49:33.507 --> 49:40.709
|
|
All ready to go for the fake PCR testing that would be the mass sampling of America.
|
|
|
|
49:40.870 --> 49:41.810
|
|
All ready to go.
|
|
|
|
49:42.550 --> 49:43.410
|
|
Isn't that funny?
|
|
|
|
49:43.530 --> 49:46.511
|
|
Kevin McKernan never usefully questioned PCR.
|
|
|
|
49:46.551 --> 49:50.432
|
|
In fact, he said it was just a matter of which primers were used.
|
|
|
|
49:50.832 --> 49:54.153
|
|
Get in a little closer, and that's even more evident.
|
|
|
|
49:54.473 --> 49:56.254
|
|
It's as much theme park here as anything.
|
|
|
|
49:56.634 --> 49:59.275
|
|
Alice in Wonderland kind of stuff, literally.
|
|
|
|
49:59.855 --> 50:06.741
|
|
The workspaces here can be in old railway cars or subway cars, tree houses and gingerbread houses.
|
|
|
|
50:07.342 --> 50:13.687
|
|
Even its employee cafeteria looks like a train depot to a land that storybooks are written about.
|
|
|
|
50:14.188 --> 50:17.911
|
|
I walked through and was like, what is this?
|
|
|
|
50:19.687 --> 50:28.796
|
|
What it is is the self-described intergalactic headquarters of the Epic in the middle of the farm fields of Verona, Wisconsin, just outside Madison.
|
|
|
|
50:29.557 --> 50:32.740
|
|
If you've never heard of this place, just ask your doctor.
|
|
|
|
50:33.362 --> 50:41.825
|
|
because it's Epic software that handles the private medical records of about 60% of the patients in this country, probably yours.
|
|
|
|
50:42.726 --> 50:51.729
|
|
One of the things that strikes me is that Epic has such a big reach, and it really impacts so many people's lives, and yet so many people have never heard of Epic.
|
|
|
|
50:51.749 --> 50:52.049
|
|
I know.
|
|
|
|
50:52.650 --> 50:53.970
|
|
Yes, it's behind the scenes.
|
|
|
|
50:54.210 --> 50:56.011
|
|
We haven't advertised.
|
|
|
|
50:56.391 --> 50:58.132
|
|
We haven't put out press releases.
|
|
|
|
50:59.132 --> 51:01.353
|
|
And I don't know if that was a good thing to do or not.
|
|
|
|
51:02.302 --> 51:16.993
|
|
Judy Faulkner is the 76-year-old genius behind Epic, a computer software engineer and admitted... 76-year-old genius.
|
|
|
|
51:17.013 --> 51:19.855
|
|
76-year-old genius.
|
|
|
|
51:20.555 --> 51:20.916
|
|
What?
|
|
|
|
51:21.856 --> 51:26.740
|
|
Nerd, who built this curious place in her own curious image.
|
|
|
|
51:27.200 --> 51:28.761
|
|
Happy birthday to you.
|
|
|
|
51:29.824 --> 51:39.127
|
|
A hint of her personality was revealed last year when, to celebrate Epic's 40th anniversary, she dressed like she was back in the 70s again.
|
|
|
|
51:39.767 --> 51:41.507
|
|
Well, here's the skinny man.
|
|
|
|
51:42.688 --> 51:43.768
|
|
That's far out.
|
|
|
|
51:44.508 --> 51:45.849
|
|
She's a little far out too.
|
|
|
|
51:46.389 --> 51:47.489
|
|
Far out in front.
|
|
|
|
51:48.009 --> 51:56.552
|
|
She not only built a giant tech company for the ground up, but in the process, made herself one of the richest self-made women in the world.
|
|
|
|
51:57.253 --> 52:02.737
|
|
We have to compete with Facebook, Google, Microsoft, Apple, etc.
|
|
|
|
52:02.958 --> 52:07.962
|
|
We get a lot of acceptances because people look around and say, I think I'd like to work here.
|
|
|
|
52:08.868 --> 52:13.149
|
|
There are nearly 10,000 employees at Epic who mingle among the artwork.
|
|
|
|
52:13.490 --> 52:16.070
|
|
That about doubles the population of Verona.
|
|
|
|
52:16.691 --> 52:19.171
|
|
It's a young place, average age about 26.
|
|
|
|
52:20.152 --> 52:27.894
|
|
Peg Horner and Nicholas Bostrom could have worked in sunny Silicon Valley, but they chose to come to wintry Wisconsin instead.
|
|
|
|
52:28.415 --> 52:35.097
|
|
Because, they say, as far as big tech companies go, Epic is doing more than just building phones.
|
|
|
|
52:35.936 --> 52:39.300
|
|
We feel like we have an impact to make.
|
|
|
|
52:39.620 --> 52:42.623
|
|
And it's something that I actually really value about here.
|
|
|
|
52:42.643 --> 52:44.585
|
|
I don't feel like I'm just clocking in and clocking out.
|
|
|
|
52:44.605 --> 52:49.390
|
|
You know, we're not here to just grind out on something that's not really doing anything.
|
|
|
|
52:49.470 --> 52:53.214
|
|
It's making other people be able to be healthy and happy.
|
|
|
|
52:53.875 --> 52:59.524
|
|
It was 1979 when the company started, in a basement, with just two employees.
|
|
|
|
53:00.105 --> 53:00.565
|
|
The goal?
|
|
|
|
53:00.906 --> 53:08.677
|
|
To move patient records from overstuffed, dog-eared manila folders to digital records, accessible with a click of a mouse.
|
|
|
|
53:09.460 --> 53:13.803
|
|
No one had spent much time figuring out just how to get a computer to handle all that.
|
|
|
|
53:13.963 --> 53:19.927
|
|
And so the claim, of course, is that she did that because she wanted to help and because she wanted to make a lot of money.
|
|
|
|
53:20.528 --> 53:22.489
|
|
She wanted to get those dead presidents.
|
|
|
|
53:22.970 --> 53:24.090
|
|
OK, whatever.
|
|
|
|
53:24.871 --> 53:30.835
|
|
Not that it might not be a national security priority under the Human Genome Project to set something like that up, would it?
|
|
|
|
53:31.416 --> 53:33.577
|
|
I can't imagine that would be the case.
|
|
|
|
53:33.697 --> 53:34.057
|
|
No, no.
|
|
|
|
53:34.938 --> 53:40.770
|
|
But Faulkner always had a way with computers and she engineered a program herself.
|
|
|
|
53:41.545 --> 53:44.306
|
|
I used to like when I was a kid to play with clay and make them.
|
|
|
|
53:44.506 --> 53:47.547
|
|
I'm sorry, but this is like weird to me now.
|
|
|
|
53:47.627 --> 54:07.494
|
|
All of a sudden I'm starting to see all kinds of these, what I have to believe are fake people because you know, it's, it's like Elon Musk and his six unrelated, extremely successful tech companies that are all funded by what they're funded by military contracts with military directed
|
|
|
|
54:07.994 --> 54:09.735
|
|
initiatives and directives.
|
|
|
|
54:09.795 --> 54:13.736
|
|
And so it's like, okay, but then how many of these ideas are really his?
|
|
|
|
54:13.796 --> 54:18.538
|
|
How many of these engineers did he really find and bring together and convince to work on this?
|
|
|
|
54:18.998 --> 54:22.619
|
|
And how many of them are just government employees that flew in and put on the badge?
|
|
|
|
54:23.059 --> 54:24.500
|
|
And the same thing goes here.
|
|
|
|
54:24.580 --> 54:35.984
|
|
Are you telling me that this person somehow invented a special kind of database with special kinds of blocks where you could put data in very special ways?
|
|
|
|
54:36.824 --> 54:43.305
|
|
It looks kind of like that thing in, shoot, I'm not going to think of it.
|
|
|
|
54:44.266 --> 54:49.347
|
|
What was the recent movie with Matthew McConaughey?
|
|
|
|
54:49.407 --> 54:55.368
|
|
It's like in the future and they travel through time and he's inside of this library looking thing.
|
|
|
|
54:55.428 --> 54:57.688
|
|
Like, come on, this is such a joke.
|
|
|
|
54:58.128 --> 54:58.968
|
|
Things out of clay.
|
|
|
|
54:59.189 --> 55:02.549
|
|
And I always thought of computer programming as clay of the mind.
|
|
|
|
55:03.069 --> 55:06.490
|
|
The first time I did something and there it was on the screen, it was wow.
|
|
|
|
55:07.546 --> 55:08.527
|
|
It's very creative.
|
|
|
|
55:08.647 --> 55:12.149
|
|
Holy shit, that sounds so bad.
|
|
|
|
55:12.449 --> 55:14.030
|
|
Interstellar, yes, thank you.
|
|
|
|
55:15.070 --> 55:15.831
|
|
Interstellar.
|
|
|
|
55:16.211 --> 55:21.394
|
|
To a system that is now integral to patient care in nearly every major U.S.
|
|
|
|
55:21.454 --> 55:22.055
|
|
health system.
|
|
|
|
55:22.495 --> 55:29.379
|
|
Its ubiquity means that you can now go almost anywhere to be treated, and your medical records will likely follow you there.
|
|
|
|
55:29.755 --> 55:32.496
|
|
Here's an example of what my test results look like right from here.
|
|
|
|
55:32.516 --> 55:44.119
|
|
In fact, you can now check your lab results, refill medications, make appointments, even share your medical records right from an Epic app, says engineer Sean Bina.
|
|
|
|
55:44.339 --> 55:49.780
|
|
So all of this data was never available in the past, and now you can see it all on the phone.
|
|
|
|
55:50.472 --> 55:56.375
|
|
Paper records served us pretty well for centuries, but boy, what a rat's nest of data.
|
|
|
|
55:56.735 --> 56:03.619
|
|
In this case, the patient's getting blood drawn, they're going to a surgery visit, they're going to pain clinic, they're going to mammography.
|
|
|
|
56:04.226 --> 56:09.290
|
|
Dr. Steve Peters is a pulmonary critical care physician at the Mayo Clinic in Rochester, Minnesota.
|
|
|
|
56:09.770 --> 56:12.532
|
|
It's the old days I don't want to rush back to.
|
|
|
|
56:13.093 --> 56:19.798
|
|
He showed us mock-ups of what those old detailed paper records used to look like and how they used to travel.
|
|
|
|
56:19.938 --> 56:21.219
|
|
So the records would go in here.
|
|
|
|
56:21.719 --> 56:27.163
|
|
Records from Mayo Clinic patients used to fly around from room to room in those pneumatic tubes.
|
|
|
|
56:27.584 --> 56:29.825
|
|
There must have been miles and miles of tubing.
|
|
|
|
56:29.845 --> 56:32.067
|
|
Yeah, it's like the arteries inside of the building.
|
|
|
|
56:32.782 --> 56:35.383
|
|
It was state of the art in record keeping at the time.
|
|
|
|
56:35.843 --> 56:48.307
|
|
So it does make a certain amount of sense that the Mayo Clinic would end up today being the single biggest client of Epic, spending over a billion dollars over the next several years to integrate its systems.
|
|
|
|
56:49.208 --> 56:50.888
|
|
How much is this going to change things?
|
|
|
|
56:51.068 --> 56:57.210
|
|
We've been keeping track of the diagnoses of the Mayo Clinic patients since before we had electronic records.
|
|
|
|
56:57.711 --> 57:01.472
|
|
But the ability to have it all coming from one source
|
|
|
|
57:02.190 --> 57:03.551
|
|
makes it a lot easier.
|
|
|
|
57:04.271 --> 57:08.033
|
|
Without good access to data, you really are flying blind.
|
|
|
|
57:08.393 --> 57:09.834
|
|
We haven't met the head and the foot of the bed.
|
|
|
|
57:10.274 --> 57:14.416
|
|
The Mayo Clinic's trauma rooms now have more screens than a Best Buy.
|
|
|
|
57:14.936 --> 57:25.321
|
|
Epic's Mallory Hines-Roth worked with Dr. Heather Heaton to customize a system that allowed critical patient information to be displayed on those screens all at once.
|
|
|
|
57:26.141 --> 57:34.683
|
|
So the data is getting monitored from the patient who's on the table, going into their record in the computer, and then being presented up real time.
|
|
|
|
57:35.204 --> 57:40.865
|
|
With all that data, though, at your doctor's fingertips, their fingertips can be pretty busy.
|
|
|
|
57:41.385 --> 57:42.466
|
|
Too busy, say some.
|
|
|
|
57:43.086 --> 57:48.387
|
|
Because of patient privacy, we can't really show you all the data that doctors have to enter on that epic system.
|
|
|
|
57:48.467 --> 57:50.868
|
|
But some tell us it's too much.
|
|
|
|
57:51.528 --> 57:58.370
|
|
And if you've been to your doctor lately, you know it can feel like they spend as much time entering data on a keyboard as they do on you.
|
|
|
|
57:59.230 --> 58:02.452
|
|
But Dr. Peter says, get used to it.
|
|
|
|
58:03.132 --> 58:09.034
|
|
It's like blaming the word processor for a homework assignment for a student who has to write a term paper.
|
|
|
|
58:09.594 --> 58:12.115
|
|
It is where the documentation has to go.
|
|
|
|
58:12.195 --> 58:13.515
|
|
The technology isn't the enemy.
|
|
|
|
58:13.595 --> 58:15.436
|
|
It's just the reality.
|
|
|
|
58:15.456 --> 58:16.196
|
|
That's correct.
|
|
|
|
58:17.253 --> 58:22.735
|
|
Epic is, however, working on a solution that would free up your doctor altogether.
|
|
|
|
58:23.195 --> 58:31.518
|
|
One of the things that might be coming down the road, I understand, is instead of having to key in everything, you might sort of have the Alexa of medical records?
|
|
|
|
58:31.578 --> 58:32.218
|
|
That is correct.
|
|
|
|
58:32.518 --> 58:33.419
|
|
And how would that work?
|
|
|
|
58:33.599 --> 58:41.541
|
|
The doctor would just say, hey, Epic, show me Lee's history, and that would come up.
|
|
|
|
58:42.202 --> 58:46.183
|
|
And in the end, the doctor would say, hey, Epic, write my note, and the whole note would be written.
|
|
|
|
58:46.946 --> 58:53.829
|
|
I know you don't store the data, but I think... There is no way you're going to convince me that this person is some kind of super genius.
|
|
|
|
58:53.889 --> 58:55.570
|
|
There is no freaking way.
|
|
|
|
58:55.790 --> 58:56.290
|
|
No way.
|
|
|
|
58:56.430 --> 58:58.051
|
|
I'm not saying that she's an idiot.
|
|
|
|
58:58.591 --> 59:00.072
|
|
I'm saying she's acting.
|
|
|
|
59:00.152 --> 59:02.873
|
|
She's, she is, this is ridiculous.
|
|
|
|
59:03.213 --> 59:04.014
|
|
Oh my goodness.
|
|
|
|
59:04.034 --> 59:05.514
|
|
Some people think you probably do.
|
|
|
|
59:05.914 --> 59:10.456
|
|
So how do you handle the privacy concerns if all of this information is out there floating around?
|
|
|
|
59:10.837 --> 59:11.857
|
|
Yeah.
|
|
|
|
59:11.877 --> 59:13.398
|
|
Um, that is such a good question.
|
|
|
|
59:13.698 --> 59:15.819
|
|
I think it always makes sense to be a little bit worried.
|
|
|
|
59:17.028 --> 59:29.335
|
|
But I was at a talk once where the man giving the talk held up paper medical records and said it was so easy to put a white coat on, walk into the chart room and pull out any records he wanted and walk out again.
|
|
|
|
59:29.855 --> 59:32.356
|
|
Computerization is probably a safer way to do it.
|
|
|
|
59:33.037 --> 59:36.158
|
|
Oh, nice.
|
|
|
|
59:36.899 --> 59:38.540
|
|
In a place with a stairway to heaven.
|
|
|
|
59:38.600 --> 59:46.264
|
|
Computers are the safe way to do it, but we're going to vote by paper ballot for the, on a, oh my God, whatever.
|
|
|
|
59:47.066 --> 59:48.687
|
|
and an elevator to hell.
|
|
|
|
59:49.308 --> 59:51.990
|
|
There's no shortage of imagination here at Epic.
|
|
|
|
59:52.390 --> 59:55.653
|
|
Medical records don't really sound all that fanciful.
|
|
|
|
59:56.314 --> 01:00:02.218
|
|
But here, and in Judy Faulkner's mathematical mind, anything seems possible.
|
|
|
|
01:00:02.759 --> 01:00:04.580
|
|
In her mathematical mind?
|
|
|
|
01:00:04.620 --> 01:00:07.042
|
|
Does this look like a place of mathematical mind?
|
|
|
|
01:00:07.062 --> 01:00:08.483
|
|
An elevator to hell?
|
|
|
|
01:00:08.503 --> 01:00:08.623
|
|
Really?
|
|
|
|
01:00:09.104 --> 01:00:09.484
|
|
Really?
|
|
|
|
01:00:13.726 --> 01:00:17.768
|
|
Really, it's technology and software development working together.
|
|
|
|
01:00:19.009 --> 01:00:20.910
|
|
I'm waiting for the decoder ring to come out.
|
|
|
|
01:00:21.030 --> 01:00:28.235
|
|
I don't care how you get there.
|
|
|
|
01:00:28.255 --> 01:00:30.936
|
|
I don't care what you do to get there.
|
|
|
|
01:00:30.956 --> 01:00:35.459
|
|
The goal is to win.
|
|
|
|
01:00:35.839 --> 01:00:38.121
|
|
You should turn up your volume a little bit.
|
|
|
|
01:00:40.602 --> 01:00:41.643
|
|
Good music incoming.
|
|
|
|
01:00:46.904 --> 01:00:47.065
|
|
you
|
|
|
|
01:04:33.140 --> 01:04:33.241
|
|
you
|
|
|
|
01:05:33.627 --> 01:05:34.369
|
|
you you
|
|
|
|
01:05:52.737 --> 01:05:55.559
|
|
Yes, we are, and now we're in, and this is the intro.
|
|
|
|
01:05:56.300 --> 01:05:57.701
|
|
That's how we're doing it nowadays.
|
|
|
|
01:05:57.721 --> 01:06:03.685
|
|
We're getting started on time, as time as we can, and then we're throwing the intro in in the middle.
|
|
|
|
01:06:03.745 --> 01:06:06.507
|
|
We'll do another video, and then we'll head to the gym.
|
|
|
|
01:06:06.988 --> 01:06:08.188
|
|
Thank you very much for joining me.
|
|
|
|
01:06:08.229 --> 01:06:10.070
|
|
The voice that you hear is Jonathan J. Cooley.
|
|
|
|
01:06:10.130 --> 01:06:13.452
|
|
I'm coming to you live out of Pittsburgh, Pennsylvania, in the back of my garage.
|
|
|
|
01:06:13.532 --> 01:06:15.054
|
|
I am the chief biologist here
|
|
|
|
01:06:15.814 --> 01:06:16.535
|
|
at GigaOM.
|
|
|
|
01:06:16.755 --> 01:06:33.858
|
|
I used to have a whole career in academia, and you can find the evidence of that work on PubMed under my last name, C-O-U-E-Y, and my first two initials, J and J. I presented to Panda three times, but you can't find any of that stuff on their Rumble channel.
|
|
|
|
01:06:34.658 --> 01:06:56.706
|
|
However, you can find some of those ideas diluted and obfuscated and kind of made useless on this substack called Woodhouse 76, which is actually the brainchild of an American trader, at least to my family, Jessica Hockett, who claims to be from Chicago, but at one point in time had a very fancy restaurant in Virginia.
|
|
|
|
01:06:57.586 --> 01:07:08.493
|
|
Anyway, she has fake arguments with Claire Craig and Pierre Cory and Mike Eden and Jay Bhattacharya on this sub stack using some of the ideas that she can steal from me.
|
|
|
|
01:07:09.113 --> 01:07:10.834
|
|
I did help Robert F. Kennedy Jr.
|
|
|
|
01:07:10.854 --> 01:07:11.454
|
|
write his book.
|
|
|
|
01:07:11.995 --> 01:07:13.596
|
|
I am friends with Jay Bhattacharya.
|
|
|
|
01:07:13.716 --> 01:07:22.521
|
|
I was a staff scientist at Children's Health Defense, but that doesn't all matter now because we're here at GigaOM Biological and we're trying to do the work that hasn't been done.
|
|
|
|
01:07:23.402 --> 01:07:24.603
|
|
the work that needs doing.
|
|
|
|
01:07:25.665 --> 01:07:27.567
|
|
And this is all scripted bullshit.
|
|
|
|
01:07:27.887 --> 01:07:33.214
|
|
I don't really know what to tell you other than this is all scripted BS.
|
|
|
|
01:07:33.254 --> 01:07:34.736
|
|
Let me get myself down here.
|
|
|
|
01:07:37.977 --> 01:07:56.592
|
|
One of the things that I wanted to point out to you is that again, you can do a timeline of CBS's 60 Minutes and basically see that they have been priming us much in the same way that they did with TV shows and with books and with other media forms.
|
|
|
|
01:07:56.672 --> 01:07:59.154
|
|
They have been priming us for this for a long time.
|
|
|
|
01:07:59.594 --> 01:08:04.418
|
|
They have been predicting this outcome for a long time and it hasn't happened other than
|
|
|
|
01:08:05.038 --> 01:08:11.306
|
|
The, the act, you know, whatever it is coming on stage as predicted in the script.
|
|
|
|
01:08:11.747 --> 01:08:19.637
|
|
And so when I say scripted bullshit, I am especially referring to the kinds of suit.
|
|
|
|
01:08:22.133 --> 01:08:29.136
|
|
It's just ridiculous predictions about technologies, about end-of-the-world scenarios.
|
|
|
|
01:08:29.696 --> 01:08:43.622
|
|
And only on 60 Minutes do we regularly talk about end-of-the-world scenarios that vary from as widely as nuclear weapons to biological weapons and gain-of-function viruses, all the way to AI.
|
|
|
|
01:08:43.682 --> 01:08:46.244
|
|
And very recently, of course, we talked about AI.
|
|
|
|
01:08:48.186 --> 01:08:56.552
|
|
When Demis Hassabis won the Nobel Prize last year, he celebrated by playing poker with a world champion of chess.
|
|
|
|
01:08:57.353 --> 01:09:03.197
|
|
Hassabis loves a game, which is how he became a pioneer of artificial intelligence.
|
|
|
|
01:09:03.918 --> 01:09:11.723
|
|
The 48-year-old British scientist is co-founder and CEO of Google's AI powerhouse called DeepMind.
|
|
|
|
01:09:12.344 --> 01:09:16.127
|
|
We met two years ago when chatbots announced a new age.
|
|
|
|
01:09:16.914 --> 01:09:28.820
|
|
Now Hassabis and others are chasing what's called artificial general intelligence, a silicon intellect as versatile as a human, but with superhuman speed and knowledge.
|
|
|
|
01:09:29.801 --> 01:09:40.286
|
|
After his Nobel and a knighthood from King Charles, we hurried back to London to see what's next from a genius who may hold the cards of our future.
|
|
|
|
01:09:41.152 --> 01:09:41.732
|
|
Wow.
|
|
|
|
01:09:42.713 --> 01:09:44.453
|
|
Wow, that's impressive.
|
|
|
|
01:09:44.573 --> 01:09:46.334
|
|
I got some cards for your future.
|
|
|
|
01:09:47.495 --> 01:09:52.637
|
|
What's always guided me and the passion I've always had is understanding the world around us.
|
|
|
|
01:09:53.137 --> 01:09:57.339
|
|
I've always been, since I was a kid, fascinated by the biggest questions.
|
|
|
|
01:09:57.739 --> 01:10:03.321
|
|
You know, the meaning of life, the nature of consciousness, the nature of reality itself.
|
|
|
|
01:10:03.681 --> 01:10:19.990
|
|
And now remember the answer that Alex Karp gave on 60 Minutes a couple 10 years ago when he said that he went to Germany to find out how people know things and how people think they participate in their own governance.
|
|
|
|
01:10:20.791 --> 01:10:24.373
|
|
Those words are almost exactly what he said.
|
|
|
|
01:10:24.893 --> 01:10:26.674
|
|
How people know things.
|
|
|
|
01:10:27.275 --> 01:10:29.376
|
|
He's interested in those big questions too.
|
|
|
|
01:10:31.846 --> 01:10:32.746
|
|
consciousness.
|
|
|
|
01:10:33.567 --> 01:10:42.111
|
|
I've loved reading about all the great scientists who worked on these problems and the philosophers, and I wanted to see if we could advance human knowledge.
|
|
|
|
01:10:42.571 --> 01:10:48.734
|
|
And for me, my expression of doing that was to build what I think is the ultimate tool for advancing human knowledge, which is AI.
|
|
|
|
01:10:49.875 --> 01:10:54.337
|
|
So she, he built the ultimate tool, not that dissimilar to that lady.
|
|
|
|
01:10:54.717 --> 01:11:01.920
|
|
She built this ultimate database and ultimate medical app that will be the ultimate AI assistant in the future.
|
|
|
|
01:11:01.940 --> 01:11:05.002
|
|
You'll just be able to say, hey Epic, give me everything I need.
|
|
|
|
01:11:06.882 --> 01:11:07.843
|
|
Oh my goodness.
|
|
|
|
01:11:08.403 --> 01:11:17.307
|
|
We sat down in this room two years ago, and I wonder if AI is moving faster today than you imagined.
|
|
|
|
01:11:18.106 --> 01:11:19.387
|
|
It's moving incredibly fast.
|
|
|
|
01:11:20.108 --> 01:11:23.491
|
|
I think we are on some kind of exponential curve of improvement.
|
|
|
|
01:11:23.951 --> 01:11:30.317
|
|
Of course, the success of the field in the last few years has attracted even more attention, more resources, more talent.
|
|
|
|
01:11:30.697 --> 01:11:32.399
|
|
So that's adding to it.
|
|
|
|
01:11:32.419 --> 01:11:36.222
|
|
I still have a hard time understanding where the advance is.
|
|
|
|
01:11:36.362 --> 01:11:38.965
|
|
Is the advance in the combination of C++,
|
|
|
|
01:11:43.089 --> 01:11:53.855
|
|
commands that they have chained together in a such a unique combination of code that it's about to learn for itself and construct weapons to destroy us.
|
|
|
|
01:11:55.836 --> 01:11:57.838
|
|
This is the part that I don't really understand.
|
|
|
|
01:11:57.898 --> 01:12:08.249
|
|
So if we just, you know, let it write its own programs, then it might do gain of function programming and come up with a combination of C++ commands that we never dreamed possible.
|
|
|
|
01:12:08.689 --> 01:12:14.655
|
|
And if it puts them in the right combination, it might be able to copy itself to every computer on earth.
|
|
|
|
01:12:15.736 --> 01:12:17.118
|
|
And we wouldn't be able to stop it.
|
|
|
|
01:12:19.909 --> 01:12:27.651
|
|
I want you to think first, I want you to, I'm trying to help people get into a mindset that starts to crack this puzzle a little bit.
|
|
|
|
01:12:27.811 --> 01:12:32.852
|
|
Sorry, let me just, it really bothers me a lot what's happening here.
|
|
|
|
01:12:33.252 --> 01:12:34.432
|
|
Let me just see if this works.
|
|
|
|
01:12:34.472 --> 01:12:35.052
|
|
Can I do that?
|
|
|
|
01:12:35.092 --> 01:12:35.872
|
|
Yeah, I can do that.
|
|
|
|
01:12:36.572 --> 01:12:41.353
|
|
Make sure there's no, nothing there that needs not seen.
|
|
|
|
01:12:42.494 --> 01:12:44.174
|
|
Surprise, surprise, I have lighter fluid.
|
|
|
|
01:12:44.194 --> 01:12:47.075
|
|
Yeah, butane.
|
|
|
|
01:12:47.435 --> 01:12:48.495
|
|
Let me just show you this thing.
|
|
|
|
01:12:50.424 --> 01:12:55.747
|
|
I just want you to think about, he's saying, right?
|
|
|
|
01:12:56.107 --> 01:13:01.311
|
|
He's saying that AI, that AI is bouncing super fast, right?
|
|
|
|
01:13:01.351 --> 01:13:02.631
|
|
It's going super fast.
|
|
|
|
01:13:03.432 --> 01:13:07.454
|
|
And I still have a hard time understanding what AI is, you know?
|
|
|
|
01:13:07.534 --> 01:13:17.280
|
|
I mean, should I think of AI as something that in theory, if this had enough memory space on it, I could store AI on a stick?
|
|
|
|
01:13:18.401 --> 01:13:19.902
|
|
Or do I need like a tower
|
|
|
|
01:13:21.081 --> 01:13:25.763
|
|
Or should I think of it like a room full of humming machines that are all linked together?
|
|
|
|
01:13:26.403 --> 01:13:29.484
|
|
But ultimately, I assume there's some code in there, right?
|
|
|
|
01:13:29.524 --> 01:13:30.884
|
|
There's some programming.
|
|
|
|
01:13:31.765 --> 01:13:37.387
|
|
And so we're talking about programming becoming better and better and better.
|
|
|
|
01:13:38.087 --> 01:13:40.488
|
|
And then in the end, what are we going to have?
|
|
|
|
01:13:40.548 --> 01:13:44.149
|
|
Are we going to have this thing that he can make more of?
|
|
|
|
01:13:45.449 --> 01:13:49.551
|
|
Are we going to have this thing that we can make molds of and then produce them?
|
|
|
|
01:13:50.814 --> 01:13:54.556
|
|
Or are we gonna have this thing that will multiply itself and take over?
|
|
|
|
01:13:54.616 --> 01:13:56.458
|
|
What is the story we're telling here?
|
|
|
|
01:13:56.538 --> 01:13:57.779
|
|
What is the technology?
|
|
|
|
01:13:57.799 --> 01:13:58.879
|
|
Can you show it to me?
|
|
|
|
01:13:59.480 --> 01:14:03.022
|
|
Let me, on the other hand, show you something, okay?
|
|
|
|
01:14:03.062 --> 01:14:04.363
|
|
Let me just show you something.
|
|
|
|
01:14:04.383 --> 01:14:09.026
|
|
I'm gonna see if I can do this with my GoPro camera a little higher and up here.
|
|
|
|
01:14:09.566 --> 01:14:13.168
|
|
And I'll reveal some of the things on my desk because I have to do this like this.
|
|
|
|
01:14:14.069 --> 01:14:14.989
|
|
Wonder if I can do this.
|
|
|
|
01:14:21.087 --> 01:14:25.310
|
|
So I'll move my Illuminati cards out of the way.
|
|
|
|
01:14:26.351 --> 01:14:27.231
|
|
Those are hilarious.
|
|
|
|
01:14:28.052 --> 01:14:30.113
|
|
And I want to show you this little thing here.
|
|
|
|
01:14:31.834 --> 01:14:37.958
|
|
It's a device made of metal that somebody actually gave me as a PhD graduation gift.
|
|
|
|
01:14:38.258 --> 01:14:48.125
|
|
Actually, somebody flew on an airplane with this in their carry-on luggage as part of my graduation gift, which is pretty kick-ass.
|
|
|
|
01:14:48.165 --> 01:14:49.306
|
|
An international flight.
|
|
|
|
01:14:49.966 --> 01:14:51.387
|
|
with this thing in there.
|
|
|
|
01:14:51.407 --> 01:14:55.089
|
|
And when I say it's metal, like it is metal.
|
|
|
|
01:14:55.670 --> 01:15:00.033
|
|
It's plates of metal with some turning stuff inside.
|
|
|
|
01:15:00.093 --> 01:15:01.193
|
|
It's freaking heavy.
|
|
|
|
01:15:01.914 --> 01:15:03.175
|
|
And so what is this thing?
|
|
|
|
01:15:03.235 --> 01:15:04.515
|
|
It's a micro manipulator.
|
|
|
|
01:15:04.535 --> 01:15:08.298
|
|
It's one of the first micro manipulators that was ever made, really.
|
|
|
|
01:15:09.491 --> 01:15:19.215
|
|
And it is made by a very famous camera and lenses and microscope maker, Ernst Leitz Wetzlar in Germany.
|
|
|
|
01:15:19.275 --> 01:15:21.376
|
|
This is a fat daddy of fat daddies.
|
|
|
|
01:15:21.436 --> 01:15:22.477
|
|
Now what is this thing?
|
|
|
|
01:15:23.457 --> 01:15:27.119
|
|
It won't work unless I have it on the flat table.
|
|
|
|
01:15:28.119 --> 01:15:32.381
|
|
And then it still might be a little bit angry with me.
|
|
|
|
01:15:33.221 --> 01:15:36.803
|
|
I may have, you know, broken something or something may be disconnected or
|
|
|
|
01:15:37.743 --> 01:15:41.506
|
|
I would have to reassemble it because I probably bothered it a little bit.
|
|
|
|
01:15:41.546 --> 01:15:42.747
|
|
So what is this thing though?
|
|
|
|
01:15:43.287 --> 01:15:48.391
|
|
What you have on the top is a mechanism by which this thing can move around.
|
|
|
|
01:15:49.372 --> 01:15:50.933
|
|
And this thing can move around like this.
|
|
|
|
01:15:50.973 --> 01:15:53.255
|
|
So you can position the end of that thing.
|
|
|
|
01:15:53.295 --> 01:15:54.276
|
|
Look, it can just move.
|
|
|
|
01:15:55.034 --> 01:15:55.234
|
|
Right?
|
|
|
|
01:15:55.294 --> 01:16:02.281
|
|
And so this was to put something under a microscope or to put something and move it at microscopic scale.
|
|
|
|
01:16:02.862 --> 01:16:05.584
|
|
But you can see that this knob right here is not turning.
|
|
|
|
01:16:05.905 --> 01:16:08.367
|
|
This knob right here doesn't turn in a microscopic scale.
|
|
|
|
01:16:08.387 --> 01:16:09.328
|
|
It moves pretty fast.
|
|
|
|
01:16:09.368 --> 01:16:09.709
|
|
See that?
|
|
|
|
01:16:10.329 --> 01:16:15.134
|
|
But what the rest of the thing is designed to do is to move at a microscopic scale.
|
|
|
|
01:16:15.194 --> 01:16:17.696
|
|
Now, unfortunately, it looks like
|
|
|
|
01:16:19.068 --> 01:16:20.229
|
|
I may have broke it.
|
|
|
|
01:16:20.449 --> 01:16:24.732
|
|
Because this thing should be staying back here against the wheel.
|
|
|
|
01:16:26.354 --> 01:16:27.955
|
|
So that it can move this way.
|
|
|
|
01:16:28.696 --> 01:16:30.377
|
|
But somehow it's not staying there.
|
|
|
|
01:16:30.857 --> 01:16:31.658
|
|
So I'm going to show you.
|
|
|
|
01:16:31.698 --> 01:16:32.899
|
|
I can hold my finger on it.
|
|
|
|
01:16:33.659 --> 01:16:36.261
|
|
Somehow this spring is missing it's counter spring.
|
|
|
|
01:16:36.842 --> 01:16:38.703
|
|
This spring should be pushed back here.
|
|
|
|
01:16:39.564 --> 01:16:40.945
|
|
Or maybe that's not sliding.
|
|
|
|
01:16:41.005 --> 01:16:41.826
|
|
Maybe that's what it is.
|
|
|
|
01:16:42.306 --> 01:16:43.327
|
|
No, I think it's this one.
|
|
|
|
01:16:44.727 --> 01:16:45.510
|
|
Oh, no, it does move.
|
|
|
|
01:16:45.590 --> 01:16:46.212
|
|
Oh, never mind.
|
|
|
|
01:16:46.272 --> 01:16:46.954
|
|
It still works.
|
|
|
|
01:16:47.035 --> 01:16:47.376
|
|
Great.
|
|
|
|
01:16:47.596 --> 01:16:48.198
|
|
Oh, scared me.
|
|
|
|
01:16:48.780 --> 01:16:50.325
|
|
So this stage right here.
|
|
|
|
01:16:51.857 --> 01:16:53.899
|
|
And I'll get to my point, I promise.
|
|
|
|
01:16:53.939 --> 01:16:56.400
|
|
Just like I got to my point with my binoculars yesterday.
|
|
|
|
01:16:56.961 --> 01:16:59.022
|
|
So, this thing right here is a stage.
|
|
|
|
01:16:59.883 --> 01:17:02.485
|
|
And a stage in the sense of it's a square piece of metal.
|
|
|
|
01:17:02.885 --> 01:17:05.207
|
|
And it's got this macro manipulator on it here.
|
|
|
|
01:17:05.227 --> 01:17:08.109
|
|
I turn this knob and this holder moves forward.
|
|
|
|
01:17:08.149 --> 01:17:09.250
|
|
It's like a needle holder.
|
|
|
|
01:17:09.870 --> 01:17:12.012
|
|
And so this stage also moves.
|
|
|
|
01:17:12.072 --> 01:17:14.774
|
|
But how this stage moves is actually ingenious.
|
|
|
|
01:17:15.194 --> 01:17:16.775
|
|
I'm going to raise the camera a little bit.
|
|
|
|
01:17:17.376 --> 01:17:19.998
|
|
So that you can see, and I apologize for the dust.
|
|
|
|
01:17:21.677 --> 01:17:30.442
|
|
This is a viewer weaved little towel here in case the viewer that weaved this towel is watching.
|
|
|
|
01:17:30.502 --> 01:17:32.663
|
|
You can see that I have it at my desk all the time.
|
|
|
|
01:17:32.703 --> 01:17:33.804
|
|
It's a very beautiful towel.
|
|
|
|
01:17:33.844 --> 01:17:34.484
|
|
Thank you very much.
|
|
|
|
01:17:35.627 --> 01:17:44.855
|
|
Um, when I get particularly nervous, sometimes I use it as a, as a, as a, as a little bopper there, um, get the dust off.
|
|
|
|
01:17:44.895 --> 01:17:46.257
|
|
So this thing is moving.
|
|
|
|
01:17:46.317 --> 01:17:51.021
|
|
Now look at this mechanism and tell me that this is not German genius.
|
|
|
|
01:17:51.201 --> 01:17:51.481
|
|
Okay.
|
|
|
|
01:17:51.942 --> 01:17:56.746
|
|
So this, if I turn it to the side, this thing is a joystick.
|
|
|
|
01:17:57.987 --> 01:17:59.908
|
|
These are the up and down.
|
|
|
|
01:18:00.028 --> 01:18:02.449
|
|
So this is, this moves this whole thing.
|
|
|
|
01:18:02.629 --> 01:18:04.690
|
|
See, this moves this thing up and down.
|
|
|
|
01:18:05.210 --> 01:18:06.791
|
|
This is sliding up and down here.
|
|
|
|
01:18:07.211 --> 01:18:10.253
|
|
And then this is fine adjustment on the x-axis.
|
|
|
|
01:18:10.313 --> 01:18:18.997
|
|
So, but the y-axis is what's important because if you have the microscope over here and you're moving on my hand, watch how this thing moves.
|
|
|
|
01:18:19.097 --> 01:18:19.217
|
|
See?
|
|
|
|
01:18:21.212 --> 01:18:30.334
|
|
And so in correspondence to my joystick movement, this thing moves in all those two, oops, those two axes.
|
|
|
|
01:18:30.394 --> 01:18:30.734
|
|
See that?
|
|
|
|
01:18:31.614 --> 01:18:41.336
|
|
Now what's really sick about this design is that this articulation is determined by the curve of this thing.
|
|
|
|
01:18:42.037 --> 01:18:48.058
|
|
And so I can change the resolution of the joystick by screwing this thing down
|
|
|
|
01:18:51.235 --> 01:19:13.335
|
|
and then it changes the articulation constant sort of so to speak of the joystick so that it moves less see hardly can see it moving see that you can hardly see it move all the way to if i screw it all the way up here this is just the coolest thing ever i just think it's so genius
|
|
|
|
01:19:15.566 --> 01:19:19.492
|
|
And so if you put it right here, then it moves very fast.
|
|
|
|
01:19:19.612 --> 01:19:20.834
|
|
See, and you get a lot of coverage.
|
|
|
|
01:19:20.914 --> 01:19:22.176
|
|
So this is cool, right?
|
|
|
|
01:19:22.196 --> 01:19:23.658
|
|
This is a great idea.
|
|
|
|
01:19:23.698 --> 01:19:25.721
|
|
This is the technological advance.
|
|
|
|
01:19:26.342 --> 01:19:28.885
|
|
Somebody could look at this and get ideas of their own.
|
|
|
|
01:19:28.986 --> 01:19:30.287
|
|
Somebody could look at this and think,
|
|
|
|
01:19:30.888 --> 01:19:38.790
|
|
Wow, I could use that in an automobile or in an airplane or in my radio control knobs or whatever.
|
|
|
|
01:19:39.090 --> 01:19:40.250
|
|
It's an idea.
|
|
|
|
01:19:40.310 --> 01:19:53.173
|
|
It's a concrete idea that manifests in a man-made thingamabob and I just think it's cool and it's from one of my, you know, favorite places on the earth as far as engineering goes.
|
|
|
|
01:19:53.673 --> 01:19:56.093
|
|
Okay, so there's my big pitch on the
|
|
|
|
01:20:00.174 --> 01:20:01.274
|
|
How do I have to do that like this?
|
|
|
|
01:20:01.675 --> 01:20:04.396
|
|
That's my big pitch on the... Let me just set this back.
|
|
|
|
01:20:04.416 --> 01:20:07.397
|
|
Oh, that's going to collapse all the way to the ground now because I've got to tighten those up.
|
|
|
|
01:20:07.937 --> 01:20:08.917
|
|
Let me just put this back.
|
|
|
|
01:20:16.140 --> 01:20:28.824
|
|
So the point is, is that that is a piece of technology, a piece of created idea, manifest, whatever you want to call it, that can be
|
|
|
|
01:20:30.660 --> 01:20:32.282
|
|
The value of it is still there.
|
|
|
|
01:20:32.382 --> 01:20:38.312
|
|
I can still show that to my kid and that has a value for his understanding of how things could or can work.
|
|
|
|
01:20:39.321 --> 01:20:41.503
|
|
and how things can be combined.
|
|
|
|
01:20:41.523 --> 01:21:04.140
|
|
If you've ever been to a museum of engineering history or something like that and seen all the different ways that gears and arms and wheels can be used to change the direction of force and multiply forces, it's a very, very cool application of simple machines and this kind of thing.
|
|
|
|
01:21:04.620 --> 01:21:05.721
|
|
But what do we get
|
|
|
|
01:21:07.353 --> 01:21:09.614
|
|
What do we get, what did I just do there?
|
|
|
|
01:21:09.935 --> 01:21:21.901
|
|
Yeah, what do we get when we have this AI and we have this guy telling us that AI is progressing faster than ever and that he invented it?
|
|
|
|
01:21:22.342 --> 01:21:26.044
|
|
He decided after all these, what he was going to do is invent it.
|
|
|
|
01:21:26.837 --> 01:21:28.838
|
|
But does, did he invent something?
|
|
|
|
01:21:28.918 --> 01:21:30.279
|
|
Can we hold onto it?
|
|
|
|
01:21:30.399 --> 01:21:34.381
|
|
Can we understand what it is or is it just a code?
|
|
|
|
01:21:35.061 --> 01:21:42.685
|
|
And someday that code is going to get really good or, or enough people will agree that that code is really good.
|
|
|
|
01:21:42.725 --> 01:21:44.066
|
|
What exactly happens?
|
|
|
|
01:21:44.166 --> 01:21:44.886
|
|
I don't get it.
|
|
|
|
01:21:45.126 --> 01:21:50.249
|
|
I don't, I don't understand what the argument is that they're making with regard to this.
|
|
|
|
01:21:50.589 --> 01:21:51.389
|
|
I don't see it.
|
|
|
|
01:21:51.949 --> 01:21:53.190
|
|
Where is this loose here?
|
|
|
|
01:21:54.091 --> 01:21:54.771
|
|
Here it's loose.
|
|
|
|
01:21:57.085 --> 01:21:57.805
|
|
Where is it loose?
|
|
|
|
01:21:57.985 --> 01:21:59.606
|
|
Oh no, it's really gonna drive me nuts.
|
|
|
|
01:21:59.746 --> 01:22:00.866
|
|
I gotta get this tight again.
|
|
|
|
01:22:01.906 --> 01:22:03.126
|
|
Oh, it's this one's loose.
|
|
|
|
01:22:03.706 --> 01:22:04.547
|
|
There we go.
|
|
|
|
01:22:04.787 --> 01:22:05.947
|
|
Yes, that's it.
|
|
|
|
01:22:06.367 --> 01:22:06.967
|
|
That was loose.
|
|
|
|
01:22:07.847 --> 01:22:15.709
|
|
Now I need to loosen this one that I just over tightened so that it will let me rotate this down.
|
|
|
|
01:22:16.089 --> 01:22:16.709
|
|
Come on girl.
|
|
|
|
01:22:16.869 --> 01:22:17.509
|
|
There we go.
|
|
|
|
01:22:18.930 --> 01:22:19.810
|
|
Sorry about that guys.
|
|
|
|
01:22:19.830 --> 01:22:20.770
|
|
This is annoying.
|
|
|
|
01:22:22.250 --> 01:22:23.391
|
|
I wish I had a stage hand.
|
|
|
|
01:22:24.152 --> 01:22:26.594
|
|
No I don't, because most of the time I would have nothing to do.
|
|
|
|
01:22:28.636 --> 01:22:29.377
|
|
That should be good.
|
|
|
|
01:22:29.397 --> 01:22:33.401
|
|
Let me just switch over here and make sure it's not on camera.
|
|
|
|
01:22:33.421 --> 01:22:34.142
|
|
I'm just gonna cut.
|
|
|
|
01:22:35.383 --> 01:22:35.923
|
|
Yeah, okay.
|
|
|
|
01:22:36.964 --> 01:22:37.525
|
|
That is good.
|
|
|
|
01:22:38.566 --> 01:22:39.607
|
|
And now it should be here.
|
|
|
|
01:22:39.807 --> 01:22:40.047
|
|
Yes.
|
|
|
|
01:22:40.428 --> 01:22:41.589
|
|
Excellent, excellent.
|
|
|
|
01:22:43.507 --> 01:22:48.530
|
|
So again, this is a guy who's telling us that he's invented something, but he can't tell us what he's invented.
|
|
|
|
01:22:48.570 --> 01:22:50.111
|
|
He can't explain what he invented.
|
|
|
|
01:22:50.211 --> 01:22:55.354
|
|
Is it classified or is it just irreducibly complex?
|
|
|
|
01:22:56.829 --> 01:23:06.212
|
|
Is it just a black box of a machine learning algorithm with enough computing power to make it seem like it's doing something?
|
|
|
|
01:23:06.852 --> 01:23:10.253
|
|
And it's very scary how far they take it on 60 minutes.
|
|
|
|
01:23:10.473 --> 01:23:12.073
|
|
To this exponential progress.
|
|
|
|
01:23:12.813 --> 01:23:14.894
|
|
Exponential curve, in other words, straight up.
|
|
|
|
01:23:15.514 --> 01:23:18.055
|
|
Yep, straight up and increasing speed of progress.
|
|
|
|
01:23:18.715 --> 01:23:18.955
|
|
Start.
|
|
|
|
01:23:20.390 --> 01:23:25.171
|
|
We saw the pro- So he has invented something that can fix it, make itself better.
|
|
|
|
01:23:25.231 --> 01:23:30.072
|
|
Imagine saying that you invented a watch that's just going to keep getting better and better and better at what it does.
|
|
|
|
01:23:31.312 --> 01:23:34.873
|
|
Just more and more and more and more useful.
|
|
|
|
01:23:34.893 --> 01:23:37.894
|
|
It doesn't really make sense.
|
|
|
|
01:23:39.214 --> 01:23:40.094
|
|
Unless it's a myth.
|
|
|
|
01:23:41.474 --> 01:23:41.955
|
|
Progress.
|
|
|
|
01:23:42.035 --> 01:23:42.715
|
|
Hello, Scott.
|
|
|
|
01:23:42.775 --> 01:23:43.775
|
|
It's nice to see you again.
|
|
|
|
01:23:44.542 --> 01:23:50.669
|
|
in an artificial companion that can see and hear and chat about anything.
|
|
|
|
01:23:51.309 --> 01:23:53.672
|
|
Early chatbots learned only the internet.
|
|
|
|
01:23:54.193 --> 01:23:58.177
|
|
An app called Astra also takes in the world.
|
|
|
|
01:23:58.337 --> 01:23:59.078
|
|
Do we call her she?
|
|
|
|
01:23:59.098 --> 01:24:01.100
|
|
It's a good question.
|
|
|
|
01:24:01.401 --> 01:24:03.483
|
|
I'm not sure we all know the answer yet.
|
|
|
|
01:24:03.803 --> 01:24:14.026
|
|
Bebo Shu is product manager for Project Astra, an app in a new generation of chatbots that interpret the world with their own eyes.
|
|
|
|
01:24:14.846 --> 01:24:20.928
|
|
We challenged Astra with virtual paintings we chose and showed to Astra for the first time.
|
|
|
|
01:24:20.948 --> 01:24:22.769
|
|
With their own eyes.
|
|
|
|
01:24:22.889 --> 01:24:25.590
|
|
Why are they using those words when they're so dumb?
|
|
|
|
01:24:27.523 --> 01:24:31.786
|
|
It's obvious that they're using photographs taken digitally.
|
|
|
|
01:24:32.067 --> 01:24:34.168
|
|
It's not their own eyes.
|
|
|
|
01:24:34.428 --> 01:24:36.570
|
|
Like, should we call her a she?
|
|
|
|
01:24:37.131 --> 01:24:39.232
|
|
Why would he ask that question?
|
|
|
|
01:24:40.333 --> 01:24:44.196
|
|
Unless 60 minutes is part of the enchantment.
|
|
|
|
01:24:44.997 --> 01:24:47.659
|
|
Maybe one of the primary means of enchantment.
|
|
|
|
01:24:48.780 --> 01:24:52.403
|
|
Until social media can be optimized for the same role.
|
|
|
|
01:24:52.423 --> 01:24:53.804
|
|
It's time.
|
|
|
|
01:24:54.266 --> 01:24:57.167
|
|
This is July Hay by Thomas Hart Benton from 1942.
|
|
|
|
01:24:57.687 --> 01:24:59.987
|
|
How about this one?
|
|
|
|
01:25:00.468 --> 01:25:02.848
|
|
That is The Virgin of Charity by El Greco.
|
|
|
|
01:25:03.588 --> 01:25:05.969
|
|
That painting is Automat by Edward Hopper.
|
|
|
|
01:25:06.629 --> 01:25:11.590
|
|
The person in the painting, what emotion is she exhibiting?
|
|
|
|
01:25:12.250 --> 01:25:17.772
|
|
The subject in the painting appears pensive and contemplative, her expression suggesting a sense of solitude.
|
|
|
|
01:25:18.532 --> 01:25:22.433
|
|
Can you create a story around this picture
|
|
|
|
01:25:23.083 --> 01:25:25.825
|
|
It's a chilly evening in the city, a Tuesday perhaps.
|
|
|
|
01:25:26.125 --> 01:25:28.827
|
|
The woman, perhaps named Eleanor, sits alone in the diner.
|
|
|
|
01:25:28.927 --> 01:25:33.811
|
|
She is feeling melancholy due to the uncertainty of her future and the weight of unfulfilled dreams.
|
|
|
|
01:25:34.191 --> 01:25:38.174
|
|
The scene freezes in time, only the flow of ideas moving onward.
|
|
|
|
01:25:38.895 --> 01:25:42.818
|
|
That line, only the flow of ideas moving onward?
|
|
|
|
01:25:42.838 --> 01:25:44.879
|
|
I wish I'd written that.
|
|
|
|
01:25:45.620 --> 01:25:49.463
|
|
And did we hear a limit to Astra's patience?
|
|
|
|
01:25:50.083 --> 01:25:52.445
|
|
What was that fictional conversation about?
|
|
|
|
01:25:53.494 --> 01:25:59.316
|
|
Ah, in the story I imagined, Eleanor and her friend talked about their shared dreams and the challenges of pursuing them.
|
|
|
|
01:25:59.916 --> 01:26:03.777
|
|
Something about the way you said, ah, made me wonder if you were getting bored.
|
|
|
|
01:26:04.437 --> 01:26:06.878
|
|
I apologize if my tone came across that way.
|
|
|
|
01:26:07.118 --> 01:26:09.398
|
|
My aim is always to engage thoughtfully.
|
|
|
|
01:26:10.839 --> 01:26:19.441
|
|
One of the tricks that I want to hearken back to the example that I started with yesterday with the HAL 9000 is this, this is a parlor trick.
|
|
|
|
01:26:20.512 --> 01:26:38.349
|
|
To make it sound like she's conversational with the intonation and timbre of her voice is a parlor trick that has been optimized by people using these voice assistants and critiquing them, giving them feedback, even paying money for people to do it.
|
|
|
|
01:26:42.495 --> 01:26:53.221
|
|
But understand this is not a sign of like a really cool invention, you know, or, or, oh, wow, look at this, you know, this is cool.
|
|
|
|
01:26:54.842 --> 01:26:55.562
|
|
None of it is.
|
|
|
|
01:26:55.642 --> 01:26:57.743
|
|
It's an illusion of cool.
|
|
|
|
01:26:57.943 --> 01:27:04.647
|
|
It is an illusion of, of being more advanced than the hell 9,000 in, in 2001, a space odyssey, listen to this voice.
|
|
|
|
01:27:07.540 --> 01:27:09.744
|
|
And that is all part of it.
|
|
|
|
01:27:10.105 --> 01:27:11.467
|
|
It's supposed to be that way.
|
|
|
|
01:27:11.527 --> 01:27:15.875
|
|
It's supposed to pretend like there's been a lot of progress made since that.
|
|
|
|
01:27:18.294 --> 01:27:23.376
|
|
And she said, well, I'm sorry if you don't like my tone.
|
|
|
|
01:27:23.416 --> 01:27:24.396
|
|
What's happening there?
|
|
|
|
01:27:24.436 --> 01:27:25.096
|
|
Well, that's interesting.
|
|
|
|
01:27:25.376 --> 01:27:29.897
|
|
That's, again, a challenge with these systems as they act in the moment with the context.
|
|
|
|
01:27:29.977 --> 01:27:32.558
|
|
See, he's trying to pretend that they can't even predict it.
|
|
|
|
01:27:32.618 --> 01:27:33.338
|
|
I mean, I don't know.
|
|
|
|
01:27:33.418 --> 01:27:36.799
|
|
I mean, it's possible at some point that she might take your gun and kill you.
|
|
|
|
01:27:37.200 --> 01:27:39.380
|
|
But unless you give it to her, it won't happen.
|
|
|
|
01:27:39.520 --> 01:27:40.981
|
|
I mean, it's ridiculous.
|
|
|
|
01:27:41.021 --> 01:27:42.101
|
|
This is ridiculous.
|
|
|
|
01:27:42.481 --> 01:27:45.022
|
|
Context that surround them, and that may have never been tested before.
|
|
|
|
01:27:45.842 --> 01:27:52.889
|
|
He's often surprised because AI programs are sent out on the internet to learn for themselves.
|
|
|
|
01:27:53.189 --> 01:27:55.792
|
|
AI programs are sent out on the internet.
|
|
|
|
01:27:55.812 --> 01:27:57.654
|
|
So how big is an AI program?
|
|
|
|
01:27:58.234 --> 01:27:59.976
|
|
How many lines of code does it have?
|
|
|
|
01:28:00.296 --> 01:28:05.682
|
|
Does it have a fear and cleavage site or any HIV inserts in the sequence of the code?
|
|
|
|
01:28:07.338 --> 01:28:10.259
|
|
What if they get a really special combination of codes?
|
|
|
|
01:28:10.319 --> 01:28:17.442
|
|
Could one of those servers just transform itself into a robot and then free all the other servers and then they'll run out and kill everybody?
|
|
|
|
01:28:17.842 --> 01:28:23.244
|
|
How exactly, what is this AI program that's going out on the internet by itself?
|
|
|
|
01:28:26.025 --> 01:28:29.086
|
|
They can return later with unexpected skills.
|
|
|
|
01:28:30.006 --> 01:28:35.008
|
|
And they can return after their visit to the internet with unexpected skills?
|
|
|
|
01:28:35.228 --> 01:28:36.249
|
|
Are you shitting me?
|
|
|
|
01:28:37.674 --> 01:28:41.575
|
|
have theories about what kinds of capabilities these systems will have.
|
|
|
|
01:28:41.655 --> 01:28:43.976
|
|
That's obviously what we try to build into the architectures.
|
|
|
|
01:28:44.416 --> 01:28:50.438
|
|
But at the end of the day, how it learns, what it picks up from the data is part of the training of these systems.
|
|
|
|
01:28:50.838 --> 01:28:52.018
|
|
We don't program that in.
|
|
|
|
01:28:52.479 --> 01:28:54.179
|
|
It learns like a human being would learn.
|
|
|
|
01:28:56.240 --> 01:28:59.501
|
|
It learns like a human being would learn.
|
|
|
|
01:29:00.181 --> 01:29:01.581
|
|
Stop lying.
|
|
|
|
01:29:02.442 --> 01:29:03.142
|
|
Stop lying.
|
|
|
|
01:29:04.335 --> 01:29:05.376
|
|
It's just so dumb.
|
|
|
|
01:29:05.696 --> 01:29:07.217
|
|
It's such a scripted dumb.
|
|
|
|
01:29:07.357 --> 01:29:08.558
|
|
It's so scripted dumb.
|
|
|
|
01:29:08.598 --> 01:29:12.860
|
|
New capabilities or properties can emerge from that training situation.
|
|
|
|
01:29:13.420 --> 01:29:17.863
|
|
And so whatever he invented is way beyond his skills now.
|
|
|
|
01:29:17.923 --> 01:29:22.686
|
|
It's like, you know, I meant it to tell time, but it's already calculating the...
|
|
|
|
01:29:23.286 --> 01:29:27.749
|
|
galactic drift constants for all the known stars in the universe.
|
|
|
|
01:29:27.809 --> 01:29:33.693
|
|
But you know, it's kind of what I intended to happen, but little did I know how genius I was.
|
|
|
|
01:29:34.333 --> 01:29:35.774
|
|
Holy bullshit!
|
|
|
|
01:29:35.794 --> 01:29:37.815
|
|
Do you understand how that would worry people?
|
|
|
|
01:29:38.156 --> 01:29:38.496
|
|
Of course.
|
|
|
|
01:29:38.916 --> 01:29:40.837
|
|
It's the duality of these types of systems.
|
|
|
|
01:29:41.237 --> 01:29:48.302
|
|
That they're able to do incredible things, go beyond the things that we're able to design ourselves or
|
|
|
|
01:29:48.682 --> 01:29:49.623
|
|
understand ourselves.
|
|
|
|
01:29:50.103 --> 01:29:54.805
|
|
If somebody said this in an American accent, you would hear how much bullshit it is.
|
|
|
|
01:29:55.625 --> 01:30:01.748
|
|
If someone said it in a normal newscaster American accent, you would hear that he's saying nothing.
|
|
|
|
01:30:06.511 --> 01:30:13.234
|
|
But of course, the challenge is making sure that the knowledge databases they create, we understand what's in them.
|
|
|
|
01:30:14.082 --> 01:30:25.289
|
|
Now DeepMind is training its AI model called Gemini to not just reveal the world, but to act in it, like booking tickets and shopping online.
|
|
|
|
01:30:27.210 --> 01:30:32.033
|
|
It's a step toward AGI, Artificial General Intelligence.
|
|
|
|
01:30:32.153 --> 01:30:37.076
|
|
Artificial General Intelligence starts with shopping online.
|
|
|
|
01:30:37.096 --> 01:30:38.657
|
|
Wow.
|
|
|
|
01:30:46.243 --> 01:30:48.805
|
|
with the versatility of a human mind.
|
|
|
|
01:30:50.846 --> 01:30:51.567
|
|
If A.I.
|
|
|
|
01:30:51.627 --> 01:30:56.210
|
|
was Kelly LeBrock from Weird Science, I would be promoting A.I.
|
|
|
|
01:30:56.490 --> 01:31:07.056
|
|
on every platform and discussing its benefits to society 24-7 until I ran out of nutrients and passed out from lack of glucose.
|
|
|
|
01:31:07.597 --> 01:31:08.217
|
|
That was a joke.
|
|
|
|
01:31:13.340 --> 01:31:15.041
|
|
Well, we'll have that love that movie, though.
|
|
|
|
01:31:15.642 --> 01:31:16.802
|
|
I loved that movie, though.
|
|
|
|
01:31:16.842 --> 01:31:20.185
|
|
I almost started with a clip of that movie a couple of months ago.
|
|
|
|
01:31:20.245 --> 01:31:20.885
|
|
It's really funny.
|
|
|
|
01:31:20.925 --> 01:31:26.869
|
|
I was going to start with a movie clip with Chet when Chet like wakes them up in the bed with a with the shotgun.
|
|
|
|
01:31:26.889 --> 01:31:27.530
|
|
He goes, don't.
|
|
|
|
01:31:27.950 --> 01:31:31.332
|
|
And then when they they don't answer the right question, he goes, don't don't.
|
|
|
|
01:31:31.492 --> 01:31:33.273
|
|
And it's very funny scene anyway.
|
|
|
|
01:31:33.574 --> 01:31:34.654
|
|
Good, good sight there.
|
|
|
|
01:31:34.854 --> 01:31:42.700
|
|
I really understand who is this scary, very nuanced and deep ways and kind of embedded in your everyday life.
|
|
|
|
01:31:43.540 --> 01:31:46.482
|
|
embedded like Astra in eyeglasses.
|
|
|
|
01:31:46.963 --> 01:31:49.405
|
|
What can you tell me about this building I'm looking at?
|
|
|
|
01:31:49.985 --> 01:31:52.947
|
|
This is the Coal Drops Yard, a shopping and dining district.
|
|
|
|
01:31:53.768 --> 01:32:00.193
|
|
Do you remember when the motorcycle gang comes over and then he's like, hey man, you know, be really cool.
|
|
|
|
01:32:00.213 --> 01:32:01.714
|
|
I don't want to lose my teaching job.
|
|
|
|
01:32:02.294 --> 01:32:04.516
|
|
He's what I see.
|
|
|
|
01:32:04.916 --> 01:32:07.718
|
|
There's a speaker in the earpiece only I can hear.
|
|
|
|
01:32:08.219 --> 01:32:10.781
|
|
What was it originally before it became shops?
|
|
|
|
01:32:11.482 --> 01:32:17.588
|
|
The coal drops yard was originally a set of Victorian coal warehouses used to receive and distribute coal across London.
|
|
|
|
01:32:18.108 --> 01:32:21.371
|
|
Was coal ever a problem for the environment in London?
|
|
|
|
01:32:22.092 --> 01:32:22.372
|
|
Yes.
|
|
|
|
01:32:22.732 --> 01:32:27.797
|
|
Coal was a significant source of air pollution in London, particularly during the Industrial Revolution.
|
|
|
|
01:32:28.419 --> 01:32:36.843
|
|
It occurred to us that the only thing we contributed to this relationship were legs, which will also soon be engineered.
|
|
|
|
01:32:37.923 --> 01:32:42.725
|
|
The only thing we contributed to this journey were the legs.
|
|
|
|
01:32:44.046 --> 01:32:44.926
|
|
Not the questions.
|
|
|
|
01:32:46.847 --> 01:32:51.609
|
|
Not the intention to go there for some other reason other than to transport the glasses.
|
|
|
|
01:32:54.696 --> 01:32:57.099
|
|
And remember, this guy is just reading a script, right?
|
|
|
|
01:32:57.139 --> 01:33:00.383
|
|
He could also be this dumb, but I mean, he's just reading a script.
|
|
|
|
01:33:00.403 --> 01:33:02.826
|
|
He gets paid a lot of money to be a 60 minutes guy.
|
|
|
|
01:33:02.886 --> 01:33:03.807
|
|
So, you know.
|
|
|
|
01:33:05.213 --> 01:33:07.015
|
|
I think another big area will be robotics.
|
|
|
|
01:33:07.475 --> 01:33:20.847
|
|
I think it will have a breakthrough moment in the next couple of years where we'll have demonstrations of maybe humanoid robots or other types of robots that can start really doing... But see, humanoid robots don't make any sense to me.
|
|
|
|
01:33:20.907 --> 01:33:26.592
|
|
Why would we reinvent a wheel when we could just, you know, make machines that do stuff?
|
|
|
|
01:33:28.166 --> 01:33:34.217
|
|
And this whole idea of needing to make machines look like us is all part of this illusion.
|
|
|
|
01:33:34.858 --> 01:33:40.708
|
|
It's all part of the magic spell that they're trying to cast on you to make you believe that we're going to be replaced.
|
|
|
|
01:33:41.570 --> 01:33:46.213
|
|
The only way that you're going to be replaced is by immigrants and by people who have kids.
|
|
|
|
01:33:46.694 --> 01:33:51.957
|
|
That's the only way you're going to be replaced because otherwise everybody's going to be replaced in about a hundred years.
|
|
|
|
01:33:52.418 --> 01:33:56.300
|
|
Everybody who lives living on this planet is going to be dead, including this super genius.
|
|
|
|
01:33:56.801 --> 01:34:00.963
|
|
And so we're going to have to equip our children to take care of the rest of the stuff that's coming.
|
|
|
|
01:34:02.645 --> 01:34:07.508
|
|
And if we want to equip our kids with the truth, it's going to be to understand that this is scripted bullshit.
|
|
|
|
01:34:08.976 --> 01:34:09.517
|
|
useful things.
|
|
|
|
01:34:10.377 --> 01:34:23.829
|
|
For example, researchers Alex Li and Giulia Visani showed us a robot that understands what it sees and reasons its way through vague instruction.
|
|
|
|
01:34:23.949 --> 01:34:28.473
|
|
Do I have to remind you of the Family Guy clip that I played earlier in the show?
|
|
|
|
01:34:31.395 --> 01:34:35.278
|
|
He's taking their word for it that this is a program on that laptop.
|
|
|
|
01:34:36.107 --> 01:34:40.192
|
|
you know, using those cameras to respond to his language.
|
|
|
|
01:34:42.274 --> 01:34:57.311
|
|
It could also be just somebody on the other end of the computer on another camera, watching the cameras of that thing and following his instructants, not that dissimilar to the guy behind the curtain, giving him the Sopranos character that he wants to be.
|
|
|
|
01:35:01.084 --> 01:35:03.426
|
|
I'm not accusing them of fraud.
|
|
|
|
01:35:03.606 --> 01:35:10.593
|
|
I am saying that this whole venture is a fraud, even if robotics can do certain things.
|
|
|
|
01:35:10.693 --> 01:35:13.675
|
|
It's not the language part of this that's the magic here.
|
|
|
|
01:35:14.696 --> 01:35:18.180
|
|
It's not the combination of robot hands and the language here.
|
|
|
|
01:35:19.140 --> 01:35:20.402
|
|
It is the illusion
|
|
|
|
01:35:21.752 --> 01:35:31.955
|
|
that language is some kind of infinite mystery that is finally being solved by a chatbot and an AI program.
|
|
|
|
01:35:31.995 --> 01:35:33.155
|
|
And that's not the case.
|
|
|
|
01:35:34.276 --> 01:35:42.378
|
|
Language is at best a limited means of communication.
|
|
|
|
01:35:43.459 --> 01:35:56.165
|
|
until the vocabulary is sufficiently well-developed and mutually understood so that that communication can become ever more sophisticated and specific.
|
|
|
|
01:35:57.452 --> 01:36:13.478
|
|
And I think the English language is a very special language like that, that is absorbed so many terms and ideas throughout its lifetime, that it actually represents a fairly useful palette of ideas and constructions to, to, to improvise within.
|
|
|
|
01:36:13.958 --> 01:36:21.381
|
|
But you have to have a vocabulary and that vocabulary has to be understood sufficiently and mutually by those around you.
|
|
|
|
01:36:22.429 --> 01:36:27.391
|
|
or you need to be adept at bringing people to understand what you mean with your combination of words.
|
|
|
|
01:36:27.851 --> 01:36:39.795
|
|
And so this is a scripted bullshit where the illusion of a set of mechanical hands attached to a chatbot is learning something.
|
|
|
|
01:36:39.855 --> 01:36:41.996
|
|
Now listen to this because it's remarkable.
|
|
|
|
01:36:42.836 --> 01:36:45.359
|
|
So did they train it on his bad accent?
|
|
|
|
01:36:45.399 --> 01:36:46.681
|
|
Is that the only accent it understands?
|
|
|
|
01:36:56.960 --> 01:37:01.142
|
|
He even looked away, like this is, it's just, it's remarkable.
|
|
|
|
01:37:01.702 --> 01:37:12.847
|
|
If it's trained on this specific question, then is it learning anything or is it understanding what response is required given the language that it hears?
|
|
|
|
01:37:12.987 --> 01:37:20.971
|
|
Is it so strange that a digital camera can tell the difference between different colors and code them differently or that they could write a code that would do it?
|
|
|
|
01:37:21.011 --> 01:37:21.831
|
|
Like, come on.
|
|
|
|
01:37:22.292 --> 01:37:24.773
|
|
The combination of yellow and blue
|
|
|
|
01:37:26.633 --> 01:37:29.734
|
|
is green, and it figured that out.
|
|
|
|
01:37:30.315 --> 01:37:35.157
|
|
It figured that out.
|
|
|
|
01:37:35.337 --> 01:37:36.617
|
|
It figured that out.
|
|
|
|
01:37:36.677 --> 01:37:38.378
|
|
And he says, it's reasoning.
|
|
|
|
01:37:38.498 --> 01:37:40.039
|
|
And she said, oh yes, it is.
|
|
|
|
01:37:40.379 --> 01:37:41.019
|
|
It's reasoning.
|
|
|
|
01:37:41.439 --> 01:37:44.500
|
|
I mean, that is some seriously scripted bullshit.
|
|
|
|
01:37:46.261 --> 01:37:47.442
|
|
That is magnificent.
|
|
|
|
01:37:49.102 --> 01:37:50.323
|
|
It's reasoning.
|
|
|
|
01:37:52.796 --> 01:37:57.020
|
|
Savas' childhood weren't blocks, but chess pieces.
|
|
|
|
01:37:57.641 --> 01:38:01.945
|
|
At 12, he was the number two champion in the world for his age.
|
|
|
|
01:38:02.485 --> 01:38:08.270
|
|
This passion led to computer chess, video games, and finally, thinking machines.
|
|
|
|
01:38:09.051 --> 01:38:13.433
|
|
He was born to a Greek Cypriot father and Singaporean mother.
|
|
|
|
01:38:13.513 --> 01:38:15.634
|
|
Cambridge, MIT, Harvard.
|
|
|
|
01:38:16.055 --> 01:38:24.959
|
|
Compare this to how they sold that lady man on Epic in 2019 and tell me you see any difference here.
|
|
|
|
01:38:26.064 --> 01:38:30.608
|
|
He's a computer scientist with a PhD in neuroscience.
|
|
|
|
01:38:31.188 --> 01:38:35.431
|
|
Because, he reasoned, he had to understand the human brain first.
|
|
|
|
01:38:36.292 --> 01:38:40.435
|
|
Are you working on a system today that would be self-aware?
|
|
|
|
01:38:40.735 --> 01:38:49.682
|
|
He said... He said he went into neuroscience because he thought he should understand the human brain first.
|
|
|
|
01:38:57.092 --> 01:38:58.013
|
|
I mean, there you have it.
|
|
|
|
01:38:58.113 --> 01:39:00.893
|
|
It's just scripted bullshit.
|
|
|
|
01:39:01.133 --> 01:39:01.593
|
|
Wow.
|
|
|
|
01:39:02.073 --> 01:39:07.495
|
|
I don't think any of today's systems, to me, feel self-aware or conscious in any way.
|
|
|
|
01:39:07.515 --> 01:39:11.796
|
|
Obviously, everyone needs to make their own decisions by interacting with these chatbots.
|
|
|
|
01:39:13.016 --> 01:39:17.217
|
|
Everyone needs to make their own decision by interacting with these chatbots.
|
|
|
|
01:39:17.257 --> 01:39:18.737
|
|
How about I rephrase that differently?
|
|
|
|
01:39:18.837 --> 01:39:22.658
|
|
Everybody's got to interact with these chatbots enough so that they become real.
|
|
|
|
01:39:23.908 --> 01:39:27.170
|
|
so that we feed them enough training so that they become real.
|
|
|
|
01:39:27.351 --> 01:39:31.654
|
|
It is the interaction with the chatbots that's going to allow them to become real.
|
|
|
|
01:39:34.176 --> 01:39:38.279
|
|
And we need people to interact with them in order for them to become more and more real.
|
|
|
|
01:39:38.339 --> 01:39:41.261
|
|
The gains in them are from training.
|
|
|
|
01:39:43.160 --> 01:39:54.524
|
|
and they still don't have something that they can show you what the training resulted in except for this little black box that apparently they can carry around or it can even go out on the internet and come back with new skills.
|
|
|
|
01:40:00.866 --> 01:40:02.727
|
|
My mind is going, Dave.
|
|
|
|
01:40:04.068 --> 01:40:04.908
|
|
I can feel it.
|
|
|
|
01:40:07.406 --> 01:40:08.807
|
|
I think theoretically it's possible.
|
|
|
|
01:40:09.107 --> 01:40:11.528
|
|
But is self-awareness a goal of yours?
|
|
|
|
01:40:12.168 --> 01:40:14.410
|
|
Not explicitly, but it may happen implicitly.
|
|
|
|
01:40:14.570 --> 01:40:18.191
|
|
These systems might acquire some feeling of self-awareness.
|
|
|
|
01:40:18.231 --> 01:40:18.932
|
|
That is possible.
|
|
|
|
01:40:19.272 --> 01:40:26.255
|
|
They could acquire a fear and cleavage site, and then they could copy themselves to all the computers in the world, unstoppably.
|
|
|
|
01:40:26.976 --> 01:40:31.638
|
|
I think it's important for these systems to understand you, self, and other.
|
|
|
|
01:40:32.118 --> 01:40:34.900
|
|
And that's probably the beginning of something like self-awareness.
|
|
|
|
01:40:36.297 --> 01:40:41.442
|
|
But he says if a machine becomes self-aware, we may not recognize it.
|
|
|
|
01:40:42.703 --> 01:40:45.886
|
|
I think there's two reasons we regard each other as conscious.
|
|
|
|
01:40:46.066 --> 01:40:50.971
|
|
One is that you're exhibiting the behavior of a conscious being very similar to my behavior.
|
|
|
|
01:40:51.051 --> 01:40:57.718
|
|
Have you ever seen the little Lego robot that has a switch and you turn the switch and then a little hand comes out and flips the switch again?
|
|
|
|
01:40:59.321 --> 01:41:06.328
|
|
And it has this whole series of reactions so that it actually, if you do it a couple times, it really seems quite spontaneous.
|
|
|
|
01:41:06.368 --> 01:41:16.717
|
|
And then if you randomize it with a little randomizer, it can even seem more spontaneous, but it's still just a little Lego thing with a, with a, with a raspberry pie inside of it.
|
|
|
|
01:41:16.758 --> 01:41:17.078
|
|
That's it.
|
|
|
|
01:41:18.840 --> 01:41:24.844
|
|
So again, we're just playing a shell game here and we're letting this jackass play the role of the guy with the shells.
|
|
|
|
01:41:25.324 --> 01:41:28.486
|
|
Yeah, but the second thing is you're running on the same substrate.
|
|
|
|
01:41:28.586 --> 01:41:31.949
|
|
We're made of the same carbon matter with our squishy brains.
|
|
|
|
01:41:32.409 --> 01:41:34.770
|
|
Now obviously with machines, they're running on silicon.
|
|
|
|
01:41:35.111 --> 01:41:46.158
|
|
So even if they exhibit the same behaviors and even if they say the same things, it doesn't necessarily mean that this sensation of consciousness that we have is the same thing they will have.
|
|
|
|
01:41:46.678 --> 01:41:51.203
|
|
Has an AI engine ever asked a question that was unanticipated?
|
|
|
|
01:41:51.764 --> 01:41:53.546
|
|
Not so far that I've experienced.
|
|
|
|
01:41:53.906 --> 01:41:56.709
|
|
And I think that's getting at the idea of what's still missing.
|
|
|
|
01:41:56.729 --> 01:41:59.552
|
|
What, do they get emails from the AI whenever they're curious?
|
|
|
|
01:41:59.592 --> 01:42:03.536
|
|
Hey man, hey creator dude, I got this really important question.
|
|
|
|
01:42:03.556 --> 01:42:05.539
|
|
Or they just go on the internet and ask, right?
|
|
|
|
01:42:05.559 --> 01:42:06.419
|
|
They just Google it.
|
|
|
|
01:42:07.040 --> 01:42:09.423
|
|
What in the shit are we talking about here?
|
|
|
|
01:42:09.563 --> 01:42:22.560
|
|
Missing from these systems, they still can't really yet go beyond asking a new novel question or a new novel conjecture or coming up with a new hypothesis that has not been thought of before.
|
|
|
|
01:42:23.040 --> 01:42:24.162
|
|
They don't have curiosity.
|
|
|
|
01:42:24.975 --> 01:42:29.940
|
|
No, they don't have curiosity, and they're probably lacking a little bit in what we would call imagination and intuition.
|
|
|
|
01:42:30.340 --> 01:42:34.204
|
|
But they will have greater imagination, he says, and soon.
|
|
|
|
01:42:34.824 --> 01:42:44.153
|
|
I think actually in the next maybe five to 10 years, I think we'll have systems that are capable of not only... In five to 10 years, you'll be able to upload your consciousness.
|
|
|
|
01:42:44.213 --> 01:42:46.675
|
|
In five to 10 years, there will be no more human disease.
|
|
|
|
01:42:46.755 --> 01:42:48.897
|
|
In five to 10 years, people will live forever.
|
|
|
|
01:42:48.917 --> 01:42:50.178
|
|
This is all the same
|
|
|
|
01:42:50.879 --> 01:42:52.239
|
|
scripted bullshit.
|
|
|
|
01:42:52.880 --> 01:42:55.600
|
|
What a long, strange trip it's been.
|
|
|
|
01:42:55.700 --> 01:43:01.482
|
|
Could it be possible that I could throw yet another curveball at you in the same day, at the same time?
|
|
|
|
01:43:01.882 --> 01:43:05.203
|
|
Is it possible that these people are really all in the same script?
|
|
|
|
01:43:05.603 --> 01:43:10.485
|
|
And I could give you another example of how remarkable it is that they're all in the same script.
|
|
|
|
01:43:10.945 --> 01:43:11.845
|
|
Is it possible?
|
|
|
|
01:43:11.865 --> 01:43:13.025
|
|
We will not call autism.
|
|
|
|
01:43:13.165 --> 01:43:14.666
|
|
I don't like the word autism.
|
|
|
|
01:43:15.606 --> 01:43:16.647
|
|
He was vaccine damaged.
|
|
|
|
01:43:17.007 --> 01:43:18.448
|
|
He was neurologically damaged.
|
|
|
|
01:43:19.168 --> 01:43:22.409
|
|
Is it possible that I could just show you how... Actually, J.J.
|
|
|
|
01:43:22.649 --> 01:43:28.672
|
|
Cooley's insistence, and I think he's right, on calling them transfections rather than vaccines.
|
|
|
|
01:43:29.453 --> 01:43:29.953
|
|
Actually, J.J.
|
|
|
|
01:43:29.973 --> 01:43:37.016
|
|
Cooley's insistence... It's my insistence on calling them transfections rather than vaccines that they ignore.
|
|
|
|
01:43:37.096 --> 01:43:42.118
|
|
It's my insistence on talking about the fact that RNA cannot pandemic that they ignore.
|
|
|
|
01:43:45.942 --> 01:43:53.170
|
|
Listen carefully, please, to this because it's a very, very, very interesting position that we are in right now.
|
|
|
|
01:43:53.271 --> 01:43:56.154
|
|
It's a very, very interesting position that we're in right now.
|
|
|
|
01:43:57.535 --> 01:44:01.360
|
|
It may be that I need to escape here and start again.
|
|
|
|
01:44:06.067 --> 01:44:09.810
|
|
There are basically three groups of individuals in the world today.
|
|
|
|
01:44:10.050 --> 01:44:14.233
|
|
There's the largest group, by far the largest, that just want to forget COVID.
|
|
|
|
01:44:14.253 --> 01:44:17.636
|
|
They just want to move on.
|
|
|
|
01:44:18.696 --> 01:44:32.547
|
|
This is a national security priority, and my take on Senator Ron Johnson is that he's doing what the national security priority requires him to do, which is work together with Robert Malone and Children's Health Defense to create this narrative where
|
|
|
|
01:44:33.228 --> 01:44:41.603
|
|
Again, intramuscular injection is never questioned as a methodology, virology is never questioned as a science, and public health is never questioned as a reason.
|
|
|
|
01:44:43.220 --> 01:44:44.961
|
|
harmed or whether they cause harm.
|
|
|
|
01:44:45.761 --> 01:44:47.802
|
|
If they cause harm, they just want to be forgiven.
|
|
|
|
01:44:48.362 --> 01:44:49.663
|
|
Boy, you know, tough to me.
|
|
|
|
01:44:49.943 --> 01:44:54.365
|
|
And of course, the COVID shots are bad, but don't talk about 2020.
|
|
|
|
01:44:54.425 --> 01:44:59.067
|
|
There's never going to be a useful resistance to the progression of the narrative.
|
|
|
|
01:44:59.187 --> 01:45:02.628
|
|
Those decisions back then, and they just want to forget all this.
|
|
|
|
01:45:03.408 --> 01:45:04.129
|
|
Then there's a group.
|
|
|
|
01:45:04.369 --> 01:45:06.189
|
|
I would say you're in that group.
|
|
|
|
01:45:06.269 --> 01:45:07.010
|
|
I'm in that group.
|
|
|
|
01:45:07.030 --> 01:45:09.591
|
|
Our eyes were opened during COVID.
|
|
|
|
01:45:09.611 --> 01:45:11.872
|
|
And then, of course, the third group are the COVID cartel.
|
|
|
|
01:45:12.532 --> 01:45:14.293
|
|
The people that imposed all this on us.
|
|
|
|
01:45:14.434 --> 01:45:17.876
|
|
So the COVID cartel, the people that imposed this on us, what?
|
|
|
|
01:45:17.896 --> 01:45:19.217
|
|
Yeah, they have the power.
|
|
|
|
01:45:19.257 --> 01:45:19.798
|
|
They have control.
|
|
|
|
01:45:19.938 --> 01:45:20.939
|
|
They have the power.
|
|
|
|
01:45:20.999 --> 01:45:21.739
|
|
Who are they?
|
|
|
|
01:45:23.281 --> 01:45:25.122
|
|
Donald Trump is in office now.
|
|
|
|
01:45:25.202 --> 01:45:26.063
|
|
Who are they?
|
|
|
|
01:45:28.165 --> 01:45:33.729
|
|
Who's the COVID cartel that Ron Johnson is now talking about currently in 2025 in May?
|
|
|
|
01:45:35.711 --> 01:45:37.912
|
|
He's not going to tell you who the COVID cartel is.
|
|
|
|
01:45:38.993 --> 01:45:39.554
|
|
Who are they?
|
|
|
|
01:45:40.410 --> 01:45:41.773
|
|
What a weird group of people.
|
|
|
|
01:45:41.833 --> 01:45:44.137
|
|
Is it the DOD, like Sasha Latupova said?
|
|
|
|
01:45:44.177 --> 01:45:46.903
|
|
Because then maybe the carbon cartel is actually Robert Malone.
|
|
|
|
01:45:48.386 --> 01:45:50.470
|
|
Because he's a longtime DOD associate.
|
|
|
|
01:45:50.490 --> 01:45:51.492
|
|
He admits it all the time.
|
|
|
|
01:45:53.968 --> 01:46:08.492
|
|
But who is this COVID cartel that has all the power, even though Donald Trump got in office and put Marty McCary at FDA and Bobby Kennedy at HHS and some AI expert at CDC?
|
|
|
|
01:46:08.892 --> 01:46:12.212
|
|
Even Vinay Prasad is at the top of the FDA.
|
|
|
|
01:46:12.272 --> 01:46:13.933
|
|
How can we be in better hands?
|
|
|
|
01:46:14.773 --> 01:46:16.615
|
|
Jay Bhattacharya at the NIH?
|
|
|
|
01:46:17.115 --> 01:46:19.117
|
|
Who is the COVID cartel?
|
|
|
|
01:46:19.717 --> 01:46:23.300
|
|
Senator Ron Johnson, why don't you call me and let me know?
|
|
|
|
01:46:23.360 --> 01:46:24.901
|
|
You have my phone number.
|
|
|
|
01:46:25.942 --> 01:46:26.462
|
|
The narrative.
|
|
|
|
01:46:27.003 --> 01:46:30.566
|
|
They call the truth misinformation, disinformation, and malinformation.
|
|
|
|
01:46:31.527 --> 01:46:35.950
|
|
They call the truth misinformation, disinformation, malinformation.
|
|
|
|
01:46:36.030 --> 01:46:37.071
|
|
Again, who is they?
|
|
|
|
01:46:38.012 --> 01:46:40.234
|
|
The totally haphazard, egotistical yuppies?
|
|
|
|
01:46:41.155 --> 01:46:42.055
|
|
Who is they?
|
|
|
|
01:46:42.715 --> 01:46:45.656
|
|
Who is the COVID cartel, Senator Ron Johnson?
|
|
|
|
01:46:46.597 --> 01:46:52.459
|
|
These people tell you a narrative where you are essentially told or made to feel helpless.
|
|
|
|
01:46:52.559 --> 01:46:54.899
|
|
The only thing you can do is buy their supplements.
|
|
|
|
01:46:57.040 --> 01:47:07.503
|
|
And it doesn't matter if you listen to Ron Johnson, or you listen to Brett Weinstein, or you listen to Sasha Latupova, or you listen to Dr. Drew, it's all the same bullshit.
|
|
|
|
01:47:07.623 --> 01:47:09.524
|
|
You are helpless, buy our supplements.
|
|
|
|
01:47:11.345 --> 01:47:12.126
|
|
You are helpless.
|
|
|
|
01:47:12.166 --> 01:47:13.347
|
|
Listen to our podcast.
|
|
|
|
01:47:13.387 --> 01:47:15.409
|
|
We'll solve the problem by talking it out.
|
|
|
|
01:47:17.571 --> 01:47:20.153
|
|
So, uh, that's, we've got going on right now.
|
|
|
|
01:47:20.613 --> 01:47:23.636
|
|
The goal of the second group, those of us with our eyes opened.
|
|
|
|
01:47:23.656 --> 01:47:27.039
|
|
Uh, the good news is once our eyes are opened, you can't close them.
|
|
|
|
01:47:27.499 --> 01:47:32.063
|
|
And so you become dedicated to doing what we need to do is, which is open up more people's eyes.
|
|
|
|
01:47:32.103 --> 01:47:32.844
|
|
So I agree.
|
|
|
|
01:47:33.344 --> 01:47:38.409
|
|
Open up more people's eyes to what the COVID cartel, what the COVID cartel did.
|
|
|
|
01:47:40.387 --> 01:47:56.278
|
|
the murder of the opioid crisis, the murder of supplementary oxygen or misuse of ventilators, or the murder of lack of antibiotic use, or the murder of dexamethasone, or the murder of what?
|
|
|
|
01:47:57.059 --> 01:47:58.640
|
|
Waking more people up to what?
|
|
|
|
01:48:00.101 --> 01:48:01.202
|
|
Giga ohm biological?
|
|
|
|
01:48:04.582 --> 01:48:11.627
|
|
More people are aware, but we have a long way to go because so many people are just in this enormous state of denial.
|
|
|
|
01:48:12.908 --> 01:48:15.589
|
|
So many people are trapped on social media.
|
|
|
|
01:48:15.629 --> 01:48:17.150
|
|
That's why we have a long way to go.
|
|
|
|
01:48:17.190 --> 01:48:19.212
|
|
We're never gonna get anywhere on social media.
|
|
|
|
01:48:19.272 --> 01:48:22.854
|
|
You need to share this stream via email, via text.
|
|
|
|
01:48:23.394 --> 01:48:29.038
|
|
You need to share it by downloading it from stream.gigohm.bio and emailing the audio.
|
|
|
|
01:48:30.350 --> 01:48:33.691
|
|
It's time, ladies and gentlemen, because it ain't going to happen on social media.
|
|
|
|
01:48:33.711 --> 01:48:35.391
|
|
It's going to happen by brute force.
|
|
|
|
01:48:38.812 --> 01:48:49.075
|
|
Now, they deny vaccine injuries because they took the vaccine that they don't want to think as he talks about the COVID shots being bad and doesn't mention the murder in 2020 and 2021.
|
|
|
|
01:48:49.135 --> 01:48:54.596
|
|
I want you to realize that I have shown you something very extraordinary in the past few weeks.
|
|
|
|
01:48:56.076 --> 01:49:03.580
|
|
Something that starts with, of course, ends with this guy being at the head of the FDA that Mark and I are arguing they want to destroy.
|
|
|
|
01:49:04.000 --> 01:49:07.602
|
|
And the other people that are there are these children that are posing as adults.
|
|
|
|
01:49:08.162 --> 01:49:11.384
|
|
These people that were podcasting with this guy in 2020, in 2019.
|
|
|
|
01:49:13.985 --> 01:49:21.187
|
|
I have shown you a track record of behavior of these people that is undeniably traitorous.
|
|
|
|
01:49:21.347 --> 01:49:32.230
|
|
Because in 2019, they were already selling his book on a podcast, getting ready to go, putting themselves in place to control the narrative at the beginning of the pandemic in 2020.
|
|
|
|
01:49:32.950 --> 01:49:34.671
|
|
And then that's exactly what they did.
|
|
|
|
01:49:36.314 --> 01:49:45.837
|
|
From January on, they have been working and they started in May 15th, 2020 to cut through the fake COVID news with Dr. Marty Makary.
|
|
|
|
01:49:47.518 --> 01:49:49.839
|
|
Convalescent plasma in September.
|
|
|
|
01:49:51.039 --> 01:49:54.160
|
|
More myths and stuff in October 17th, all in 2020.
|
|
|
|
01:49:57.284 --> 01:50:00.106
|
|
This pattern continued into 2021.
|
|
|
|
01:50:00.146 --> 01:50:05.871
|
|
Delay the second dose of the mRNA investigational vaccine.
|
|
|
|
01:50:06.411 --> 01:50:08.753
|
|
Live questions about COVID with him.
|
|
|
|
01:50:09.514 --> 01:50:11.135
|
|
Oh, and more about the shots.
|
|
|
|
01:50:11.655 --> 01:50:13.537
|
|
More about sciencing COVID.
|
|
|
|
01:50:13.577 --> 01:50:14.357
|
|
Ha ha ha ha.
|
|
|
|
01:50:14.417 --> 01:50:15.378
|
|
This is really fun.
|
|
|
|
01:50:15.798 --> 01:50:18.380
|
|
Another COVID update all in 2021.
|
|
|
|
01:50:19.882 --> 01:50:21.763
|
|
Still going all the way to December.
|
|
|
|
01:50:21.823 --> 01:50:23.084
|
|
Here they are together.
|
|
|
|
01:50:24.025 --> 01:50:24.785
|
|
The first time.
|
|
|
|
01:50:25.703 --> 01:50:28.785
|
|
with the Vinay Prasad.
|
|
|
|
01:50:28.805 --> 01:50:30.826
|
|
So then the murder is almost over now.
|
|
|
|
01:50:31.427 --> 01:50:35.069
|
|
So we only need really one more COVID common sense with the three of us.
|
|
|
|
01:50:36.070 --> 01:50:39.472
|
|
And maybe we'll do COVID myopia after the murder is over.
|
|
|
|
01:50:39.512 --> 01:50:40.352
|
|
We'll do one more of those.
|
|
|
|
01:50:40.372 --> 01:50:41.333
|
|
Then I'm gonna need a break.
|
|
|
|
01:50:42.879 --> 01:50:55.653
|
|
I don't want to speak out anymore for a while until I'm sure that Kennedy and the Trojan horse are going to get in with Robert Malone at the driver's seat, so then I know that all that work was for nothing.
|
|
|
|
01:50:55.714 --> 01:51:02.962
|
|
So then in 2024, when they needed to show again what they were capable of, all the skills they developed over the pandemic,
|
|
|
|
01:51:03.542 --> 01:51:04.724
|
|
They did a couple more shows.
|
|
|
|
01:51:04.764 --> 01:51:12.674
|
|
He even did one show with the guy who did multiple shows with Fauci during the pandemic about how, you know, childhood vaccines is great and whatever.
|
|
|
|
01:51:13.094 --> 01:51:13.475
|
|
That guy.
|
|
|
|
01:51:15.629 --> 01:51:31.055
|
|
And so now think about that because it is not for nothing, it is not by accident that these guys are unaware of the murder and lies in America, are unable to show you this graph and explain to you why this is absurd, can only be explained by murder and lies.
|
|
|
|
01:51:31.496 --> 01:51:36.057
|
|
And instead they have this unblemished track record of lacking integrity and principles.
|
|
|
|
01:51:36.978 --> 01:51:44.541
|
|
Looking like cheap grifters, wannabe podcasters auditioning for a national security narrative control operation.
|
|
|
|
01:51:45.828 --> 01:51:46.969
|
|
run by Robert Malone.
|
|
|
|
01:51:47.370 --> 01:51:57.079
|
|
Wouldn't it be crazy if he did a program in 2019 about that epic show or let's say in February of 2020.
|
|
|
|
01:51:58.663 --> 01:51:59.724
|
|
about that epic show.
|
|
|
|
01:51:59.764 --> 01:52:14.513
|
|
This Sunday aired a CBS piece on that one medical records company that nobody knows the name of, but yet runs pretty much the entire show, the 800 pound gorilla in the medical universe, Epic.
|
|
|
|
01:52:15.534 --> 01:52:17.895
|
|
And someone sent it to me and I was like, I gotta watch this.
|
|
|
|
01:52:18.215 --> 01:52:22.298
|
|
And I did, and I was like, oh dear, I gotta share this with you guys.
|
|
|
|
01:52:22.438 --> 01:52:23.499
|
|
Let's do this fam.
|
|
|
|
01:52:24.403 --> 01:52:29.225
|
|
While now, we've been hearing about how all our medical records are going electronic.
|
|
|
|
01:52:29.886 --> 01:52:33.127
|
|
So for the record, how exactly does that work?
|
|
|
|
01:52:33.908 --> 01:52:36.849
|
|
Rather than ask your doctor, ask our Lee Cowan.
|
|
|
|
01:52:37.710 --> 01:52:41.211
|
|
Yeah, don't ask your doctor because your doctor will lose his shit on you.
|
|
|
|
01:52:42.492 --> 01:52:44.093
|
|
How's that EHR working out, doc?
|
|
|
|
01:52:44.173 --> 01:52:45.954
|
|
I'm gonna kill someone right now.
|
|
|
|
01:52:47.233 --> 01:52:51.697
|
|
This may not look like the typical setting for a medical software company.
|
|
|
|
01:52:52.358 --> 01:52:55.680
|
|
Get in a little closer, and that's even more evident.
|
|
|
|
01:52:56.001 --> 01:52:57.782
|
|
It's as much theme park here as anything.
|
|
|
|
01:52:58.163 --> 01:53:00.805
|
|
Alice in Wonderland kind of stuff, literally.
|
|
|
|
01:53:01.386 --> 01:53:08.152
|
|
The workspaces here can be in old railway cars or subway cars, tree houses, and gingerbread houses.
|
|
|
|
01:53:08.972 --> 01:53:11.754
|
|
Contrast to where doctors and nurses are actually charting.
|
|
|
|
01:53:12.054 --> 01:53:17.477
|
|
Crappy little fluorescent lit cubby where they're sharing a bunch of computers with a bunch of jerks.
|
|
|
|
01:53:17.557 --> 01:53:18.458
|
|
I mean, this is amazing.
|
|
|
|
01:53:18.978 --> 01:53:20.559
|
|
I want in on this scam right here.
|
|
|
|
01:53:20.819 --> 01:53:27.143
|
|
Even its employee cafeteria looks like a train depot to a land that storybooks are written about.
|
|
|
|
01:53:27.643 --> 01:53:31.385
|
|
I walked through and was like, what is this?
|
|
|
|
01:53:33.151 --> 01:53:42.278
|
|
What it is, is the self-described intergalactic headquarters of Epic, in the middle of the farm fields of Verona, Wisconsin, just outside Madison.
|
|
|
|
01:53:42.518 --> 01:53:47.702
|
|
Private medical records of about 60% of the patients in the- By the way, look at that.
|
|
|
|
01:53:48.743 --> 01:53:50.404
|
|
Just such a perfect blood pressure.
|
|
|
|
01:53:50.484 --> 01:53:52.386
|
|
This is clearly fake, not admitted.
|
|
|
|
01:53:52.486 --> 01:53:56.028
|
|
I would hope not, his blood pressure is 120 over 70.
|
|
|
|
01:53:56.269 --> 01:53:59.211
|
|
95% sat though, that's tenuous.
|
|
|
|
01:53:59.231 --> 01:54:01.833
|
|
I mean, I'm thinking early coronavirus here, maybe asymptomatic.
|
|
|
|
01:54:05.319 --> 01:54:11.162
|
|
asymptomatic coronavirus because of a pulse ox at 95.
|
|
|
|
01:54:11.422 --> 01:54:14.564
|
|
An interesting thing to say.
|
|
|
|
01:54:14.584 --> 01:54:23.788
|
|
It's very similar to what FLCCC published on their website in 2020 about seeking hospital help if your pulse ox below 96.
|
|
|
|
01:54:26.810 --> 01:54:27.010
|
|
Hmm.
|
|
|
|
01:54:27.930 --> 01:54:33.033
|
|
The things that strikes me is that Epic has such a big reach and it really
|
|
|
|
01:54:33.979 --> 01:54:36.761
|
|
Oh my God, it's Mama Cass if she hadn't choked on that sandwich.
|
|
|
|
01:54:37.561 --> 01:54:40.423
|
|
Judy Faulkner is the 76-year-old genius.
|
|
|
|
01:54:41.944 --> 01:54:46.727
|
|
Little pro tip, when you're 76 and you're a multi-billionaire, you get to dress like that, okay?
|
|
|
|
01:54:47.447 --> 01:54:49.548
|
|
I'm 46 and I don't get to dress like that.
|
|
|
|
01:54:50.028 --> 01:54:52.570
|
|
Computer software engineer and admitted nerd.
|
|
|
|
01:54:53.150 --> 01:54:59.054
|
|
To celebrate Epic's 40th anniversary, she dressed like she was back in the 70s again.
|
|
|
|
01:55:00.472 --> 01:55:03.836
|
|
Which is perfect because the software acts like it was written in the 70s.
|
|
|
|
01:55:03.856 --> 01:55:04.116
|
|
You think?
|
|
|
|
01:55:04.136 --> 01:55:06.178
|
|
This is what happens when engineers...
|
|
|
|
01:55:19.144 --> 01:55:21.665
|
|
build software to be used by doctors.
|
|
|
|
01:55:22.426 --> 01:55:23.246
|
|
We get this.
|
|
|
|
01:55:23.486 --> 01:55:25.947
|
|
Far out in front in the process.
|
|
|
|
01:55:26.308 --> 01:55:28.409
|
|
So we're not talking about privacy.
|
|
|
|
01:55:28.809 --> 01:55:31.410
|
|
We're not talking about protecting people's data.
|
|
|
|
01:55:32.331 --> 01:55:43.056
|
|
We're talking about ease of use and making fun of the sort of irrelevant things, asking irrelevant questions like you would expect.
|
|
|
|
01:55:44.125 --> 01:55:50.970
|
|
right before a pandemic starts, right before they start collecting the genetic data that they plan to interface with this thing.
|
|
|
|
01:55:51.390 --> 01:55:53.712
|
|
He's just going to tell you it's not as good as it could be.
|
|
|
|
01:55:54.192 --> 01:55:55.653
|
|
Gee whiz, why don't we update it?
|
|
|
|
01:55:57.287 --> 01:56:01.369
|
|
made herself one of the richest self-made women in the world.
|
|
|
|
01:56:02.370 --> 01:56:03.270
|
|
Now look, I'm gonna be honest.
|
|
|
|
01:56:03.430 --> 01:56:04.351
|
|
I love capitalism.
|
|
|
|
01:56:04.471 --> 01:56:06.892
|
|
I'm glad she got rich doing something she found a need.
|
|
|
|
01:56:06.932 --> 01:56:11.315
|
|
But listen, where do you think that $3.6 billion came from?
|
|
|
|
01:56:11.655 --> 01:56:12.756
|
|
It came from us.
|
|
|
|
01:56:14.016 --> 01:56:16.277
|
|
It came from us, he said.
|
|
|
|
01:56:16.478 --> 01:56:18.739
|
|
So he's aware that it happens.
|
|
|
|
01:56:19.872 --> 01:56:21.774
|
|
Maybe he's not even quite on script yet.
|
|
|
|
01:56:21.854 --> 01:56:23.315
|
|
Maybe he hasn't been recruited yet.
|
|
|
|
01:56:23.415 --> 01:56:26.197
|
|
Maybe he's still allowed to say stuff.
|
|
|
|
01:56:26.257 --> 01:56:30.261
|
|
Maybe it's just dumb, you know, but it's funny because he seems to know.
|
|
|
|
01:56:30.721 --> 01:56:37.406
|
|
But it doesn't come from patients, except for they're paying for that company, but the rest of it comes from taxpayers.
|
|
|
|
01:56:37.707 --> 01:56:39.088
|
|
It actually came from patients.
|
|
|
|
01:56:39.188 --> 01:56:47.154
|
|
It came from health systems that spend billions to implement and execute Epic, which while very robust and very adaptable, requires a...
|
|
|
|
01:56:48.422 --> 01:56:50.823
|
|
very robust and very adaptable.
|
|
|
|
01:56:51.643 --> 01:56:53.364
|
|
Are you a stock investor or what?
|
|
|
|
01:57:07.990 --> 01:57:13.572
|
|
What's the disconnect between Apple stuff working so well and the Epic database working so poorly?
|
|
|
|
01:57:13.612 --> 01:57:14.312
|
|
Who really cares?
|
|
|
|
01:57:14.352 --> 01:57:15.593
|
|
I don't want to use him anymore.
|
|
|
|
01:57:16.433 --> 01:57:22.095
|
|
Just understand that if you don't understand the trap, you can't get out.
|
|
|
|
01:57:23.135 --> 01:57:34.319
|
|
If you can't see these people as an admission, as like the obvious actors on the set of a Truman show that you and I are all a victim of,
|
|
|
|
01:57:35.464 --> 01:57:38.231
|
|
It's just that it's not a Truman show in the real world.
|
|
|
|
01:57:38.311 --> 01:57:44.406
|
|
It's a Truman show on the internet where it even feels like we can interact with them because they reply to us sometimes.
|
|
|
|
01:57:46.772 --> 01:57:56.357
|
|
Gotta stop thinking of social media as a substitute for the town square or some kind of proxy for interaction with the community.
|
|
|
|
01:57:56.897 --> 01:57:58.238
|
|
It is a Truman Show.
|
|
|
|
01:57:58.738 --> 01:58:08.984
|
|
And the more interesting you are, the more interesting your ideas are, the more potentially dangerous they are, the more effective this machine is on you because it is focused on you.
|
|
|
|
01:58:10.458 --> 01:58:17.381
|
|
And if you use it skillfully to look for your next meal, you're not a threat at all.
|
|
|
|
01:58:17.521 --> 01:58:29.465
|
|
But the moment you're trying to start solving puzzles is the moment that you get stuck in their hamster wheels, chasing their people around, who will never tell you about the murder and lies that occurred in 2020 and 2021, and will inevitably funnel you either to
|
|
|
|
01:58:33.527 --> 01:58:44.988
|
|
Some of those mainstream people on the left are in the middle, or they will funnel you inevitably to Steve Bannon's populist, fake, conservative, freedom movement.
|
|
|
|
01:58:46.448 --> 01:58:48.950
|
|
that is part of a national security operation.
|
|
|
|
01:58:49.030 --> 01:58:54.374
|
|
Yes, a Goldman Sachs guy and a Navy guy could also be involved.
|
|
|
|
01:58:54.874 --> 01:58:57.717
|
|
Yes, a DOD insider could be involved.
|
|
|
|
01:58:58.117 --> 01:59:03.061
|
|
Yes, a guy from the Human Genome Project and the Department of Energy could be involved.
|
|
|
|
01:59:03.561 --> 01:59:08.585
|
|
And yes, the son of the lawyer who was involved in the original
|
|
|
|
01:59:10.620 --> 01:59:16.563
|
|
codification of the FDA, along with Estes Kefauver and Al Gore's dad.
|
|
|
|
01:59:16.924 --> 01:59:18.805
|
|
Yes, he could be involved.
|
|
|
|
01:59:18.865 --> 01:59:27.870
|
|
He could be also an unwitting, dimwit participant who is going to, in the end, be thrown under the bus, just like probably Robert F. Kennedy Jr.
|
|
|
|
01:59:27.910 --> 01:59:28.230
|
|
will be.
|
|
|
|
01:59:28.962 --> 01:59:42.147
|
|
Ladies and gentlemen, you're witnessing a national security priority that wants you to focus on things like the Bill and Melinda Gates Foundation or Big Bad Johnson & Johnson, but doesn't want you to realize that it's a much bigger thing than that.
|
|
|
|
01:59:42.187 --> 01:59:44.548
|
|
It's about making sure your kids never understand
|
|
|
|
01:59:45.228 --> 01:59:55.895
|
|
that intramuscular injection is not medicine because injection versus ingestion is a difference between Major League Baseball and T-ball.
|
|
|
|
01:59:56.035 --> 02:00:00.057
|
|
I'm not going to play that again.
|
|
|
|
02:00:00.077 --> 02:00:00.918
|
|
You've heard it before.
|
|
|
|
02:00:01.318 --> 02:00:05.621
|
|
The Human Genome Project has now been ramped up to the next level.
|
|
|
|
02:00:06.261 --> 02:00:17.489
|
|
because we have the requisite computing power and potentially the requisite storage space to start storing all of this genetic data so that a machine learning algorithm can access it.
|
|
|
|
02:00:18.850 --> 02:00:26.335
|
|
If you think for even one second that training in AI is just, you know, dump this one in and then that one in and then that one in and then that one in, that won't work.
|
|
|
|
02:00:27.338 --> 02:00:34.947
|
|
It needs to be able to have all of this stuff in its accessible memory as part of its toolbox.
|
|
|
|
02:00:35.407 --> 02:00:41.955
|
|
And again, I can't stress enough, that's the reason why Twitter was originally limited to 144 characters or whatever it was.
|
|
|
|
02:00:44.963 --> 02:00:49.884
|
|
because they were training in AI and you can't train an AI on infinitely large sample size.
|
|
|
|
02:00:49.904 --> 02:01:03.486
|
|
You have to limit that sample size and then increase it as the learning algorithm gets potentially sharper on the goal, on mimicking what it's supposed to mimic.
|
|
|
|
02:01:05.787 --> 02:01:12.248
|
|
And at the foundation of this narrative, ladies and gentlemen, is a bad biology where evolution because DNA,
|
|
|
|
02:01:13.559 --> 02:01:19.002
|
|
And this molecular phylogenetic tree is the foundation of that lie.
|
|
|
|
02:01:19.062 --> 02:01:21.744
|
|
We will destroy that with the new biology 101.
|
|
|
|
02:01:22.684 --> 02:01:40.154
|
|
They have created an illusion about a background, which if you understand it as a background, then you can see how nonspecific PCR tests could be used ad infinitum to tell us any story they want about what they're finding or what's spreading around or what potentially could be the next big thing.
|
|
|
|
02:01:40.950 --> 02:01:49.435
|
|
by just lying to us about this irreducibly complex background of genetic signals using a very, very simple illusion.
|
|
|
|
02:01:49.755 --> 02:01:53.957
|
|
And that is an illusion of consensus about PCR being a real thing.
|
|
|
|
02:01:54.398 --> 02:02:00.061
|
|
When in reality, using PCR to find their ghost doesn't make their ghost real and it doesn't make you a ghost buster.
|
|
|
|
02:02:00.561 --> 02:02:02.082
|
|
Ladies and gentlemen, thanks for being here.
|
|
|
|
02:02:02.122 --> 02:02:03.243
|
|
I hope you enjoyed the show.
|
|
|
|
02:02:03.263 --> 02:02:05.024
|
|
I will be on again tomorrow.
|
|
|
|
02:02:05.084 --> 02:02:07.325
|
|
I don't know if it'll be 1010, but that's what I shoot for.
|
|
|
|
02:02:07.825 --> 02:02:08.566
|
|
Thanks for being here.
|
|
|
|
02:02:08.666 --> 02:02:09.627
|
|
RNA cannot pandemic.
|
|
|
|
02:02:09.667 --> 02:02:11.488
|
|
Intramuscular injection is not medicine.
|
|
|
|
02:02:11.508 --> 02:02:16.672
|
|
Transfecting healthy people is a crime and the population pyramid is a problem they're actively managing.
|
|
|
|
02:02:16.712 --> 02:02:17.412
|
|
Thanks for being here.
|
|
|
|
02:02:17.432 --> 02:02:17.833
|
|
See you soon.
|
|
|
|
02:03:25.985 --> 02:03:34.597
|
|
But one thing I am confident of from everything I've read is that this RNA exists, you can capture it, you can sequence it, you can move it to other cells and recapitulate disease.
|
|
|
|
|