You can not select more than 25 topics
Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
1907 lines
74 KiB
1907 lines
74 KiB
WEBVTT
|
|
|
|
02:48.000 --> 02:52.000
|
|
Like I said before, why the ham?
|
|
|
|
02:53.000 --> 02:55.000
|
|
Whoa, whoa, whoa, whoa!
|
|
|
|
03:18.000 --> 03:25.000
|
|
Why would you understand that you're in the city of the A's?
|
|
|
|
03:25.000 --> 03:28.000
|
|
Yes, two days ago. One side state your seat.
|
|
|
|
03:43.000 --> 03:46.000
|
|
Good evening ladies and gentlemen.
|
|
|
|
03:46.000 --> 03:55.000
|
|
This is Giga Home Biological High Resistance Low Noise Information Group brought to you by biologists at the 20th of October, 2023.
|
|
|
|
03:55.000 --> 03:57.000
|
|
Street is still alive.
|
|
|
|
04:02.000 --> 04:07.000
|
|
And we are still fighting for what proceed to be the truth.
|
|
|
|
04:07.000 --> 04:17.000
|
|
I think it's funny that you guys are playing the Twib game at the chat. Please do that. That's hilarious.
|
|
|
|
04:17.000 --> 04:24.000
|
|
It is cloudy here in Pennsylvania. I haven't even been outside, so I don't know.
|
|
|
|
04:24.000 --> 04:40.000
|
|
Let's take the power back from the Charlotte's Insights ladies and gentlemen.
|
|
|
|
04:41.000 --> 04:47.000
|
|
Let's show our kids the brick wall at the back of the theater.
|
|
|
|
04:47.000 --> 04:50.000
|
|
That's what I'd like to do.
|
|
|
|
04:50.000 --> 04:53.000
|
|
I want to show everybody what I can see.
|
|
|
|
04:54.000 --> 05:01.000
|
|
I want to show everybody what Jessica has seen. I want to show everybody what Denny Rancor has seen.
|
|
|
|
05:01.000 --> 05:12.000
|
|
And today I want to share with you what my friend Mark Koolak has seen as he's been following his nose and doing what he does.
|
|
|
|
05:13.000 --> 05:18.000
|
|
As an analyst and as an archivist.
|
|
|
|
05:18.000 --> 05:29.000
|
|
And quite frankly, if you don't go to his website and start just randomly browsing his archive, you have no idea what you're missing.
|
|
|
|
05:29.000 --> 05:35.000
|
|
And again, it's not about what you know, but it's rather about what you could know if you chose to find out.
|
|
|
|
05:35.000 --> 05:43.000
|
|
And so many people that we know of don't bother to go to Mark's website to find out what you can find out.
|
|
|
|
05:43.000 --> 05:46.000
|
|
Which is pretty sketch house.
|
|
|
|
05:46.000 --> 05:56.000
|
|
It tells you something about who they are, what their priorities are, who's team they're playing on.
|
|
|
|
05:57.000 --> 06:01.000
|
|
I'm on Team Giga Home Biological.
|
|
|
|
06:01.000 --> 06:10.000
|
|
My name is Jonathan Cooley. I'm the Chief Biologist here at this little operation in the back of a garage in Pittsburgh, Pennsylvania.
|
|
|
|
06:10.000 --> 06:15.000
|
|
And I guess I can tell you I'm not feeling very well.
|
|
|
|
06:15.000 --> 06:22.000
|
|
I did technically take the day off today, which makes me a loser.
|
|
|
|
06:23.000 --> 06:26.000
|
|
And so I didn't do a lot today that I should have done.
|
|
|
|
06:26.000 --> 06:37.000
|
|
And now I'm on this just because, I don't know, I want to feel like I did something and it's not very hard to just sit here and be here for an hour with you.
|
|
|
|
06:37.000 --> 06:42.000
|
|
So thank you very much for joining me. Don't take the bait on TV and social media, although it's very hard.
|
|
|
|
06:42.000 --> 06:44.000
|
|
It's very hard.
|
|
|
|
06:44.000 --> 06:47.000
|
|
They are trying to fool us with this.
|
|
|
|
06:47.000 --> 06:52.000
|
|
With this Scooby-Doo mystery that they they want us to so desperately to solve.
|
|
|
|
06:52.000 --> 06:57.000
|
|
They want us to solve the mystery so badly, actually.
|
|
|
|
06:57.000 --> 07:05.000
|
|
That they have really, I mean, they have front loaded the mystery.
|
|
|
|
07:05.000 --> 07:09.000
|
|
And they front loaded it already many, many years ago.
|
|
|
|
07:09.000 --> 07:15.000
|
|
No, I mean, it's 2023. So it was many, many years ago that they front loaded.
|
|
|
|
07:21.000 --> 07:27.000
|
|
That they front loaded this narrative into the TV and into Hollywood.
|
|
|
|
07:27.000 --> 07:37.000
|
|
They've been pushing this for a long time. And I think all of us are kind of aware of it now, but we need to get a little better at being aware of it.
|
|
|
|
07:39.000 --> 07:45.000
|
|
This is the book that's coming out quite soon or maybe it just came out, but this isn't the only book, right?
|
|
|
|
07:45.000 --> 07:53.000
|
|
The first book that came out that we were watching was Andrew Hough's book and then Robert Malone's book came out.
|
|
|
|
07:53.000 --> 08:01.000
|
|
And before that, there was this Breggins book and there was this book by this lady from Australia.
|
|
|
|
08:01.000 --> 08:06.000
|
|
I mean, in December, there's going to be another book by Robert F. Kennedy Jr.
|
|
|
|
08:07.000 --> 08:14.000
|
|
And every single one of these books talks about a novel virus that everybody was vulnerable to.
|
|
|
|
08:14.000 --> 08:19.000
|
|
It was likely released from a laboratory. It might have been gain a function or bio weapon.
|
|
|
|
08:19.000 --> 08:24.000
|
|
And it circulated the globe for three years and we had to do something.
|
|
|
|
08:24.000 --> 08:28.000
|
|
Some of these books even say that we didn't do enough early enough.
|
|
|
|
08:28.000 --> 08:33.000
|
|
And that had we done it early enough, we would have avoided a pandemic.
|
|
|
|
08:33.000 --> 08:47.000
|
|
Some of these books say that none of these books say that deadly protocols were used to create the illusion of a deadly novel virus spreading around the world.
|
|
|
|
08:47.000 --> 08:59.000
|
|
None of these books talk about the use of infectious clones as the only methodology by which RNA viruses can really be studied.
|
|
|
|
08:59.000 --> 09:16.000
|
|
None of these books talk about the failure of PCR and why it's a failure. None of these books talk about the sketchiness of the sequencing that's being done all behind closed doors, all with proprietary technology.
|
|
|
|
09:16.000 --> 09:26.000
|
|
Nobody's talking about this. All of these books are the same story. Every single one of these books is the same story.
|
|
|
|
09:26.000 --> 09:32.000
|
|
Of all the lies that my government told me, none of them are about vaccines.
|
|
|
|
09:32.000 --> 09:42.000
|
|
The childhood vaccine schedule isn't questioned by Robert Malone, even though that book is published by Children's Health Defense.
|
|
|
|
09:42.000 --> 09:53.000
|
|
I can assure you had I been a part of Children's Health Defense at the beginning of this, Robert Malone would not be having a book with our organization's name on it.
|
|
|
|
09:54.000 --> 10:00.000
|
|
You're either on this team or you're not, and I'm sorry, but a lot of these people are not on our team.
|
|
|
|
10:00.000 --> 10:04.000
|
|
Andrew Huff doesn't care about childhood vaccines.
|
|
|
|
10:04.000 --> 10:10.000
|
|
Andrew Huff doesn't care about the lie of public health.
|
|
|
|
10:10.000 --> 10:20.000
|
|
The dragons might care about the lie about public health, but they sure as heck don't care about whether this is the virus is a lie or not.
|
|
|
|
10:20.000 --> 10:24.000
|
|
And if they do, they're certainly not coming to me about it.
|
|
|
|
10:24.000 --> 10:37.000
|
|
They're not making any progress with spreading the idea that infectious clones are the only way a sequence could be found all around the world simultaneously and be made of RNA.
|
|
|
|
10:38.000 --> 10:48.000
|
|
I mean, the Sherry Marks and girls, she's just a person who got sucked into the narrative like everybody else did without the technical sophistication to read her way out.
|
|
|
|
10:48.000 --> 10:54.000
|
|
But we've read our way out and we know that this is nonsense.
|
|
|
|
10:54.000 --> 11:09.000
|
|
And I assure you that when this book comes out right here, there's going to be plenty of chance for me to open that book and show you what I put in that book to make sure the clones are the thing we talk about in the future.
|
|
|
|
11:09.000 --> 11:13.000
|
|
Or maybe surprisingly, they won't be in the book.
|
|
|
|
11:13.000 --> 11:16.000
|
|
And then I'll have even more things to complain about.
|
|
|
|
11:17.000 --> 11:28.000
|
|
But I assure you that all of these narratives are being allowed to come out in book form and sold on Amazon because they don't question the faith.
|
|
|
|
11:28.000 --> 11:40.000
|
|
The faith of a novel virus that everybody was vulnerable for that we had to do something that the mRNA saved millions from and was likely to gain a function and therefore will come again.
|
|
|
|
11:40.000 --> 11:45.000
|
|
That faith is not being questioned by any of these books, including this one right here.
|
|
|
|
11:45.000 --> 11:50.000
|
|
Just so you know, prepare yourself.
|
|
|
|
11:50.000 --> 12:06.000
|
|
And so I've made it my job over the last year to try and figure out how the hell we got here. How is it that so many people perceive this faith to be true.
|
|
|
|
12:06.000 --> 12:16.000
|
|
It's a very complicated web that people are trapped in that includes liars, liars on all sides.
|
|
|
|
12:16.000 --> 12:21.000
|
|
And most of those liars have not been able to codify this.
|
|
|
|
12:21.000 --> 12:34.000
|
|
Most of those liars have not even approached being able to adequately describe what it is that we are failing with right now.
|
|
|
|
12:34.000 --> 12:38.000
|
|
The only one who's ever done it and I'm going to toot my own horn right now is me.
|
|
|
|
12:38.000 --> 12:44.000
|
|
I'll tell you right now. Children's health defense has never said it like this.
|
|
|
|
12:44.000 --> 13:03.000
|
|
Children's health defense, one of the foremost organizations against the vaccine schedule and four children's health has never been able to so succinctly describe what's so screwed up about the biology of the vaccine schedule until I did it right there.
|
|
|
|
13:03.000 --> 13:18.000
|
|
And every single person that's ever come against me, Robert Malone, Kevin McCairn, Kevin McCurnan, the list goes on and on and on, all the people that say I'm a clown, none of them will touch that.
|
|
|
|
13:18.000 --> 13:22.000
|
|
None of them will touch this biology because they know they can't.
|
|
|
|
13:22.000 --> 13:27.000
|
|
So they try to make fun of clones or swarm me or whatever they say.
|
|
|
|
13:28.000 --> 13:34.000
|
|
None of them will touch this. None of them will touch this because they know it's the truth that they are covering up.
|
|
|
|
13:34.000 --> 13:40.000
|
|
They know it's the truth that they are covering up. They need you to ask a different question.
|
|
|
|
13:40.000 --> 13:44.000
|
|
So they'll tell you to ask any other question but this one.
|
|
|
|
13:44.000 --> 13:52.000
|
|
And that's what those books back there are all about. Just ask the wrong question and you never get out.
|
|
|
|
13:53.000 --> 13:59.000
|
|
And that illusion of consensus about what the right question is is exactly how they've kept us here.
|
|
|
|
13:59.000 --> 14:03.000
|
|
That's what the Scooby-Doo mystery is. It's an illusion of consensus that everybody agrees.
|
|
|
|
14:03.000 --> 14:10.000
|
|
This is the thing we got to figure out.
|
|
|
|
14:10.000 --> 14:19.000
|
|
And that's why this damn people map is so important because if you look through the time frame, the time course of this pandemic, the liars become obvious.
|
|
|
|
14:19.000 --> 14:28.000
|
|
Because their stories change, because their details change, because their background becomes more rich.
|
|
|
|
14:28.000 --> 14:35.000
|
|
And because their stories are inconsistent, they're liars, liars.
|
|
|
|
14:35.000 --> 14:37.000
|
|
And we're outing them.
|
|
|
|
14:37.000 --> 14:42.000
|
|
As we move forward, we are outing the liars.
|
|
|
|
14:42.000 --> 14:49.000
|
|
Joseph Lee was on yesterday. I thought it was a really good discussion. There was a lot of stuff in there that I didn't like though.
|
|
|
|
14:49.000 --> 15:01.000
|
|
Joseph Lee was still stuck on there being a virus going around the world that was different than the other ones that started somewhere in 2020.
|
|
|
|
15:02.000 --> 15:14.000
|
|
And I don't know that until I actually said it out loud that he'd ever considered the possibility, the real extreme, you know, that he talks about all the time, which Joseph, you're listening, doesn't always work.
|
|
|
|
15:14.000 --> 15:22.000
|
|
The example that you gave of a million pound ball and a little BB, because the million ball ball has momentum.
|
|
|
|
15:22.000 --> 15:27.000
|
|
I mean, sorry, inertia and the little one doesn't have as much inertia, so the little one makes it down the ramp better.
|
|
|
|
15:27.000 --> 15:32.000
|
|
That doesn't help you understand the five pound or the five kilo and the six kilo ball.
|
|
|
|
15:32.000 --> 15:45.000
|
|
It actually doesn't, because in certain scenarios and in certain inclines with certain friction coefficients, the five and the six pound one will behave differently, because their mass is so close.
|
|
|
|
15:45.000 --> 15:58.000
|
|
So by exaggerating the masses, you've exaggerated the inertia that one has and the other one doesn't and made it a more important part of that equation when it probably wasn't when the balls were close to the same weight.
|
|
|
|
15:58.000 --> 16:03.000
|
|
So your strategy works sometimes, but it doesn't work all the time.
|
|
|
|
16:04.000 --> 16:17.000
|
|
And I think that Joe is one of these people like myself, who's, you know, yelling at the wall so much that sometimes you can convince yourself that you're right until you yell at somebody else and then they yell back.
|
|
|
|
16:17.000 --> 16:27.000
|
|
And so I think it was really great to have that discussion with him. I think I made progress with understanding exactly the points that he's been trying to make over the last four years.
|
|
|
|
16:27.000 --> 16:44.000
|
|
And I think I've zeroed in on a few things that I think I disagree with him on in terms of how they need to be articulated and what of the biology needs to be emphasized but immunologically speaking, I think we 100% agree that what we've been told is incorrect.
|
|
|
|
16:45.000 --> 17:03.000
|
|
And I'm intrigued because he pushed the idea that there wasn't a lot of memory involved with this with this virus. And I've heard other people push that before too, including Rochelle Walensky at the beginning of the pandemic when she said we weren't sure if we would make any meaningful memory to this
|
|
|
|
17:03.000 --> 17:20.000
|
|
community to this virus. Also, here, Brendan Walsh has said that before, that there's no T cell memory to this virus. He even yelled at me about that in a Zoom meeting a couple years ago. So I found it interesting, but I definitely overall
|
|
|
|
17:20.000 --> 17:31.000
|
|
loved the conversation. And I definitely think that Jolie is still on our team and not trying to not trying to derail the the freedom movement.
|
|
|
|
17:31.000 --> 17:45.000
|
|
I've also a couple of days ago talked about Kevin McCarron and his little World Health Council discussion about the mRNA or the mRNA contaminated with DNA.
|
|
|
|
17:46.000 --> 18:01.000
|
|
And at the end of that stream, if you didn't see it, I break it down on my notebook, all the things that this guy has been a leader on and then left behind as he's moved along with the slow role of the narrative.
|
|
|
|
18:01.000 --> 18:22.000
|
|
It's a really important list to understand because you can see why he's probably not all he says he is and why it's really weird that someone with this backyard is spending any time at all on a renter like me in Pittsburgh.
|
|
|
|
18:22.000 --> 18:40.000
|
|
Tonight, I want to talk about my friend Mark Koolack, who lives in the outskirts of Boston, and who has recently adopted his granddaughter after a rather tragic story of losing a son.
|
|
|
|
18:40.000 --> 18:57.000
|
|
I really like Mark a lot. I've had a lot of contact with Mark in the last two and a half years, and I've come to trust him as my handler, as many people have accused him of being, other people accuse him of being my research wing.
|
|
|
|
18:58.000 --> 19:10.000
|
|
I would accuse him of being a good friend and somebody who is oftentimes told me to chill out, to take a day off, to think about what you say tomorrow, and be positive.
|
|
|
|
19:10.000 --> 19:17.000
|
|
And often, it reminds me just to pray, which I oftentimes need reminding of.
|
|
|
|
19:17.000 --> 19:40.000
|
|
And Mark has done some really good work in the last few days, somewhat inspired by my random viewing of a Peter Kullis video, but also just because he has been so vigilant and so steadfast in his archiving that he was kind of poised to make this jump, because adding one more piece
|
|
|
|
19:41.000 --> 19:50.000
|
|
to an incomplete puzzle. At some point, you start to see the picture, and Mark has been collecting puzzle pieces for a long time that didn't necessarily fit that well.
|
|
|
|
19:50.000 --> 19:54.000
|
|
And now they're starting to fit.
|
|
|
|
19:54.000 --> 19:57.000
|
|
And so I want to call your attention to this video here.
|
|
|
|
19:57.000 --> 20:07.000
|
|
First of all, I'm going to grab that text and I'm going to put it in the chat because I think it's still up, and I sure would like it if everybody would watch it.
|
|
|
|
20:07.000 --> 20:17.000
|
|
And then I'm going to show a little bit of it now. I don't think we need to watch the whole thing. I want you to watch the whole thing. And then we're going to watch a brief video by this company's.
|
|
|
|
20:18.000 --> 20:32.000
|
|
That they released or was released right before the pandemic, which kind of describes their, their business from the pre pandemic perspective, which I think you will find very insightful and I think Mark will too.
|
|
|
|
20:32.000 --> 20:37.000
|
|
So again, I can't stress enough how you if you're not following him.
|
|
|
|
20:37.000 --> 20:44.000
|
|
I don't really what's wrong with you. But anyway, here we go. I'm going to get out of this. And I think I have it right back here. I do.
|
|
|
|
20:45.000 --> 20:47.000
|
|
It's going to hit play and then I'm going to make it big.
|
|
|
|
20:47.000 --> 20:55.000
|
|
But also Peter Teal receives a lot of investing advice, at least for his personal investments from Eric Weinstein.
|
|
|
|
20:55.000 --> 21:02.000
|
|
Brother Brett Weinstein and son of less Weinstein, as I've shown you before.
|
|
|
|
21:03.000 --> 21:16.000
|
|
So there's a lot of insider info on the patent issues regarding proteins regarding antibodies, et cetera, finding their way to to the investors of Peter Teal.
|
|
|
|
21:16.000 --> 21:28.000
|
|
Presumably, it would be, it would be really, I don't know why because we've seen what his research capabilities have been as far as physics and mathematics go.
|
|
|
|
21:28.000 --> 21:33.000
|
|
I don't know why Eric Weinstein would have made $100 million in the other way.
|
|
|
|
21:33.000 --> 21:37.000
|
|
I fail to see how that's possible.
|
|
|
|
21:37.000 --> 21:46.000
|
|
I'd like to be a modest person, but I could probably mop the floor with with Eric.
|
|
|
|
21:46.000 --> 21:52.000
|
|
So I don't, I just don't see how I don't know if that's true because Eric is a masterful.
|
|
|
|
21:52.000 --> 21:58.000
|
|
It's very useful intelligence to Peter Teal. That's the only explanation that makes any sense.
|
|
|
|
21:58.000 --> 22:06.000
|
|
And with Eric Weinstein's father, less Weinstein, one of the top arbitrators in the United States of America and also a senior person.
|
|
|
|
22:06.000 --> 22:21.000
|
|
And I think he provided a lot of legal oversight on the Santa Corps versus a Genentech key patent issue or key patent case.
|
|
|
|
22:21.000 --> 22:23.000
|
|
It would just make all the sense in the world.
|
|
|
|
22:23.000 --> 22:31.000
|
|
Now, that Santa Corps case was 2010, 2011, 2012, and Absolera was made in 2012.
|
|
|
|
22:31.000 --> 22:50.000
|
|
So, so again, the timing makes it look like Absolera, the inspiration for Absolera might have come from a broader understanding of what was going on and that the industry needed to take things in a new direction.
|
|
|
|
22:50.000 --> 23:06.000
|
|
As I talked about in the previous video, a lot of the lipid nanoparticle technologies on the delivery side for things like mRNA, which became mRNA, right, have been slowly being researched and developed for 20 plus years, 25 years before this.
|
|
|
|
23:06.000 --> 23:21.000
|
|
With respect to something called gene silencing technology, or silencing RNA technology, where RNA was being delivered in two cells in all kinds of synthetic ways and hope was to up regulate or deregulate genes.
|
|
|
|
23:21.000 --> 23:25.000
|
|
Now, I want to put out something here that I wonder if people really understand.
|
|
|
|
23:25.000 --> 23:38.000
|
|
There are two markets for these kinds of technologies, and I'm not even really sure that Mark understands the other market, just because it's not really on everybody's radar.
|
|
|
|
23:38.000 --> 23:55.000
|
|
But SI RNA in particular, so small interfering RNA, is where you inject a small RNA molecule and it binds to a endogenous longer molecule, and that double-stranded portion of that RNA triggers its destruction.
|
|
|
|
23:55.000 --> 24:17.000
|
|
So, if you wanted to reduce the expression of a protein in an animal that you were studying and a protein that you were studying, you can use SI RNA to knock down the endogenous transcripts of that mRNA so that the protein of that mRNA, the expression of that protein would go down.
|
|
|
|
24:17.000 --> 24:37.000
|
|
So, it's important to understand that as these kinds of technologies have been developed, like transfection and transformation, these technologies were developed in laboratory animal experimental settings with mice, rats, and monkeys.
|
|
|
|
24:38.000 --> 24:51.000
|
|
And it's a gigantic market that has a gigantic monetary base, right, because they're all spending grant money from the NIH and the Well Trust and also their crap.
|
|
|
|
24:53.000 --> 24:59.000
|
|
And they're writing grants with the cost of these reagents already in their grant, right, written.
|
|
|
|
25:00.000 --> 25:13.000
|
|
So, these companies, as long as they choose cleverly, the small interfering RNAs that they make, the small interfering RNA products that they make, as long as they choose them correctly,
|
|
|
|
25:15.000 --> 25:21.000
|
|
and choose to look at proteins that are interesting to the academic bench science,
|
|
|
|
25:22.000 --> 25:29.000
|
|
there's potentially huge markets for these very cheap and easy to make products.
|
|
|
|
25:30.000 --> 25:36.000
|
|
But the trick, of course, is, and Mark's not wrong, nobody's wrong when they say this, the trick is to get into the human market.
|
|
|
|
25:37.000 --> 25:40.000
|
|
The trick is to get something to be a medicine.
|
|
|
|
25:41.000 --> 25:50.000
|
|
But I think it's important to see how much of their illusion of consensus that this stuff works is built on the idea
|
|
|
|
25:51.000 --> 25:56.000
|
|
that what's been going on in academia for all these years is actually proof that it works.
|
|
|
|
25:58.000 --> 26:03.000
|
|
Even though we kill all those animals way before we evaluate the health effects of the treatment,
|
|
|
|
26:04.000 --> 26:12.000
|
|
we think that on, as academic biologists, we think that on face value, what it says on the tin, it actually does.
|
|
|
|
26:13.000 --> 26:18.000
|
|
So, if what it says on the tin is what you want to do to that patient, makes sense to me.
|
|
|
|
26:19.000 --> 26:29.000
|
|
And I think that's why a lot of academic biologists, when they heard the way that the, the transfection was going to be used to be an immunization,
|
|
|
|
26:30.000 --> 26:32.000
|
|
they were all kind of like, okay, that doesn't sound terrible.
|
|
|
|
26:33.000 --> 26:36.000
|
|
They must have figured out the little, the little nuances of it, right?
|
|
|
|
26:37.000 --> 26:47.000
|
|
And that's exactly what, that's exactly what Robert Malone said on several podcasts as an excuse for why he decided to say this,
|
|
|
|
26:48.000 --> 26:52.000
|
|
he decided to take the shot because he assumed they solved the targeting problem.
|
|
|
|
26:54.000 --> 26:57.000
|
|
That's what he said. He assumed they solved the targeting problem.
|
|
|
|
26:59.000 --> 27:03.000
|
|
Even though we just heard Peter Cullis say it the other night that they, they hadn't solved that problem.
|
|
|
|
27:04.000 --> 27:10.000
|
|
He had burned five postdocs on that, and the last postdoc that he burned on had threatened to quit if he couldn't change her project.
|
|
|
|
27:11.000 --> 27:19.000
|
|
So, it's a massive web of liars that are spectacularly committed to the lives that they are telling.
|
|
|
|
27:20.000 --> 27:33.000
|
|
And I think that Mark and myself and a few other people have uncovered evidence of what the real, the real play is.
|
|
|
|
27:34.000 --> 27:40.000
|
|
Those people are all out there to make us ask the wrong questions, and it doesn't even matter what the question is as long as it's wrong.
|
|
|
|
27:42.000 --> 27:50.000
|
|
That's why it gets to be so confusing. That's why it gets to be so weird and why it's so fragmented in terms of what's most important to talk about
|
|
|
|
27:50.000 --> 27:57.000
|
|
because there's only one right answer for these people. Don't talk about this stuff.
|
|
|
|
27:58.000 --> 28:06.000
|
|
Don't talk about the antibody paradox. Don't talk about $150 billion a year industry that's about to crash.
|
|
|
|
28:07.000 --> 28:17.000
|
|
And don't talk about how those patents, as they become worthless, are changing the landscape for all of these patent holders.
|
|
|
|
28:18.000 --> 28:46.000
|
|
And there might not be very many ways to get around that except to take those old antibodies and wash them through a new kind of system where you could claim essentially that, well, I used this set of patents to improve my previous technology monoclonal antibody that actually was polyclonal.
|
|
|
|
28:47.000 --> 28:52.000
|
|
And now I made it monoclonal with the help of Absalara and their unique processes.
|
|
|
|
28:52.000 --> 29:13.000
|
|
So essentially now that it's under their patents, you have another 20 years of patents that would be defendable from a completely different set of intellectual property space because the antibody monoclonal antibody property space was just basically closed by the Supreme Court.
|
|
|
|
29:14.000 --> 29:30.000
|
|
And now the question is whether all those patent holders are going to risk having their patents tested in court or if they're just going to wash their patents through this system and get new ones.
|
|
|
|
29:31.000 --> 29:40.000
|
|
New ones based on a new and improved stack of technology that allows you to select an exact antibody from trillions.
|
|
|
|
29:42.000 --> 29:51.000
|
|
It's extraordinary. Now listen carefully to what Mark says here because what's key about this whole thing is that it's a Canadian company.
|
|
|
|
29:52.000 --> 29:55.000
|
|
It's a Canadian company funded by DARPA.
|
|
|
|
30:00.000 --> 30:13.000
|
|
It's a Canadian company that has links to epivacs and Aries and a couple other companies that incidentally feature Robert Malone.
|
|
|
|
30:14.000 --> 30:17.000
|
|
Turn them on and turn them off, right?
|
|
|
|
30:17.000 --> 30:21.000
|
|
And though maybe they had some limited success in technically making it work.
|
|
|
|
30:21.000 --> 30:28.000
|
|
They did not find, I don't believe, a truly successful business market for it.
|
|
|
|
30:28.000 --> 30:37.000
|
|
And that's right. So that's why I broke off into that. The small interfering RNA as a technology works great.
|
|
|
|
30:37.000 --> 30:48.000
|
|
If you want to mess around with an animal model of a disease or you want to create an animal model of a disease or manipulate a system to understand what a protein does.
|
|
|
|
30:49.000 --> 31:03.000
|
|
But there are very few therapeutic cases where one needs to reduce the expression of a protein and then where repeated endosing of a small interfering RNA would be the way to go.
|
|
|
|
31:04.000 --> 31:07.000
|
|
So there aren't very many therapeutic uses for it.
|
|
|
|
31:08.000 --> 31:21.000
|
|
But what they were developing at that time, if you remember, is not only the S RNA, RNA technology, but the delivery, the lipid nanoparticle, that's what Peter Collins was doing.
|
|
|
|
31:22.000 --> 31:32.000
|
|
And that technology and that company that was doing that was also Robert Malone involved, which is strange.
|
|
|
|
31:33.000 --> 31:53.000
|
|
Employers such a technology at scale. And I think a lot of the individuals that were involved in silencing RNA technologies between 2010 and 2012 redirected a lot of their efforts and their energy to say, look, there's a lot of money to be made in antibodies.
|
|
|
|
31:54.000 --> 31:58.000
|
|
So let's deploy this technology in a way where we can just make an antigen.
|
|
|
|
31:59.000 --> 32:09.000
|
|
We can take our technology, call it messenger RNA technology, we'll make synthetic exosomes effectively, and we'll cause cells based after an injection to express a protein.
|
|
|
|
32:10.000 --> 32:14.000
|
|
And then the protein, the antigen body will make the antibodies.
|
|
|
|
32:14.000 --> 32:22.000
|
|
And before you know it, now we're in the game of making antibodies and getting into that $150 plus billion a year just in the United States market.
|
|
|
|
32:23.000 --> 32:29.000
|
|
And not only that, but you're using your own patents to do it. Not the old patents, but the new patents.
|
|
|
|
32:29.000 --> 32:31.000
|
|
That just makes all the sense in the world. Right.
|
|
|
|
32:32.000 --> 32:33.000
|
|
Okay.
|
|
|
|
32:33.000 --> 32:47.000
|
|
Essentially, you could see, you could see this technology is extending the lifespan of the patents, which have indirectly been challenged by the recent Supreme Court ruling.
|
|
|
|
32:48.000 --> 33:02.000
|
|
The one that the antibody paradox paper in the Yale law school journal Swiss citing the Gorsuch was citing repeatedly that we have asserted was probably put out there very specifically to be used by the Supreme Court.
|
|
|
|
33:03.000 --> 33:17.000
|
|
So they basically took this huge area of patent law, which as far as we can tell has been curated by Dungeon Master Les Weinstein for many years.
|
|
|
|
33:18.000 --> 33:41.000
|
|
And in 2012, Dungeon Master Les Weinstein would have been one of the first people to realize that this space is going to start to become shaky because the patent law and how patents are supposed to work is not really being followed by monoclonal antibody patents.
|
|
|
|
33:42.000 --> 33:50.000
|
|
When you follow the patent, you can't always make what they say you can make, and that doesn't make it a valid patent.
|
|
|
|
33:50.000 --> 33:55.000
|
|
The biology described in the patents was nonsense.
|
|
|
|
33:55.000 --> 34:02.000
|
|
And once that became clear to the Supreme Court, the Supreme Court is very happy to get rid of that.
|
|
|
|
34:03.000 --> 34:31.000
|
|
And I think Mark is dead on balls accurate when he thinks that this might be the reaction of the market, the reaction of the smart people in the space who could see that, okay, well, if this intellectual property space worth $150 billion based on IP is about to become a free space, how could we leverage that for ourselves?
|
|
|
|
34:31.000 --> 34:48.000
|
|
I've looked at some videos of the CEO of Absalera here.
|
|
|
|
34:48.000 --> 34:51.000
|
|
And let's see here.
|
|
|
|
34:51.000 --> 34:53.000
|
|
See, Ab.
|
|
|
|
34:53.000 --> 34:54.000
|
|
Salera.
|
|
|
|
34:54.000 --> 34:57.000
|
|
And I think his name is Carl Hanson.
|
|
|
|
34:57.000 --> 34:59.000
|
|
Okay.
|
|
|
|
34:59.000 --> 35:05.000
|
|
And see here, just a couple of things.
|
|
|
|
35:05.000 --> 35:06.000
|
|
I have Forbes.
|
|
|
|
35:06.000 --> 35:07.000
|
|
You can go there.
|
|
|
|
35:07.000 --> 35:15.000
|
|
The Globe and Mail meet Carl Hanson overnight billionaire CEO behind Absalera's life saving COVID-19 treatment, et cetera.
|
|
|
|
35:16.000 --> 35:23.000
|
|
And he's like a six foot to jock.
|
|
|
|
35:23.000 --> 35:31.000
|
|
I'm not trying to slander him in any way, but I think he was on the college basketball team.
|
|
|
|
35:31.000 --> 35:37.000
|
|
He has twin brother, always interesting, right, who was also on the basketball team.
|
|
|
|
35:37.000 --> 35:39.000
|
|
And he does triathlons.
|
|
|
|
35:39.000 --> 35:42.000
|
|
He's a somewhere midlife guy.
|
|
|
|
35:42.000 --> 35:47.000
|
|
He was a professor in one of the Vancouver schools colleges.
|
|
|
|
35:47.000 --> 35:54.000
|
|
And then suddenly around 2012, he gets this idea, okay, to start this company.
|
|
|
|
35:54.000 --> 36:02.000
|
|
And sure enough, with the outbreak of coronavirus, he overnight becomes a billionaire.
|
|
|
|
36:02.000 --> 36:03.000
|
|
This is a story.
|
|
|
|
36:03.000 --> 36:05.000
|
|
This is a construct story.
|
|
|
|
36:05.000 --> 36:06.000
|
|
Okay.
|
|
|
|
36:06.000 --> 36:08.000
|
|
This is what it is.
|
|
|
|
36:09.000 --> 36:12.000
|
|
It could be one of the smartest people in the world.
|
|
|
|
36:12.000 --> 36:16.000
|
|
But the way that worked, we've seen this over and over and over again.
|
|
|
|
36:16.000 --> 36:18.000
|
|
He was picked for the role.
|
|
|
|
36:18.000 --> 36:22.000
|
|
I'm not upset, but let's just be honest.
|
|
|
|
36:22.000 --> 36:23.000
|
|
Okay.
|
|
|
|
36:23.000 --> 36:28.000
|
|
So if you look at the other people in Absalera, this company that kind of started about a nothing,
|
|
|
|
36:28.000 --> 36:34.000
|
|
and then before you know it is getting all this money, who gave them the idea?
|
|
|
|
36:34.000 --> 36:36.000
|
|
How did they start this idea?
|
|
|
|
36:36.000 --> 36:38.000
|
|
Where did the idea come from?
|
|
|
|
36:38.000 --> 36:42.000
|
|
Who really created Absalera?
|
|
|
|
36:42.000 --> 36:49.000
|
|
Who really provided the inspiration for Absalera in 2012?
|
|
|
|
36:49.000 --> 37:02.000
|
|
Well, because it's so hard to find old stuff on Absalera, and I'm not even going to go through
|
|
|
|
37:02.000 --> 37:06.000
|
|
the variety of searches that I did over the last couple of hours.
|
|
|
|
37:06.000 --> 37:12.000
|
|
Sure enough, a company called Epivax that we reviewed before, and I'll tell you who they
|
|
|
|
37:12.000 --> 37:21.000
|
|
are in a moment, announced that they were providing immunogenicity screening toolkits to Absalera,
|
|
|
|
37:21.000 --> 37:26.000
|
|
that they've been providing technologies, that they've been providing intellectual property
|
|
|
|
37:26.000 --> 37:28.000
|
|
to Absalera.
|
|
|
|
37:28.000 --> 37:38.000
|
|
Now, these relationships seemed to start in 2020, but I highly doubt that they just started
|
|
|
|
37:38.000 --> 37:47.000
|
|
when companies suddenly have several, how would you say, business associations like that,
|
|
|
|
37:47.000 --> 37:54.000
|
|
it's unlikely that they decided to just do it, probably have been in contact for a while.
|
|
|
|
37:54.000 --> 38:03.000
|
|
So, I'm going to venture out and suggest that Epivax has been working with Absalera
|
|
|
|
38:03.000 --> 38:08.000
|
|
in the sharing of intellectual property and things which have not been disclosed in press releases,
|
|
|
|
38:08.000 --> 38:12.000
|
|
and I think that's a pretty safe bet, a very safe matter of fact,
|
|
|
|
38:12.000 --> 38:14.000
|
|
even though I don't have the literature yet to prove it.
|
|
|
|
38:14.000 --> 38:18.000
|
|
Every once in a while, you just know how these things are.
|
|
|
|
38:18.000 --> 38:22.000
|
|
Epivax is located in Providence, Rhode Island, not actually not too far from me,
|
|
|
|
38:22.000 --> 38:25.000
|
|
but probably an hour drive from my house.
|
|
|
|
38:25.000 --> 38:31.000
|
|
Rhode Island is also between New York City and Boston for those who are outside of the country.
|
|
|
|
38:31.000 --> 38:38.000
|
|
It's one of our geographically smallest states and only has a population of about a million people, I think.
|
|
|
|
38:38.000 --> 38:45.000
|
|
Anyways, they don't have a lot of big companies in Rhode Island, but this is a little startup
|
|
|
|
38:45.000 --> 38:50.000
|
|
that has made a little bit of a name for itself, and it's computational biology.
|
|
|
|
38:50.000 --> 38:55.000
|
|
It's going through data sets, finding patterns, mining for things.
|
|
|
|
38:55.000 --> 38:58.000
|
|
That's what Epivax does.
|
|
|
|
38:58.000 --> 39:05.000
|
|
There's been papers written, white papers out of Epivax about how the future of vaccines
|
|
|
|
39:05.000 --> 39:11.000
|
|
should be made computationally through blood screening and so on and so forth.
|
|
|
|
39:11.000 --> 39:14.000
|
|
Get around, just get away from all the lab stuff.
|
|
|
|
39:14.000 --> 39:19.000
|
|
Things can be so much faster if we just use the computers.
|
|
|
|
39:19.000 --> 39:36.000
|
|
Epivax, a board member of Epivax, and their chief science officer from 2012 to 2019 was Robert Malone.
|
|
|
|
39:36.000 --> 39:51.000
|
|
I've known this, but I totally forgot about that fact.
|
|
|
|
39:51.000 --> 39:52.000
|
|
Let's see.
|
|
|
|
39:52.000 --> 39:53.000
|
|
Let's go to the Houstonic site.
|
|
|
|
39:53.000 --> 39:58.000
|
|
Let's look up the Malone, and we have two versions of his resume.
|
|
|
|
39:58.000 --> 40:04.000
|
|
The 2021 that was made around the same time he was on the Brett Weinstein podcast.
|
|
|
|
40:04.000 --> 40:07.000
|
|
Then there's the 2017 resume.
|
|
|
|
40:07.000 --> 40:18.000
|
|
If you look at the 2017 resume from Malone, if you go down here, let's see, 2017, the work experience, projects,
|
|
|
|
40:18.000 --> 40:23.000
|
|
Beardsworth, Aeros, Intradime.
|
|
|
|
40:23.000 --> 40:24.000
|
|
Let's see here.
|
|
|
|
40:24.000 --> 40:27.000
|
|
Where is the Epivax stuff?
|
|
|
|
40:27.000 --> 40:30.000
|
|
Solve, Beardsworth.
|
|
|
|
40:30.000 --> 40:33.000
|
|
Come on, I know it's on here.
|
|
|
|
40:33.000 --> 40:34.000
|
|
Why?
|
|
|
|
40:34.000 --> 40:37.000
|
|
Because he was actually a consultant there.
|
|
|
|
40:37.000 --> 40:44.000
|
|
He was on the board of directors for Epivax, and he was on the scientific advisory board from 2012 to 2019.
|
|
|
|
40:44.000 --> 40:46.000
|
|
He wasn't a scientific director.
|
|
|
|
40:46.000 --> 40:48.000
|
|
He was on the scientific advisory board.
|
|
|
|
40:48.000 --> 40:51.000
|
|
Still, it's a meaningful thing.
|
|
|
|
40:51.000 --> 40:55.000
|
|
This is where he chose to spend his time providing consulting services.
|
|
|
|
40:55.000 --> 41:00.000
|
|
He actually provided consulting services to Epivax as early as 2005.
|
|
|
|
41:00.000 --> 41:06.000
|
|
For a long time, Malone has always been more of a computational guy than a bench guy.
|
|
|
|
41:06.000 --> 41:22.000
|
|
So, Robert Malone, possibly, and in fact, when you look at that, when you realize that he's the only person that has
|
|
|
|
41:22.000 --> 41:27.000
|
|
seemed to have had both experience and industry connections,
|
|
|
|
41:27.000 --> 41:37.000
|
|
I'm pretty sure played a very significant role in the establishment of AB Absalar, of Absalar or AB Salar.
|
|
|
|
41:37.000 --> 41:41.000
|
|
I'm pretty sure he did.
|
|
|
|
41:41.000 --> 41:51.000
|
|
It would appear possible that Epivax provided some type of intellectual property transfer,
|
|
|
|
41:51.000 --> 42:00.000
|
|
and they used Vancouver and Canada as a nice, convenient way to be able to offload some of this stuff over the border.
|
|
|
|
42:00.000 --> 42:09.000
|
|
And possibly, in part, escapes at least some of the US-based internet patent law.
|
|
|
|
42:09.000 --> 42:17.000
|
|
I think Malone is known full well that ultimately, because Malone was also one of the co-founders of Intradime,
|
|
|
|
42:17.000 --> 42:29.000
|
|
which was one of these one or two companies that was very early in the silencing SIRNA technologies,
|
|
|
|
42:29.000 --> 42:37.000
|
|
which, again, around 2012 seems to have taken a pivot to be redeployed in a way to create messenger RNA,
|
|
|
|
42:37.000 --> 42:46.000
|
|
because they weren't getting anywhere, and because the antibody patent issue was getting so hot.
|
|
|
|
42:46.000 --> 42:51.000
|
|
Now, some of you may have heard of something called the Pfizer org chart, or the perfect Pfizer org chart,
|
|
|
|
42:51.000 --> 42:55.000
|
|
as the most incredible thing ever created and discovered.
|
|
|
|
42:55.000 --> 43:04.000
|
|
It should be noted that a lot of the startup capital for AB Salar came from Pfizer.
|
|
|
|
43:04.000 --> 43:08.000
|
|
The real perfect org chart is not in Pfizer.
|
|
|
|
43:08.000 --> 43:12.000
|
|
The real perfect org chart is out here.
|
|
|
|
43:12.000 --> 43:16.000
|
|
It's AB Salera. It's epivax. It's computational biology startup companies.
|
|
|
|
43:16.000 --> 43:21.000
|
|
It's using Canada to get around the United States patent law.
|
|
|
|
43:21.000 --> 43:29.000
|
|
It's using repurposing silencing RNA technologies, advancing lipid nanoparticle delivery systems
|
|
|
|
43:29.000 --> 43:36.000
|
|
to create messenger RNA way, again, in pairing these things together to get around the antibody patent issues,
|
|
|
|
43:36.000 --> 43:44.000
|
|
and then God knows what do whatever is going to be next, and we had no idea what that is.
|
|
|
|
43:44.000 --> 43:48.000
|
|
It's important for me to point out here that...
|
|
|
|
43:48.000 --> 43:54.000
|
|
...important. I think it's interesting to point out here that one possible permutation of this story
|
|
|
|
43:54.000 --> 43:57.000
|
|
would be a novel virus.
|
|
|
|
43:57.000 --> 44:02.000
|
|
A lie about a novel virus to get out a novel technology.
|
|
|
|
44:02.000 --> 44:04.000
|
|
Thank you very much.
|
|
|
|
44:04.000 --> 44:06.000
|
|
Thank you. Thank you.
|
|
|
|
44:06.000 --> 44:12.000
|
|
To get out a novel technology that they knew would fail on the first outing because that's the way
|
|
|
|
44:12.000 --> 44:15.000
|
|
transfection works.
|
|
|
|
44:15.000 --> 44:22.000
|
|
So they did that on purpose so that they could blame it on the spike protein and DNA contamination
|
|
|
|
44:22.000 --> 44:31.000
|
|
and all this other stuff and then say, but if we roll back and don't do foreign proteins,
|
|
|
|
44:31.000 --> 44:35.000
|
|
but instead we make antibodies with these...
|
|
|
|
44:35.000 --> 44:37.000
|
|
...with this mRNA.
|
|
|
|
44:37.000 --> 44:43.000
|
|
Now we'll just give you the antibody that you need instead of giving you a spike protein
|
|
|
|
44:43.000 --> 44:49.000
|
|
and then counting on your body to develop a response to it.
|
|
|
|
44:49.000 --> 44:55.000
|
|
I don't think that that's necessarily going to work as an immunization,
|
|
|
|
44:55.000 --> 45:05.000
|
|
but I do see very clearly how the technology for finding and locating clonal B cells
|
|
|
|
45:05.000 --> 45:09.000
|
|
and getting them separate and everything,
|
|
|
|
45:09.000 --> 45:14.000
|
|
that that technology, because of its nature, is patentable.
|
|
|
|
45:14.000 --> 45:17.000
|
|
All these processes where you get a single cell and you sort them out,
|
|
|
|
45:17.000 --> 45:22.000
|
|
you find out what proteins they're making, that's all patentable.
|
|
|
|
45:22.000 --> 45:30.000
|
|
And so where you had monoclonal antibodies has a hard to patent thing,
|
|
|
|
45:30.000 --> 45:39.000
|
|
now monoclonal antibodies are the product of this chain of patented processes.
|
|
|
|
45:39.000 --> 45:44.000
|
|
So even though at the end of the process you end up with a monoclonal antibody,
|
|
|
|
45:44.000 --> 45:51.000
|
|
that the Supreme Court has said by traditional means is not very easy to patent anymore.
|
|
|
|
45:51.000 --> 46:02.000
|
|
Because you have these 50 processes that lead up to that productive monoclonal antibody,
|
|
|
|
46:02.000 --> 46:07.000
|
|
the actual patenting of the antibodies irrelevant.
|
|
|
|
46:07.000 --> 46:15.000
|
|
And so they've essentially done an end-around on this huge shake-up of the intellectual property space
|
|
|
|
46:15.000 --> 46:21.000
|
|
that was monoclonal antibodies by just simply doing it again,
|
|
|
|
46:21.000 --> 46:29.000
|
|
creating another set of methodologies or mythologies where all we know
|
|
|
|
46:29.000 --> 46:36.000
|
|
that consist of patented processes which allow them to claim intellectual property
|
|
|
|
46:36.000 --> 46:42.000
|
|
over this set of therapeutics to give.
|
|
|
|
46:42.000 --> 46:49.000
|
|
Now the re-thing that I want you to watch is this TED Talk because this TED Talk is extraordinary
|
|
|
|
46:49.000 --> 46:52.000
|
|
because it's before the pandemic so you hear things different.
|
|
|
|
46:52.000 --> 46:56.000
|
|
I'm going to finish my hamburger and watch this.
|
|
|
|
47:06.000 --> 47:08.000
|
|
Good morning.
|
|
|
|
47:08.000 --> 47:15.000
|
|
We have an incredible ability to make molecules that have the potential to make the drives of the future
|
|
|
|
47:15.000 --> 47:18.000
|
|
and save the lives of so many people.
|
|
|
|
47:18.000 --> 47:23.000
|
|
Some of you in this room right now might be making some of these molecules
|
|
|
|
47:23.000 --> 47:30.000
|
|
and I'm here to tell you how we might be able to harness that potential through science and technology.
|
|
|
|
47:30.000 --> 47:32.000
|
|
My name is Kevin Arias.
|
|
|
|
47:32.000 --> 47:39.000
|
|
I'm a scientist, most of the co-founder of a biotechnology company that works on drug discovery.
|
|
|
|
47:39.000 --> 47:48.000
|
|
And today I really want to give you an idea of how science and technology are enabling new ways of finding drugs
|
|
|
|
47:48.000 --> 47:51.000
|
|
that would impact human health.
|
|
|
|
47:51.000 --> 47:55.000
|
|
So first of all, I had no business to be in science if I look back,
|
|
|
|
47:55.000 --> 47:58.000
|
|
probably even less be here in front of you today.
|
|
|
|
47:58.000 --> 48:01.000
|
|
I was born in a very small village in the French Alps
|
|
|
|
48:01.000 --> 48:04.000
|
|
and was rather distinct to be a skin structure,
|
|
|
|
48:04.000 --> 48:09.000
|
|
but my parents forced me to go to university and get exposed to knowledge.
|
|
|
|
48:09.000 --> 48:17.000
|
|
And through that, I had the luck to meet great mentors who really showed me how big ideas can be achieved.
|
|
|
|
48:17.000 --> 48:24.000
|
|
And with that in mind, I really want to try to communicate
|
|
|
|
48:24.000 --> 48:28.000
|
|
what many of us, scientists and people in the biotechnology industry
|
|
|
|
48:28.000 --> 48:34.000
|
|
are witnessing how science and technology are changing the way we approach human health.
|
|
|
|
48:34.000 --> 48:39.000
|
|
So let's talk about therapeutic molecules.
|
|
|
|
48:39.000 --> 48:47.000
|
|
Many people typically think that drugs or therapeutic molecules are, I don't know, or astring.
|
|
|
|
48:47.000 --> 48:52.000
|
|
These molecules have been made by chemistry synthesized
|
|
|
|
48:52.000 --> 48:58.000
|
|
and we've made really thousands of molecules that have solved many problems.
|
|
|
|
48:58.000 --> 49:01.000
|
|
We can cure many diseases.
|
|
|
|
49:01.000 --> 49:03.000
|
|
And that's one part of the story.
|
|
|
|
49:03.000 --> 49:13.000
|
|
We all know very well that there are still too many diseases for which we don't have drugs.
|
|
|
|
49:13.000 --> 49:16.000
|
|
It's kind of a weird statement to make, right?
|
|
|
|
49:16.000 --> 49:19.000
|
|
We've made lots of drugs that cured lots of diseases.
|
|
|
|
49:19.000 --> 49:22.000
|
|
But he doesn't give any examples.
|
|
|
|
49:22.000 --> 49:27.000
|
|
Can you think of an example of a drug that cured a disease?
|
|
|
|
49:27.000 --> 49:30.000
|
|
What drug cured measles?
|
|
|
|
49:30.000 --> 49:33.000
|
|
What drug cured polio?
|
|
|
|
49:33.000 --> 49:36.000
|
|
What drug cures the flu?
|
|
|
|
49:36.000 --> 49:38.000
|
|
What drug cured MS?
|
|
|
|
49:38.000 --> 49:40.000
|
|
What drug cures cancer?
|
|
|
|
49:40.000 --> 49:43.000
|
|
What drug cures diabetes?
|
|
|
|
49:43.000 --> 49:47.000
|
|
What drug that we made on a chemical bench has cured any disease?
|
|
|
|
49:47.000 --> 49:51.000
|
|
Yet he just said it like it's a known fact.
|
|
|
|
49:51.000 --> 49:55.000
|
|
Like there's a whole list behind him on the screen.
|
|
|
|
49:55.000 --> 49:58.000
|
|
What has been cured?
|
|
|
|
49:58.000 --> 50:02.000
|
|
Tell me what drug has cured, what disease?
|
|
|
|
50:02.000 --> 50:04.000
|
|
Name one.
|
|
|
|
50:04.000 --> 50:07.000
|
|
Tools that have solved many problems.
|
|
|
|
50:07.000 --> 50:10.000
|
|
We can cure many diseases.
|
|
|
|
50:10.000 --> 50:13.000
|
|
And that's one part of the story.
|
|
|
|
50:13.000 --> 50:22.000
|
|
Anybiotics are something that cure infection and can get rid of a parasite.
|
|
|
|
50:22.000 --> 50:30.000
|
|
But when we talk about disease, what are we talking about?
|
|
|
|
50:30.000 --> 50:33.000
|
|
Excuse me.
|
|
|
|
50:33.000 --> 50:36.000
|
|
I'm confused.
|
|
|
|
50:36.000 --> 50:38.000
|
|
And I think he is too.
|
|
|
|
50:38.000 --> 50:43.000
|
|
This is part of a sales pitch, nothing more than a bullshit sales pitch.
|
|
|
|
50:43.000 --> 50:49.000
|
|
Very well that there are still too many diseases for which we don't have drugs or therapeutic molecules that work.
|
|
|
|
50:49.000 --> 51:02.000
|
|
So today I want to tell you about a specific type of molecules that nature is making, that all of us are making, even animals, as part of a natural defense mechanism.
|
|
|
|
51:02.000 --> 51:10.000
|
|
And how these molecules have actually the potential to give us drugs with unparalleled properties.
|
|
|
|
51:10.000 --> 51:14.000
|
|
These molecules are called antibodies.
|
|
|
|
51:14.000 --> 51:18.000
|
|
I'm sure and I expect many of you will be familiar with this name.
|
|
|
|
51:18.000 --> 51:22.000
|
|
These are the molecules that we make when we get the flu shot.
|
|
|
|
51:22.000 --> 51:30.000
|
|
And I expect you to get your flu shot every year, by the way.
|
|
|
|
51:30.000 --> 51:32.000
|
|
Did you hear that?
|
|
|
|
51:32.000 --> 51:34.000
|
|
Did you hear that?
|
|
|
|
51:34.000 --> 51:39.000
|
|
Anybodies are what you make after you get the flu shot.
|
|
|
|
51:39.000 --> 51:42.000
|
|
This guy doesn't understand immunology.
|
|
|
|
51:42.000 --> 51:45.000
|
|
He understands the mythology.
|
|
|
|
51:45.000 --> 51:49.000
|
|
He understands what he's supposed to sell.
|
|
|
|
51:49.000 --> 51:57.000
|
|
He understands the mythology he's supposed to push and he understands the mythology under which he's going to make a million dollars.
|
|
|
|
51:57.000 --> 51:59.000
|
|
That's what you see here.
|
|
|
|
51:59.000 --> 52:08.000
|
|
A person who's willing to lie or accept a lie because if somebody else doesn't do it, if I don't do it, somebody else will do it.
|
|
|
|
52:08.000 --> 52:12.000
|
|
You, to get your flu shot every year, by the way.
|
|
|
|
52:13.000 --> 52:18.000
|
|
Our tea bodies are just amazing.
|
|
|
|
52:18.000 --> 52:21.000
|
|
They're molecules made by immune cells.
|
|
|
|
52:21.000 --> 52:28.000
|
|
And their role is really to stick and tag anything that might be harmful for a body.
|
|
|
|
52:28.000 --> 52:30.000
|
|
That can be a bacteria.
|
|
|
|
52:30.000 --> 52:31.000
|
|
That can be a virus.
|
|
|
|
52:31.000 --> 52:33.000
|
|
That can be a cancer cell.
|
|
|
|
52:33.000 --> 52:38.000
|
|
That can even be an allergen or any kind of toxic chemical that comes across.
|
|
|
|
52:38.000 --> 52:46.000
|
|
Now, anti-bodies have this unique ability to stick and tag anything they bind to.
|
|
|
|
52:46.000 --> 52:56.000
|
|
And when they do this, they give a signal to the immune system, our body, or in the case of animals, their body, to get rid of that molecule and clear it.
|
|
|
|
52:56.000 --> 53:04.000
|
|
So that's very interesting when you start thinking about therapeutic purposes and thinking about maybe making a drug out of it.
|
|
|
|
53:04.000 --> 53:08.000
|
|
Now, the truly amazing thing is that...
|
|
|
|
53:08.000 --> 53:16.000
|
|
So thinking about making a drug out of it, I think somebody in the chat is right when saying that that word is loaded.
|
|
|
|
53:16.000 --> 53:20.000
|
|
And it is a part of the enchantment.
|
|
|
|
53:20.000 --> 53:25.000
|
|
I guess drug is what you call it when it becomes a profitable product.
|
|
|
|
53:25.000 --> 53:28.000
|
|
Drug is what you call it when you start selling it to people.
|
|
|
|
53:28.000 --> 53:38.000
|
|
I don't know. I don't know what it means, but it's a stupid word to describe something that's biologically defined, biologically produced and endogenously found.
|
|
|
|
53:38.000 --> 53:47.000
|
|
We speak, all of us here, we're all making millions and millions of different anti-bodies as I speak.
|
|
|
|
53:47.000 --> 53:56.000
|
|
And anti-bodies are molecules that are floating in our body, and they all share a similar structure, but they have slight differences.
|
|
|
|
53:56.000 --> 54:04.000
|
|
And these differences allow them to bind one molecule versus another. It gives them specificity.
|
|
|
|
54:04.000 --> 54:09.000
|
|
And it also means you can make anti-bodies against anything. That's the way nature worked it out.
|
|
|
|
54:09.000 --> 54:14.000
|
|
So now this is becoming very interesting if you're in the drug industry.
|
|
|
|
54:14.000 --> 54:24.000
|
|
Because if you have a therapeutic program, then it means you need to find the anti-body that has the properties you want so that you can use this as a drug.
|
|
|
|
54:24.000 --> 54:35.000
|
|
But here's the trick. Each antibody is made by a single tiny immune cell. We have billions and billions of cells.
|
|
|
|
54:35.000 --> 54:49.000
|
|
So if you're in the drug industry or if you want to make a new drug using an antibody, the game is not only to find the best antibody, but it's also to find the antibody or the cell that is making that antibody.
|
|
|
|
54:49.000 --> 54:52.000
|
|
And that's not the neat thing to do.
|
|
|
|
54:52.000 --> 55:07.000
|
|
I don't know if you remember, when you were in high school, if you take a drop of your blood and you put it in the glass light and you put this under the microscope, and you look, what you can see is nearly millions again of cells floating.
|
|
|
|
55:07.000 --> 55:12.000
|
|
And you have no way to tell which one is making the antibody you might be interested in.
|
|
|
|
55:12.000 --> 55:33.000
|
|
Now the trick here is to understand that that's where the antibody paradox comes from, is that these companies, probably with the assistance of people like Les Weinstein and other patent lawyers, were crafting patents, describing monoclonal antibodies as a patentable entity.
|
|
|
|
55:33.000 --> 55:49.000
|
|
And as the biology of antibodies and the biology of the variability region, the variable region of antibodies was more and more understood how it was produced and could be somewhat characterized.
|
|
|
|
55:49.000 --> 56:06.000
|
|
They realized that what people were selling as a uniform product was actually a polyclonal set of antibodies that was aimed at the same target.
|
|
|
|
56:06.000 --> 56:14.000
|
|
And often this was characterized as an antibody that could be sold as a product, as a drug.
|
|
|
|
56:14.000 --> 56:22.000
|
|
But the antibodies that were inside of it were not produced by only one B cell and therefore were not all identical.
|
|
|
|
56:22.000 --> 56:34.000
|
|
And so what he's describing here is this paradox that was created because some monoclonal antibodies are really monoclonal, and there are ways to make monoclonal antibodies.
|
|
|
|
56:35.000 --> 56:42.000
|
|
But as he said, it's not easy, it's very difficult and often fails.
|
|
|
|
56:42.000 --> 56:54.000
|
|
And then the production, the production of the antibodies that are going to be used is still a painstaking process.
|
|
|
|
56:54.000 --> 57:01.000
|
|
Whereas these guys are thinking about using the mRNA technology to make the antibodies
|
|
|
|
57:01.000 --> 57:24.000
|
|
so by using artificial means of creating the monoclonal antibody patented artificial means, they resurrect this whole market as a patentable market, but only if you use their tech.
|
|
|
|
57:25.000 --> 57:32.000
|
|
Because if you use the traditional way of making monoclonal antibodies, that's really not patentable anymore.
|
|
|
|
57:32.000 --> 57:37.000
|
|
At least you're risking the possibility that your patent will be unenforceable.
|
|
|
|
57:37.000 --> 57:53.000
|
|
Whereas if you go through these guys and partner with them, they only need a small royalty and now you have a partnership with several proprietary patented processes plus your unique monoclonal.
|
|
|
|
57:54.000 --> 58:07.000
|
|
And that's what makes it so interesting, they're looking for clients that have the drugs already, and then they'll improve the drugs through their algorithms.
|
|
|
|
58:07.000 --> 58:21.000
|
|
So if you're a scientist and you're working on a program and you have to find that antibody, you have a really big problem, you don't know and you can't get that cell that's making the antibody you need.
|
|
|
|
58:21.000 --> 58:24.000
|
|
So how do you do that?
|
|
|
|
58:24.000 --> 58:36.000
|
|
Well, as very often, what happened in science, the same way we've been shooting for the stars, we've built rockets and we've built satellites, we've built new tools, new systems.
|
|
|
|
58:36.000 --> 58:47.000
|
|
Well, in this case, we've had to build systems that can allow us to screen these cells that are making antibodies to find the ones that we can turn into a drug.
|
|
|
|
58:48.000 --> 59:03.000
|
|
So luckily, I've been part of a group of very small people a few years ago here at the university and we were all working on different aspects of biomedical research, but we were essentially tinkering and trying to make miniaturized systems that could move cells around.
|
|
|
|
59:04.000 --> 59:17.000
|
|
And quite rapidly, we realized that by combining the things together that all of us were doing, we could actually make a system that was very efficient at finding those cells that are making antibodies that could be turned into drugs.
|
|
|
|
59:17.000 --> 59:28.000
|
|
It's been working so well that we started a company out of it, and the company's doing quite well, we're working with some of the most innovative companies in the world now, biotech and pharmaceutical companies.
|
|
|
|
59:28.000 --> 59:38.000
|
|
So that's been a long story short, but what I'm telling you is that we can find these antibodies, and that's a really great thing.
|
|
|
|
59:38.000 --> 59:48.000
|
|
Now, what you might ask me is, well, that's great, you know, telling us antibodies are important, but isn't really relevant.
|
|
|
|
59:49.000 --> 01:00:03.000
|
|
So the context, and just to give you a little bit of perspective, is that 30 years ago, the only drugs or therapeutic molecules that were in the market were molecules like Tylenol or Astring, these type of chemicals.
|
|
|
|
01:00:03.000 --> 01:00:10.000
|
|
Now we're in 2018, six of the top 10 blockbuster drugs are antibodies.
|
|
|
|
01:00:10.000 --> 01:00:13.000
|
|
This is the fastest growing class of therapeutics.
|
|
|
|
01:00:14.000 --> 01:00:16.000
|
|
This is a hundred billion dollar market.
|
|
|
|
01:00:16.000 --> 01:00:22.000
|
|
We have enabled therapeutic strategies that were thought to be impossible before.
|
|
|
|
01:00:22.000 --> 01:00:25.000
|
|
Cancer is a great example.
|
|
|
|
01:00:25.000 --> 01:00:34.000
|
|
We've been having issues making molecules that can kill the cancer and keep healthy cells of patients alive.
|
|
|
|
01:00:34.000 --> 01:00:41.000
|
|
That's why we have all these problems with toxicity side effects of molecules or drugs that are on the market.
|
|
|
|
01:00:41.000 --> 01:00:51.000
|
|
With antibodies, people have enabled strategy where they actually target the immune system of the patient so that the immune system of the patient can actually get rid of the cancer itself.
|
|
|
|
01:00:51.000 --> 01:00:54.000
|
|
It's been a revolution in the field.
|
|
|
|
01:00:54.000 --> 01:01:00.000
|
|
People who were out of solution assured the cancer themselves and get into remission.
|
|
|
|
01:01:00.000 --> 01:01:03.000
|
|
This has led to a new field.
|
|
|
|
01:01:03.000 --> 01:01:06.000
|
|
Yeah, that happened to my sister-in-law in the Netherlands.
|
|
|
|
01:01:06.000 --> 01:01:17.000
|
|
She was in remission from ovarian cancer and then they gave her four shots over a year and her cancer came back and killed her in a month.
|
|
|
|
01:01:17.000 --> 01:01:20.000
|
|
That's true story.
|
|
|
|
01:01:20.000 --> 01:01:32.000
|
|
I've got another friend who went to Hollywood to be an actress when I went to the Netherlands to be a scientist in 2002.
|
|
|
|
01:01:33.000 --> 01:01:49.000
|
|
And in order to be on the set of the Captain Picard show so that she could be the Borg Queen, she took three shots and got super fast brain cancer and died.
|
|
|
|
01:01:50.000 --> 01:02:02.000
|
|
These people are all being killed because of the arrogance that you see on this stage here where it's just RNA and antibodies are important and stuff.
|
|
|
|
01:02:02.000 --> 01:02:10.000
|
|
The lack of respect for the sacred biology of the human is what's going to destroy this world.
|
|
|
|
01:02:10.000 --> 01:02:17.000
|
|
And it's so obvious to me how we got here now when I look at this arrogance before the pandemic.
|
|
|
|
01:02:18.000 --> 01:02:21.000
|
|
It's going to get much worse before it gets better in this video alone.
|
|
|
|
01:02:21.000 --> 01:02:25.000
|
|
It's called immunotherapy.
|
|
|
|
01:02:25.000 --> 01:02:31.000
|
|
We've been able to treat arthritis, inflammation, autoimmune disease and so on.
|
|
|
|
01:02:31.000 --> 01:02:40.000
|
|
Even a few weeks ago, a new antibody has been approved to treat migraine, which is a relief for many people.
|
|
|
|
01:02:40.000 --> 01:02:46.000
|
|
And that's what's happening at a pace that is unprecedented.
|
|
|
|
01:02:47.000 --> 01:02:49.000
|
|
An antibody for migraine.
|
|
|
|
01:02:49.000 --> 01:02:51.000
|
|
Boy, that sounds like a tough one.
|
|
|
|
01:02:51.000 --> 01:02:54.000
|
|
I think that is even more amazing is what I'm just about to tell you.
|
|
|
|
01:02:54.000 --> 01:02:56.000
|
|
What's more amazing?
|
|
|
|
01:02:56.000 --> 01:02:58.000
|
|
Even more amazing.
|
|
|
|
01:02:58.000 --> 01:03:05.000
|
|
We know that infectious disease and specifically viruses are a problem.
|
|
|
|
01:03:05.000 --> 01:03:10.000
|
|
Flu and other viruses still kill hundreds of thousands of people every year.
|
|
|
|
01:03:10.000 --> 01:03:18.000
|
|
The Ebola crisis is a sober reminder of the vulnerability of our societies to those types of pandemics.
|
|
|
|
01:03:18.000 --> 01:03:23.000
|
|
They spread rapidly and when they are lethal, we are very exposed.
|
|
|
|
01:03:23.000 --> 01:03:34.000
|
|
Bill Gates, earlier this spring, called us the biggest threat humanity is facing if a new virus or a virus weaponized is launched.
|
|
|
|
01:03:35.000 --> 01:03:42.000
|
|
A couple of days ago, a plane was flying from Lameka to GFK had a problem.
|
|
|
|
01:03:42.000 --> 01:03:49.000
|
|
Out of 500 passengers, about 100 of them get sick and no one knew what it was.
|
|
|
|
01:03:49.000 --> 01:03:52.000
|
|
So when they landed in GFK, it was almost a crisis.
|
|
|
|
01:03:52.000 --> 01:03:56.000
|
|
No one knew exactly what was making these people sick.
|
|
|
|
01:03:56.000 --> 01:03:58.000
|
|
And we weren't sure we could react.
|
|
|
|
01:03:58.000 --> 01:04:03.000
|
|
If this had been Ebola, I'm not sure we could have taken.
|
|
|
|
01:04:04.000 --> 01:04:05.000
|
|
What was it?
|
|
|
|
01:04:05.000 --> 01:04:07.000
|
|
Our antibodies can be powerful.
|
|
|
|
01:04:07.000 --> 01:04:09.000
|
|
What was it?
|
|
|
|
01:04:09.000 --> 01:04:11.000
|
|
Food poisoning?
|
|
|
|
01:04:11.000 --> 01:04:15.000
|
|
Not worth mentioning?
|
|
|
|
01:04:15.000 --> 01:04:18.000
|
|
The mythology is thick and this was in 2019.
|
|
|
|
01:04:18.000 --> 01:04:23.000
|
|
They were laying it down thick already, getting ready for the big show.
|
|
|
|
01:04:23.000 --> 01:04:26.000
|
|
We've been very successful so far.
|
|
|
|
01:04:26.000 --> 01:04:32.000
|
|
So the next question is, can we use antibodies to prevent the next pandemic?
|
|
|
|
01:04:32.000 --> 01:04:35.000
|
|
And my answer is yes, yes we can.
|
|
|
|
01:04:35.000 --> 01:04:47.000
|
|
If you or me are getting sick, if we're getting the flu, if bad luck, we get Ebola and we survive it, our body will make great antibodies.
|
|
|
|
01:04:47.000 --> 01:04:52.000
|
|
And now we have technologies to find antibodies that can neutralize the virus.
|
|
|
|
01:04:52.000 --> 01:04:54.000
|
|
That's something we can do.
|
|
|
|
01:04:54.000 --> 01:05:01.000
|
|
But in the case of viruses that are spreading rapidly, we don't have the capabilities to discover the antibodies.
|
|
|
|
01:05:01.000 --> 01:05:05.000
|
|
And turn it into rapidly enough.
|
|
|
|
01:05:05.000 --> 01:05:13.000
|
|
It takes about seven to ten years from the discovery of an antibody before it reaches market.
|
|
|
|
01:05:13.000 --> 01:05:17.000
|
|
That's the status, the situation right now.
|
|
|
|
01:05:17.000 --> 01:05:19.000
|
|
So how are we going to solve this?
|
|
|
|
01:05:19.000 --> 01:05:28.000
|
|
Well, let me tell you about DARPA, the Defense Advanced Research Project Agency from the Department of Defense in the U.S.
|
|
|
|
01:05:29.000 --> 01:05:39.000
|
|
For those of you who haven't heard about DARPA, this is the agency that has created ARPANET, the ancestor of what we know as Internet now.
|
|
|
|
01:05:39.000 --> 01:05:41.000
|
|
They've created stealth technology.
|
|
|
|
01:05:41.000 --> 01:05:44.000
|
|
They've even created Siri, GPS.
|
|
|
|
01:05:44.000 --> 01:05:49.000
|
|
So this is an agency and these are people who are living in the future.
|
|
|
|
01:05:49.000 --> 01:05:51.000
|
|
This is science fiction.
|
|
|
|
01:05:51.000 --> 01:05:55.000
|
|
And one of the top priority is to solve that problem.
|
|
|
|
01:05:55.000 --> 01:05:57.000
|
|
How do we prevent the next pandemic?
|
|
|
|
01:05:57.000 --> 01:06:04.000
|
|
So what they want to do is to put in place all the technologies that we need once and for all
|
|
|
|
01:06:04.000 --> 01:06:12.000
|
|
so that we can discover antibodies, vectorize them, have enough to deliver to about 20,000 people
|
|
|
|
01:06:12.000 --> 01:06:14.000
|
|
so that we can put a fire break.
|
|
|
|
01:06:14.000 --> 01:06:20.000
|
|
Would any new virus or any new flu virus, any new mutant, would come out?
|
|
|
|
01:06:20.000 --> 01:06:22.000
|
|
A fire break?
|
|
|
|
01:06:22.000 --> 01:06:29.000
|
|
You mean 20,000 people that you want to protect from the clone that you release and that everybody else has screwed?
|
|
|
|
01:06:29.000 --> 01:06:33.000
|
|
20,000 people will be a fire break?
|
|
|
|
01:06:33.000 --> 01:06:35.000
|
|
What is he talking about?
|
|
|
|
01:06:35.000 --> 01:06:41.000
|
|
You can't give antibodies to 20,000 people that you know are going to be your fire break unless they're already...
|
|
|
|
01:06:41.000 --> 01:06:46.000
|
|
He just told us that you give it to them after they're sick.
|
|
|
|
01:06:47.000 --> 01:06:49.000
|
|
They're already off.
|
|
|
|
01:06:49.000 --> 01:06:52.000
|
|
They're already losing their foundation.
|
|
|
|
01:06:52.000 --> 01:06:58.000
|
|
The biology is already in the contradictory because it's dumb simple.
|
|
|
|
01:06:58.000 --> 01:07:02.000
|
|
Antibodies are immunity and we can make antibodies.
|
|
|
|
01:07:02.000 --> 01:07:10.000
|
|
Antibodies, antibodies, antibodies, drugs, $150 billion drugs, antibodies, antibodies, antibodies.
|
|
|
|
01:07:10.000 --> 01:07:13.000
|
|
It's basically all he said.
|
|
|
|
01:07:13.000 --> 01:07:17.000
|
|
And they want to do this in 60 days.
|
|
|
|
01:07:17.000 --> 01:07:18.000
|
|
60 days.
|
|
|
|
01:07:18.000 --> 01:07:23.000
|
|
I told you right now, the industry is doing this in 7 to 10 years.
|
|
|
|
01:07:23.000 --> 01:07:26.000
|
|
That's the magnitude of the effort.
|
|
|
|
01:07:26.000 --> 01:07:36.000
|
|
And I'm telling you, all the people working on that project led by Colerne and Matt Hepburn, it's a public project, it's called P3, are committed.
|
|
|
|
01:07:36.000 --> 01:07:40.000
|
|
Almost feel it's their duty because we know we have to solve that problem.
|
|
|
|
01:07:40.000 --> 01:07:48.000
|
|
And the only way to get there is through science and technology and pushing the limits of it.
|
|
|
|
01:07:48.000 --> 01:08:00.000
|
|
So with that in mind, if there's one message I'd like you to take home today is that we are witnessing amazing progress in the space of human health.
|
|
|
|
01:08:00.000 --> 01:08:09.000
|
|
There's a combination of basic discoveries, investments, new technologies that are all coming together that are providing unique opportunities.
|
|
|
|
01:08:09.000 --> 01:08:21.000
|
|
But at the same time, when science is more than ever questioned and thinking about the anti-vaccine movement, for example, when this information is shared so rapidly,
|
|
|
|
01:08:21.000 --> 01:08:32.000
|
|
I truly believe that we must remain mindful and vigilant, mindful and vigilant about what science and technology has brought us.
|
|
|
|
01:08:32.000 --> 01:08:38.000
|
|
We've built most of our societies on science and technology.
|
|
|
|
01:08:38.000 --> 01:08:45.000
|
|
We've been able to understand better our world, but also how us are working as humans.
|
|
|
|
01:08:45.000 --> 01:08:54.000
|
|
So we need to keep building on this momentum. We need to keep pushing the limits so that we can make a positive impact on our societies.
|
|
|
|
01:08:54.000 --> 01:08:59.000
|
|
Thank you very much.
|
|
|
|
01:08:59.000 --> 01:09:19.000
|
|
So that's one of the worst TED talks that you've ever seen, I'm sure, but the point is, and I can't stress it enough, the point is that this company represents an obvious response to the antibody paradox,
|
|
|
|
01:09:19.000 --> 01:09:30.000
|
|
and I haven't really known what to do with, have been struggling to put into place, have been struggling to fit in these pieces together in Absalara,
|
|
|
|
01:09:30.000 --> 01:09:42.000
|
|
and its focus on finding monoclonal antibodies and bringing them to market in record time, funding by DARPA, working with a cutest with the lipid nanoparticles that are used.
|
|
|
|
01:09:42.000 --> 01:09:47.000
|
|
In the COVID shot, it all lines up.
|
|
|
|
01:09:47.000 --> 01:10:04.000
|
|
And what really lines up is the idea that somebody like Bob Malone was brokering these technologies between the United States government and the Canadian government, the United States government and Canadian firms.
|
|
|
|
01:10:04.000 --> 01:10:20.000
|
|
What makes sense is this exchange of American ideas with a foreign company in parallel with a change in the legal intellectual property space of America.
|
|
|
|
01:10:21.000 --> 01:10:26.000
|
|
Why isn't DARPA funding and a startup company in America to do this?
|
|
|
|
01:10:26.000 --> 01:10:35.000
|
|
Why isn't Peter Thiel on the board of an American company doing this?
|
|
|
|
01:10:35.000 --> 01:10:42.000
|
|
I would say it might be because of the antibody paradox, the antibody patent paradox.
|
|
|
|
01:10:42.000 --> 01:10:57.000
|
|
It's better to make an antibody company in Canada than it would be to make it here, where the entire intellectual property space of antibodies has been fundamentally changed and remains untested.
|
|
|
|
01:10:57.000 --> 01:11:09.000
|
|
And in fact, they moved up there before the antibody paradox was actually played out, almost as if they knew that it was going to play out.
|
|
|
|
01:11:09.000 --> 01:11:14.000
|
|
Watch that video by Mark. Watch the whole thing.
|
|
|
|
01:11:14.000 --> 01:11:19.000
|
|
Please. Oh, sorry, I was already there. Watch the whole thing.
|
|
|
|
01:11:19.000 --> 01:11:27.000
|
|
Remember the reason why we're here is because it doesn't matter what's true, but matter what is perceived to be true, and that's how they've gotten us here.
|
|
|
|
01:11:27.000 --> 01:11:31.000
|
|
A spectacular commitment to lies.
|
|
|
|
01:11:31.000 --> 01:11:44.000
|
|
And the lies mostly revolve around gain of function, but not only that, it's also about your immune system and about the value of antibodies and about what vaccination can and can't do.
|
|
|
|
01:11:44.000 --> 01:11:47.000
|
|
They've lied about it all.
|
|
|
|
01:11:47.000 --> 01:11:54.000
|
|
And they've told us a story that just is not mathematically correct about spread of a novel cause of death.
|
|
|
|
01:11:54.000 --> 01:12:09.000
|
|
There's no evidence of that spread, and there's no evidence of a novel cause of death. There's an evidence of some serious iatrogenic murder, but not of a novel aerosolized spreading cause of death, not really.
|
|
|
|
01:12:09.000 --> 01:12:17.000
|
|
And I think they're misleading our young so that they can convert our freedoms to fascism. That's what they're doing.
|
|
|
|
01:12:17.000 --> 01:12:23.000
|
|
And all these people need you to do is ask the wrong questions.
|
|
|
|
01:12:23.000 --> 01:12:27.000
|
|
That's what Paul Cottrell has been doing from the very beginning. That's what George Webb does.
|
|
|
|
01:12:27.000 --> 01:12:34.000
|
|
That's what Robert Malone does. That's what Jared Bunded Walsh does. That's what Nurse Aaron did.
|
|
|
|
01:12:34.000 --> 01:12:42.000
|
|
That's what Brett Weinstein did. They make you ask the wrong questions were just long enough so that you're done.
|
|
|
|
01:12:42.000 --> 01:12:45.000
|
|
You've already entered the trap.
|
|
|
|
01:12:45.000 --> 01:12:55.000
|
|
And none of these people will question the novel virus. None of these people will question that we had to lock down. None of these people will question that some people were saved by the mRNA.
|
|
|
|
01:12:55.000 --> 01:13:05.000
|
|
And none of these people will question that it could have been a gain of function virus. And that might be the reason why we're having so much trouble with it. And that might be the reason why it'll come again. None of them will question it.
|
|
|
|
01:13:05.000 --> 01:13:08.000
|
|
And that's what the Scooby Doo is.
|
|
|
|
01:13:08.000 --> 01:13:19.000
|
|
It's this illusion of consensus that we all figured it out. And that's why Fox News viewers and CNN News viewers both believe that it was a lab leak. No.
|
|
|
|
01:13:19.000 --> 01:13:28.000
|
|
And they've all accepted this idea that we didn't know what was going to go on. We don't know what's going on. Anybody's our stuff.
|
|
|
|
01:13:28.000 --> 01:13:38.000
|
|
And this change has allowed everybody to ignore the fact that we killed people in the hospital. And we're still doing it.
|
|
|
|
01:13:38.000 --> 01:13:44.000
|
|
And we upped all the other causes of death that we could.
|
|
|
|
01:13:44.000 --> 01:13:52.000
|
|
Including heart attacks, drug overdoses, everything. Everything that could be pumped up was pumped up.
|
|
|
|
01:13:52.000 --> 01:14:02.000
|
|
Drug overdoses was a big one. Heart attacks in certain locales was a big one so that they could say that the virus was causing heart attacks.
|
|
|
|
01:14:02.000 --> 01:14:09.000
|
|
What they didn't tell you is they weren't resuscitating anybody in New York for a month.
|
|
|
|
01:14:09.000 --> 01:14:15.000
|
|
So we've got to break this illusion of consensus one mind at a time, ladies and gentlemen, you can help me.
|
|
|
|
01:14:15.000 --> 01:14:24.000
|
|
You can help me by explaining what the Scooby Doo is, which is an illusion of consensus about a laboratory, bad cave, virus leak.
|
|
|
|
01:14:24.000 --> 01:14:26.000
|
|
And a cover up of it.
|
|
|
|
01:14:26.000 --> 01:14:32.000
|
|
We were all fooled into believing this is what happened and it didn't.
|
|
|
|
01:14:32.000 --> 01:14:37.000
|
|
They have conflated a background signal with that spread.
|
|
|
|
01:14:37.000 --> 01:14:42.000
|
|
And it is the protocols that were murder and transfection is not medicine.
|
|
|
|
01:14:43.000 --> 01:15:03.000
|
|
It was likely an infectious clone release in several places in the world that led to the massive bamboozlement of the entire molecular biology community, assuming that the signals had to be real and not understanding the depth and the breadth of the lie.
|
|
|
|
01:15:03.000 --> 01:15:12.000
|
|
You might call it a transfection agent release if it was just a spike protein and a toxin, but I don't know.
|
|
|
|
01:15:12.000 --> 01:15:20.000
|
|
It's definitely not no viruses because the fidelity of them has been exaggerated.
|
|
|
|
01:15:20.000 --> 01:15:22.000
|
|
Their abundance has been exaggerated.
|
|
|
|
01:15:22.000 --> 01:15:30.000
|
|
Their role in our biology has been grossly understated in some ways and over exaggerated in other ways.
|
|
|
|
01:15:31.000 --> 01:15:34.000
|
|
But no viruses is not the answer.
|
|
|
|
01:15:34.000 --> 01:15:36.000
|
|
It is a trap.
|
|
|
|
01:15:36.000 --> 01:15:40.000
|
|
It is a trap like any other trap.
|
|
|
|
01:15:40.000 --> 01:15:44.000
|
|
As long as you don't ask the right questions, you'll never get out.
|
|
|
|
01:15:44.000 --> 01:15:49.000
|
|
And the no virus group is a very easy way to get you to not ask the right questions.
|
|
|
|
01:15:49.000 --> 01:15:52.000
|
|
And you need to ask questions about the swarm.
|
|
|
|
01:15:52.000 --> 01:15:56.000
|
|
You need to ask questions about clones.
|
|
|
|
01:15:57.000 --> 01:15:59.000
|
|
You need to ask questions about PCR.
|
|
|
|
01:15:59.000 --> 01:16:02.000
|
|
You need to question it about do not resuscitate orders.
|
|
|
|
01:16:02.000 --> 01:16:07.000
|
|
And if you're talking no virus, you're never going to answer any of those questions.
|
|
|
|
01:16:07.000 --> 01:16:09.000
|
|
You're never going to talk about any of this.
|
|
|
|
01:16:09.000 --> 01:16:12.000
|
|
Just like Malone never talks about any of this.
|
|
|
|
01:16:12.000 --> 01:16:16.000
|
|
Just like here at London Ball, she never talks about anything like this.
|
|
|
|
01:16:16.000 --> 01:16:21.000
|
|
Just like Brett Weinstein never covers any of this anymore.
|
|
|
|
01:16:21.000 --> 01:16:23.000
|
|
None of them do.
|
|
|
|
01:16:23.000 --> 01:16:26.000
|
|
Kevin McCare never covered any of this.
|
|
|
|
01:16:26.000 --> 01:16:29.000
|
|
Kevin McCurnan doesn't cover any of this.
|
|
|
|
01:16:29.000 --> 01:16:32.000
|
|
Charles Rixie doesn't cover any of this.
|
|
|
|
01:16:32.000 --> 01:16:35.000
|
|
None of them cover any of this list.
|
|
|
|
01:16:35.000 --> 01:16:43.000
|
|
Because they're focused on making sure that you ask the wrong questions and stay trapped in the Scooby-Doo.
|
|
|
|
01:16:43.000 --> 01:16:48.000
|
|
They don't want you to see what's being done to your children and your children's children.
|
|
|
|
01:16:49.000 --> 01:16:56.000
|
|
They don't want you to have a far enough vision to understand that people would lie to you for this because it's all the marbles.
|
|
|
|
01:17:00.000 --> 01:17:04.000
|
|
And all of those people that I just listed, they won't address this screen.
|
|
|
|
01:17:04.000 --> 01:17:06.000
|
|
They won't address this statement.
|
|
|
|
01:17:06.000 --> 01:17:12.000
|
|
None of them have been able to codify this truth like Gigo and Biological has codified it.
|
|
|
|
01:17:13.000 --> 01:17:23.000
|
|
So if you want to know who's playing on the team and who's not, ask people why it took them so long and they still haven't come up with this set of words.
|
|
|
|
01:17:23.000 --> 01:17:28.000
|
|
Ask them because they'll have no excuse other than they get paid not to come up with it.
|
|
|
|
01:17:28.000 --> 01:17:36.000
|
|
Well, I'm now a staff scientist at Children's Health Defense and I'm very much prepared to bring more of this reign.
|
|
|
|
01:17:37.000 --> 01:17:40.000
|
|
I'm ready. So bring it on ladies and gentlemen.
|
|
|
|
01:17:40.000 --> 01:17:42.000
|
|
That's another night in. See you tomorrow.
|
|
|
|
01:18:06.000 --> 01:18:12.000
|
|
Thanks for watching.
|
|
|
|
01:18:36.000 --> 01:18:42.000
|
|
Thanks for watching.
|
|
|
|
01:19:06.000 --> 01:19:16.000
|
|
Thanks for watching.
|
|
|
|
01:19:36.000 --> 01:19:39.000
|
|
Thank you.
|
|
|
|
|