You can not select more than 25 topics
Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
4131 lines
154 KiB
4131 lines
154 KiB
7 months ago
|
WEBVTT
|
||
|
|
||
|
00:00.000 --> 00:02.000
|
||
|
You
|
||
|
|
||
|
00:30.000 --> 00:32.000
|
||
|
You
|
||
|
|
||
|
01:00.000 --> 01:02.000
|
||
|
You
|
||
|
|
||
|
01:30.000 --> 01:32.000
|
||
|
You
|
||
|
|
||
|
02:00.000 --> 02:02.000
|
||
|
You
|
||
|
|
||
|
02:30.000 --> 02:32.000
|
||
|
You
|
||
|
|
||
|
03:00.000 --> 03:02.000
|
||
|
You
|
||
|
|
||
|
03:30.000 --> 03:32.000
|
||
|
I
|
||
|
|
||
|
04:00.000 --> 04:02.000
|
||
|
I
|
||
|
|
||
|
04:30.000 --> 04:32.000
|
||
|
You
|
||
|
|
||
|
04:32.000 --> 04:34.000
|
||
|
You
|
||
|
|
||
|
04:34.000 --> 04:36.000
|
||
|
You
|
||
|
|
||
|
04:36.000 --> 04:38.000
|
||
|
You
|
||
|
|
||
|
04:38.000 --> 04:40.000
|
||
|
You
|
||
|
|
||
|
04:40.000 --> 04:42.000
|
||
|
You
|
||
|
|
||
|
04:42.000 --> 04:44.000
|
||
|
You
|
||
|
|
||
|
04:44.000 --> 04:46.000
|
||
|
You
|
||
|
|
||
|
04:46.000 --> 04:48.000
|
||
|
You
|
||
|
|
||
|
04:48.000 --> 04:50.000
|
||
|
You
|
||
|
|
||
|
04:50.000 --> 04:52.000
|
||
|
You
|
||
|
|
||
|
04:52.000 --> 04:54.000
|
||
|
You
|
||
|
|
||
|
04:54.000 --> 04:56.000
|
||
|
You
|
||
|
|
||
|
04:56.000 --> 04:58.000
|
||
|
You
|
||
|
|
||
|
04:58.000 --> 04:59.000
|
||
|
You
|
||
|
|
||
|
04:59.000 --> 05:01.000
|
||
|
You
|
||
|
|
||
|
05:01.000 --> 05:03.000
|
||
|
You
|
||
|
|
||
|
05:03.000 --> 05:05.000
|
||
|
You
|
||
|
|
||
|
05:05.000 --> 05:07.000
|
||
|
You
|
||
|
|
||
|
05:07.000 --> 05:08.000
|
||
|
To
|
||
|
|
||
|
05:08.000 --> 05:11.000
|
||
|
If you want to
|
||
|
|
||
|
05:11.000 --> 05:13.000
|
||
|
Turn
|
||
|
|
||
|
05:13.000 --> 05:17.000
|
||
|
I will
|
||
|
|
||
|
05:17.000 --> 05:19.000
|
||
|
Give
|
||
|
|
||
|
05:19.000 --> 05:21.000
|
||
|
You
|
||
|
|
||
|
05:21.000 --> 05:25.000
|
||
|
Will
|
||
|
|
||
|
05:25.000 --> 05:32.000
|
||
|
Shaking up the water, didn't it cold?
|
||
|
|
||
|
05:32.000 --> 05:37.000
|
||
|
Back from the gym.
|
||
|
|
||
|
05:37.000 --> 05:43.000
|
||
|
My 12 year old son has shot more than.
|
||
|
|
||
|
05:43.000 --> 05:49.000
|
||
|
14,000 shots in the last two months, which is pretty incredible.
|
||
|
|
||
|
05:49.000 --> 05:54.000
|
||
|
It's pretty incredible rumble good. Excellent. Thank you, Pamela.
|
||
|
|
||
|
05:54.000 --> 05:59.000
|
||
|
All right. I'm really excited that this is working okay.
|
||
|
|
||
|
05:59.000 --> 06:04.000
|
||
|
Well, he just started this year, so he's not anywhere near Minnie, Jordan yet,
|
||
|
|
||
|
06:04.000 --> 06:11.000
|
||
|
but he is enjoying himself a lot and it's a lot of father and son's time that wasn't happening before.
|
||
|
|
||
|
06:11.000 --> 06:16.000
|
||
|
So basketball has been a very, very good idea for this family.
|
||
|
|
||
|
06:16.000 --> 06:26.000
|
||
|
Very good idea.
|
||
|
|
||
|
06:26.000 --> 06:30.000
|
||
|
Oh, should I pause and tell you right now?
|
||
|
|
||
|
06:30.000 --> 06:36.000
|
||
|
Should I just pause and tell you right now that I had a two and a half, almost three hour conversation with Ed today?
|
||
|
|
||
|
06:36.000 --> 06:41.000
|
||
|
That's the reason why I didn't make the one o'clock deadline before basketball.
|
||
|
|
||
|
06:41.000 --> 06:44.000
|
||
|
It was one of the most amazing conversations I've ever had.
|
||
|
|
||
|
06:44.000 --> 06:49.000
|
||
|
I'm going to headline the red pill Expo. I'm going to headline it.
|
||
|
|
||
|
06:49.000 --> 06:53.000
|
||
|
I'm getting a whole hour. We're going to do it.
|
||
|
|
||
|
06:53.000 --> 06:56.000
|
||
|
I'm almost, I'm so freaking out. I can't tell you.
|
||
|
|
||
|
06:56.000 --> 07:01.000
|
||
|
Don't tell anyone, though. It's just for us. That's just secret.
|
||
|
|
||
|
07:01.000 --> 07:08.000
|
||
|
But yeah, he's going to promote it on the website as soon as I get in my headshot and as soon as I get him a thing in the bobber.
|
||
|
|
||
|
07:08.000 --> 07:15.000
|
||
|
So it's really, really good. Really, really good.
|
||
|
|
||
|
07:15.000 --> 07:20.000
|
||
|
Yeah, it is so freaking awesome.
|
||
|
|
||
|
07:20.000 --> 07:27.000
|
||
|
It is so freaking awesome.
|
||
|
|
||
|
07:27.000 --> 07:37.000
|
||
|
And that's because people have been sharing this work and people have been pushing it around and somebody pushed the right stream in front of Ed.
|
||
|
|
||
|
07:37.000 --> 07:41.000
|
||
|
And the next thing you know, that guy tried four days in a row to call me.
|
||
|
|
||
|
07:41.000 --> 07:47.000
|
||
|
He had some kind of problem with his landline and so I couldn't always get through to that.
|
||
|
|
||
|
07:47.000 --> 07:50.000
|
||
|
Then I wasn't at his desk, but he was at his desk.
|
||
|
|
||
|
07:50.000 --> 08:04.000
|
||
|
The Red Pill Expo, I think, is in Rapid City, South Dakota, which is an incredible metropolis of culture and distraction, of course.
|
||
|
|
||
|
08:04.000 --> 08:06.000
|
||
|
That's the reason why they have it there.
|
||
|
|
||
|
08:06.000 --> 08:10.000
|
||
|
So there's no distraction to stay focused on the biology, not take the bait on TV or social media.
|
||
|
|
||
|
08:10.000 --> 08:13.000
|
||
|
I love our neighbors. I believe it's in Rapid City.
|
||
|
|
||
|
08:13.000 --> 08:18.000
|
||
|
I don't know that for sure because I didn't find the website yet.
|
||
|
|
||
|
08:18.000 --> 08:26.000
|
||
|
But yes, it's coming and it's going to be very exciting because I have actually been asked to do it.
|
||
|
|
||
|
08:26.000 --> 08:28.000
|
||
|
So it's pretty crazy.
|
||
|
|
||
|
08:28.000 --> 08:43.000
|
||
|
I don't even really know what to say other than we should be very excited and I'm going to work very hard to make sure I don't make a pool of any of us and represent the sacred biology as well as possible.
|
||
|
|
||
|
08:43.000 --> 08:46.000
|
||
|
There's still some reverence up in the house, if you will.
|
||
|
|
||
|
08:46.000 --> 08:48.000
|
||
|
So it's going to be really great.
|
||
|
|
||
|
08:48.000 --> 08:49.000
|
||
|
I'm really excited.
|
||
|
|
||
|
08:49.000 --> 08:50.000
|
||
|
It was really nice to talk to him.
|
||
|
|
||
|
08:50.000 --> 08:52.000
|
||
|
He's 92 years old.
|
||
|
|
||
|
08:52.000 --> 08:55.000
|
||
|
Do you know 92 years old?
|
||
|
|
||
|
08:55.000 --> 08:58.000
|
||
|
Oh my gosh, we talked about so much stuff.
|
||
|
|
||
|
08:58.000 --> 09:07.000
|
||
|
It was like one of those conversations where I started to think about, okay, now maybe I'm exposing myself to too much radiation on my head here.
|
||
|
|
||
|
09:08.000 --> 09:10.000
|
||
|
So 92, who's 92?
|
||
|
|
||
|
09:10.000 --> 09:12.000
|
||
|
Ed Griffin is 92.
|
||
|
|
||
|
09:12.000 --> 09:15.000
|
||
|
If you can hear me over the music.
|
||
|
|
||
|
09:15.000 --> 09:17.000
|
||
|
Ed Griffin is 92.
|
||
|
|
||
|
09:17.000 --> 09:29.000
|
||
|
Part of what I would, I, a new honorary member at least of the, of the, yeah, of the independent bright web.
|
||
|
|
||
|
09:29.000 --> 09:33.000
|
||
|
He's an ask me to participate in a event that he's organizing.
|
||
|
|
||
|
09:33.000 --> 09:40.000
|
||
|
I assume I can tell you guys in a week or so it'll probably be up on the website as far as I know.
|
||
|
|
||
|
09:40.000 --> 09:48.000
|
||
|
Let's just, let's just with an I-N, but let's just keep our, Griffin is F-F-I-N.
|
||
|
|
||
|
09:48.000 --> 09:50.000
|
||
|
Let's just keep it under our hats for now.
|
||
|
|
||
|
09:50.000 --> 09:56.000
|
||
|
Let's not start posting it all over social media yet when it's, when we have a website and whatever.
|
||
|
|
||
|
09:56.000 --> 09:59.000
|
||
|
Yeah, he is the OG indeed.
|
||
|
|
||
|
09:59.000 --> 10:03.000
|
||
|
He's the OG in terms of, you know, figuring it out.
|
||
|
|
||
|
10:03.000 --> 10:09.000
|
||
|
A guy who was trying to get us to drop our hands a long time ago to stop, stop hiding.
|
||
|
|
||
|
10:09.000 --> 10:12.000
|
||
|
So it's, it's interesting.
|
||
|
|
||
|
10:12.000 --> 10:15.000
|
||
|
It's interesting where we're going to go here and we'll see.
|
||
|
|
||
|
10:15.000 --> 10:16.000
|
||
|
We'll see.
|
||
|
|
||
|
10:16.000 --> 10:20.000
|
||
|
But I'm going to do it and we'll see what happens.
|
||
|
|
||
|
10:20.000 --> 10:21.000
|
||
|
So I'm excited.
|
||
|
|
||
|
10:21.000 --> 10:24.000
|
||
|
It'll be the first time I've done anything like that.
|
||
|
|
||
|
10:25.000 --> 10:27.000
|
||
|
So I hope it's a good crowd.
|
||
|
|
||
|
10:27.000 --> 10:36.000
|
||
|
A group of people, I'm sure it is, given who Ed Griffin is and what he's done for us as a country to try and bring,
|
||
|
|
||
|
10:36.000 --> 10:44.000
|
||
|
bring awareness to how our, our financial system is run, how our money system is run in such a way that the people who have
|
||
|
|
||
|
10:44.000 --> 10:51.000
|
||
|
control of the bus are actually able to do quite a lot more than to decide just where we're going.
|
||
|
|
||
|
10:52.000 --> 10:59.000
|
||
|
If you can write numbers into books and create money and other people can't, that's a pretty spectacular.
|
||
|
|
||
|
10:59.000 --> 11:03.000
|
||
|
That's a pretty, pretty spectacular power.
|
||
|
|
||
|
11:03.000 --> 11:10.000
|
||
|
And indeed, that's how the United States and many other Western countries are essentially controlled is that through
|
||
|
|
||
|
11:10.000 --> 11:11.000
|
||
|
fractional reserve banking.
|
||
|
|
||
|
11:11.000 --> 11:13.000
|
||
|
Anyway, that's not the specialty of this place.
|
||
|
|
||
|
11:13.000 --> 11:19.000
|
||
|
This is a place to get some biology in your head or get some clarity about biology in your head.
|
||
|
|
||
|
11:19.000 --> 11:25.000
|
||
|
Today, I'm really hoping we can enjoy ourselves a little bit, relax a little bit.
|
||
|
|
||
|
11:25.000 --> 11:30.000
|
||
|
I had to pull something off of the curated list of curious videos.
|
||
|
|
||
|
11:30.000 --> 11:33.000
|
||
|
I hope you don't mind to study all like this.
|
||
|
|
||
|
11:33.000 --> 11:41.000
|
||
|
I'm doing a lot of those lately because I don't want to roll out any new slides until it's worth you downloading them.
|
||
|
|
||
|
11:41.000 --> 11:50.000
|
||
|
And so we're working on a slide deck for pre-on mini course, and we're working on a slide deck for a couple other mini courses.
|
||
|
|
||
|
11:50.000 --> 11:55.000
|
||
|
And that's how I've started to think about it in my head as a mini course because it feels more doable.
|
||
|
|
||
|
11:55.000 --> 12:01.000
|
||
|
If I can do something over the course of a week with a couple breaks in between, then I think a mini course might be three days
|
||
|
|
||
|
12:01.000 --> 12:05.000
|
||
|
or four days where we really try to journal club our way out of some holes.
|
||
|
|
||
|
12:05.000 --> 12:12.000
|
||
|
So that's the kind of plan I've got besides just trying to be online every day.
|
||
|
|
||
|
12:12.000 --> 12:16.000
|
||
|
So this is, you know, a little booth in that direction as well.
|
||
|
|
||
|
12:16.000 --> 12:22.000
|
||
|
I'm not trying to use an excuse that I couldn't do it this afternoon to make it so that I couldn't do it tonight.
|
||
|
|
||
|
12:22.000 --> 12:31.000
|
||
|
There's some nice food upstairs that may or may not come down on a plate and I may or may not turn off the microphone occasionally to have a couple of mouthfuls.
|
||
|
|
||
|
12:32.000 --> 12:38.000
|
||
|
I am a little hungry, but I'm not going to eat on camera. I promise I won't share any food noises with you.
|
||
|
|
||
|
12:38.000 --> 12:43.000
|
||
|
We are still facing this conscious and intelligent manipulation of organized habits.
|
||
|
|
||
|
12:43.000 --> 12:48.000
|
||
|
We repeat this over and over again more for the people who might be stumbling on us for the first time.
|
||
|
|
||
|
12:48.000 --> 12:52.000
|
||
|
Let me see if I can get this on here.
|
||
|
|
||
|
12:52.000 --> 12:58.000
|
||
|
And so I repeated a lot, but I get this music out of the way here.
|
||
|
|
||
|
12:58.000 --> 13:06.000
|
||
|
I repeated a lot, but reality is that it's worth repeating that this has been a long standing problem.
|
||
|
|
||
|
13:06.000 --> 13:08.000
|
||
|
It's one we haven't been aware of. I wasn't aware of.
|
||
|
|
||
|
13:08.000 --> 13:11.000
|
||
|
I was aware of it in a compartmentalized way.
|
||
|
|
||
|
13:11.000 --> 13:13.000
|
||
|
You know, of course they lie about celebrities.
|
||
|
|
||
|
13:13.000 --> 13:19.000
|
||
|
Of course they can put bad stories in the news if they want to and ruin somebody's reputation,
|
||
|
|
||
|
13:19.000 --> 13:25.000
|
||
|
but it never dawned on me that in the basically the entire accepted history of the United States
|
||
|
|
||
|
13:25.000 --> 13:30.000
|
||
|
might more or less be manipulated to a lesser or greater extent at all times.
|
||
|
|
||
|
13:30.000 --> 13:40.000
|
||
|
It never dawned on me that the history and content of certain biological fields could be manipulated in a very similar way.
|
||
|
|
||
|
13:40.000 --> 13:48.000
|
||
|
And in fact, I think that's the reality where we live, much like Alan Dulles is purported to have said
|
||
|
|
||
|
13:48.000 --> 13:58.000
|
||
|
that when the American public, everything the American public believes is false, then we will have finally succeeded or something like this.
|
||
|
|
||
|
13:58.000 --> 14:01.000
|
||
|
This is how they have been governing us for quite some time.
|
||
|
|
||
|
14:01.000 --> 14:11.000
|
||
|
And they tell us truths that are not dangerous to their rule and not dangerous to the presuppositions of the system that perpetuate their rule.
|
||
|
|
||
|
14:11.000 --> 14:17.000
|
||
|
And if we look back to a cartoon in the Chicago Tribune from 1934,
|
||
|
|
||
|
14:17.000 --> 14:20.000
|
||
|
you will see a lot of interesting things.
|
||
|
|
||
|
14:20.000 --> 14:29.000
|
||
|
Young pinkies from Columbia, sorry, young pinkies and Columbia and Harvard drunk on power,
|
||
|
|
||
|
14:29.000 --> 14:31.000
|
||
|
riding and some kind of thing here.
|
||
|
|
||
|
14:31.000 --> 14:33.000
|
||
|
I mean, tug well ahead.
|
||
|
|
||
|
14:33.000 --> 14:42.000
|
||
|
I guess that probably has something to do with somebody who's driving the car depleting the resources of the soundest government in the world by shoveling money out.
|
||
|
|
||
|
14:42.000 --> 14:47.000
|
||
|
And this is Wallace. I don't know who then likes shoveling money out into the street.
|
||
|
|
||
|
14:47.000 --> 14:52.000
|
||
|
There's Stalin in the background going, how red the sunrise is getting.
|
||
|
|
||
|
14:52.000 --> 15:08.000
|
||
|
And then here on this painting, it says the plan of action for the US is spend spend spend under the guise of recovery bust the government blame the capitalist for the failure junk the Constitution and declare a dictatorship.
|
||
|
|
||
|
15:09.000 --> 15:12.000
|
||
|
1934.
|
||
|
|
||
|
15:12.000 --> 15:17.000
|
||
|
Guess there were conspiracy theorists back then too.
|
||
|
|
||
|
15:17.000 --> 15:26.000
|
||
|
So the principle of informed consent is one way that you can kind of try to find if there's a light on anywhere inside of someone's head.
|
||
|
|
||
|
15:26.000 --> 15:31.000
|
||
|
If we start talking about injections in general.
|
||
|
|
||
|
15:31.000 --> 15:42.000
|
||
|
For example, the guy at the gym the other day when I told you that I was helping him carry this ladder and he said that he had just gotten a shingle shot in his arm and that's why he couldn't carry the ladder as easily.
|
||
|
|
||
|
15:42.000 --> 15:51.000
|
||
|
I could have said, for example, well, what kind of informed consent do you get with something like that? Do they just tell you you need a single shot or do they give you a brochure? What's that all about?
|
||
|
|
||
|
15:51.000 --> 16:07.000
|
||
|
But I didn't have it. And this is a great example of how we all need to more actively have a set of responses ready when we encounter situations like this.
|
||
|
|
||
|
16:07.000 --> 16:23.000
|
||
|
Me, the guy that wants to stream every night about this stuff, drop the ball when I was at the gym the other day, carry in that ladder with that guy would have been a perfect time to bring up the principle of informed consent in the context of that single shot and just kind of ask him like,
|
||
|
|
||
|
16:23.000 --> 16:33.000
|
||
|
I'm just wondering on behalf of maybe my dad or something, you can bullshit him if you want to. I'm not wondering for my dad, he can take a single shot. I guess if he wants to, he thinks I'm a know it all.
|
||
|
|
||
|
16:33.000 --> 16:43.000
|
||
|
So the point would be though that with this guy at the gym, I could have said, hey, with regard to, you know, how did that come about? Did they tell you that? Did you ask for it? You know, what's the deal?
|
||
|
|
||
|
16:43.000 --> 16:56.000
|
||
|
And maybe he would tell me maybe he wouldn't, but my guess is I've known him long enough. I've been, I've been, you know, shooting the shit with him for a while now. I think that that's not too probing of a question because I'm not asking him what he believes.
|
||
|
|
||
|
16:56.000 --> 17:01.000
|
||
|
I just asked him, you know, informed consent. And if you say it in that way.
|
||
|
|
||
|
17:01.000 --> 17:17.000
|
||
|
I think sets a very high bar of objectivity, you know, just saying, by the way, you know, I'm wondering, and that could go any number of places right that could go to the informed consent that was was provided or not provided in that person's mind with regard to the
|
||
|
|
||
|
17:17.000 --> 17:22.000
|
||
|
it could go to informed consent with regard to he doesn't even know what it is anymore.
|
||
|
|
||
|
17:22.000 --> 17:37.000
|
||
|
There's a lot of ways that that could have opened doors that I blew it so informed consent I say for a lot of different reasons. It's because of the concept it's important to understand that concept it's important for us to teach our children that concept
|
||
|
|
||
|
17:37.000 --> 17:54.000
|
||
|
in lots of different contexts as both responsibility and as a kind of barrier or or boundary that shouldn't be crossed. And I really think that that also in terms of communicating with other people it's a really great way to kind of ping
|
||
|
|
||
|
17:54.000 --> 18:05.000
|
||
|
the depth of their understanding and you can say it in a way that that kind of doesn't necessarily belie the side of the equation on which you reside at this time.
|
||
|
|
||
|
18:05.000 --> 18:10.000
|
||
|
Anyway, that's why I keep one of the reasons why I say this almost every night.
|
||
|
|
||
|
18:10.000 --> 18:28.000
|
||
|
And of course our hypothesis remains the same, our hypothesis is that somewhere here in 2020, they created a few mass casualty events and they misconstrued that as the, the evidence of the beginning of a preeminent, you know, worst case scenario
|
||
|
|
||
|
18:28.000 --> 18:31.000
|
||
|
that a million people could die.
|
||
|
|
||
|
18:31.000 --> 18:43.000
|
||
|
And that level of potential for a global disaster or global emergency pre required them to declare the emergency that would allow them to say, hey, it's spreading.
|
||
|
|
||
|
18:44.000 --> 19:02.000
|
||
|
So we've got to start testing, then they flubbed the test and made it, you know, seem like, oh my gosh, it's just ineptitude and rushing and, and, you know, the clashing between Trump and, and, and NHS and all this nonsense.
|
||
|
|
||
|
19:03.000 --> 19:20.000
|
||
|
That in reality was one big theater to make sure that the worst case scenario narrative of a gain of function virus having released from a lab would be something that would permeate and argue and, and bubble in the background enough so that both sides of a limited spectrum
|
||
|
|
||
|
19:21.000 --> 19:35.000
|
||
|
would eventually come to the conclusion that although the PCR tests weren't perfect, although they overcycled them, although there were, there were people that weren't killed directly of COVID but just with COVID.
|
||
|
|
||
|
19:35.000 --> 19:45.000
|
||
|
There definitely is a COVID. There definitely is a SARS to it wasn't there before 2020 and it started spreading in 2020 or late 2019.
|
||
|
|
||
|
19:45.000 --> 19:59.000
|
||
|
The importance of that basic premise allows them to see the narrative that includes a mythology that's never before been supported in, in biology with real observations.
|
||
|
|
||
|
19:59.000 --> 20:13.000
|
||
|
We've never tracked an RNA molecule from a, from a puddle in a market to all corners of the globe before to a different species before and tested zoo animals expecting them to be positive.
|
||
|
|
||
|
20:13.000 --> 20:21.000
|
||
|
Here we're supposed to believe that all of these observations are of equal clarity and an equal fidelity and it's just ridiculous.
|
||
|
|
||
|
20:21.000 --> 20:36.000
|
||
|
There's no question in my mind that this is a combination of bad diagnostics and also a really hot background of RNA and DNA from a wide variety of sources that can't really be adequately quantified.
|
||
|
|
||
|
20:36.000 --> 20:50.000
|
||
|
And so if you pretend that a PCR test is specific for something when there are no symptoms present, you're just misleading entire, basically entire fields.
|
||
|
|
||
|
20:51.000 --> 21:12.000
|
||
|
And it's unfortunate because what I'm trying to argue, ladies and gentlemen, is that there are academic biologists out there in every major city in the United States who should really know this they should really know that the way we use PCR in the context of a sample from a patient with a single pair of primers
|
||
|
|
||
|
21:12.000 --> 21:22.000
|
||
|
and no positive or negative controls, no replication and triplicate, no sequencing of the amplicon that we amplified.
|
||
|
|
||
|
21:22.000 --> 21:28.000
|
||
|
There would, this would pass no muster in any academic setting.
|
||
|
|
||
|
21:28.000 --> 21:45.000
|
||
|
And decide the fate of people's lives and to decide whether or not they go on crazy protocol a before, or whether they get treated like they any other patient would have been treated with these symptoms is dependent on that very diagnostic being
|
||
|
|
||
|
21:45.000 --> 21:53.000
|
||
|
obviously applied in a way that will not give them the specificity that they would want if it was a medical setting.
|
||
|
|
||
|
21:53.000 --> 22:15.000
|
||
|
We're not talking about academic fraud about some some obscure G protein coupled receptor sub unit that that you're sequencing or that you're saying is present in a brain region and therefore partially responsible for whatever little circuitry you're interested in and getting a fake paper for it.
|
||
|
|
||
|
22:15.000 --> 22:32.000
|
||
|
We're talking about diagnosing the presence of a pathogen and potentially implying the infectiousness of someone or their their potential for harming others as we say on television and so the use of PCR in such a a negligent way.
|
||
|
|
||
|
22:32.000 --> 22:42.000
|
||
|
And the way that there's nobody at any university that seems to want to just speak up and say, look, it's not about cycle count.
|
||
|
|
||
|
22:42.000 --> 22:56.000
|
||
|
It's about these all these other things that we know at the academic bench are required in order to establish that a PCR amplicon result means what you say it means.
|
||
|
|
||
|
22:56.000 --> 23:05.000
|
||
|
Every one of them that is published a PCR result knows the kinds of controls that are done at minimum.
|
||
|
|
||
|
23:05.000 --> 23:13.000
|
||
|
And yet in none of these situations and none of these diagnostics or any of these controls present.
|
||
|
|
||
|
23:13.000 --> 23:17.000
|
||
|
And it is not dissimilar to what they did with AIDS.
|
||
|
|
||
|
23:17.000 --> 23:21.000
|
||
|
Although with AIDS, a lot of times the test was an antibody.
|
||
|
|
||
|
23:21.000 --> 23:30.000
|
||
|
Of an antibody that was indicated right by by a test that they would make.
|
||
|
|
||
|
23:30.000 --> 23:44.000
|
||
|
Not through sequencing not through spec high high specificity this is this is like pull down kind of things you know false positives are are obviously possible.
|
||
|
|
||
|
23:44.000 --> 24:03.000
|
||
|
And so we're in the same place now, where they have used testing and diagnostics and a probably a few mass casualty events created in order to make sure a few viral videos in order to make sure that people really thought that worst case scenario was possible.
|
||
|
|
||
|
24:03.000 --> 24:21.000
|
||
|
And yes, I do believe that they could have used an infectious clone in order to achieve this and it would have been very effective and the people that would have used the patient samples that had the infectious clone in them would never have known.
|
||
|
|
||
|
24:21.000 --> 24:30.000
|
||
|
Wow, this this virus grows well, this sure his psychopathic in all these places it's not what happened.
|
||
|
|
||
|
24:31.000 --> 24:49.000
|
||
|
When they isolated the virus in Wuhan or at least the paper that says they isolated a virus in Wuhan it was like two wells out of 96 plate well plate that actually showed the psychopathic or whatever they call them effects where they said there was a virus and then they sequenced those two holes.
|
||
|
|
||
|
24:49.000 --> 24:59.000
|
||
|
And what they say they amplified was the the whole genome of a coronavirus that they then published I don't know when.
|
||
|
|
||
|
24:59.000 --> 25:09.000
|
||
|
They did a database and everybody found out that they deleted a database and so obviously the Chinese must have done it or something.
|
||
|
|
||
|
25:09.000 --> 25:24.000
|
||
|
And this was all a big show to make sure that you would be more concerned about where the virus came from than what they were going to give you to cure it that it was a foregone conclusion that there was going to be a vaccine in that meeting with Donald Trump.
|
||
|
|
||
|
25:24.000 --> 25:29.000
|
||
|
But it was going to take a while but there was definitely going to be a vaccine.
|
||
|
|
||
|
25:29.000 --> 25:41.000
|
||
|
And in the meantime there was this bridge as Mark pointed out that would get us there that were these therapeutics in these monoclonal antibodies and all this other stuff.
|
||
|
|
||
|
25:41.000 --> 25:56.000
|
||
|
I think they wanted another year or so of things and it might have even been warped speed that that sort of threw everything off me that that could have been the unplanned part of the plan.
|
||
|
|
||
|
25:56.000 --> 26:06.000
|
||
|
And they barely were able to stretch it to where they got control of the White House again before that stuff already started I don't know.
|
||
|
|
||
|
26:06.000 --> 26:19.000
|
||
|
It's a very curious set of observations that can be made if one keeps your eyes open now and it's it's really important that we do so as we move forward.
|
||
|
|
||
|
26:19.000 --> 26:29.000
|
||
|
Because there is a still united agreement that there was a mystery virus that can account for many or all of the excess deaths over the last three years.
|
||
|
|
||
|
26:29.000 --> 26:44.000
|
||
|
And this idea that there's a mystery virus that it can account for these deaths. This is an idea that is shared by all of these people by the dark horse podcast by by Paul Offit and Twiv.
|
||
|
|
||
|
26:44.000 --> 27:00.000
|
||
|
Everybody believes this. And so essentially we've made no progress. As I pointed out last night, these people even had Denny Rancor fly across the world to Romania make him feel like he was being heard.
|
||
|
|
||
|
27:00.000 --> 27:05.000
|
||
|
Put him on stage like they may be doing to me in South Dakota.
|
||
|
|
||
|
27:06.000 --> 27:08.000
|
||
|
I don't know.
|
||
|
|
||
|
27:08.000 --> 27:09.000
|
||
|
Say something else.
|
||
|
|
||
|
27:09.000 --> 27:12.000
|
||
|
I don't know what's going to go on, but I know what they did to Denny Rancor.
|
||
|
|
||
|
27:12.000 --> 27:13.000
|
||
|
They called him over there.
|
||
|
|
||
|
27:13.000 --> 27:14.000
|
||
|
They flew him over there.
|
||
|
|
||
|
27:14.000 --> 27:16.000
|
||
|
They all made friends with them.
|
||
|
|
||
|
27:16.000 --> 27:22.000
|
||
|
They all came back and pretended to promote him, but actually they didn't promote him and his story at all.
|
||
|
|
||
|
27:22.000 --> 27:24.000
|
||
|
They promoted themselves.
|
||
|
|
||
|
27:24.000 --> 27:26.000
|
||
|
And they said, wow, we were right.
|
||
|
|
||
|
27:26.000 --> 27:30.000
|
||
|
Look, Denny Rancor calculates 17 million people were hurt from the shot.
|
||
|
|
||
|
27:31.000 --> 27:40.000
|
||
|
What they don't say is that Denny Rancor also says there's no evidence of the spreading pathogen. It looks like people were killed by protocols.
|
||
|
|
||
|
27:40.000 --> 27:42.000
|
||
|
They could say that.
|
||
|
|
||
|
27:42.000 --> 27:51.000
|
||
|
They met him in person in Romania. They listened to his talk in Romania. They said they were really impressed by his talk.
|
||
|
|
||
|
27:52.000 --> 28:02.000
|
||
|
But they must have all been simultaneously like texting each other about how good his talk was when he said that there was no evidence of spreading pathogen.
|
||
|
|
||
|
28:02.000 --> 28:04.000
|
||
|
And that's why they missed it.
|
||
|
|
||
|
28:04.000 --> 28:12.000
|
||
|
I guess that's what happened. I can't otherwise I can't explain why they wouldn't have said that in at least Robert Malone sub stack about meeting.
|
||
|
|
||
|
28:12.000 --> 28:29.000
|
||
|
No, our Jessica roses sub stack about having breakfast with no and and and Brett Weinstein has covered it a couple times on his on his live podcast and nope didn't say it at all.
|
||
|
|
||
|
28:29.000 --> 28:39.000
|
||
|
Didn't even come close to saying never have these people come close to saying that there was really no evidence of spread of a particularly novel.
|
||
|
|
||
|
28:39.000 --> 28:44.000
|
||
|
That's a dangerous pathogen.
|
||
|
|
||
|
28:44.000 --> 28:54.000
|
||
|
Never have they pointed out that that fear, uncertainty and doubt in the confusion that started at the beginning of the pandemic might have already hurt people.
|
||
|
|
||
|
28:54.000 --> 29:07.000
|
||
|
They've never ever acknowledged what we are talking about for the last month or so, which is overuse of pulse oximeters and supplementary oxygen could have easily damaged a lot of people.
|
||
|
|
||
|
29:07.000 --> 29:18.000
|
||
|
It's a very easy way to get people on ventilators it could have actually been the reason why they sent people home when they were kind to sick but not really sick.
|
||
|
|
||
|
29:18.000 --> 29:25.000
|
||
|
Don't come back until your pulse ox is such and such.
|
||
|
|
||
|
29:25.000 --> 29:33.000
|
||
|
Because then when you come back we can give you supplementary oxygen and then we know where that's going to take us.
|
||
|
|
||
|
29:34.000 --> 29:46.000
|
||
|
Some people said that they were sending people home to save the hospital from being overloaded. Some people said that that was missing the possibility of treating people early.
|
||
|
|
||
|
29:46.000 --> 29:51.000
|
||
|
And so they were trying to make things worse.
|
||
|
|
||
|
29:52.000 --> 30:06.000
|
||
|
So that people come into the hospital the first thing they can do is start them with supplementary oxygen and then all of the symptoms that happen as a result of the supplementary oxygen will necessarily be attributed to COVID because that's the whole reason they're in there in the first place
|
||
|
|
||
|
30:06.000 --> 30:10.000
|
||
|
and oxygen doesn't hurt people.
|
||
|
|
||
|
30:10.000 --> 30:13.000
|
||
|
This is a big deal.
|
||
|
|
||
|
30:13.000 --> 30:17.000
|
||
|
All the doctors that that listen to this stream haven't said anything yet.
|
||
|
|
||
|
30:17.000 --> 30:21.000
|
||
|
And all the doctors that we've sent emails to haven't said anything yet.
|
||
|
|
||
|
30:21.000 --> 30:27.000
|
||
|
They could all add it to the list of reasons why we should be pissed and they're not adding it.
|
||
|
|
||
|
30:27.000 --> 30:38.000
|
||
|
No one has added this to their list yet and it's actually gigantic because it's a very easy way to explain why anyone would get to the stage where they were ventilated.
|
||
|
|
||
|
30:38.000 --> 30:48.000
|
||
|
Anyone would get to the stage why they got worse when they went to the hospital, but they weren't ventilated if they were given supplementary oxygen even at a normal level.
|
||
|
|
||
|
30:48.000 --> 31:01.000
|
||
|
Never mind at a level like Mark has described with something like 10 or 20 or 30 liters a minute.
|
||
|
|
||
|
31:01.000 --> 31:14.000
|
||
|
They're not resuscitate orders in New York City are probably how they created much of this mess because that's also how they got a lot of people that die of heart attacks and call it COVID because they still got to build those deaths.
|
||
|
|
||
|
31:14.000 --> 31:21.000
|
||
|
COVID deaths are COVID deaths, even if they died in someone's house and you just brought them to the morgue.
|
||
|
|
||
|
31:21.000 --> 31:30.000
|
||
|
Ventilators were probably misused, but the question really is how do they get people on the ventilators. I think they got them on the ventilators by hurting them with supplementary oxygen.
|
||
|
|
||
|
31:30.000 --> 31:36.000
|
||
|
I think this is a really great observation by Jessica and Mark and others.
|
||
|
|
||
|
31:36.000 --> 31:41.000
|
||
|
I think it's striking how few people will just say oh yeah that's terrible.
|
||
|
|
||
|
31:41.000 --> 31:45.000
|
||
|
Thomas Binder said it immediately in my interview with him.
|
||
|
|
||
|
31:46.000 --> 32:04.000
|
||
|
Is it really noticed? I mean, the only antibiotic that was promoted by by the American or America's frontline doctors was Zithromycin, a Zithromax and a Zithromycin and that's a product by Pfizer.
|
||
|
|
||
|
32:04.000 --> 32:07.000
|
||
|
So they didn't say antibiotics.
|
||
|
|
||
|
32:07.000 --> 32:13.000
|
||
|
They stated a very specific product plus hydroxychloroquine.
|
||
|
|
||
|
32:13.000 --> 32:19.000
|
||
|
They had to say zinc all the time, but that's really probably the most important thing.
|
||
|
|
||
|
32:19.000 --> 32:34.000
|
||
|
And if you look at one of the last videos of Zeb Zalenko, you will find him in his car saying that it's like a national security secret that any zinc ionophore and zinc is supplementary zinc is a solution to all RNA viruses.
|
||
|
|
||
|
32:38.000 --> 32:47.000
|
||
|
And so when they said hydroxychloroquine is in and then they said Zithromycin and she yelled it into the mic.
|
||
|
|
||
|
32:47.000 --> 32:53.000
|
||
|
Dr. Emmanuel, it's a very, it's not a random choice.
|
||
|
|
||
|
32:53.000 --> 33:09.000
|
||
|
We have been played ladies and gentlemen by lots of people. Unfortunately, the poor use of steroids is under spoken about. And that's also a very easy way to bring people out of the spiral of a of a cytokine storm.
|
||
|
|
||
|
33:09.000 --> 33:14.000
|
||
|
I told people on the stream that I might eat something just in case.
|
||
|
|
||
|
33:14.000 --> 33:20.000
|
||
|
Thank you.
|
||
|
|
||
|
33:21.000 --> 33:28.000
|
||
|
Remdesivir and medazolam might be one of those things where you if you just focus on those things, it's not that dangerous.
|
||
|
|
||
|
33:28.000 --> 33:38.000
|
||
|
Then those are narratives they can talk their way out of. Not everybody died of remdesivir. So it'll be a way that even doctors can talk their way out of it in their own minds.
|
||
|
|
||
|
33:38.000 --> 33:55.000
|
||
|
It's nearly as dangerous as the overuse of pulse oximeters and oxygen to people who were given it. And if they think back and realize that everybody was given it all we gave lots of people high flow nasal oxygen with even with a mask over their face or something like that if they're
|
||
|
|
||
|
33:55.000 --> 34:01.000
|
||
|
aware of the acute toxic effects of such a treatment.
|
||
|
|
||
|
34:01.000 --> 34:11.000
|
||
|
When it's not especially when it's not necessary they may have treated those those acute symptoms that resulted from that as part of the progression of COVID.
|
||
|
|
||
|
34:11.000 --> 34:23.000
|
||
|
Just because they were panicked just because they weren't thinking they were exhausted. I don't know, because they watch TV skillfully.
|
||
|
|
||
|
34:23.000 --> 34:35.000
|
||
|
When we read the newspaper that's one of the things that Mark did in his last show he showed all kinds of newspaper articles that were all talking about pulse oximeters and pulse ox for some reason has a primary indicator of disease progression.
|
||
|
|
||
|
34:35.000 --> 34:40.000
|
||
|
Why were all these newspaper articles written for shit sake.
|
||
|
|
||
|
34:40.000 --> 34:47.000
|
||
|
What's suddenly become the primary indicator in all these newspapers was how is that possible.
|
||
|
|
||
|
34:47.000 --> 34:54.000
|
||
|
Have we always used pulse ox to indicate people's progression in pneumonia before.
|
||
|
|
||
|
34:54.000 --> 35:04.000
|
||
|
And if so why didn't we understand it in that way what did we give people oxygen supplementary oxygen the moment they went to below 95.
|
||
|
|
||
|
35:04.000 --> 35:13.000
|
||
|
Like it said in one of the articles that Mark showed that below 95 pulse ox is like some kind of danger.
|
||
|
|
||
|
35:13.000 --> 35:18.000
|
||
|
And still the only person that's talking about opioid deaths really is Mark.
|
||
|
|
||
|
35:18.000 --> 35:24.000
|
||
|
I mean I mention it it's on my slide all the time but you know besides me and Mark who's mentioning it.
|
||
|
|
||
|
35:24.000 --> 35:32.000
|
||
|
Who understands that the all cause mortality just had to increase the excess deaths just had to increase.
|
||
|
|
||
|
35:32.000 --> 35:45.000
|
||
|
And then they had to be spectacularly committed to the lie that the only new death on the blocks on new deaths on the block are the deaths caused by the novel virus.
|
||
|
|
||
|
35:45.000 --> 36:02.000
|
||
|
The graph that we saw in Vincent Rancid yellow's lecture yesterday where the the dotted line of projected life expectancy was going to go down in 2020 that could have easily been from the opioid epidemic.
|
||
|
|
||
|
36:02.000 --> 36:06.000
|
||
|
It's as if Vincent Rancid yellow is completely unaware.
|
||
|
|
||
|
36:07.000 --> 36:18.000
|
||
|
That in the last four years more than 500,000 people have died of opioid deaths that just conveniently I didn't realize that sorry Wow.
|
||
|
|
||
|
36:18.000 --> 36:27.000
|
||
|
I thought all those people died of COVID to because there aren't even that you know that's the problem.
|
||
|
|
||
|
36:28.000 --> 36:37.000
|
||
|
These numbers are just magical numbers that are based on what you call excess deaths if you don't call 500,000 people dying of opioids.
|
||
|
|
||
|
36:37.000 --> 36:48.000
|
||
|
Excess deaths either just normal than the number of excess deaths in that year will be very different than if you call all those people excess deaths every year.
|
||
|
|
||
|
36:48.000 --> 36:52.000
|
||
|
Is there an acceptable number of people that die of opioid deaths.
|
||
|
|
||
|
36:52.000 --> 36:54.000
|
||
|
Overdose.
|
||
|
|
||
|
36:54.000 --> 37:01.000
|
||
|
Is there an acceptable number of people under 40 to die of that just kind of tough shit.
|
||
|
|
||
|
37:01.000 --> 37:03.000
|
||
|
Some people think so.
|
||
|
|
||
|
37:03.000 --> 37:04.000
|
||
|
I don't.
|
||
|
|
||
|
37:04.000 --> 37:06.000
|
||
|
Some people think so.
|
||
|
|
||
|
37:06.000 --> 37:09.000
|
||
|
Some people think that it is kind of tough shit.
|
||
|
|
||
|
37:09.000 --> 37:13.000
|
||
|
They should know better had better parents or something like that.
|
||
|
|
||
|
37:13.000 --> 37:19.000
|
||
|
It's not the way that works, especially when it's part of an elaborate plan.
|
||
|
|
||
|
37:19.000 --> 37:30.000
|
||
|
To control demolish America and do it in the guise of a pandemic and its response.
|
||
|
|
||
|
37:30.000 --> 37:32.000
|
||
|
So that we beg for it.
|
||
|
|
||
|
37:32.000 --> 37:40.000
|
||
|
That certificate fraud financial incentives I mean come on they were what is it $32,000 a body.
|
||
|
|
||
|
37:41.000 --> 37:54.000
|
||
|
PCR fraud letter full test fraud sequencing fraud it's all the same thing we don't talk about this, you know, the Medicare costs that were saved we don't talk about strict liability seventh minimum violation we don't talk about the epidemiological evidence of spread.
|
||
|
|
||
|
37:54.000 --> 38:05.000
|
||
|
So that I think about the biology is the way out and the other flip side is to understand the kinds of mythologies that they've been talking about behind the scenes for a really long time.
|
||
|
|
||
|
38:05.000 --> 38:16.000
|
||
|
And what I want you to do today.
|
||
|
|
||
|
38:16.000 --> 38:26.000
|
||
|
What I want you to do today is listen to this man by the name of Ray Kurzweil and listen to him talk about the singularity and other things.
|
||
|
|
||
|
38:27.000 --> 38:37.000
|
||
|
He claims to have been the guy who he claims to be his claim to fame is that he's been working on artificial intelligence longer than anyone else alive.
|
||
|
|
||
|
38:37.000 --> 38:39.000
|
||
|
And he'll explain that.
|
||
|
|
||
|
38:39.000 --> 38:54.000
|
||
|
And he believes at some point very soon that the singularity between biology and machine is going to occur and that this is going to have, you know, a generational spanning consequences for all of us.
|
||
|
|
||
|
38:54.000 --> 38:58.000
|
||
|
And so there's a lot of really crazy things that come out in this conversation.
|
||
|
|
||
|
38:58.000 --> 39:15.000
|
||
|
But I want to be sure that we are watching this movie with the same mindset and the way that we should be watching this mindset in my movie and excuse me, in my humble opinion is that we should be watching it with the mindset that this is the kind of
|
||
|
|
||
|
39:16.000 --> 39:29.000
|
||
|
Let's say, for example, that a bunch of rich people were going to get together in the wine cellar of some rich person's house and they were going to, you know, have a candlelight dinner in this nice basement of a restaurant that somebody owns off of Sardinia.
|
||
|
|
||
|
39:29.000 --> 39:40.000
|
||
|
And it's, you know, a restaurant from the 1600s built into some old chapel and there, you know, all these old friends are there and they're all billionaires and they're all hanging out.
|
||
|
|
||
|
39:41.000 --> 39:43.000
|
||
|
How they're going to do this.
|
||
|
|
||
|
39:43.000 --> 39:55.000
|
||
|
And then they have a couple smart people in the room, not just rich people, but also a couple smart people in the room that like to tell the big stories. And they're kind of like
|
||
|
|
||
|
39:55.000 --> 40:01.000
|
||
|
What's that physicist named Tyson is that his name.
|
||
|
|
||
|
40:02.000 --> 40:16.000
|
||
|
But, but they're for the rich people. And I really think Ray Kurzweil is one of these guys who has been telling these stories to the richest of rich for a very long time and that's why he behaves in the way that he does in the video.
|
||
|
|
||
|
40:17.000 --> 40:22.000
|
||
|
He's extremely confident of his correctness, his rightness, how smart he is.
|
||
|
|
||
|
40:22.000 --> 40:28.000
|
||
|
It's almost laughable that he takes the time to talk to the people in the room and to let this guy interview him.
|
||
|
|
||
|
40:28.000 --> 40:34.000
|
||
|
And in a lot of ways, actually, you might be annoyed by the interviewer or or hear his skepticism.
|
||
|
|
||
|
40:34.000 --> 40:40.000
|
||
|
I'm going to leave it at 1.5 speed, because otherwise it'll take me way too long to get through it.
|
||
|
|
||
|
40:41.000 --> 40:53.000
|
||
|
And in that 1.5 speed, you might not be able to hear exactly how much skepticism the interviewer has, but actually the interviewer has a very healthy skepticism and I actually appreciate it.
|
||
|
|
||
|
40:53.000 --> 40:57.000
|
||
|
But Ray Kurzweiler is absolutely unfazed.
|
||
|
|
||
|
40:57.000 --> 41:04.000
|
||
|
And at some point in the video, he says that if people at least
|
||
|
|
||
|
41:04.000 --> 41:12.000
|
||
|
live five more years, they will live long enough to live 500 years if they are diligent.
|
||
|
|
||
|
41:12.000 --> 41:23.000
|
||
|
I shit you not. And this is a, I believe it's a talk from no less than it could be less than three weeks ago.
|
||
|
|
||
|
41:23.000 --> 41:29.000
|
||
|
I guess I could go like this and see that I'll just get back up again.
|
||
|
|
||
|
41:29.000 --> 41:41.000
|
||
|
It's four weeks ago, so one month ago in Portugal. Yes. Enjoy. Here we go.
|
||
|
|
||
|
41:43.000 --> 41:46.000
|
||
|
All right. I'm so excited to be here with you, Ray.
|
||
|
|
||
|
41:46.000 --> 41:48.000
|
||
|
Thank you. Thank you. I'd like to see everybody together.
|
||
|
|
||
|
41:48.000 --> 41:52.000
|
||
|
I don't know why it's so dim. Hold on a second.
|
||
|
|
||
|
41:52.000 --> 41:58.000
|
||
|
I'm going to figure that out. Let me get my head out of the way, make sure this volume is all the way up. It is.
|
||
|
|
||
|
41:58.000 --> 42:04.000
|
||
|
So that means I guess I got to turn this volume up. I didn't think it was so low before.
|
||
|
|
||
|
42:04.000 --> 42:08.000
|
||
|
I'm doing anything wrong here. It looks okay. I'm just going to jerk.
|
||
|
|
||
|
42:08.000 --> 42:16.000
|
||
|
Make it work in AI longer than any other human alive, which means if you live forever and we'll get to that, you will always have that distinction.
|
||
|
|
||
|
42:16.000 --> 42:20.000
|
||
|
I think that's right. Marvin Minsky was actually my mentor.
|
||
|
|
||
|
42:20.000 --> 42:24.000
|
||
|
If you were alive today, he would actually be more than 61 years. I'm going to bring him back also.
|
||
|
|
||
|
42:24.000 --> 42:28.000
|
||
|
I'm not sure how we'll count the distinction then.
|
||
|
|
||
|
42:28.000 --> 42:32.000
|
||
|
All right. So we're going to fix the audio. But this is what we're going to do with this conversation.
|
||
|
|
||
|
42:32.000 --> 42:36.000
|
||
|
I'm going to start out asking Ray some questions about where we are today. We'll do that for a few minutes.
|
||
|
|
||
|
42:36.000 --> 42:40.000
|
||
|
Then we'll get into what has to happen to reach the singularity, so the next 20 years.
|
||
|
|
||
|
42:40.000 --> 42:43.000
|
||
|
Then we'll get a discussion about what the singularity is, what it means, how it changed our lives.
|
||
|
|
||
|
42:43.000 --> 42:47.000
|
||
|
And then we'll talk a little bit about how if we believe this vision of the future, what it means for us today.
|
||
|
|
||
|
42:47.000 --> 42:51.000
|
||
|
Ask your questions, but come in. I'll ask them as they go in the different sections of the conversation. But let's get cracking.
|
||
|
|
||
|
42:51.000 --> 42:55.000
|
||
|
Can you hear me? You can't hear Ray.
|
||
|
|
||
|
42:55.000 --> 43:01.000
|
||
|
Well, this will be recorded. You guys are going to all live forever. There'll be plenty of time. It will be fine.
|
||
|
|
||
|
43:01.000 --> 43:05.000
|
||
|
Oh, shoot.
|
||
|
|
||
|
43:05.000 --> 43:09.000
|
||
|
Oh, you know what's going to this is going to be? I'm going to stop the YouTube stream.
|
||
|
|
||
|
43:09.000 --> 43:13.000
|
||
|
I don't know. It's YouTube's not going to like me doing this, so I'm just going to stop it.
|
||
|
|
||
|
43:13.000 --> 43:17.000
|
||
|
And that should help. And then I will...
|
||
|
|
||
|
43:17.000 --> 43:21.000
|
||
|
Too bad for YouTube people. They don't like me anyway.
|
||
|
|
||
|
43:21.000 --> 43:27.000
|
||
|
I'm going to pause it and I'm going to refresh this stream. I apologize for that, guys.
|
||
|
|
||
|
43:27.000 --> 43:31.000
|
||
|
I knew that that was going to cross up a little bit. I should have known better.
|
||
|
|
||
|
43:31.000 --> 43:35.000
|
||
|
I'll refresh this like that.
|
||
|
|
||
|
43:35.000 --> 43:39.000
|
||
|
And then that went back to my old five and five years.
|
||
|
|
||
|
43:39.000 --> 43:43.000
|
||
|
Okay, great. And I imagine all of you will be alive in five years.
|
||
|
|
||
|
43:43.000 --> 43:47.000
|
||
|
But it looks like it's maybe a year or two ahead of schedule.
|
||
|
|
||
|
43:47.000 --> 43:51.000
|
||
|
There we go.
|
||
|
|
||
|
43:51.000 --> 43:55.000
|
||
|
Audio engineers, are we good to go?
|
||
|
|
||
|
43:55.000 --> 43:59.000
|
||
|
We're good to go. All right. First question, Ray.
|
||
|
|
||
|
43:59.000 --> 44:03.000
|
||
|
So, you've been working in AI for 61 years. Can you hear me?
|
||
|
|
||
|
44:04.000 --> 44:07.000
|
||
|
Let's not... So everybody in the front can hear you, but nobody in the back can hear you.
|
||
|
|
||
|
44:07.000 --> 44:11.000
|
||
|
Can you hear me now?
|
||
|
|
||
|
44:11.000 --> 44:15.000
|
||
|
All right. I'll speak louder. First question.
|
||
|
|
||
|
44:15.000 --> 44:19.000
|
||
|
So, you've been living in the AI revolution for a long time. You've made lots of predictions, many of which have been remarkably accurate.
|
||
|
|
||
|
44:19.000 --> 44:23.000
|
||
|
We've all been living in a remarkable two-year transformation
|
||
|
|
||
|
44:23.000 --> 44:27.000
|
||
|
with large language models a year and a half. What has surprised you about the innovations in large language models
|
||
|
|
||
|
44:27.000 --> 44:31.000
|
||
|
and what has happened recently? Well, I did finish this book a year ago.
|
||
|
|
||
|
44:31.000 --> 44:35.000
|
||
|
I didn't really cover a large language model, so I delayed the book
|
||
|
|
||
|
44:35.000 --> 44:39.000
|
||
|
to cover that.
|
||
|
|
||
|
44:39.000 --> 44:43.000
|
||
|
I was expecting that to happen
|
||
|
|
||
|
44:43.000 --> 44:47.000
|
||
|
a couple of years later. I made a prediction in 1999
|
||
|
|
||
|
44:47.000 --> 44:51.000
|
||
|
that would happen by 2029. And we're not quite there yet.
|
||
|
|
||
|
44:51.000 --> 44:55.000
|
||
|
But it looks like it's maybe a year or two ahead of schedule.
|
||
|
|
||
|
44:55.000 --> 44:59.000
|
||
|
So that was maybe a bit of a problem.
|
||
|
|
||
|
44:59.000 --> 45:03.000
|
||
|
You predicted back in 1999 that computer passed the Turing test in 2029.
|
||
|
|
||
|
45:03.000 --> 45:07.000
|
||
|
Are you revising that to something more closer to today?
|
||
|
|
||
|
45:07.000 --> 45:11.000
|
||
|
No, I'm still saying 2029.
|
||
|
|
||
|
45:11.000 --> 45:15.000
|
||
|
The definition of the Turing test is not precise.
|
||
|
|
||
|
45:15.000 --> 45:19.000
|
||
|
We're going to have people claiming that Turing test has been solved.
|
||
|
|
||
|
45:19.000 --> 45:23.000
|
||
|
And people saying that GPT-4 actually passes it. Some people. So it's going to be like maybe two or three years
|
||
|
|
||
|
45:23.000 --> 45:27.000
|
||
|
where people start claiming and then they continue to claim and finally everybody will accept it.
|
||
|
|
||
|
45:27.000 --> 45:31.000
|
||
|
So it's not like it happens in one day. But you have a very specific definition of the Turing test.
|
||
|
|
||
|
45:31.000 --> 45:35.000
|
||
|
When do you think we'll pass that definition? Well, the Turing test is actually not that significant
|
||
|
|
||
|
45:35.000 --> 45:39.000
|
||
|
because that means that you can a computer will pass
|
||
|
|
||
|
45:39.000 --> 45:43.000
|
||
|
for a human being. And what's more important
|
||
|
|
||
|
45:43.000 --> 45:47.000
|
||
|
is AGI, automatic general intelligence, which means it can emulate any human being.
|
||
|
|
||
|
45:47.000 --> 45:51.000
|
||
|
Is it automatic general intelligence or is it
|
||
|
|
||
|
45:51.000 --> 45:55.000
|
||
|
artificial, right? You just made a mistake. A computer and it can do everything
|
||
|
|
||
|
45:55.000 --> 45:59.000
|
||
|
that any human being can do. And that's also 2029. It all happens at the same time.
|
||
|
|
||
|
45:59.000 --> 46:03.000
|
||
|
But nobody can do that. I mean, just take an average large language model today.
|
||
|
|
||
|
46:03.000 --> 46:07.000
|
||
|
You can ask it anything. And it will answer you pretty convincingly.
|
||
|
|
||
|
46:07.000 --> 46:11.000
|
||
|
No human being can do all that. And it does it very quickly. It will write a very nice essay
|
||
|
|
||
|
46:11.000 --> 46:15.000
|
||
|
in 15 seconds. And then you can ask it again and it will write another essay.
|
||
|
|
||
|
46:15.000 --> 46:19.000
|
||
|
And no human being can actually perform at that level. Right. So you have to dumb it down to actually have it
|
||
|
|
||
|
46:19.000 --> 46:23.000
|
||
|
have a convincing Turing test. So have a Turing test you have to dumb it down. Yeah. Let me ask the first question from the audience
|
||
|
|
||
|
46:23.000 --> 46:27.000
|
||
|
right relevant to where we are, which is Brian Daniel. Is the Kirchweil curve still accurate?
|
||
|
|
||
|
46:27.000 --> 46:33.000
|
||
|
Is the Kirchweil curve still accurate? Yes. Let's pull the slides up. First slide.
|
||
|
|
||
|
46:33.000 --> 46:37.000
|
||
|
So this is an 80-year track record. This is an exponential growth.
|
||
|
|
||
|
46:37.000 --> 46:41.000
|
||
|
A straight line on this curve means exponential curvature.
|
||
|
|
||
|
46:41.000 --> 46:45.000
|
||
|
If it was sort of exponential but not quite, it would curve.
|
||
|
|
||
|
46:45.000 --> 46:51.000
|
||
|
This is actually a straight line. It started out with a computer that did 0.000
|
||
|
|
||
|
46:51.000 --> 46:57.000
|
||
|
0.000 0.007 calculations per second per constant dollar. That's the lower left hand corner.
|
||
|
|
||
|
46:57.000 --> 47:03.000
|
||
|
At the upper right hand corner, it's 65 billion calculations per second for the same amount of money.
|
||
|
|
||
|
47:03.000 --> 47:06.000
|
||
|
So that's why large language models have only been feasible for two years.
|
||
|
|
||
|
47:06.000 --> 47:10.000
|
||
|
We actually had large language models before that.
|
||
|
|
||
|
47:10.000 --> 47:15.000
|
||
|
Okay. So did you pick up the first, I didn't see this in the first time I watched the beginning.
|
||
|
|
||
|
47:15.000 --> 47:19.000
|
||
|
I only watched about the first 10 minutes. Did you pick up the first bamboozleman here?
|
||
|
|
||
|
47:19.000 --> 47:26.000
|
||
|
So this is per dollar. So this is not computer speed growing at this rate.
|
||
|
|
||
|
47:26.000 --> 47:36.000
|
||
|
This is computer speed per dollar, which is kind of comical given the fact that the dollar has inflated and deflated and changed over time.
|
||
|
|
||
|
47:36.000 --> 47:47.000
|
||
|
It's kind of funny, right? It's comical, actually, that such a graph can be thrown up on a screen and everybody just goes,
|
||
|
|
||
|
47:47.000 --> 47:50.000
|
||
|
oh, yeah, and swallows it whole.
|
||
|
|
||
|
47:50.000 --> 47:57.000
|
||
|
Think about what this really means. Think about what it really means because it doesn't mean what he says it is that is,
|
||
|
|
||
|
47:57.000 --> 48:04.000
|
||
|
I predicted this, which is what he says. This is an 80 year track record, he said.
|
||
|
|
||
|
48:04.000 --> 48:08.000
|
||
|
Did you hear that? 80 year track record, he said.
|
||
|
|
||
|
48:08.000 --> 48:12.000
|
||
|
Right, so you have to dumb it down to actually have it have a convincing turning test.
|
||
|
|
||
|
48:12.000 --> 48:14.000
|
||
|
So I have a turning test, you have to dumb it down. Yeah.
|
||
|
|
||
|
48:14.000 --> 48:17.000
|
||
|
Let me ask you the first question from the audience since I think it's quite relevant to where we are, which is Brian Daniel.
|
||
|
|
||
|
48:17.000 --> 48:21.000
|
||
|
Is the Kurtzweil curve still accurate? Is the Kurtzweil curve still accurate?
|
||
|
|
||
|
48:21.000 --> 48:26.000
|
||
|
Yes. Can I see that? Let's pull the slides up. First slide.
|
||
|
|
||
|
48:26.000 --> 48:30.000
|
||
|
So this is an 80 year track record. This is an exponential growth.
|
||
|
|
||
|
48:30.000 --> 48:34.000
|
||
|
A straight line on this curve means exponential curvature.
|
||
|
|
||
|
48:34.000 --> 48:39.000
|
||
|
If it was sort of exponential but not quite, it would curve. This is actually a straight line.
|
||
|
|
||
|
48:39.000 --> 48:48.000
|
||
|
It started out with a computer that did 0.00007 calculations per second per constant dollar.
|
||
|
|
||
|
48:48.000 --> 48:55.000
|
||
|
That's the lower left hand corner. At the upper right hand corner, it's 65 billion calculations per second for the same amount of money.
|
||
|
|
||
|
48:55.000 --> 48:58.000
|
||
|
So that's why large language models have only been feasible for two years.
|
||
|
|
||
|
48:58.000 --> 49:01.000
|
||
|
We actually had large language models before that, but it didn't work very well.
|
||
|
|
||
|
49:01.000 --> 49:08.000
|
||
|
And this is an exponential curve. Technology moves in an exponential curve. We see that, for example.
|
||
|
|
||
|
49:08.000 --> 49:20.000
|
||
|
So now what's going to happen, of course, is that the interviewer is going to say, okay, so it just means that we have to just get more computing power and they have to get faster and then all these things happen.
|
||
|
|
||
|
49:20.000 --> 49:24.000
|
||
|
It has nothing to do with our understanding increasing.
|
||
|
|
||
|
49:24.000 --> 49:31.000
|
||
|
And that's actually a very good insight to how this bamboozlement has been pulled over our eyes.
|
||
|
|
||
|
49:31.000 --> 49:41.000
|
||
|
Having renewable energy come from the sun and wind, that's actually an exponential curve. It's increased. It's gone.
|
||
|
|
||
|
49:41.000 --> 49:49.000
|
||
|
He just said that solar and wind is also an exponential curve. It's not clear to me that wind is doing very good.
|
||
|
|
||
|
49:49.000 --> 49:51.000
|
||
|
Wow, that is spectacular.
|
||
|
|
||
|
49:51.000 --> 49:58.000
|
||
|
9.7%. We multiplied the amount of energy coming from solar energy a million fold.
|
||
|
|
||
|
49:58.000 --> 50:04.000
|
||
|
So this kind of curve really directs all kinds of technology.
|
||
|
|
||
|
50:04.000 --> 50:07.000
|
||
|
And this is the reason that we're making progress.
|
||
|
|
||
|
50:07.000 --> 50:12.000
|
||
|
We knew how to do large language models years ago, but were dependent on this curve.
|
||
|
|
||
|
50:12.000 --> 50:17.000
|
||
|
And it's pretty amazing. It started out increasing relay speeds, then vacuum tubes, then integrated circuits.
|
||
|
|
||
|
50:17.000 --> 50:22.000
|
||
|
And each year it makes the same amount of progress, approximately, regardless of where you are on this curve.
|
||
|
|
||
|
50:22.000 --> 50:29.000
|
||
|
We just added the last point. And it's, again, we basically multiply this by two every 1.4 years.
|
||
|
|
||
|
50:29.000 --> 50:35.000
|
||
|
And this is the reason that computers are exciting, but it actually affects every type of technology.
|
||
|
|
||
|
50:35.000 --> 50:39.000
|
||
|
And we just added the last point like two weeks ago.
|
||
|
|
||
|
50:39.000 --> 50:43.000
|
||
|
All right, so let me ask you a question. You know, you wrote a book about how to build a mind.
|
||
|
|
||
|
50:43.000 --> 50:45.000
|
||
|
You know, a lot about how the human mind is constructed.
|
||
|
|
||
|
50:45.000 --> 50:48.000
|
||
|
A lot of the progress in AI, AI systems are being built and what we understand about neural networks, right?
|
||
|
|
||
|
50:48.000 --> 50:51.000
|
||
|
So clearly our understanding of this helps with AI.
|
||
|
|
||
|
50:51.000 --> 50:55.000
|
||
|
In the last two years, by watching these large language models, have we learned anything new about our brains?
|
||
|
|
||
|
50:55.000 --> 50:57.000
|
||
|
Are we learning about the inside of our skulls as we do this?
|
||
|
|
||
|
50:57.000 --> 51:00.000
|
||
|
It really has to do with the amount of connections.
|
||
|
|
||
|
51:00.000 --> 51:03.000
|
||
|
The brain is actually organized fairly differently.
|
||
|
|
||
|
51:03.000 --> 51:06.000
|
||
|
The things nearly I, for example, deal with vision.
|
||
|
|
||
|
51:06.000 --> 51:10.000
|
||
|
And we have different ways of implementing different parts of the brain that remember different things.
|
||
|
|
||
|
51:10.000 --> 51:11.000
|
||
|
We actually don't need that.
|
||
|
|
||
|
51:11.000 --> 51:14.000
|
||
|
In a large language model, all the connections are the same.
|
||
|
|
||
|
51:14.000 --> 51:16.000
|
||
|
We have to get the connections up to a certain point.
|
||
|
|
||
|
51:16.000 --> 51:22.000
|
||
|
If it approximately matches what the brain does with about a trillion connections, it will perform kind of like the brain.
|
||
|
|
||
|
51:22.000 --> 51:24.000
|
||
|
We're kind of almost at that point.
|
||
|
|
||
|
51:24.000 --> 51:26.000
|
||
|
Wait, so you think KPT4 is 400 billion?
|
||
|
|
||
|
51:26.000 --> 51:28.000
|
||
|
The next ones will be a trillion or more.
|
||
|
|
||
|
51:28.000 --> 51:32.000
|
||
|
So the construction of these models, they are more efficient in their construction than our brains are.
|
||
|
|
||
|
51:32.000 --> 51:37.000
|
||
|
We make them to be as efficient as possible, but it doesn't really matter how they're organized.
|
||
|
|
||
|
51:37.000 --> 51:44.000
|
||
|
And we can actually create certain software that will actually expand the amount of connections more for the same amount of computation.
|
||
|
|
||
|
51:44.000 --> 51:53.000
|
||
|
But it really has to do with how many connections are a particular computer is responsible for.
|
||
|
|
||
|
51:53.000 --> 51:58.000
|
||
|
So as we approach AGI, we're not looking for a new understanding of how to make these machines more efficient.
|
||
|
|
||
|
51:58.000 --> 52:00.000
|
||
|
The transformer architecture was clearly very important.
|
||
|
|
||
|
52:00.000 --> 52:02.000
|
||
|
We can really just get there.
|
||
|
|
||
|
52:02.000 --> 52:05.000
|
||
|
Right, but the software and the learning is also important.
|
||
|
|
||
|
52:05.000 --> 52:09.000
|
||
|
You could have a trillion connections, but if you didn't have something to learn from, it wouldn't be very effective.
|
||
|
|
||
|
52:09.000 --> 52:12.000
|
||
|
So we actually have to be able to collect all this data.
|
||
|
|
||
|
52:12.000 --> 52:19.000
|
||
|
All the data, you have to be able to collect all the data. See, so we actually have to be able to collect all this data.
|
||
|
|
||
|
52:19.000 --> 52:20.000
|
||
|
There is a.
|
||
|
|
||
|
52:20.000 --> 52:25.000
|
||
|
A cute video of Jordan Peterson interviewing some.
|
||
|
|
||
|
52:26.000 --> 52:33.000
|
||
|
Entrepreneurial medical guy in Canada who's doing some imaging thing with AI software.
|
||
|
|
||
|
52:33.000 --> 52:39.000
|
||
|
And they wax intellectual about it for about 10 minutes and then the guy on the other side of the table just says to Jordan.
|
||
|
|
||
|
52:39.000 --> 52:42.000
|
||
|
Yeah, but I mean, we just we just had to teach it.
|
||
|
|
||
|
52:42.000 --> 52:52.000
|
||
|
We had to go basically like capture things, you know, and click on and teach it where the tumors were and what the tumors weren't until.
|
||
|
|
||
|
52:52.000 --> 52:57.000
|
||
|
Circle the tumor and circle the tumor thousands and thousands of times until the AI finally.
|
||
|
|
||
|
52:57.000 --> 53:03.000
|
||
|
Puts together a neural neural network that seems to recognize the same pattern that you've been teaching it.
|
||
|
|
||
|
53:03.000 --> 53:08.000
|
||
|
There's no rhyme or reason to it. It's brute force every time.
|
||
|
|
||
|
53:08.000 --> 53:15.000
|
||
|
And so the kind of breakthrough that he thinks is magically going to happen is not going to happen.
|
||
|
|
||
|
53:16.000 --> 53:26.000
|
||
|
And it's it's shocking to me that more people aren't aware of it. I understand that there are a lot of people who are, but it seems to me that even in this.
|
||
|
|
||
|
53:26.000 --> 53:33.000
|
||
|
This rudimentary presentation, there should be more people in the audience getting up and walking out or, you know.
|
||
|
|
||
|
53:33.000 --> 53:38.000
|
||
|
Afterward leaving with extreme skepticism.
|
||
|
|
||
|
53:38.000 --> 53:44.000
|
||
|
So we do it on the web and so on. I mean, we've been collecting stuff on the web for several decades.
|
||
|
|
||
|
53:44.000 --> 53:51.000
|
||
|
That's really what we're depending on to be able to train these large language models.
|
||
|
|
||
|
53:51.000 --> 53:55.000
|
||
|
And we shouldn't actually call them large language models because they deal with much more than language.
|
||
|
|
||
|
53:55.000 --> 54:03.000
|
||
|
I mean, it's language, but you can add pictures. You can add things that affect disease that there's nothing to do with language.
|
||
|
|
||
|
54:03.000 --> 54:11.000
|
||
|
In fact, we're using now simulated biology to be able to simulate different ways to affect disease.
|
||
|
|
||
|
54:11.000 --> 54:17.000
|
||
|
And that's something to do with language. But they really should be called large event models.
|
||
|
|
||
|
54:17.000 --> 54:22.000
|
||
|
Do you think there's anything that happens inside of our brains that cannot be captured by a computation and by math?
|
||
|
|
||
|
54:22.000 --> 54:25.000
|
||
|
No, I mean, what would that be?
|
||
|
|
||
|
54:25.000 --> 54:30.000
|
||
|
Okay, quick follow the audience. Raise your hand if you think there's something in your brain that cannot be captured by a computation or math.
|
||
|
|
||
|
54:30.000 --> 54:33.000
|
||
|
Like a soul. All right, so convince them that they're wrong, right?
|
||
|
|
||
|
54:33.000 --> 54:37.000
|
||
|
I mean, consciousness is very important, but it's actually not scientific.
|
||
|
|
||
|
54:37.000 --> 54:42.000
|
||
|
There's no way I could slide somebody in and the light will go on. Oh, this one's conscious. No, this one's not.
|
||
|
|
||
|
54:42.000 --> 54:47.000
|
||
|
It's not scientific, but it's actually extremely important.
|
||
|
|
||
|
54:47.000 --> 54:56.000
|
||
|
I think that that's actually from, you know, I'm an armchair neurobiologist at this stage and I'm just going to throw my own personal opinion out there.
|
||
|
|
||
|
54:56.000 --> 55:04.000
|
||
|
For me, it's not very hard to draw the line between conscious and unconscious and not conscious.
|
||
|
|
||
|
55:05.000 --> 55:25.000
|
||
|
Any animal that can make some rudimentary avoidance of pain has a certain level of consciousness, but the consciousness that he's talking about where one tries to understand or empathize with other beings.
|
||
|
|
||
|
55:25.000 --> 55:29.000
|
||
|
That level of consciousness is definitely not unique to us.
|
||
|
|
||
|
55:29.000 --> 55:39.000
|
||
|
Anybody that has a dog knows that dogs are trying to figure out how we feel and responding to it and respond differently depending on how we feel.
|
||
|
|
||
|
55:39.000 --> 55:45.000
|
||
|
If anybody can tell whether I'm angry when I come into the room, it's definitely Ruby.
|
||
|
|
||
|
55:45.000 --> 56:08.000
|
||
|
And so we're, we're dealing with a scenario here where this is purposefully confusing. That's part of the, the sort of, sort of mental gymnastics that are done by somebody like myself who does does 20 years of work on mouse lines, where you have to sacrifice an animal every day.
|
||
|
|
||
|
56:08.000 --> 56:16.000
|
||
|
And we even use words like sacrifice instead of instead of kill or dissect.
|
||
|
|
||
|
56:16.000 --> 56:21.000
|
||
|
And even sacrifice is kind of a weird word if you think about it.
|
||
|
|
||
|
56:21.000 --> 56:28.000
|
||
|
But I had to use an animal every day that I wanted to make these recordings to look at the way these neurons are connected to study the synapses live.
|
||
|
|
||
|
56:29.000 --> 56:41.000
|
||
|
I took my work seriously after I did that because there was an animal that I use that day. And I got really pissed if I didn't get data after using an animal like that.
|
||
|
|
||
|
56:41.000 --> 56:54.000
|
||
|
But every day that I did it was it weighed on me and made it made an impression on me as something that I better be have a good reason for doing this and I thought I did.
|
||
|
|
||
|
56:54.000 --> 57:02.000
|
||
|
But in retrospect, when I look back on it, I still kind of think I do because, again, the level of consciousness is something we should talk about.
|
||
|
|
||
|
57:02.000 --> 57:10.000
|
||
|
And I think it is worth thinking about. But the idea that a mouse isn't conscious is just ridiculous.
|
||
|
|
||
|
57:10.000 --> 57:21.000
|
||
|
It might have a limited spectrum of consciousness. It might have a limited spectrum of experience that might vary little compared to how ours does.
|
||
|
|
||
|
57:21.000 --> 57:28.000
|
||
|
It might have very few facets relative to what ours we feel has.
|
||
|
|
||
|
57:28.000 --> 57:33.000
|
||
|
But in reality, we can't ever know that.
|
||
|
|
||
|
57:33.000 --> 57:40.000
|
||
|
That's talking that consciousness isn't scientific investigating it is probably very hard.
|
||
|
|
||
|
57:40.000 --> 57:50.000
|
||
|
But thinking that we don't understand what consciousness is or what it might not be is ridiculous.
|
||
|
|
||
|
57:50.000 --> 57:57.000
|
||
|
I mean, even fish can be trained to respond to light and respond to sounds. And they remember certain people.
|
||
|
|
||
|
57:57.000 --> 58:03.000
|
||
|
It's very hard to say that a fish is not conscious. It's not super smart.
|
||
|
|
||
|
58:03.000 --> 58:13.000
|
||
|
It's not having a terribly traumatic experience if one day versus another day happens because I don't know how to really adequately express it.
|
||
|
|
||
|
58:13.000 --> 58:16.000
|
||
|
But what we're dealing with here is bamboozlement.
|
||
|
|
||
|
58:16.000 --> 58:28.000
|
||
|
It is absolutely discarding the sacred aspect of everything alive on earth and trying to make it easier for you to discount the vast majority of life as being separate from us.
|
||
|
|
||
|
58:28.000 --> 58:41.000
|
||
|
Instead of acknowledging the hard truth that the more you're around animals, if you talk to anybody that loves animals, anybody that works with animals, you're never going to get a dairy farmer to tell you that his cows aren't conscious.
|
||
|
|
||
|
58:41.000 --> 58:55.000
|
||
|
You're never going to get a goat farmer to tell you his goats aren't conscious. You're never going to get a dog enthusiast or a cat enthusiast to tell you that their main coons aren't conscious.
|
||
|
|
||
|
58:55.000 --> 59:05.000
|
||
|
And yet here he is trying to say that we can't really think about it or study it or whatever. It's a light bulb. It's either on or it's not and it's on and a lot of things on earth.
|
||
|
|
||
|
59:05.000 --> 59:09.000
|
||
|
Let's just face it.
|
||
|
|
||
|
59:09.000 --> 59:19.000
|
||
|
And we have sort of a moral obligation to acknowledge that and I think that most people do in their daily lives but we don't as we should.
|
||
|
|
||
|
59:19.000 --> 59:27.000
|
||
|
I'm not arguing for veganism or anything like that I am arguing for humanely treating animals that we're eventually going to eat of course.
|
||
|
|
||
|
59:27.000 --> 59:30.000
|
||
|
But they're conscious.
|
||
|
|
||
|
59:30.000 --> 59:33.000
|
||
|
It's extraordinary these people.
|
||
|
|
||
|
59:33.000 --> 59:40.000
|
||
|
And another question, why am I me? How come what happens to me? I'm conscious of and I'm not conscious of what happens to you.
|
||
|
|
||
|
59:40.000 --> 59:44.000
|
||
|
These are deeply mysterious things but they're really not conscious.
|
||
|
|
||
|
59:44.000 --> 59:52.000
|
||
|
So Marvin Minsky who was my mentor for 50 years, he said it's not scientific and therefore we shouldn't bother with it and any discussion of consciousness he would kind of dismiss.
|
||
|
|
||
|
59:52.000 --> 59:57.000
|
||
|
But he actually did his reaction to people was totally dependent on whether he felt they were conscious or not.
|
||
|
|
||
|
59:57.000 --> 01:00:05.000
|
||
|
So he actually did use that but it's not something that we're ignoring because there's no way to tell whether something's conscious.
|
||
|
|
||
|
01:00:05.000 --> 01:00:10.000
|
||
|
And it's so weird because that's ridiculous. There's no way to tell if something's conscious.
|
||
|
|
||
|
01:00:10.000 --> 01:00:16.000
|
||
|
I think what he's saying is there's no way to tell if some particular thought is conscious or not.
|
||
|
|
||
|
01:00:16.000 --> 01:00:31.000
|
||
|
Because now the interview is going to say something a little bit ridiculous and that's going to throw everything off and then actually curse while I was going to turn around and make it about AI again and linking people and making.
|
||
|
|
||
|
01:00:31.000 --> 01:00:42.000
|
||
|
And so it's very frustrating of course it's very frustrating I find it also very frustrating because at the heart of consciousness is this sacredness of creation.
|
||
|
|
||
|
01:00:42.000 --> 01:00:49.000
|
||
|
Being able to humbly acknowledge that we aren't the only conscious beings on earth might be a good step in the right direction.
|
||
|
|
||
|
01:00:49.000 --> 01:00:55.000
|
||
|
And it's certainly not a step that these people want us to take.
|
||
|
|
||
|
01:00:55.000 --> 01:01:00.000
|
||
|
Unless it gets us to stop eating meat I guess then that would be okay.
|
||
|
|
||
|
01:01:00.000 --> 01:01:04.000
|
||
|
And then we don't know what we'll discover. There's really no way to tell whether or not something's conscious.
|
||
|
|
||
|
01:01:04.000 --> 01:01:07.000
|
||
|
This is not conscious and the jealousy right there is conscious.
|
||
|
|
||
|
01:01:07.000 --> 01:01:11.000
|
||
|
How do you approve that?
|
||
|
|
||
|
01:01:11.000 --> 01:01:17.000
|
||
|
I mean we kind of agree with humans. The humans are conscious. Some humans are conscious.
|
||
|
|
||
|
01:01:17.000 --> 01:01:23.000
|
||
|
But how about animals? We have a big disagreement. Some people say animals are not conscious.
|
||
|
|
||
|
01:01:23.000 --> 01:01:26.000
|
||
|
A lot of people think animals are conscious. Maybe some animals are conscious and others are not.
|
||
|
|
||
|
01:01:26.000 --> 01:01:28.000
|
||
|
There's no way to prove that.
|
||
|
|
||
|
01:01:29.000 --> 01:01:37.000
|
||
|
So we have a infinite spectrum of gender but we can't have a spectrum of consciousness at all. He's that dumb.
|
||
|
|
||
|
01:01:37.000 --> 01:01:40.000
|
||
|
He's that kind of compartmentalized.
|
||
|
|
||
|
01:01:40.000 --> 01:01:45.000
|
||
|
I'm sure he accepts the spectrum of gender right.
|
||
|
|
||
|
01:01:45.000 --> 01:01:57.000
|
||
|
So in his infinite wisdom where he's been right for 80 years this old fart can't come up with the idea that maybe there's a spectrum of consciousness and it includes a lot of different animals.
|
||
|
|
||
|
01:01:57.000 --> 01:02:05.000
|
||
|
And it definitely includes animals that experience fear or are motivated by hunger and pain.
|
||
|
|
||
|
01:02:05.000 --> 01:02:21.000
|
||
|
Like come on. I mean it's just laughable that he spits these kinds of words out in front of this audience that can't rebut him in front of a guy who apparently can't quite grasp how bad it is with this guy saying.
|
||
|
|
||
|
01:02:21.000 --> 01:02:30.000
|
||
|
But it's just awful. It's just awful nonsense and it's disrespectful to everything that we should have reverence for.
|
||
|
|
||
|
01:02:30.000 --> 01:02:35.000
|
||
|
I want to run down this consciousness question but before we do that I want to make sure I understood your previous answer correctly.
|
||
|
|
||
|
01:02:35.000 --> 01:02:44.000
|
||
|
So the feeling I get with being in love or the feeling any emotion that I get could eventually be represented in math in a large language model.
|
||
|
|
||
|
01:02:44.000 --> 01:02:54.000
|
||
|
Certainly the behavior, the feelings that you have if you're with somebody that you love is definitely dependent on what the connections do you can tell whether or not that's happening.
|
||
|
|
||
|
01:02:54.000 --> 01:02:57.000
|
||
|
All right, I'm back.
|
||
|
|
||
|
01:02:57.000 --> 01:03:13.000
|
||
|
Is everybody here convinced. So this is of course no we're not convinced of course he's waving his hand at the fact that we can see a signal and MRI when somebody sees a picture of their spouse or a picture of somebody they're just sexually attracted to that may or may not be the case.
|
||
|
|
||
|
01:03:14.000 --> 01:03:21.000
|
||
|
But if that is the case it's because we train the equipment to being able to measure this person over time.
|
||
|
|
||
|
01:03:21.000 --> 01:03:31.000
|
||
|
We have a zero baseline we do that we show them these pictures over and over and over again and we take those signals and we average them across time and across exposures.
|
||
|
|
||
|
01:03:31.000 --> 01:03:42.000
|
||
|
And then if we see a significant difference we say see there's a there's a significant signal in this brain region or that brain region in this person when they see their spouse or when they see someone they're just sexually attracted to.
|
||
|
|
||
|
01:03:42.000 --> 01:03:50.000
|
||
|
And then he equates that as sure yeah that's no problem we could definitely make a model of that.
|
||
|
|
||
|
01:03:50.000 --> 01:04:03.000
|
||
|
And what is that the equivalent of that's the equivalent of having an infrared camera pointed at your car that's running at full speed in park in your driveway and then saying yeah see it's getting hot right over there.
|
||
|
|
||
|
01:04:04.000 --> 01:04:13.000
|
||
|
And so we can pretty much be sure that you know we can make a model of that we know how the car works now.
|
||
|
|
||
|
01:04:13.000 --> 01:04:23.000
|
||
|
And that's the resolution too right oh yeah that this this left quarter panels getting warmer than the right quarter panel so most of the power must be coming from this side.
|
||
|
|
||
|
01:04:23.000 --> 01:04:31.000
|
||
|
There's also a lot of heat coming out of the back so maybe that's thrust like it's just ridiculous.
|
||
|
|
||
|
01:04:31.000 --> 01:04:40.000
|
||
|
And anytime somebody who's not a neuroscientist talks about what we know about the brain oftentimes I just get irate so you're going to have to apologize.
|
||
|
|
||
|
01:04:40.000 --> 01:04:44.000
|
||
|
I'm going to have to apologize ahead of time if I do become irate.
|
||
|
|
||
|
01:04:44.000 --> 01:04:45.000
|
||
|
Not entirely.
|
||
|
|
||
|
01:04:45.000 --> 01:04:48.000
|
||
|
All right well close enough so you don't think that it's worth trying to define.
|
||
|
|
||
|
01:04:48.000 --> 01:04:55.000
|
||
|
You spend a fair amount of your book giving different arguments about what consciousness means but it seems like you're arguing on stage that we shouldn't try to define it.
|
||
|
|
||
|
01:04:55.000 --> 01:04:58.000
|
||
|
There's no way to actually prove it. I mean we have certain agreements.
|
||
|
|
||
|
01:04:58.000 --> 01:05:04.000
|
||
|
I agree that all of your conscious you actually made it into this room so that's a pretty good indication that you're conscious.
|
||
|
|
||
|
01:05:04.000 --> 01:05:06.000
|
||
|
But that's not a proof.
|
||
|
|
||
|
01:05:06.000 --> 01:05:10.000
|
||
|
And there may be human beings that don't seem quite conscious at the time.
|
||
|
|
||
|
01:05:10.000 --> 01:05:12.000
|
||
|
Are they conscious or not and animals.
|
||
|
|
||
|
01:05:12.000 --> 01:05:16.000
|
||
|
What are we talking about we're not conscious when we're sleeping.
|
||
|
|
||
|
01:05:16.000 --> 01:05:19.000
|
||
|
This is just foolishness.
|
||
|
|
||
|
01:05:19.000 --> 01:05:22.000
|
||
|
It's just foolish talk it really is.
|
||
|
|
||
|
01:05:22.000 --> 01:05:28.000
|
||
|
To a neuroscientist consciousness is differentiated from sleep.
|
||
|
|
||
|
01:05:28.000 --> 01:05:32.000
|
||
|
It's just silly talk this is just silly talk.
|
||
|
|
||
|
01:05:32.000 --> 01:05:40.000
|
||
|
That's part of the reason why I find it so frustrating it's just silly talk and it's contorting a language that we already understand.
|
||
|
|
||
|
01:05:40.000 --> 01:05:46.000
|
||
|
We already understand what consciousness is we already understand that it's different than sleep.
|
||
|
|
||
|
01:05:46.000 --> 01:05:52.000
|
||
|
So you can basically draw a line somewhere between animals that sleep and don't sleep in terms of defining consciousness.
|
||
|
|
||
|
01:05:52.000 --> 01:05:56.000
|
||
|
Now the depth of consciousness the depth of perception the depth of understanding.
|
||
|
|
||
|
01:05:56.000 --> 01:06:02.000
|
||
|
These are all things that we can talk about discuss philosophize about maybe even attempt to measure or make a model about.
|
||
|
|
||
|
01:06:02.000 --> 01:06:05.000
|
||
|
But what he's talking about right now is pure bullshit.
|
||
|
|
||
|
01:06:05.000 --> 01:06:09.000
|
||
|
I think elephants and whales are conscious but not everybody agrees with that.
|
||
|
|
||
|
01:06:09.000 --> 01:06:10.000
|
||
|
So at what point can we.
|
||
|
|
||
|
01:06:10.000 --> 01:06:15.000
|
||
|
Who doesn't agree with elephants being conscious when they mourn their dead.
|
||
|
|
||
|
01:06:16.000 --> 01:06:22.000
|
||
|
Who doesn't agree with whales being conscious when they can remember each other.
|
||
|
|
||
|
01:06:22.000 --> 01:06:25.000
|
||
|
I mean.
|
||
|
|
||
|
01:06:31.000 --> 01:06:39.000
|
||
|
It's very frustrating because this is really just desecrating our understanding of biology insulting our understanding of biology.
|
||
|
|
||
|
01:06:39.000 --> 01:06:42.000
|
||
|
The things that we do know and do understand.
|
||
|
|
||
|
01:06:42.000 --> 01:06:49.000
|
||
|
They aren't very deep and it ain't very much but we definitely know this.
|
||
|
|
||
|
01:06:49.000 --> 01:06:57.000
|
||
|
And essentially how long will it be until we can essentially download the entire contents of your brain and express it through some kind of a machine.
|
||
|
|
||
|
01:06:57.000 --> 01:07:01.000
|
||
|
That's actually an important question because we're going to talk about longevity.
|
||
|
|
||
|
01:07:01.000 --> 01:07:04.000
|
||
|
We're going to get to a point where we have longevity escape velocity.
|
||
|
|
||
|
01:07:04.000 --> 01:07:05.000
|
||
|
It's not that far away.
|
||
|
|
||
|
01:07:05.000 --> 01:07:08.000
|
||
|
I think if you're diligent you'll be able to achieve that by 2029.
|
||
|
|
||
|
01:07:08.000 --> 01:07:10.000
|
||
|
That's only five or six years from now.
|
||
|
|
||
|
01:07:10.000 --> 01:07:14.000
|
||
|
So right now you go through here, use up a year of your longevity.
|
||
|
|
||
|
01:07:14.000 --> 01:07:17.000
|
||
|
But you get back from scientific progress right now about four months.
|
||
|
|
||
|
01:07:17.000 --> 01:07:19.000
|
||
|
But scientific progress is on an exponential curve.
|
||
|
|
||
|
01:07:19.000 --> 01:07:21.000
|
||
|
It's going to speed up every year.
|
||
|
|
||
|
01:07:21.000 --> 01:07:26.000
|
||
|
And by 2029 if you're diligent you'll use up a year of your longevity with the year passing but you'll get back a full year.
|
||
|
|
||
|
01:07:26.000 --> 01:07:28.000
|
||
|
In past 2029 you'll get back more than a year.
|
||
|
|
||
|
01:07:28.000 --> 01:07:30.000
|
||
|
So you'll actually go backwards in time.
|
||
|
|
||
|
01:07:31.000 --> 01:07:40.000
|
||
|
Now that's not a guarantee of infinite life because you could have a ten year old and computers longevity as many, many decades and you can die tomorrow.
|
||
|
|
||
|
01:07:40.000 --> 01:07:47.000
|
||
|
But what's important about actually capturing everything in your brain, we can't do that today and we won't be able to do that in five years.
|
||
|
|
||
|
01:07:47.000 --> 01:07:50.000
|
||
|
But you will be able to do that by the singularity which is 2045.
|
||
|
|
||
|
01:07:50.000 --> 01:07:54.000
|
||
|
What does it say about a man if he wears a gold Mickey Mouse watch?
|
||
|
|
||
|
01:08:02.000 --> 01:08:10.000
|
||
|
By 2029 you will be able, you basically won't age anymore.
|
||
|
|
||
|
01:08:10.000 --> 01:08:13.000
|
||
|
I don't even understand what this is.
|
||
|
|
||
|
01:08:13.000 --> 01:08:15.000
|
||
|
It's just so, so dumb.
|
||
|
|
||
|
01:08:15.000 --> 01:08:19.000
|
||
|
And so at that point you can actually go inside the brain and capture everything in there.
|
||
|
|
||
|
01:08:19.000 --> 01:08:24.000
|
||
|
Now you're thinking it's going to be a combination of capture everything in there.
|
||
|
|
||
|
01:08:24.000 --> 01:08:31.000
|
||
|
By 2045 they think that they can just go into your brain and scan through it and capture everything that's in there.
|
||
|
|
||
|
01:08:31.000 --> 01:08:51.000
|
||
|
If the computer is on, do you think there would be a way for you to scan everything that was in a computer and get it all out?
|
||
|
|
||
|
01:08:51.000 --> 01:08:54.000
|
||
|
You know, just how long would that take?
|
||
|
|
||
|
01:08:55.000 --> 01:09:00.000
|
||
|
And with a computer, right, you know, you have a particular interface that's built for that.
|
||
|
|
||
|
01:09:00.000 --> 01:09:02.000
|
||
|
What interface are they going to use for that?
|
||
|
|
||
|
01:09:02.000 --> 01:09:09.000
|
||
|
They're just going to scan through the brain with some kind of non-invasive super camera.
|
||
|
|
||
|
01:09:09.000 --> 01:09:12.000
|
||
|
I mean, these are just, these are just bullshit stories.
|
||
|
|
||
|
01:09:12.000 --> 01:09:18.000
|
||
|
It's just absolutely absurd that he would say something like that that you're going to be able to capture everything that's in there.
|
||
|
|
||
|
01:09:19.000 --> 01:09:23.000
|
||
|
So do we use all of our brain or only 10% of it?
|
||
|
|
||
|
01:09:23.000 --> 01:09:28.000
|
||
|
How are you going to pull the 10% of the brain that we use out of the 90% that we don't?
|
||
|
|
||
|
01:09:28.000 --> 01:09:36.000
|
||
|
How will you know the difference? Can you see it?
|
||
|
|
||
|
01:09:36.000 --> 01:09:42.000
|
||
|
Don't you understand how just ridiculous it is that these are the kinds of people that have gotten us into the pandemic?
|
||
|
|
||
|
01:09:42.000 --> 01:09:45.000
|
||
|
These are the kinds of people.
|
||
|
|
||
|
01:09:45.000 --> 01:09:58.000
|
||
|
I think Rochelle Walinski looks up to this guy that Tony Pouchy doesn't look up to this guy that Kevin Esfelt doesn't look up to this guy that all these transhumanists freaks don't look up to a guy like this.
|
||
|
|
||
|
01:09:58.000 --> 01:10:06.000
|
||
|
I would be willing to bet that that Elon Musk would cite this guy as one of the visionaries that he looks up to.
|
||
|
|
||
|
01:10:06.000 --> 01:10:08.000
|
||
|
I bet Doll is to donuts on it.
|
||
|
|
||
|
01:10:08.000 --> 01:10:15.000
|
||
|
The amount you get from computation, which will add to your thing, and that's automatically captured.
|
||
|
|
||
|
01:10:15.000 --> 01:10:20.000
|
||
|
I mean, right now, anything that you have in a computer is automatically captured today.
|
||
|
|
||
|
01:10:20.000 --> 01:10:25.000
|
||
|
And the kind of additional thinking we will have by adding to our brain, that will be captured.
|
||
|
|
||
|
01:10:25.000 --> 01:10:32.000
|
||
|
But the connections that we have in the brain that we start with will still have that.
|
||
|
|
||
|
01:10:32.000 --> 01:10:35.000
|
||
|
That's not captured today, but that will be captured in 2045.
|
||
|
|
||
|
01:10:35.000 --> 01:10:37.000
|
||
|
We'll go inside the brain and capture that as well.
|
||
|
|
||
|
01:10:37.000 --> 01:10:41.000
|
||
|
And therefore, we'll actually capture the entire brain, which will be backed up.
|
||
|
|
||
|
01:10:41.000 --> 01:10:48.000
|
||
|
So even if you get wiped out, you walk into a bomb and it explodes, we can actually recreate everything that was in your brain by 2045.
|
||
|
|
||
|
01:10:48.000 --> 01:10:52.000
|
||
|
It's one of the implications of the singularity.
|
||
|
|
||
|
01:10:52.000 --> 01:11:02.000
|
||
|
Now, that doesn't absolutely guarantee because the world could blow up and all the computer, all the things that contain computers could blow up.
|
||
|
|
||
|
01:11:02.000 --> 01:11:06.000
|
||
|
So you wouldn't be able to recreate that.
|
||
|
|
||
|
01:11:06.000 --> 01:11:10.000
|
||
|
We never actually get to a point where we absolutely guarantee that you live forever.
|
||
|
|
||
|
01:11:10.000 --> 01:11:16.000
|
||
|
But most of the things that right now would upset capturing that will be overcome by that time.
|
||
|
|
||
|
01:11:16.000 --> 01:11:18.000
|
||
|
There's a lot there, right?
|
||
|
|
||
|
01:11:18.000 --> 01:11:22.000
|
||
|
Pamela says there won't be a malware-free version of Windows by 2029.
|
||
|
|
||
|
01:11:22.000 --> 01:11:25.000
|
||
|
Holy balls, is that dead on accurate, man.
|
||
|
|
||
|
01:11:25.000 --> 01:11:27.000
|
||
|
Let's start with state velocity.
|
||
|
|
||
|
01:11:27.000 --> 01:11:32.000
|
||
|
So do you think that anybody in this audience, in their current biological body, will live to be 500 years old?
|
||
|
|
||
|
01:11:32.000 --> 01:11:33.000
|
||
|
You're missing me?
|
||
|
|
||
|
01:11:33.000 --> 01:11:34.000
|
||
|
Yeah.
|
||
|
|
||
|
01:11:34.000 --> 01:11:35.000
|
||
|
Absolutely.
|
||
|
|
||
|
01:11:35.000 --> 01:11:40.000
|
||
|
I mean, if you're going to be alive in five years, and I imagine all of you will be alive in five years.
|
||
|
|
||
|
01:11:40.000 --> 01:11:44.000
|
||
|
If they're alive for five years, they will likely live to be 500 years old.
|
||
|
|
||
|
01:11:44.000 --> 01:11:47.000
|
||
|
If they're diligent, and I think the people in this audience will be diligent.
|
||
|
|
||
|
01:11:47.000 --> 01:11:48.000
|
||
|
All right.
|
||
|
|
||
|
01:11:48.000 --> 01:11:52.000
|
||
|
Well, you can drink whatever you want as long as you don't get run over tonight because you don't have to worry about decline.
|
||
|
|
||
|
01:11:52.000 --> 01:11:53.000
|
||
|
All right.
|
||
|
|
||
|
01:11:53.000 --> 01:11:54.000
|
||
|
So let me ask you a question.
|
||
|
|
||
|
01:11:54.000 --> 01:11:58.000
|
||
|
Yeah, we're going to spend a lot of time on what the singularity is, what it means, and what it'll be like, but I want to ask some questions that'll lead us up there.
|
||
|
|
||
|
01:11:58.000 --> 01:12:00.000
|
||
|
So I'm going to take this question from Mark Sternberg and modify it slightly.
|
||
|
|
||
|
01:12:00.000 --> 01:12:06.000
|
||
|
In the timeframe, AI will be able to do, or sufficiently sophisticated computers in your argument can do everything that the human brain can do.
|
||
|
|
||
|
01:12:06.000 --> 01:12:11.000
|
||
|
What will they not be able to do in the next 10 years?
|
||
|
|
||
|
01:12:11.000 --> 01:12:15.000
|
||
|
Well, one thing has to do with being creative.
|
||
|
|
||
|
01:12:15.000 --> 01:12:20.000
|
||
|
Some people, they'll be able to do everything that human can do, but they're not going to be able to create new knowledge.
|
||
|
|
||
|
01:12:20.000 --> 01:12:24.000
|
||
|
That's actually wrong because we can simulate, for example, biology.
|
||
|
|
||
|
01:12:24.000 --> 01:12:29.000
|
||
|
And the Moderna vaccine, for example, we didn't do it the usual way, which if somebody sits down and thinks, well, I think this might work.
|
||
|
|
||
|
01:12:29.000 --> 01:12:32.000
|
||
|
And then they try it out and it takes years to try it out and multiple people.
|
||
|
|
||
|
01:12:32.000 --> 01:12:35.000
|
||
|
And it's one person's idea about what might work.
|
||
|
|
||
|
01:12:35.000 --> 01:12:37.000
|
||
|
They actually listed everything that might work.
|
||
|
|
||
|
01:12:37.000 --> 01:12:40.000
|
||
|
And there was actually several billion different mRNA sequences.
|
||
|
|
||
|
01:12:40.000 --> 01:12:41.000
|
||
|
And they said, let's try them all.
|
||
|
|
||
|
01:12:41.000 --> 01:12:44.000
|
||
|
And they tried every single one by simulating biology.
|
||
|
|
||
|
01:12:44.000 --> 01:12:45.000
|
||
|
And that took two days.
|
||
|
|
||
|
01:12:45.000 --> 01:12:48.000
|
||
|
So one weekend, they tried out several billion different possibilities.
|
||
|
|
||
|
01:12:48.000 --> 01:12:50.000
|
||
|
And then they picked the one that turned out to be the best.
|
||
|
|
||
|
01:12:50.000 --> 01:12:56.000
|
||
|
And that actually was the Moderna vaccine up until today.
|
||
|
|
||
|
01:12:56.000 --> 01:13:06.000
|
||
|
According to the lady that did the TED talk that I covered over three years ago or four years ago, whatever the hell her name was, Melissa something or other, that's a complete bullshit lie.
|
||
|
|
||
|
01:13:06.000 --> 01:13:10.000
|
||
|
They got the sequence at 45 minutes later, they knew what they were going to do.
|
||
|
|
||
|
01:13:10.000 --> 01:13:14.000
|
||
|
What in the absolute Sam shit hell is he saying right there?
|
||
|
|
||
|
01:13:14.000 --> 01:13:16.000
|
||
|
They tested a billion things.
|
||
|
|
||
|
01:13:16.000 --> 01:13:17.000
|
||
|
Did he really say that?
|
||
|
|
||
|
01:13:17.000 --> 01:13:22.000
|
||
|
What kind of absolute nonsense is that?
|
||
|
|
||
|
01:13:22.000 --> 01:13:27.000
|
||
|
Melissa Moore just said it was 45 minutes.
|
||
|
|
||
|
01:13:27.000 --> 01:13:28.000
|
||
|
And them all.
|
||
|
|
||
|
01:13:28.000 --> 01:13:30.000
|
||
|
And they tried every single one by simulating biology.
|
||
|
|
||
|
01:13:30.000 --> 01:13:32.000
|
||
|
And that took two days.
|
||
|
|
||
|
01:13:32.000 --> 01:13:35.000
|
||
|
So one weekend, they tried out several billion different possibilities.
|
||
|
|
||
|
01:13:35.000 --> 01:13:37.000
|
||
|
And then they picked the one that turned out to be the best.
|
||
|
|
||
|
01:13:37.000 --> 01:13:44.000
|
||
|
And that actually was the several billion possibilities of, I guess, a molecule.
|
||
|
|
||
|
01:13:44.000 --> 01:13:50.000
|
||
|
I mean, a protein, the spike protein or variants of it, variants of different.
|
||
|
|
||
|
01:13:50.000 --> 01:13:53.000
|
||
|
What, what possibly did they test?
|
||
|
|
||
|
01:13:53.000 --> 01:13:56.000
|
||
|
Please tell me that I'm wrong here. What's going on?
|
||
|
|
||
|
01:13:56.000 --> 01:13:59.000
|
||
|
Billion different what?
|
||
|
|
||
|
01:13:59.000 --> 01:14:02.000
|
||
|
Like code on optimization.
|
||
|
|
||
|
01:14:02.000 --> 01:14:12.000
|
||
|
What is he talking about? This is crazy that he would even say these words on stage out loud. What kind of bullshit is this?
|
||
|
|
||
|
01:14:12.000 --> 01:14:17.000
|
||
|
Moderna vaccine up until today.
|
||
|
|
||
|
01:14:17.000 --> 01:14:24.000
|
||
|
Now, they did actually test it on humans. We'll be able to overcome that as well because we'll be able to test using simulated biology as well.
|
||
|
|
||
|
01:14:24.000 --> 01:14:28.000
|
||
|
They actually decided to test it. It's a little bit hard to give up testing on humans. We will do that.
|
||
|
|
||
|
01:14:28.000 --> 01:14:31.000
|
||
|
So you can actually try out every single one, pick the best one.
|
||
|
|
||
|
01:14:31.000 --> 01:14:36.000
|
||
|
And then you can try out that by testing on a million simulated humans and do that in a few days as well.
|
||
|
|
||
|
01:14:36.000 --> 01:14:39.000
|
||
|
And that's actually the future of how we're going to create medications for diseases.
|
||
|
|
||
|
01:14:39.000 --> 01:14:43.000
|
||
|
And there's lots of things going on now with cancer and other diseases that are using that.
|
||
|
|
||
|
01:14:43.000 --> 01:14:47.000
|
||
|
So that's a whole new method. This is actually starting now.
|
||
|
|
||
|
01:14:47.000 --> 01:14:54.000
|
||
|
It started writing with a regenerative vaccine. We did another cure for a mental disease.
|
||
|
|
||
|
01:14:54.000 --> 01:14:59.000
|
||
|
It's actually now in stage three trials. That's going to be how we create medications from now on.
|
||
|
|
||
|
01:14:59.000 --> 01:15:01.000
|
||
|
But what are the frontiers? What can we not do?
|
||
|
|
||
|
01:15:01.000 --> 01:15:06.000
|
||
|
So that's where computer is being created. And it's not just actually trying something that occurs to it.
|
||
|
|
||
|
01:15:06.000 --> 01:15:09.000
|
||
|
It makes a list of everything that's possible and tries it all.
|
||
|
|
||
|
01:15:09.000 --> 01:15:12.000
|
||
|
Is that creativity or is that just brute force with maximum capability?
|
||
|
|
||
|
01:15:12.000 --> 01:15:18.000
|
||
|
It's much better than any other form of creativity. And yes, it's creative because you're trying out every single possibility.
|
||
|
|
||
|
01:15:18.000 --> 01:15:21.000
|
||
|
And you're doing it very quickly. And you come up with something that we didn't have before.
|
||
|
|
||
|
01:15:21.000 --> 01:15:23.000
|
||
|
I mean, what else would creativity be?
|
||
|
|
||
|
01:15:23.000 --> 01:15:26.000
|
||
|
All right. So we're going to cross the frontier of creativity. What will we not cross?
|
||
|
|
||
|
01:15:26.000 --> 01:15:28.000
|
||
|
What are the challenges that would be outstanding in the next ten years?
|
||
|
|
||
|
01:15:28.000 --> 01:15:30.000
|
||
|
Well, we don't know everything. And we haven't gone through it.
|
||
|
|
||
|
01:15:30.000 --> 01:15:39.000
|
||
|
I mean, what's so bullshit about it is that there isn't an infinite design space that needs to be tested.
|
||
|
|
||
|
01:15:39.000 --> 01:15:45.000
|
||
|
You can't brute force reality like you brute force chess.
|
||
|
|
||
|
01:15:45.000 --> 01:15:55.000
|
||
|
You can't brute force reality like you brute force MRI and not MRI, but x-rays for tumors.
|
||
|
|
||
|
01:15:55.000 --> 01:15:59.000
|
||
|
You can't brute force reality.
|
||
|
|
||
|
01:15:59.000 --> 01:16:05.000
|
||
|
And basically that's what he's talking about here, right? You just test all the possibilities and find all the good shit.
|
||
|
|
||
|
01:16:05.000 --> 01:16:20.000
|
||
|
Like how silly this is. This is really the way that they have been pushing and using these weaponized piles of money to convince people to get on board with all this other nonsense like DEI.
|
||
|
|
||
|
01:16:20.000 --> 01:16:28.000
|
||
|
Because the singularities coming. And if you're not with us, you're going to get left behind.
|
||
|
|
||
|
01:16:28.000 --> 01:16:32.000
|
||
|
You better start climbing that ladder now.
|
||
|
|
||
|
01:16:32.000 --> 01:16:35.000
|
||
|
Who knows how high the ladder goes?
|
||
|
|
||
|
01:16:35.000 --> 01:16:47.000
|
||
|
Who knows how high you can climb? Maybe you can be giving presentations all around the world for a few decades about bullshit that's going to happen in the future and claim your.
|
||
|
|
||
|
01:16:47.000 --> 01:16:54.000
|
||
|
You've been right for 80 years because of this curve you draw.
|
||
|
|
||
|
01:16:54.000 --> 01:17:09.000
|
||
|
It's the same nonsense as the the cheaper and cheaper and cheaper to sequence DNA is somehow related to us being able to eventually understand it and manipulate it successfully in a healthy human it's just absurd.
|
||
|
|
||
|
01:17:09.000 --> 01:17:19.000
|
||
|
They're not correlated at all. The two processes aren't related to one another. They're not going to it's not progress towards an end point.
|
||
|
|
||
|
01:17:19.000 --> 01:17:30.000
|
||
|
And somehow or another he thinks that reality can be brute force to replace creativity with AI because AI can just try everything and come up with something new.
|
||
|
|
||
|
01:17:30.000 --> 01:17:45.000
|
||
|
I'm not saying that AI couldn't potentially be pointed in a right direction directed in the right way and then and then it would it would produce things that we didn't have that we really may may have taken longer to get to or had to wait to someone get that idea.
|
||
|
|
||
|
01:17:45.000 --> 01:18:00.000
|
||
|
But the way that he's describing it in the way that I think the interviewer is absolutely suspicious of is that brute force alone can achieve all of these things because it can't.
|
||
|
|
||
|
01:18:00.000 --> 01:18:09.000
|
||
|
This process it doesn't require some creativity to imagine what might work and we have to also be able to simulate it in a biochemical simulator.
|
||
|
|
||
|
01:18:09.000 --> 01:18:15.000
|
||
|
So we actually have to figure that out and we'll be using people for a while to do that. So we don't know everything.
|
||
|
|
||
|
01:18:15.000 --> 01:18:23.000
|
||
|
I mean to be able to do everything human being can do is one thing but it's so much we don't know that we want to find out and that requires creativity.
|
||
|
|
||
|
01:18:23.000 --> 01:18:29.000
|
||
|
That will require some kind of human creativity working with machines.
|
||
|
|
||
|
01:18:29.000 --> 01:18:34.000
|
||
|
Let's go back to what's going to happen to get us to the singularity. So clearly we have the chart that you showed on the Power Compute.
|
||
|
|
||
|
01:18:34.000 --> 01:18:37.000
|
||
|
It's been very steady moving straight up on a logarithmic scale and a straight line.
|
||
|
|
||
|
01:18:37.000 --> 01:18:40.000
|
||
|
There are a couple of other elements that you think are necessary to get to the singularity.
|
||
|
|
||
|
01:18:40.000 --> 01:18:46.000
|
||
|
One is the rise of nanobots and the other is the rise of brain-machine interfaces and both of those have gone more slowly than AI.
|
||
|
|
||
|
01:18:46.000 --> 01:18:55.000
|
||
|
So convince the audience that it would be slow because anytime you affect the human body a lot of people get a big consent about it.
|
||
|
|
||
|
01:18:55.000 --> 01:19:01.000
|
||
|
If we do something with computers we have a new algorithm or we increase the speed of it.
|
||
|
|
||
|
01:19:01.000 --> 01:19:09.000
|
||
|
Nobody really is concerned about it. You can do that. Nobody cares about any dangers in it.
|
||
|
|
||
|
01:19:09.000 --> 01:19:11.000
|
||
|
I mean that's the reality.
|
||
|
|
||
|
01:19:11.000 --> 01:19:12.000
|
||
|
There's some dangers that people care about.
|
||
|
|
||
|
01:19:12.000 --> 01:19:16.000
|
||
|
Yeah, but it goes very very quickly. There's one of the reasons it goes so fast.
|
||
|
|
||
|
01:19:16.000 --> 01:19:21.000
|
||
|
But if you're affecting the body we have all kinds of concerns that might affect it negatively.
|
||
|
|
||
|
01:19:21.000 --> 01:19:23.000
|
||
|
So we want to actually try it on people.
|
||
|
|
||
|
01:19:24.000 --> 01:19:28.000
|
||
|
Okay, so let's talk about these two things right away here.
|
||
|
|
||
|
01:19:28.000 --> 01:19:31.000
|
||
|
So I didn't really write much down, but I'm going to write this down a little bit.
|
||
|
|
||
|
01:19:31.000 --> 01:19:35.000
|
||
|
So two things that he needs crucial to the singularity happening.
|
||
|
|
||
|
01:19:35.000 --> 01:19:37.000
|
||
|
Number one are nanobots.
|
||
|
|
||
|
01:19:37.000 --> 01:19:46.000
|
||
|
Please don't forget that Chan and Zuck were talking about this with Huberman.
|
||
|
|
||
|
01:19:47.000 --> 01:20:00.000
|
||
|
And that actually CZI is very interested in being a pioneer in these nanobots that are going to be like cells that wander around and come back and report all kinds of stuff.
|
||
|
|
||
|
01:20:00.000 --> 01:20:06.000
|
||
|
I'm not really sure what they have in mind here other than something that's completely imaginary.
|
||
|
|
||
|
01:20:07.000 --> 01:20:21.000
|
||
|
Something kind of like the movie from the 90s where you get a little spaceship and drive around after you're miniaturized, but then only robots I guess that all transmit by 5G their little data and then now you can measure.
|
||
|
|
||
|
01:20:21.000 --> 01:20:29.000
|
||
|
I guess it's like, you know, temperature readers, but that all over your body or something like that that are reporting all the biochemical activity of that region.
|
||
|
|
||
|
01:20:29.000 --> 01:20:32.000
|
||
|
I don't know what kind of bullshit they're thinking of, but it's not going to happen.
|
||
|
|
||
|
01:20:32.000 --> 01:20:34.000
|
||
|
It can't happen.
|
||
|
|
||
|
01:20:34.000 --> 01:20:46.000
|
||
|
The only thing that can happen is a regular sampling of genetic data and regular tissue sampling from young children and, and, and from medical samples and medical remnants.
|
||
|
|
||
|
01:20:46.000 --> 01:20:50.000
|
||
|
That's, that's all that they can do.
|
||
|
|
||
|
01:20:50.000 --> 01:21:01.000
|
||
|
And this idea that, that, that nanobots are eventually going to be in your brain and going to be able to interact with your brain and read and write in your brain, either with nanobots or a mind brain interface.
|
||
|
|
||
|
01:21:01.000 --> 01:21:16.000
|
||
|
Please understand that if you can replace a joystick or a mouse with a brain mind interface, you are not doing anything special. Okay. That's not special.
|
||
|
|
||
|
01:21:16.000 --> 01:21:28.000
|
||
|
That in no way, shape or form means that you can decode the neuronal signals that were in no way, shape or form means you can read thoughts or intrude upon them.
|
||
|
|
||
|
01:21:28.000 --> 01:21:38.000
|
||
|
And even if you can allow a quadriplegic to replace a joystick and interact with a mouse even type on a keyboard.
|
||
|
|
||
|
01:21:38.000 --> 01:21:50.000
|
||
|
It doesn't mean that we are anywhere near understanding the way the brain works the way the brain encodes anything never mind being able to scan your brain in 2029 and gets everything there.
|
||
|
|
||
|
01:21:50.000 --> 01:21:56.000
|
||
|
And then in 2045 do the same thing again and scan everything that came in since.
|
||
|
|
||
|
01:21:56.000 --> 01:21:59.000
|
||
|
But he just said we could.
|
||
|
|
||
|
01:22:00.000 --> 01:22:19.000
|
||
|
And so we are really, really in trouble, ladies and gentlemen, because this religion of, of transhumanism this futurist mythology set is really, really something very dangerous to our grandchildren extremely dangerous to our grandchildren and the it's, it's being
|
||
|
|
||
|
01:22:19.000 --> 01:22:34.000
|
||
|
hoisted upon them now under the guise of a pandemic and under the guise of a gate of function virus and under the guise of, of pre on disease or the guys of a worst case scenario that never goes away.
|
||
|
|
||
|
01:22:34.000 --> 01:22:50.000
|
||
|
But in reality the worst case scenario is that these people are allowed to try and achieve these ridiculous and essentially these goals of abomination.
|
||
|
|
||
|
01:22:50.000 --> 01:22:52.000
|
||
|
It's what it is.
|
||
|
|
||
|
01:22:52.000 --> 01:23:04.000
|
||
|
In machine interfaces haven't moved an exponential curve isn't just because, you know, lots of people are concerned about the risk to humans. I mean, as you explain in the work in the book, they just don't work as well as they could.
|
||
|
|
||
|
01:23:04.000 --> 01:23:11.000
|
||
|
If we could try things out without having to test it, it would go a lot faster. I mean, that's the reason it goes slowly.
|
||
|
|
||
|
01:23:11.000 --> 01:23:23.000
|
||
|
But there's some thought now that we could actually figure out what's going on inside the brain and put things into the brain without actually going inside the brain, wouldn't need something like brain link.
|
||
|
|
||
|
01:23:23.000 --> 01:23:30.000
|
||
|
We could just, I mean, there's some tests where we can actually tell what's going on in the brain without actually putting something inside the brain.
|
||
|
|
||
|
01:23:30.000 --> 01:23:34.000
|
||
|
And that might actually be a way to do this much more quickly.
|
||
|
|
||
|
01:23:34.000 --> 01:23:42.000
|
||
|
But your prediction about the singularity depends, maybe I'm reading it wrong, not just on the continued exponential birth and compute, but on solving this particular problem to right.
|
||
|
|
||
|
01:23:42.000 --> 01:23:59.000
|
||
|
So on one hand, he believes that the connections in the brain are important, but on the other hand, he doesn't want to admit that how would you be able to read those connections at a nano scale at a, at a microscopic scale and then how would you be able to read those connections in quantity and, and, and quality.
|
||
|
|
||
|
01:24:00.000 --> 01:24:09.000
|
||
|
Because that's what would be necessary, right? It's not only the connections that are firing as a result of consciousness. Apparently, it's also the connections that underlie memory.
|
||
|
|
||
|
01:24:09.000 --> 01:24:14.000
|
||
|
So connections that are there, but aren't active electrically.
|
||
|
|
||
|
01:24:14.000 --> 01:24:29.000
|
||
|
So how would you read those connections without, without penetrating the skull with anything but energy or whatever I don't, it's just such nonsense. It's so beyond possible.
|
||
|
|
||
|
01:24:30.000 --> 01:24:43.000
|
||
|
And there's no technology in current existence that would belie the suggestion that we would ever even have the resolution of a, of a one millimeter by one millimeter resolution in the brain.
|
||
|
|
||
|
01:24:43.000 --> 01:24:55.000
|
||
|
Of reading anything and one millimeter by one millimeter is an impossible amount of brain material to even to even really adequately characterize during a preparation.
|
||
|
|
||
|
01:24:55.000 --> 01:25:07.000
|
||
|
The number of synapses that are inside of it, the number of connections are inside of it, the number of axons and connections from different neurons that go through just one millimeter by one millimeter of parts of the brain.
|
||
|
|
||
|
01:25:07.000 --> 01:25:24.000
|
||
|
And now he's claiming that we're going to have the resolution that's going to be so thorough and so fine that we're going to be able to actually capture the information that's present what is it just because we're going to take a really high resolution picture.
|
||
|
|
||
|
01:25:24.000 --> 01:25:36.000
|
||
|
And then what we're going to take a lot of real high resolution pictures of a lot of different people and eventually the AI will figure out how to read a brain.
|
||
|
|
||
|
01:25:36.000 --> 01:25:45.000
|
||
|
Yes, because we want to increase the amount of intelligence that humans can command. So we have to be able to marry the best computers with our actual brain.
|
||
|
|
||
|
01:25:45.000 --> 01:25:49.000
|
||
|
And why do we have to do that? Because like right now, there you go. I have my phone. In some ways, this augments my intelligence.
|
||
|
|
||
|
01:25:49.000 --> 01:26:00.000
|
||
|
It's very slow. I mean, if I ask you a question, you're going to have to type it in or speak it and it takes a while. I mean, I ask the question and then people fool around with the computer. It might take 15 seconds or 30 seconds.
|
||
|
|
||
|
01:26:00.000 --> 01:26:07.000
|
||
|
It's not like it just goes right into your brain. I mean, these are very useful. These are brain extenders. We didn't have these a little while ago.
|
||
|
|
||
|
01:26:07.000 --> 01:26:15.000
|
||
|
I'm telling you in my talks, I ask people who here has their phone. I'll bet here maybe there's one or two people, but everybody here has their phone.
|
||
|
|
||
|
01:26:15.000 --> 01:26:18.000
|
||
|
That wasn't two or five years ago. Definitely wasn't two or ten years ago.
|
||
|
|
||
|
01:26:18.000 --> 01:26:24.000
|
||
|
And it is a brain extender, but it does have some speed problems. So we want to increase that speed.
|
||
|
|
||
|
01:26:25.000 --> 01:26:32.000
|
||
|
The question could just come up where we're talking and the computer would instantly tell you what the answer is without having to fool around with an external device.
|
||
|
|
||
|
01:26:32.000 --> 01:26:38.000
|
||
|
And that's almost feasible today. And something like that would be helpful to do this.
|
||
|
|
||
|
01:26:38.000 --> 01:26:47.000
|
||
|
But that's still not what he's talking about at all. That's just a really fast computer, right? It's still just the joystick in the mouse and the keyboard replaced.
|
||
|
|
||
|
01:26:47.000 --> 01:26:52.000
|
||
|
So, you know, if you're reading those things, it can read that.
|
||
|
|
||
|
01:26:52.000 --> 01:27:07.000
|
||
|
You can you can wear something that would monitor your your speech region with a sufficient detail that after time over time of speaking out loud and having confirmation of what was being spoken.
|
||
|
|
||
|
01:27:08.000 --> 01:27:15.000
|
||
|
To read your brain to understand what you were thinking in a certain context or a certain conscious state that you could use them to interact with it.
|
||
|
|
||
|
01:27:15.000 --> 01:27:27.000
|
||
|
And so then yes, you could be talking to somebody and then at the same time that device could be reading what you're saying and what you're listening to what you're hearing and then bringing up relative interesting things that when you said,
|
||
|
|
||
|
01:27:27.000 --> 01:27:34.000
|
||
|
hey, what about this? It would already be there. Sure. Sure.
|
||
|
|
||
|
01:27:34.000 --> 01:27:47.000
|
||
|
Is that supremely augmenting the brain? What is that really? Where does that get us? It's still us thinking we're still directing all the thoughts and just that we're served up the answers quicker when we want an answer that we can't come up with.
|
||
|
|
||
|
01:27:47.000 --> 01:27:54.000
|
||
|
Would that be better or worse for us as as thinking beings?
|
||
|
|
||
|
01:27:55.000 --> 01:28:06.000
|
||
|
I think a lot of what has happened with technology is really good. I think that's why kids are better at sports than ever because they can imagine it easier when they can see it over and over again on on a video.
|
||
|
|
||
|
01:28:06.000 --> 01:28:13.000
|
||
|
And when we were kids, we had to watch the basketball once or twice a week and that's all we could see other than the people that we played with.
|
||
|
|
||
|
01:28:13.000 --> 01:28:23.000
|
||
|
So it'd be pretty hard for all of these fancy moves to go around the earth back in the old days, but now a fancy move can go around the earth in a few weeks or a few months.
|
||
|
|
||
|
01:28:23.000 --> 01:28:32.000
|
||
|
Skateboard moves and skateboard tricks go around the earth that fast, just gas, Rodney.
|
||
|
|
||
|
01:28:32.000 --> 01:28:42.000
|
||
|
And so we are we are dealing with enhancement. We are dealing with with the effects of technology, but it's not the singularity is something completely different.
|
||
|
|
||
|
01:28:42.000 --> 01:28:56.000
|
||
|
The singularity is very similar to Robert Malone saying that he thought when he was still a postdoc that within 10 years there would be a geneticist at every hospital curing childhood genetic disorders with retroviruses.
|
||
|
|
||
|
01:28:56.000 --> 01:29:02.000
|
||
|
That's why he decided to go that direction and to gene therapy.
|
||
|
|
||
|
01:29:02.000 --> 01:29:16.000
|
||
|
And we're still nowhere closer to using these things as as medical tools to cure the rarest of diseases with the with the most with the most horrible outcomes.
|
||
|
|
||
|
01:29:16.000 --> 01:29:24.000
|
||
|
So nothing about that that premonition that Robert Malone had as a young man is true, none of it yet.
|
||
|
|
||
|
01:29:24.000 --> 01:29:35.000
|
||
|
And so you should see the transfection rollout of the pandemic is forcing the inevitable we need to test this shit on people who were never going to figure out how this works.
|
||
|
|
||
|
01:29:35.000 --> 01:29:46.000
|
||
|
We need people to accept the fact that they are the only test subjects from which we can answer these questions which is exactly what he's saying here if we didn't have to test this stuff it would be great.
|
||
|
|
||
|
01:29:46.000 --> 01:29:57.000
|
||
|
The really what he really means if we could just use everybody as test subjects we'd figure this out very quick.
|
||
|
|
||
|
01:29:57.000 --> 01:30:13.000
|
||
|
But do you not get a lot of the good that you talk about if we just kept you problem with connecting our brains to the machines suddenly you're in this whole world complicated privacy issues where stuff has been injected my brain stuff from my brain is going elsewhere like you're opening up a whole host ethical moral existential problems can't you just make the phones a lot better.
|
||
|
|
||
|
01:30:13.000 --> 01:30:24.000
|
||
|
Well that's the idea that we can do that without having to go inside your brain but be able to tell what's going on in your brain externally without going inside the brain with some kind of device.
|
||
|
|
||
|
01:30:24.000 --> 01:30:33.000
|
||
|
All right well let's keep moving into the future so moving into the future we have exponential birth computer we solve a way of you know ideally figuring out how to communicate directly with your brain to speed things up explain my nanobots are essential to your vision of where we're going.
|
||
|
|
||
|
01:30:34.000 --> 01:30:43.000
|
||
|
Well if you really want to tell what's going on inside the brain you've got to be able to go at the level of the particles in the brain so we can actually tell what they're doing.
|
||
|
|
||
|
01:30:43.000 --> 01:30:52.000
|
||
|
And that's feasible we can't actually do it but we can show that it's feasible and that's one possibility.
|
||
|
|
||
|
01:30:52.000 --> 01:30:59.000
|
||
|
We're actually hoping that you could do this without actually affecting the brain at all.
|
||
|
|
||
|
01:31:00.000 --> 01:31:09.000
|
||
|
How in the hell are you going to measure the brain at the molecular level using nanobots I guess we can show it's possible what kind of bullshit line is that.
|
||
|
|
||
|
01:31:09.000 --> 01:31:17.000
|
||
|
How can it be possible that we are here right now where people like this are on stage and able to pontificate so so without reverence.
|
||
|
|
||
|
01:31:17.000 --> 01:31:21.000
|
||
|
He knows he's got to know he's full of shit.
|
||
|
|
||
|
01:31:21.000 --> 01:31:32.000
|
||
|
Sorry I'm swearing so much but I know that Eddie from from Canada is not watching this month because something I said in in in late March.
|
||
|
|
||
|
01:31:32.000 --> 01:31:39.000
|
||
|
And so he's really the only one ever complains about my swearing so since Eddie's not watching I'm swearing a little bit more than normal I apologize.
|
||
|
|
||
|
01:31:39.000 --> 01:31:43.000
|
||
|
They run around inside the brain their understanding had their extracting thoughts their inputting thoughts.
|
||
|
|
||
|
01:31:43.000 --> 01:31:46.000
|
||
|
Let's go to this nice question which is lovely from Louise Gondreva.
|
||
|
|
||
|
01:31:46.000 --> 01:31:52.000
|
||
|
What are the five main ethical questions that we will face as that happens?
|
||
|
|
||
|
01:31:52.000 --> 01:31:54.000
|
||
|
Is four enough?
|
||
|
|
||
|
01:31:54.000 --> 01:31:56.000
|
||
|
Four is fine.
|
||
|
|
||
|
01:31:56.000 --> 01:32:03.000
|
||
|
There might even be six right but you can give us four.
|
||
|
|
||
|
01:32:03.000 --> 01:32:14.000
|
||
|
I mean we're going to have a lot more power if we can actually with our own brain control computers does that give people too much power.
|
||
|
|
||
|
01:32:14.000 --> 01:32:24.000
|
||
|
Also I mean right now we talk about having a certain amount of value based on your on your talent.
|
||
|
|
||
|
01:32:24.000 --> 01:32:28.000
|
||
|
If you have a friend that can only move his face then neuro link is perfect for him.
|
||
|
|
||
|
01:32:28.000 --> 01:32:33.000
|
||
|
I don't I don't have any problem with that anymore than I have a I would never have a problem with somebody using.
|
||
|
|
||
|
01:32:33.000 --> 01:32:41.000
|
||
|
Transfection to treat cancer and to retrain tea cells or do whatever they do with transfection go for it.
|
||
|
|
||
|
01:32:41.000 --> 01:32:44.000
|
||
|
Especially if it's cancer they might kill you.
|
||
|
|
||
|
01:32:44.000 --> 01:32:47.000
|
||
|
That didn't respond to a metabolic type treatment.
|
||
|
|
||
|
01:32:47.000 --> 01:32:49.000
|
||
|
I mean all these things fine do it.
|
||
|
|
||
|
01:32:50.000 --> 01:33:01.000
|
||
|
And certainly if you're a paraplegic and and this is one of the only ways that you might be able to interact with a device but you don't need to implant it already we can do it without implanting.
|
||
|
|
||
|
01:33:01.000 --> 01:33:18.000
|
||
|
Already they could do it with without implanting I think and we have to be very careful about that because implants or implants right I don't think it is really bad idea to put something in somebody's brain and open the brain to infection.
|
||
|
|
||
|
01:33:18.000 --> 01:33:20.000
|
||
|
It's a very bad idea.
|
||
|
|
||
|
01:33:20.000 --> 01:33:35.000
|
||
|
This will give talent to people who otherwise don't have talent and talent won't be as important because you'll be able to gain talent just by merging with the right kind of large language model or whatever we call them.
|
||
|
|
||
|
01:33:35.000 --> 01:33:44.000
|
||
|
It's also same kind of arbitrary why we would give more power to somebody who has more talent because they didn't create that talent and just happen to have it.
|
||
|
|
||
|
01:33:44.000 --> 01:33:51.000
|
||
|
But everybody says we should give somebody who has talent in an area more power.
|
||
|
|
||
|
01:33:51.000 --> 01:33:54.000
|
||
|
This way you'd be able to gain talent.
|
||
|
|
||
|
01:33:54.000 --> 01:34:03.000
|
||
|
This is in the matrix if you wanted to fly an helicopter just by downloading the right software I suppose to spend a long time doing that.
|
||
|
|
||
|
01:34:03.000 --> 01:34:08.000
|
||
|
Is that fair or unfair.
|
||
|
|
||
|
01:34:08.000 --> 01:34:12.000
|
||
|
Wow that's really bullshit right I mean you heard it right.
|
||
|
|
||
|
01:34:12.000 --> 01:34:20.000
|
||
|
It was he really suggesting that people who have talent shouldn't be rewarded for it that there shouldn't be some kind of meritocracy for people who put in the work.
|
||
|
|
||
|
01:34:20.000 --> 01:34:23.000
|
||
|
The talent is not just born with it.
|
||
|
|
||
|
01:34:23.000 --> 01:34:26.000
|
||
|
What a jackass.
|
||
|
|
||
|
01:34:26.000 --> 01:34:30.000
|
||
|
What a jackass.
|
||
|
|
||
|
01:34:30.000 --> 01:34:36.000
|
||
|
And that's really very very extraordinary right I mean if you see it.
|
||
|
|
||
|
01:34:36.000 --> 01:34:41.000
|
||
|
Everybody says we should give I couldn't I can't believe it.
|
||
|
|
||
|
01:34:41.000 --> 01:34:49.000
|
||
|
Having a certain amount of value based on your on your talent.
|
||
|
|
||
|
01:34:49.000 --> 01:34:59.000
|
||
|
This will give talent to people who otherwise don't have talent and talent won't be as important because you'll be able to gain talent just by merging with the right kind of large language model.
|
||
|
|
||
|
01:34:59.000 --> 01:35:03.000
|
||
|
Or whatever we call them.
|
||
|
|
||
|
01:35:03.000 --> 01:35:10.000
|
||
|
And also same kind of arbitrary why we would give more power to somebody who has more talent because they didn't create that talent.
|
||
|
|
||
|
01:35:10.000 --> 01:35:20.000
|
||
|
Wow that's just disgusting they didn't create that talent so somebody who works their ass off from age eight to become a concert pianist he didn't work for that talent.
|
||
|
|
||
|
01:35:20.000 --> 01:35:34.000
|
||
|
Somebody who works from age four to become an excellent gymnast by the time she's 17 didn't earn that talent somebody who became a professional tennis player a professional pianist a professional whatever.
|
||
|
|
||
|
01:35:34.000 --> 01:35:39.000
|
||
|
A highly skilled person may be born with some talent.
|
||
|
|
||
|
01:35:40.000 --> 01:35:43.000
|
||
|
And they should be rewarded for it.
|
||
|
|
||
|
01:35:43.000 --> 01:35:56.000
|
||
|
They should be successful in contributing to the next generation they should be more influenced an influential on more people around them than people with less talent absolutely they should.
|
||
|
|
||
|
01:35:57.000 --> 01:36:17.000
|
||
|
And the idea that someday AI or something that you could plug into a hole in your head or or where as a hat could equalize talent by giving people that don't have talent talent is the longest con I've ever heard.
|
||
|
|
||
|
01:36:18.000 --> 01:36:23.000
|
||
|
What a boring world they are dreaming about Christy.
|
||
|
|
||
|
01:36:23.000 --> 01:36:30.000
|
||
|
But everybody says we should give somebody who has talent in an area more power.
|
||
|
|
||
|
01:36:30.000 --> 01:36:33.000
|
||
|
This way you'd be able to gain talent.
|
||
|
|
||
|
01:36:33.000 --> 01:36:42.000
|
||
|
This is in the matrix if you'd like to fly an helicopter just by downloading the right software I suppose to spending a lot of time doing that.
|
||
|
|
||
|
01:36:42.000 --> 01:36:46.000
|
||
|
Is that fair or unfair?
|
||
|
|
||
|
01:36:46.000 --> 01:36:55.000
|
||
|
I mean I think that would fall into the ethical challenge area.
|
||
|
|
||
|
01:36:55.000 --> 01:36:58.000
|
||
|
And it's not like we get to the end of this.
|
||
|
|
||
|
01:36:58.000 --> 01:37:03.000
|
||
|
This is finally what the singularity is all about and people can do certain things and they can't do other things but it's over.
|
||
|
|
||
|
01:37:03.000 --> 01:37:05.000
|
||
|
We'll never get to that point.
|
||
|
|
||
|
01:37:05.000 --> 01:37:08.000
|
||
|
I mean this curve is going to continue the other curve.
|
||
|
|
||
|
01:37:08.000 --> 01:37:11.000
|
||
|
It's going to continue indefinitely.
|
||
|
|
||
|
01:37:11.000 --> 01:37:28.000
|
||
|
So for example with nanotechnology we can create a computer where one liter computer would actually match the amount of power that all human beings today have with 10 to the 10th persons would all fit into one liter computer.
|
||
|
|
||
|
01:37:28.000 --> 01:37:31.000
|
||
|
Does that create ethical problems?
|
||
|
|
||
|
01:37:31.000 --> 01:37:34.000
|
||
|
So I mean a lot of the implications kind of run against.
|
||
|
|
||
|
01:37:34.000 --> 01:37:36.000
|
||
|
Talk about asking the wrong questions.
|
||
|
|
||
|
01:37:36.000 --> 01:37:37.000
|
||
|
I mean holy shit.
|
||
|
|
||
|
01:37:37.000 --> 01:37:40.000
|
||
|
We've been assuming about human beings.
|
||
|
|
||
|
01:37:40.000 --> 01:37:42.000
|
||
|
On the talent question which is super interesting.
|
||
|
|
||
|
01:37:42.000 --> 01:37:47.000
|
||
|
Do you feel like everybody when we get to 2040 will have equal capacities?
|
||
|
|
||
|
01:37:47.000 --> 01:37:56.000
|
||
|
I think it will be more different because we'll have different interests and you might be into some fantastic type of music and I might be into some kind of literature or something else.
|
||
|
|
||
|
01:37:56.000 --> 01:38:01.000
|
||
|
We're going to have different interests and so we'll excel.
|
||
|
|
||
|
01:38:01.000 --> 01:38:03.000
|
||
|
We don't have that now.
|
||
|
|
||
|
01:38:03.000 --> 01:38:04.000
|
||
|
Kind of bullshit is this.
|
||
|
|
||
|
01:38:04.000 --> 01:38:05.000
|
||
|
Come on.
|
||
|
|
||
|
01:38:05.000 --> 01:38:08.000
|
||
|
We don't have that now.
|
||
|
|
||
|
01:38:08.000 --> 01:38:27.000
|
||
|
We don't have different interests now how many people do you know are into patch clamp physiology flying kites collecting old books and knives and and and writing a electric unicycle and bicycles and a motorcycle and and into yes and and fish.
|
||
|
|
||
|
01:38:27.000 --> 01:38:28.000
|
||
|
And Star Wars.
|
||
|
|
||
|
01:38:28.000 --> 01:38:31.000
|
||
|
What what in the world is he talking about?
|
||
|
|
||
|
01:38:31.000 --> 01:38:33.000
|
||
|
Things depending on what your interests are.
|
||
|
|
||
|
01:38:33.000 --> 01:38:36.000
|
||
|
It sounds like we won't have the same amount of power.
|
||
|
|
||
|
01:38:36.000 --> 01:38:39.000
|
||
|
But we won't have fantastic power compared to what we have today.
|
||
|
|
||
|
01:38:39.000 --> 01:38:43.000
|
||
|
And if you're in Texas where there are no regulations you'll probably get it first instead of you and Massachusetts.
|
||
|
|
||
|
01:38:43.000 --> 01:38:45.000
|
||
|
Let me ask you another ethical question while we're on this one.
|
||
|
|
||
|
01:38:45.000 --> 01:38:49.000
|
||
|
So about a few minutes ago you mentioned the capacity to replicate someone's brain and bring them back.
|
||
|
|
||
|
01:38:49.000 --> 01:38:52.000
|
||
|
So let's say I do that with my father passed away six years ago sadly.
|
||
|
|
||
|
01:38:52.000 --> 01:38:56.000
|
||
|
I bring him back and I'm able to create a mind and a body just like my father's.
|
||
|
|
||
|
01:38:56.000 --> 01:38:57.000
|
||
|
It's exact.
|
||
|
|
||
|
01:38:57.000 --> 01:38:58.000
|
||
|
Perfect replica.
|
||
|
|
||
|
01:38:58.000 --> 01:38:59.000
|
||
|
All the thoughts.
|
||
|
|
||
|
01:38:59.000 --> 01:39:01.000
|
||
|
All the bills that he owed when he died.
|
||
|
|
||
|
01:39:01.000 --> 01:39:03.000
|
||
|
Because like that's a lot of money and a lot of book collectors call me.
|
||
|
|
||
|
01:39:03.000 --> 01:39:05.000
|
||
|
Do we have to pay those off or are we good?
|
||
|
|
||
|
01:39:05.000 --> 01:39:12.000
|
||
|
Well we're doing something like that with my daughter and you can read about this in her book and it's also in my book.
|
||
|
|
||
|
01:39:12.000 --> 01:39:16.000
|
||
|
We collected everything my father had written but he died when I was 22.
|
||
|
|
||
|
01:39:16.000 --> 01:39:21.000
|
||
|
So he's been dead for more than 50 years.
|
||
|
|
||
|
01:39:21.000 --> 01:39:28.000
|
||
|
And we fed that into a large language model and basically asked her the question of all the things he ever wrote.
|
||
|
|
||
|
01:39:28.000 --> 01:39:30.000
|
||
|
What best answers this question?
|
||
|
|
||
|
01:39:30.000 --> 01:39:32.000
|
||
|
And then you can put any question you want.
|
||
|
|
||
|
01:39:32.000 --> 01:39:33.000
|
||
|
And then you could talk to it.
|
||
|
|
||
|
01:39:33.000 --> 01:39:34.000
|
||
|
You'd say something.
|
||
|
|
||
|
01:39:34.000 --> 01:39:39.000
|
||
|
You'd think of everything he ever had written and find the best answer that he actually wrote to that question.
|
||
|
|
||
|
01:39:39.000 --> 01:39:41.000
|
||
|
And it actually was a lot of like talking to him.
|
||
|
|
||
|
01:39:41.000 --> 01:39:43.000
|
||
|
You can ask him what he liked about music.
|
||
|
|
||
|
01:39:43.000 --> 01:39:44.000
|
||
|
He was a musician.
|
||
|
|
||
|
01:39:44.000 --> 01:39:46.000
|
||
|
He actually liked Brahms the best.
|
||
|
|
||
|
01:39:46.000 --> 01:39:49.000
|
||
|
And it was very much like talking to him.
|
||
|
|
||
|
01:39:49.000 --> 01:39:53.000
|
||
|
And I put on this in my book and Amy talks about this in her book.
|
||
|
|
||
|
01:39:53.000 --> 01:39:57.000
|
||
|
And Amy actually asked the question, could I fall in love with this person?
|
||
|
|
||
|
01:39:57.000 --> 01:39:59.000
|
||
|
Even though I've never met him.
|
||
|
|
||
|
01:39:59.000 --> 01:40:00.000
|
||
|
And she does a pretty good job.
|
||
|
|
||
|
01:40:00.000 --> 01:40:03.000
|
||
|
I mean you really do fall in love with this character that she creates.
|
||
|
|
||
|
01:40:03.000 --> 01:40:06.000
|
||
|
Even though she never met him.
|
||
|
|
||
|
01:40:08.000 --> 01:40:13.000
|
||
|
So we can actually with today's technology do something where you can actually emulate somebody else.
|
||
|
|
||
|
01:40:13.000 --> 01:40:20.000
|
||
|
And I think if we get further on we can actually do that more and more responsibly and more and more that really would match that person.
|
||
|
|
||
|
01:40:20.000 --> 01:40:23.000
|
||
|
And actually emulate the way he would move and so on.
|
||
|
|
||
|
01:40:23.000 --> 01:40:24.000
|
||
|
It's telling the voice.
|
||
|
|
||
|
01:40:24.000 --> 01:40:26.000
|
||
|
You know my dad, he loved Brahms too because of those piano trios.
|
||
|
|
||
|
01:40:26.000 --> 01:40:29.000
|
||
|
So if we can solve the back taxes problem we get my dad and your dad's box.
|
||
|
|
||
|
01:40:29.000 --> 01:40:30.000
|
||
|
Hang out.
|
||
|
|
||
|
01:40:30.000 --> 01:40:31.000
|
||
|
It would be great.
|
||
|
|
||
|
01:40:31.000 --> 01:40:32.000
|
||
|
Well yeah.
|
||
|
|
||
|
01:40:32.000 --> 01:40:33.000
|
||
|
That would be cool.
|
||
|
|
||
|
01:40:33.000 --> 01:40:34.000
|
||
|
All right.
|
||
|
|
||
|
01:40:34.000 --> 01:40:35.000
|
||
|
All right.
|
||
|
|
||
|
01:40:35.000 --> 01:40:36.000
|
||
|
We got 20 minutes left.
|
||
|
|
||
|
01:40:36.000 --> 01:40:37.000
|
||
|
I want to get the thing that I most want to understand.
|
||
|
|
||
|
01:40:37.000 --> 01:40:38.000
|
||
|
Because by the way this book is wonderful.
|
||
|
|
||
|
01:40:38.000 --> 01:40:40.000
|
||
|
You guys are all going to get signed copies of it when it comes out.
|
||
|
|
||
|
01:40:40.000 --> 01:40:41.000
|
||
|
It's truly remarkable.
|
||
|
|
||
|
01:40:41.000 --> 01:40:42.000
|
||
|
As are all the great books.
|
||
|
|
||
|
01:40:42.000 --> 01:40:44.000
|
||
|
When you agree or disagree they'll definitely make you think more.
|
||
|
|
||
|
01:40:44.000 --> 01:40:51.000
|
||
|
One of the things that I don't think you do in this book is describe what a day will be like in 20, 20, 2045 when we're all much more intelligent.
|
||
|
|
||
|
01:40:51.000 --> 01:40:52.000
|
||
|
So it's 2045.
|
||
|
|
||
|
01:40:52.000 --> 01:40:53.000
|
||
|
We're all million times as intelligent.
|
||
|
|
||
|
01:40:53.000 --> 01:40:54.000
|
||
|
I wake up.
|
||
|
|
||
|
01:40:54.000 --> 01:40:57.000
|
||
|
Do I have breakfast or do I not have breakfast?
|
||
|
|
||
|
01:40:57.000 --> 01:41:04.000
|
||
|
Well, the answer to that question is kind of the same as it is now.
|
||
|
|
||
|
01:41:04.000 --> 01:41:11.000
|
||
|
First of all, the reason it's called a singularity is because we don't really fully understand that question.
|
||
|
|
||
|
01:41:11.000 --> 01:41:12.000
|
||
|
Singularity is borrowed from physics.
|
||
|
|
||
|
01:41:12.000 --> 01:41:16.000
|
||
|
Singularity in physics is where you have a black hole and no light can escape.
|
||
|
|
||
|
01:41:16.000 --> 01:41:18.000
|
||
|
And so you can't actually tell what's going on inside the black hole.
|
||
|
|
||
|
01:41:18.000 --> 01:41:20.000
|
||
|
And so we call it a singularity.
|
||
|
|
||
|
01:41:20.000 --> 01:41:21.000
|
||
|
A physical singularity.
|
||
|
|
||
|
01:41:21.000 --> 01:41:26.000
|
||
|
So this is a historical singularity where we're borrowing that term from physics and calling it a singularity.
|
||
|
|
||
|
01:41:26.000 --> 01:41:28.000
|
||
|
Because we can't really answer the question.
|
||
|
|
||
|
01:41:28.000 --> 01:41:31.000
|
||
|
If we actually multiply our intelligence a millionfold, what's that like?
|
||
|
|
||
|
01:41:31.000 --> 01:41:33.000
|
||
|
It's a little bit like asking a mouse.
|
||
|
|
||
|
01:41:33.000 --> 01:41:36.000
|
||
|
What would it be like if you had the amount of intelligence of this person?
|
||
|
|
||
|
01:41:36.000 --> 01:41:40.000
|
||
|
The mouse wouldn't really even understand the question.
|
||
|
|
||
|
01:41:40.000 --> 01:41:41.000
|
||
|
It does have intelligence.
|
||
|
|
||
|
01:41:41.000 --> 01:41:43.000
|
||
|
There's a fair amount of intelligence.
|
||
|
|
||
|
01:41:43.000 --> 01:41:44.000
|
||
|
But it couldn't understand that question.
|
||
|
|
||
|
01:41:44.000 --> 01:41:46.000
|
||
|
It couldn't articulate an answer.
|
||
|
|
||
|
01:41:46.000 --> 01:41:50.000
|
||
|
I thought we were debating whether or not a mouse or animals were conscious.
|
||
|
|
||
|
01:41:50.000 --> 01:41:52.000
|
||
|
Now it has a fair amount of intelligence.
|
||
|
|
||
|
01:41:52.000 --> 01:41:56.000
|
||
|
Isn't intelligence a like consciousness a prerequisite to intelligence?
|
||
|
|
||
|
01:41:56.000 --> 01:41:58.000
|
||
|
What the hell is going on?
|
||
|
|
||
|
01:41:58.000 --> 01:42:01.000
|
||
|
What kind of clown show is this?
|
||
|
|
||
|
01:42:01.000 --> 01:42:08.000
|
||
|
That's a little bit what it would be like for us to take the next step in intelligence by adding all the intelligence that singularity would provide.
|
||
|
|
||
|
01:42:08.000 --> 01:42:09.000
|
||
|
I just want to try and understand.
|
||
|
|
||
|
01:42:09.000 --> 01:42:11.000
|
||
|
But I'll give you one answer.
|
||
|
|
||
|
01:42:11.000 --> 01:42:20.000
|
||
|
I said if you're diligent, you'll achieve longevity, escape velocity in five or six years.
|
||
|
|
||
|
01:42:20.000 --> 01:42:29.000
|
||
|
And if we want to actually emulate everything that's going on inside the brain, let's go out a few more years.
|
||
|
|
||
|
01:42:29.000 --> 01:42:31.000
|
||
|
Let's say the 2040, 2045.
|
||
|
|
||
|
01:42:31.000 --> 01:42:33.000
|
||
|
Now you talk to a person.
|
||
|
|
||
|
01:42:33.000 --> 01:42:38.000
|
||
|
They've got all the connections that they had originally, plus all this additional connections that we add,
|
||
|
|
||
|
01:42:38.000 --> 01:42:41.000
|
||
|
through having them access computers.
|
||
|
|
||
|
01:42:41.000 --> 01:42:44.000
|
||
|
That becomes part of their thinking.
|
||
|
|
||
|
01:42:44.000 --> 01:42:50.000
|
||
|
So can you suppose that person like blows up or something happens to their mind?
|
||
|
|
||
|
01:42:50.000 --> 01:42:54.000
|
||
|
You definitely can recreate everything that's of a computer origin.
|
||
|
|
||
|
01:42:54.000 --> 01:42:55.000
|
||
|
Because we do that now.
|
||
|
|
||
|
01:42:55.000 --> 01:42:57.000
|
||
|
Anytime we create anything with a computer, it's backed up.
|
||
|
|
||
|
01:42:57.000 --> 01:43:01.000
|
||
|
So if the computer goes away, you've got to back up and you can recreate it.
|
||
|
|
||
|
01:43:01.000 --> 01:43:07.000
|
||
|
You can say, okay, but what about their thinking in their normal brain that's not done with computers?
|
||
|
|
||
|
01:43:07.000 --> 01:43:10.000
|
||
|
We don't have some ways of breaking that up.
|
||
|
|
||
|
01:43:10.000 --> 01:43:14.000
|
||
|
When we get to the singularity of a 2045, we'll be able to back that up as well.
|
||
|
|
||
|
01:43:14.000 --> 01:43:23.000
|
||
|
Because we'll be able to figure out, we'll have some ways of actually figuring out what's going on in that sort of non-biological brain.
|
||
|
|
||
|
01:43:23.000 --> 01:43:28.000
|
||
|
And so we'll be able to back up both their normal brain as well as the computer edition.
|
||
|
|
||
|
01:43:29.000 --> 01:43:32.000
|
||
|
And I believe that's feasible by 2045.
|
||
|
|
||
|
01:43:32.000 --> 01:43:34.000
|
||
|
In your vision of it.
|
||
|
|
||
|
01:43:34.000 --> 01:43:36.000
|
||
|
So you can back up the entire brain.
|
||
|
|
||
|
01:43:36.000 --> 01:43:40.000
|
||
|
If it doesn't guarantee, I mean, the whole world could blow up and lose all the data centers.
|
||
|
|
||
|
01:43:40.000 --> 01:43:42.000
|
||
|
So it's not absolute guarantee.
|
||
|
|
||
|
01:43:42.000 --> 01:43:46.000
|
||
|
What I don't understand is, will we even be fully distinct people?
|
||
|
|
||
|
01:43:46.000 --> 01:43:55.000
|
||
|
If we're sharing memories and we're all uploading our brains to the cloud and we're getting all this information coming back directly into our neocortex, are we still distinct?
|
||
|
|
||
|
01:43:56.000 --> 01:44:00.000
|
||
|
Yes, but we could also find new ways of communicating.
|
||
|
|
||
|
01:44:00.000 --> 01:44:08.000
|
||
|
So the computers that extend my brain, the right computers that extend your brain, we can create something that's like a hybrid or not.
|
||
|
|
||
|
01:44:08.000 --> 01:44:11.000
|
||
|
And it'll be up to our own decision as to whether or not to do that.
|
||
|
|
||
|
01:44:11.000 --> 01:44:13.000
|
||
|
So there'll be some new ways of communicating.
|
||
|
|
||
|
01:44:13.000 --> 01:44:14.000
|
||
|
Let me ask another question.
|
||
|
|
||
|
01:44:14.000 --> 01:44:20.000
|
||
|
I think that this has a very big parallel with peer tube because if I build a peer tube and then somebody else builds a peer tube,
|
||
|
|
||
|
01:44:20.000 --> 01:44:25.000
|
||
|
then we can mirror each other's peer tube and then share each other's content like a different channel.
|
||
|
|
||
|
01:44:25.000 --> 01:44:27.000
|
||
|
But then we each host each other.
|
||
|
|
||
|
01:44:27.000 --> 01:44:31.000
|
||
|
And so you could do that with like, you know, different friends.
|
||
|
|
||
|
01:44:31.000 --> 01:44:35.000
|
||
|
We could have, you know, like host each other's worries and host each other.
|
||
|
|
||
|
01:44:35.000 --> 01:44:36.000
|
||
|
I mean.
|
||
|
|
||
|
01:44:41.000 --> 01:44:43.000
|
||
|
This is what when I was reading the book.
|
||
|
|
||
|
01:44:43.000 --> 01:44:44.000
|
||
|
This is where I kept getting stuck.
|
||
|
|
||
|
01:44:44.000 --> 01:44:45.000
|
||
|
You were extremely optimistic.
|
||
|
|
||
|
01:44:45.000 --> 01:44:47.000
|
||
|
You were optimistic about where we are today.
|
||
|
|
||
|
01:44:47.000 --> 01:44:49.000
|
||
|
Optimistic that technology has been a massive force for good.
|
||
|
|
||
|
01:44:49.000 --> 01:44:51.000
|
||
|
You're optimistic that will continue to be a massive force for good.
|
||
|
|
||
|
01:44:51.000 --> 01:44:54.000
|
||
|
Yet there is a lot of uncertainty in the future you were describing.
|
||
|
|
||
|
01:44:54.000 --> 01:45:00.000
|
||
|
Well, first of all, I'm not necessarily optimistic that things that can go wrong.
|
||
|
|
||
|
01:45:00.000 --> 01:45:04.000
|
||
|
We had things that can go wrong before we had computers.
|
||
|
|
||
|
01:45:04.000 --> 01:45:10.000
|
||
|
When I was a child, atomic weapons were created.
|
||
|
|
||
|
01:45:10.000 --> 01:45:12.000
|
||
|
And people were very worried about atomic war.
|
||
|
|
||
|
01:45:12.000 --> 01:45:17.000
|
||
|
We would actually get under our desk and put our hands behind our head to protect us against atomic war.
|
||
|
|
||
|
01:45:17.000 --> 01:45:19.000
|
||
|
It seemed to work, actually.
|
||
|
|
||
|
01:45:19.000 --> 01:45:20.000
|
||
|
We're still here.
|
||
|
|
||
|
01:45:20.000 --> 01:45:27.000
|
||
|
But if you were to ask people, we had actually two weapons that went off and anger and killed a lot of people within a week.
|
||
|
|
||
|
01:45:27.000 --> 01:45:31.000
|
||
|
And if you were to ask people, what's the chance that we're going to go another 80 years and this will never happen again?
|
||
|
|
||
|
01:45:31.000 --> 01:45:34.000
|
||
|
Nobody would say that that was true.
|
||
|
|
||
|
01:45:34.000 --> 01:45:35.000
|
||
|
But it has happened.
|
||
|
|
||
|
01:45:35.000 --> 01:45:38.000
|
||
|
That doesn't mean it's not going to happen next week.
|
||
|
|
||
|
01:45:39.000 --> 01:45:41.000
|
||
|
But anyway, that's a great danger.
|
||
|
|
||
|
01:45:41.000 --> 01:45:44.000
|
||
|
And I think that's a much greater danger than computers are.
|
||
|
|
||
|
01:45:44.000 --> 01:45:49.000
|
||
|
Yes, there are dangers, but the computers will also be more intelligent to avoid kinds of dangers.
|
||
|
|
||
|
01:45:49.000 --> 01:45:52.000
|
||
|
Yes, it's some bad people in the world.
|
||
|
|
||
|
01:45:52.000 --> 01:45:55.000
|
||
|
But I mean, go back 80, 90 years.
|
||
|
|
||
|
01:45:55.000 --> 01:46:00.000
|
||
|
We had 100 million people die in Asia and Europe from World War II.
|
||
|
|
||
|
01:46:00.000 --> 01:46:02.000
|
||
|
We don't have wars like that anymore.
|
||
|
|
||
|
01:46:02.000 --> 01:46:03.000
|
||
|
We could.
|
||
|
|
||
|
01:46:03.000 --> 01:46:06.000
|
||
|
We would certainly have atomic weapons to do that.
|
||
|
|
||
|
01:46:07.000 --> 01:46:11.000
|
||
|
And you can also imagine computers could be involved with that.
|
||
|
|
||
|
01:46:11.000 --> 01:46:17.000
|
||
|
But if you actually look, and this goes right through one piece, first of all, if you look at my...
|
||
|
|
||
|
01:46:17.000 --> 01:46:24.000
|
||
|
They are using computers right now to hide behind their targeting when they target drone attacks and things like that in foreign nations.
|
||
|
|
||
|
01:46:24.000 --> 01:46:27.000
|
||
|
They just say that the AI said to shoot them.
|
||
|
|
||
|
01:46:27.000 --> 01:46:34.000
|
||
|
They're also doing that in Gaza right now, saying that the AI said to shoot that car.
|
||
|
|
||
|
01:46:34.000 --> 01:46:36.000
|
||
|
So they will use the AI like that.
|
||
|
|
||
|
01:46:36.000 --> 01:46:50.000
|
||
|
They will use it as an excuse, just like they did with the domain server, where the domain server just happened to spit out Remdesivir, Celicoxib, Pomotidine, and Ivermectin.
|
||
|
|
||
|
01:46:50.000 --> 01:46:53.000
|
||
|
But they're not going to use it for the things that they say they're going to use it for.
|
||
|
|
||
|
01:46:53.000 --> 01:46:56.000
|
||
|
They're not going to crack the big puzzles that they say they're going to crack with it.
|
||
|
|
||
|
01:46:56.000 --> 01:46:57.000
|
||
|
They'll make a lot of things better.
|
||
|
|
||
|
01:46:57.000 --> 01:47:00.000
|
||
|
A lot of things will become easier and automated, sure.
|
||
|
|
||
|
01:47:00.000 --> 01:47:07.000
|
||
|
But that's not exactly the same thing as what he's talking about, which is being able to record everything in your brain and recreate it.
|
||
|
|
||
|
01:47:07.000 --> 01:47:11.000
|
||
|
Being able to put things into your brain, to be able to read things out of your brain.
|
||
|
|
||
|
01:47:11.000 --> 01:47:15.000
|
||
|
This is all nonsense.
|
||
|
|
||
|
01:47:15.000 --> 01:47:20.000
|
||
|
Lineage of computers going from tiny fraction of one calculation to 65 billion.
|
||
|
|
||
|
01:47:20.000 --> 01:47:24.000
|
||
|
That's a 20 quadrillion fold increase that we've...
|
||
|
|
||
|
01:47:24.000 --> 01:47:30.000
|
||
|
20 quadrillion fold increase per dollar.
|
||
|
|
||
|
01:47:30.000 --> 01:47:38.000
|
||
|
The first computer could compute more than once per second, and if it couldn't, it would have been mechanical as it was pointed out by somebody in the chat.
|
||
|
|
||
|
01:47:38.000 --> 01:47:40.000
|
||
|
And then it's still nonsense.
|
||
|
|
||
|
01:47:40.000 --> 01:47:51.000
|
||
|
It's per dollar he's talking about, just like the cost of sequencing DNA sequences has gone down exponentially, and they reference this guy's curve.
|
||
|
|
||
|
01:47:51.000 --> 01:47:53.000
|
||
|
It's nonsense.
|
||
|
|
||
|
01:47:53.000 --> 01:48:01.000
|
||
|
The average income in the United States, it went... it's multiplied by about a hundred fold, and we live far more successfully.
|
||
|
|
||
|
01:48:01.000 --> 01:48:10.000
|
||
|
If you actually... people say all the things were created a hundred years ago, they weren't.
|
||
|
|
||
|
01:48:10.000 --> 01:48:12.000
|
||
|
And you can look at this chart.
|
||
|
|
||
|
01:48:12.000 --> 01:48:15.000
|
||
|
I've got 50 charts in the book, which are the kinds of progress we've made.
|
||
|
|
||
|
01:48:16.000 --> 01:48:18.000
|
||
|
Who says that things were great a hundred years ago?
|
||
|
|
||
|
01:48:18.000 --> 01:48:21.000
|
||
|
See, that's just a bullshit statement right there, or a straw man, or whatever you call it.
|
||
|
|
||
|
01:48:21.000 --> 01:48:24.000
|
||
|
Nobody says it was great a hundred years ago.
|
||
|
|
||
|
01:48:24.000 --> 01:48:29.000
|
||
|
Well, I do say it was great about 10 years ago.
|
||
|
|
||
|
01:48:29.000 --> 01:48:34.000
|
||
|
It was great when I was in high school.
|
||
|
|
||
|
01:48:34.000 --> 01:48:37.000
|
||
|
These people are the worst.
|
||
|
|
||
|
01:48:38.000 --> 01:48:40.000
|
||
|
And they are pure evil.
|
||
|
|
||
|
01:48:40.000 --> 01:48:45.000
|
||
|
They lie, they mislead, and they desecrate, and show no reverence for the sacred at all.
|
||
|
|
||
|
01:48:45.000 --> 01:48:47.000
|
||
|
There is no sacred in his world.
|
||
|
|
||
|
01:48:47.000 --> 01:48:51.000
|
||
|
The number of people that live and die in a property has gone down dramatically.
|
||
|
|
||
|
01:48:51.000 --> 01:48:52.000
|
||
|
We actually did it.
|
||
|
|
||
|
01:48:52.000 --> 01:48:54.000
|
||
|
And we actually did it.
|
||
|
|
||
|
01:48:54.000 --> 01:48:56.000
|
||
|
And it was a great deal.
|
||
|
|
||
|
01:48:56.000 --> 01:48:58.000
|
||
|
It was a great deal.
|
||
|
|
||
|
01:48:58.000 --> 01:48:59.000
|
||
|
It was a great deal.
|
||
|
|
||
|
01:48:59.000 --> 01:49:00.000
|
||
|
It was a great deal.
|
||
|
|
||
|
01:49:00.000 --> 01:49:01.000
|
||
|
It was a great deal.
|
||
|
|
||
|
01:49:01.000 --> 01:49:03.000
|
||
|
It was a great deal.
|
||
|
|
||
|
01:49:04.000 --> 01:49:07.000
|
||
|
The number of people that live and die in a property has gone down dramatically.
|
||
|
|
||
|
01:49:07.000 --> 01:49:11.000
|
||
|
We actually did a poll where they asked people, people that live in a property has gone up
|
||
|
|
||
|
01:49:11.000 --> 01:49:13.000
|
||
|
or down, 80 percent said it's gone up.
|
||
|
|
||
|
01:49:13.000 --> 01:49:22.000
|
||
|
But the reality is it's actually fallen by 50 percent in the last 20 years.
|
||
|
|
||
|
01:49:22.000 --> 01:49:27.000
|
||
|
So what we think about the past is really the opposite of what's happened.
|
||
|
|
||
|
01:49:27.000 --> 01:49:29.000
|
||
|
Things have gotten far better than they have.
|
||
|
|
||
|
01:49:29.000 --> 01:49:31.000
|
||
|
And computers are going to make things even better.
|
||
|
|
||
|
01:49:31.000 --> 01:49:34.000
|
||
|
And just the kind of things you're going to do now with a large language model doesn't
|
||
|
|
||
|
01:49:34.000 --> 01:49:35.000
|
||
|
exist two years ago.
|
||
|
|
||
|
01:49:35.000 --> 01:49:37.000
|
||
|
Do you ever worry that...
|
||
|
|
||
|
01:49:37.000 --> 01:49:43.000
|
||
|
So nowhere here social media has not discussed at all the effect on kids, of video games
|
||
|
|
||
|
01:49:43.000 --> 01:49:46.000
|
||
|
and social media has not discussed at all computers are going to make things better?
|
||
|
|
||
|
01:49:46.000 --> 01:49:47.000
|
||
|
You know, computers have made things better.
|
||
|
|
||
|
01:49:47.000 --> 01:49:48.000
|
||
|
Take it as a given.
|
||
|
|
||
|
01:49:48.000 --> 01:49:49.000
|
||
|
That personal income will keep going up.
|
||
|
|
||
|
01:49:49.000 --> 01:49:51.000
|
||
|
Do you ever worry it's just coming too quickly?
|
||
|
|
||
|
01:49:51.000 --> 01:49:54.000
|
||
|
It'll be better if maybe the slope of the Kurtzweil curve was a little less steep?
|
||
|
|
||
|
01:49:54.000 --> 01:49:55.000
|
||
|
That's a big difference in the past.
|
||
|
|
||
|
01:49:55.000 --> 01:49:58.000
|
||
|
They talk about what effect did the railroad have.
|
||
|
|
||
|
01:49:58.000 --> 01:50:00.000
|
||
|
And lots of jobs were lost.
|
||
|
|
||
|
01:50:00.000 --> 01:50:03.000
|
||
|
Or even the cotton genie that happened 200 years ago.
|
||
|
|
||
|
01:50:03.000 --> 01:50:06.000
|
||
|
And people were quite happy making money with the cotton genie.
|
||
|
|
||
|
01:50:06.000 --> 01:50:07.000
|
||
|
And suddenly that was gone.
|
||
|
|
||
|
01:50:07.000 --> 01:50:08.000
|
||
|
And machines were doing that.
|
||
|
|
||
|
01:50:08.000 --> 01:50:10.000
|
||
|
And people say, well, wait till this gets going.
|
||
|
|
||
|
01:50:10.000 --> 01:50:11.000
|
||
|
All jobs will be lost.
|
||
|
|
||
|
01:50:11.000 --> 01:50:15.000
|
||
|
And that's actually what was said at that time.
|
||
|
|
||
|
01:50:15.000 --> 01:50:17.000
|
||
|
But actually income went up.
|
||
|
|
||
|
01:50:17.000 --> 01:50:18.000
|
||
|
More and more people worked.
|
||
|
|
||
|
01:50:18.000 --> 01:50:19.000
|
||
|
We created...
|
||
|
|
||
|
01:50:19.000 --> 01:50:20.000
|
||
|
And if you say, well, what are they going to do?
|
||
|
|
||
|
01:50:20.000 --> 01:50:24.000
|
||
|
You couldn't answer that question because it was in industries that nobody had a clue of.
|
||
|
|
||
|
01:50:24.000 --> 01:50:27.000
|
||
|
Like for example, all of electronics.
|
||
|
|
||
|
01:50:28.000 --> 01:50:32.000
|
||
|
Also, the people who were put out of work by the cotton gen got jobs in electronics factories.
|
||
|
|
||
|
01:50:32.000 --> 01:50:33.000
|
||
|
That's cool.
|
||
|
|
||
|
01:50:33.000 --> 01:50:36.000
|
||
|
Things are getting better even if jobs are lost.
|
||
|
|
||
|
01:50:36.000 --> 01:50:38.000
|
||
|
Now you can certainly point to jobs.
|
||
|
|
||
|
01:50:38.000 --> 01:50:41.000
|
||
|
Take computer programming.
|
||
|
|
||
|
01:50:41.000 --> 01:50:45.000
|
||
|
Yeah, go learn programming is what they said at the beginning of the pandemic, right?
|
||
|
|
||
|
01:50:45.000 --> 01:50:46.000
|
||
|
Learn to code.
|
||
|
|
||
|
01:50:46.000 --> 01:50:52.000
|
||
|
Google has, I don't know, 60,000 people with the program computers and lots of other companies do.
|
||
|
|
||
|
01:50:52.000 --> 01:50:55.000
|
||
|
At some point that's not going to be a feasible job.
|
||
|
|
||
|
01:50:55.000 --> 01:50:56.000
|
||
|
They can already code.
|
||
|
|
||
|
01:50:56.000 --> 01:50:58.000
|
||
|
Lots of language models can write code.
|
||
|
|
||
|
01:50:58.000 --> 01:51:01.000
|
||
|
Not quite the way an expert programmer can do.
|
||
|
|
||
|
01:51:01.000 --> 01:51:03.000
|
||
|
But how long is that going to take?
|
||
|
|
||
|
01:51:03.000 --> 01:51:07.000
|
||
|
It's measuring years, not in the decades.
|
||
|
|
||
|
01:51:07.000 --> 01:51:12.000
|
||
|
Nonetheless, I believe that things will get better because we wipe out jobs but we create
|
||
|
|
||
|
01:51:12.000 --> 01:51:14.000
|
||
|
other ways of having an income.
|
||
|
|
||
|
01:51:14.000 --> 01:51:20.000
|
||
|
And if you actually point to something to save this machine, and this is being worked on,
|
||
|
|
||
|
01:51:20.000 --> 01:51:21.000
|
||
|
can wash dishes.
|
||
|
|
||
|
01:51:21.000 --> 01:51:22.000
|
||
|
You just have a bunch of dishes.
|
||
|
|
||
|
01:51:22.000 --> 01:51:26.000
|
||
|
It'll pick the ones that have to go on the water dishwasher and clean everything else up.
|
||
|
|
||
|
01:51:26.000 --> 01:51:28.000
|
||
|
And that will wash dishes for you.
|
||
|
|
||
|
01:51:28.000 --> 01:51:30.000
|
||
|
Would we want that not to happen?
|
||
|
|
||
|
01:51:30.000 --> 01:51:32.000
|
||
|
Would we say, well, this is kind of upsetting things.
|
||
|
|
||
|
01:51:32.000 --> 01:51:33.000
|
||
|
Let's get rid of it.
|
||
|
|
||
|
01:51:33.000 --> 01:51:34.000
|
||
|
It's not going to happen.
|
||
|
|
||
|
01:51:34.000 --> 01:51:36.000
|
||
|
And no one would advocate that.
|
||
|
|
||
|
01:51:36.000 --> 01:51:39.000
|
||
|
So we'll find things to do.
|
||
|
|
||
|
01:51:39.000 --> 01:51:42.000
|
||
|
We'll have other methods of distributing money.
|
||
|
|
||
|
01:51:42.000 --> 01:51:43.000
|
||
|
And it will be.
|
||
|
|
||
|
01:51:43.000 --> 01:51:46.000
|
||
|
Other methods of distributing money.
|
||
|
|
||
|
01:51:46.000 --> 01:51:48.000
|
||
|
Oh, that's nice.
|
||
|
|
||
|
01:51:48.000 --> 01:51:51.000
|
||
|
Other methods of distributing money.
|
||
|
|
||
|
01:51:52.000 --> 01:51:55.000
|
||
|
It'll continue these kinds of curves that we've seen already.
|
||
|
|
||
|
01:51:55.000 --> 01:51:59.000
|
||
|
It's kind of remarkable that we got large language models before we got robotic dishwashers.
|
||
|
|
||
|
01:51:59.000 --> 01:52:01.000
|
||
|
You have grandchildren.
|
||
|
|
||
|
01:52:01.000 --> 01:52:03.000
|
||
|
What would you tell a young person?
|
||
|
|
||
|
01:52:03.000 --> 01:52:05.000
|
||
|
They buy in, they agree.
|
||
|
|
||
|
01:52:05.000 --> 01:52:11.000
|
||
|
How would you tell them to best prepare themselves for what will be, if you're correct, a remarkably different future?
|
||
|
|
||
|
01:52:11.000 --> 01:52:17.000
|
||
|
I've been less concerned about what will make money and much more concerned about what turns them on.
|
||
|
|
||
|
01:52:18.000 --> 01:52:19.000
|
||
|
They love video games.
|
||
|
|
||
|
01:52:19.000 --> 01:52:21.000
|
||
|
So they should learn about that.
|
||
|
|
||
|
01:52:21.000 --> 01:52:23.000
|
||
|
They should read literature that turns them on.
|
||
|
|
||
|
01:52:23.000 --> 01:52:27.000
|
||
|
Some of those literature in the future will be created by computers.
|
||
|
|
||
|
01:52:27.000 --> 01:52:34.000
|
||
|
And find out what in the world has a positive effect on their mental being.
|
||
|
|
||
|
01:52:34.000 --> 01:52:39.000
|
||
|
And if you know that your child or your grandchild gets to one of the questions that is asked on the screen here,
|
||
|
|
||
|
01:52:39.000 --> 01:52:43.000
|
||
|
if you know that someone is going to live for hundreds of years, as you predict,
|
||
|
|
||
|
01:52:43.000 --> 01:52:46.000
|
||
|
how does that affect the way, certainly means they shouldn't retire at 65.
|
||
|
|
||
|
01:52:46.000 --> 01:52:49.000
|
||
|
But what else does it change about the way they should think about their lives?
|
||
|
|
||
|
01:52:49.000 --> 01:52:52.000
|
||
|
Well, I talk to people and say, well, I wouldn't want to live past 100.
|
||
|
|
||
|
01:52:52.000 --> 01:52:57.000
|
||
|
Or maybe they're a little more ambitious to say, I don't want to live past 110.
|
||
|
|
||
|
01:52:57.000 --> 01:53:03.000
|
||
|
But if you actually look, when people decide they've had enough and they don't want to live anymore,
|
||
|
|
||
|
01:53:03.000 --> 01:53:07.000
|
||
|
that never, ever happens unless these people are in some kind of dire pain.
|
||
|
|
||
|
01:53:07.000 --> 01:53:10.000
|
||
|
In physical pain, or emotional pain, or spiritual pain, or whatever.
|
||
|
|
||
|
01:53:10.000 --> 01:53:12.000
|
||
|
And they just cannot bear to be alive anymore.
|
||
|
|
||
|
01:53:12.000 --> 01:53:15.000
|
||
|
Nobody takes their lives other than that.
|
||
|
|
||
|
01:53:15.000 --> 01:53:19.000
|
||
|
And if we can actually overcome many kinds of physical problems,
|
||
|
|
||
|
01:53:19.000 --> 01:53:23.000
|
||
|
the answer is wiped out and so on, in fact, expect to happen,
|
||
|
|
||
|
01:53:23.000 --> 01:53:26.000
|
||
|
people will be even that much more happy to live.
|
||
|
|
||
|
01:53:26.000 --> 01:53:29.000
|
||
|
And they'll want to continue to experience tomorrow.
|
||
|
|
||
|
01:53:29.000 --> 01:53:32.000
|
||
|
And tomorrow is going to be better and better.
|
||
|
|
||
|
01:53:32.000 --> 01:53:35.000
|
||
|
These kinds of progress are not going to go away.
|
||
|
|
||
|
01:53:35.000 --> 01:53:40.000
|
||
|
So people will want to live unless they're in dire pain.
|
||
|
|
||
|
01:53:40.000 --> 01:53:43.000
|
||
|
But that's what the whole medical profession is about,
|
||
|
|
||
|
01:53:43.000 --> 01:53:46.000
|
||
|
which is going to be greatly amplified by tomorrow's computers.
|
||
|
|
||
|
01:53:46.000 --> 01:53:47.000
|
||
|
That's a great question that has popped on the screen.
|
||
|
|
||
|
01:53:47.000 --> 01:53:48.000
|
||
|
This is from Colin McCabe.
|
||
|
|
||
|
01:53:48.000 --> 01:53:49.000
|
||
|
AI is a black box.
|
||
|
|
||
|
01:53:49.000 --> 01:53:50.000
|
||
|
Nobody knows how it was built.
|
||
|
|
||
|
01:53:50.000 --> 01:53:53.000
|
||
|
How do you show that AI is trustworthy to users who want to trust it, adopt it,
|
||
|
|
||
|
01:53:53.000 --> 01:53:58.000
|
||
|
and accept it, particularly if you're going to upload it directly into your brain?
|
||
|
|
||
|
01:53:58.000 --> 01:54:01.000
|
||
|
Well, it's not true that nobody knows how they work.
|
||
|
|
||
|
01:54:01.000 --> 01:54:04.000
|
||
|
Most people who are using a large language model don't know what data sense went into it.
|
||
|
|
||
|
01:54:04.000 --> 01:54:07.000
|
||
|
They're things that happen in a transformer layer that even the architects don't understand.
|
||
|
|
||
|
01:54:07.000 --> 01:54:10.000
|
||
|
Right, but we're going to learn more and more about that.
|
||
|
|
||
|
01:54:10.000 --> 01:54:18.000
|
||
|
In fact, how computers work will be a very common type of talent that people want to gain.
|
||
|
|
||
|
01:54:18.000 --> 01:54:20.000
|
||
|
And ultimately, we'll have more trust in computers.
|
||
|
|
||
|
01:54:20.000 --> 01:54:22.000
|
||
|
I mean, large language models aren't perfect,
|
||
|
|
||
|
01:54:22.000 --> 01:54:25.000
|
||
|
and you can answer the question that can give you something that's incorrect.
|
||
|
|
||
|
01:54:25.000 --> 01:54:29.000
|
||
|
We've seen that just recently.
|
||
|
|
||
|
01:54:29.000 --> 01:54:33.000
|
||
|
The reason we have these computers giving you incorrect information
|
||
|
|
||
|
01:54:33.000 --> 01:54:36.000
|
||
|
is it doesn't have the information to begin with.
|
||
|
|
||
|
01:54:36.000 --> 01:54:38.000
|
||
|
And it actually doesn't know what it doesn't know.
|
||
|
|
||
|
01:54:38.000 --> 01:54:41.000
|
||
|
It's actually something we're working on so that it knows.
|
||
|
|
||
|
01:54:41.000 --> 01:54:42.000
|
||
|
Well, I don't know that.
|
||
|
|
||
|
01:54:42.000 --> 01:54:45.000
|
||
|
That's actually very good if it can actually say that,
|
||
|
|
||
|
01:54:45.000 --> 01:54:47.000
|
||
|
because right now it will find the best thing it knows.
|
||
|
|
||
|
01:54:47.000 --> 01:54:51.000
|
||
|
And if it's never trained on that information and there's nothing in there that tells you,
|
||
|
|
||
|
01:54:51.000 --> 01:54:54.000
|
||
|
it'll just give you the best guess, which could be very incorrect.
|
||
|
|
||
|
01:54:54.000 --> 01:54:58.000
|
||
|
And we're actually learning to be able to figure out when it knows and what it doesn't know.
|
||
|
|
||
|
01:54:58.000 --> 01:55:05.000
|
||
|
But ultimately, we'll have pretty good confidence when it knows and what it doesn't know,
|
||
|
|
||
|
01:55:05.000 --> 01:55:07.000
|
||
|
and we can actually rely on what it says.
|
||
|
|
||
|
01:55:07.000 --> 01:55:09.000
|
||
|
So your answer to the question is, A, we will understand more,
|
||
|
|
||
|
01:55:09.000 --> 01:55:12.000
|
||
|
and B, they'll be much more trustworthy, so it won't be as risky to not understand them.
|
||
|
|
||
|
01:55:12.000 --> 01:55:13.000
|
||
|
Right.
|
||
|
|
||
|
01:55:13.000 --> 01:55:14.000
|
||
|
Okay.
|
||
|
|
||
|
01:55:14.000 --> 01:55:16.000
|
||
|
You spent your life making predictions.
|
||
|
|
||
|
01:55:16.000 --> 01:55:18.000
|
||
|
Some of which like the Turing test, you've held on to them and remarkably accurate,
|
||
|
|
||
|
01:55:18.000 --> 01:55:21.000
|
||
|
as you move from an overwhelming optimist to now slightly of a pessimist.
|
||
|
|
||
|
01:55:21.000 --> 01:55:22.000
|
||
|
What is a prediction?
|
||
|
|
||
|
01:55:22.000 --> 01:55:25.000
|
||
|
Well, my books have always had a chapter on how these things can go wrong,
|
||
|
|
||
|
01:55:25.000 --> 01:55:26.000
|
||
|
and the problems.
|
||
|
|
||
|
01:55:26.000 --> 01:55:29.000
|
||
|
Tell me a prediction that you are chewing over right now.
|
||
|
|
||
|
01:55:29.000 --> 01:55:30.000
|
||
|
You're not sure.
|
||
|
|
||
|
01:55:30.000 --> 01:55:31.000
|
||
|
Why do you want to make it?
|
||
|
|
||
|
01:55:31.000 --> 01:55:34.000
|
||
|
Why do you don't want to make it?
|
||
|
|
||
|
01:55:35.000 --> 01:55:40.000
|
||
|
So we're kind of treating them like a profit, which is also a bit yuck.
|
||
|
|
||
|
01:55:40.000 --> 01:55:42.000
|
||
|
It's in nanotechnology.
|
||
|
|
||
|
01:55:42.000 --> 01:55:47.000
|
||
|
If someone would create nanotechnology that replicates,
|
||
|
|
||
|
01:55:47.000 --> 01:55:50.000
|
||
|
well, it replicates everything into paperclips.
|
||
|
|
||
|
01:55:50.000 --> 01:55:53.000
|
||
|
It'll turn the entire world into paperclips.
|
||
|
|
||
|
01:55:53.000 --> 01:55:55.000
|
||
|
That would not be positive.
|
||
|
|
||
|
01:55:55.000 --> 01:55:56.000
|
||
|
No.
|
||
|
|
||
|
01:55:56.000 --> 01:55:57.000
|
||
|
Unless you're staples.
|
||
|
|
||
|
01:55:57.000 --> 01:56:00.000
|
||
|
But then, and that's feasible.
|
||
|
|
||
|
01:56:01.000 --> 01:56:08.000
|
||
|
Take somebody who's a little bit mental to do that, but it could be done.
|
||
|
|
||
|
01:56:08.000 --> 01:56:12.000
|
||
|
And we actually will have something that actually avoids that.
|
||
|
|
||
|
01:56:12.000 --> 01:56:18.000
|
||
|
So we'll have something that can detect that this is actually telling everything
|
||
|
|
||
|
01:56:18.000 --> 01:56:21.000
|
||
|
in the paperclips and destroy it before it does that.
|
||
|
|
||
|
01:56:21.000 --> 01:56:25.000
|
||
|
But I mean, I have a chapter in this new book, The Singularity is Nearer,
|
||
|
|
||
|
01:56:25.000 --> 01:56:28.000
|
||
|
that talks about the kinds of things that could happen.
|
||
|
|
||
|
01:56:28.000 --> 01:56:31.000
|
||
|
The most remarkable part of this book is he does exactly the mathematical calculations
|
||
|
|
||
|
01:56:31.000 --> 01:56:33.000
|
||
|
on how long it would take nanobots to turn the world into gray goo
|
||
|
|
||
|
01:56:33.000 --> 01:56:35.000
|
||
|
and how long it would take the blue goo to stop the gray goo.
|
||
|
|
||
|
01:56:35.000 --> 01:56:36.000
|
||
|
That's remarkable.
|
||
|
|
||
|
01:56:36.000 --> 01:56:37.000
|
||
|
The book will be out soon.
|
||
|
|
||
|
01:56:37.000 --> 01:56:38.000
|
||
|
You definitely need to read it until the end.
|
||
|
|
||
|
01:56:38.000 --> 01:56:39.000
|
||
|
But this leads to it.
|
||
|
|
||
|
01:56:39.000 --> 01:56:40.000
|
||
|
Maybe, let me try and answer.
|
||
|
|
||
|
01:56:40.000 --> 01:56:42.000
|
||
|
The question I asked before is what should young people think about
|
||
|
|
||
|
01:56:42.000 --> 01:56:44.000
|
||
|
being working on you so their passions and what turns them on?
|
||
|
|
||
|
01:56:44.000 --> 01:56:49.000
|
||
|
Shouldn't they be thinking through how to design and architect these future systems
|
||
|
|
||
|
01:56:49.000 --> 01:56:52.000
|
||
|
so they are less likely to turn us into gray goo?
|
||
|
|
||
|
01:56:52.000 --> 01:56:54.000
|
||
|
I don't know if everybody wants to work on that.
|
||
|
|
||
|
01:56:54.000 --> 01:56:57.000
|
||
|
But folks in this room, technologically minded, you guys should all be working on not turning
|
||
|
|
||
|
01:56:57.000 --> 01:56:58.000
|
||
|
the world right?
|
||
|
|
||
|
01:56:58.000 --> 01:56:59.000
|
||
|
Yes.
|
||
|
|
||
|
01:56:59.000 --> 01:57:00.000
|
||
|
That would be on the list.
|
||
|
|
||
|
01:57:00.000 --> 01:57:03.000
|
||
|
But that leads to another question which is what will the role of humans be in thinking
|
||
|
|
||
|
01:57:03.000 --> 01:57:06.000
|
||
|
through that problem when they are only a millionth or a billionth or a trillionth as intelligent
|
||
|
|
||
|
01:57:06.000 --> 01:57:07.000
|
||
|
as machines?
|
||
|
|
||
|
01:57:07.000 --> 01:57:08.000
|
||
|
Say that again.
|
||
|
|
||
|
01:57:08.000 --> 01:57:11.000
|
||
|
So we are going to have these really hard problems to solve.
|
||
|
|
||
|
01:57:11.000 --> 01:57:14.000
|
||
|
Right now, we are along with our machines.
|
||
|
|
||
|
01:57:14.000 --> 01:57:16.000
|
||
|
We can be extremely intelligent.
|
||
|
|
||
|
01:57:16.000 --> 01:57:19.000
|
||
|
But ten years from now, fifteen years from now, there will be machines that will be so much more
|
||
|
|
||
|
01:57:19.000 --> 01:57:20.000
|
||
|
intelligent than us.
|
||
|
|
||
|
01:57:20.000 --> 01:57:23.000
|
||
|
What will our role, what will the role of humans be in trying to solve the problem?
|
||
|
|
||
|
01:57:23.000 --> 01:57:24.000
|
||
|
I see those as extensions of humans.
|
||
|
|
||
|
01:57:24.000 --> 01:57:27.000
|
||
|
And we wouldn't have them if we didn't have humans to begin with.
|
||
|
|
||
|
01:57:27.000 --> 01:57:29.000
|
||
|
And humans have a brain that can think these things through.
|
||
|
|
||
|
01:57:29.000 --> 01:57:30.000
|
||
|
And we have this thumb.
|
||
|
|
||
|
01:57:30.000 --> 01:57:32.000
|
||
|
It's not really very much appreciated.
|
||
|
|
||
|
01:57:32.000 --> 01:57:35.000
|
||
|
But whales and elephants actually have a larger brain than we have.
|
||
|
|
||
|
01:57:35.000 --> 01:57:36.000
|
||
|
And they can probably think deeper thoughts.
|
||
|
|
||
|
01:57:36.000 --> 01:57:39.000
|
||
|
But they don't have a thumb and so they don't create technology.
|
||
|
|
||
|
01:57:39.000 --> 01:57:40.000
|
||
|
But they are not conscious.
|
||
|
|
||
|
01:57:40.000 --> 01:57:42.000
|
||
|
A lot of people say they are not conscious.
|
||
|
|
||
|
01:57:42.000 --> 01:57:44.000
|
||
|
You just said that like twenty-five minutes ago.
|
||
|
|
||
|
01:57:44.000 --> 01:57:48.000
|
||
|
Do you see how much bullshittery is happening here?
|
||
|
|
||
|
01:57:48.000 --> 01:57:52.000
|
||
|
This is just an enchanter.
|
||
|
|
||
|
01:57:52.000 --> 01:57:59.000
|
||
|
Somebody who is pushing a mythology and shuffling the shells around in front of you.
|
||
|
|
||
|
01:57:59.000 --> 01:58:01.000
|
||
|
Are we talking about consciousness now?
|
||
|
|
||
|
01:58:01.000 --> 01:58:02.000
|
||
|
Well then we don't know.
|
||
|
|
||
|
01:58:02.000 --> 01:58:04.000
|
||
|
Consciousness is not even scientific.
|
||
|
|
||
|
01:58:04.000 --> 01:58:05.000
|
||
|
We don't even...
|
||
|
|
||
|
01:58:05.000 --> 01:58:08.000
|
||
|
My old boss told me never even to worry about consciousness.
|
||
|
|
||
|
01:58:08.000 --> 01:58:10.000
|
||
|
He just missed that question.
|
||
|
|
||
|
01:58:10.000 --> 01:58:11.000
|
||
|
It's just dumb.
|
||
|
|
||
|
01:58:11.000 --> 01:58:14.000
|
||
|
Now we're talking about intelligence and whatever.
|
||
|
|
||
|
01:58:14.000 --> 01:58:16.000
|
||
|
Oh yeah, then everything's intelligent.
|
||
|
|
||
|
01:58:16.000 --> 01:58:18.000
|
||
|
You know, mouse is intelligent.
|
||
|
|
||
|
01:58:18.000 --> 01:58:20.000
|
||
|
Blue whales are definitely intelligent.
|
||
|
|
||
|
01:58:20.000 --> 01:58:23.000
|
||
|
They have deeper thoughts than us because they have bigger brains.
|
||
|
|
||
|
01:58:23.000 --> 01:58:25.000
|
||
|
Holy shit.
|
||
|
|
||
|
01:58:25.000 --> 01:58:28.000
|
||
|
The monkey can create...
|
||
|
|
||
|
01:58:28.000 --> 01:58:32.000
|
||
|
It actually has a thumb, but it's actually down an inch or so.
|
||
|
|
||
|
01:58:32.000 --> 01:58:34.000
|
||
|
And therefore it really can't grow very well.
|
||
|
|
||
|
01:58:34.000 --> 01:58:37.000
|
||
|
So it can create a little bit of technology, but the technology it creates cannot create other technology.
|
||
|
|
||
|
01:58:37.000 --> 01:58:43.000
|
||
|
So the fact that we have a thumb means we can create integrated circuits that can become a large language model.
|
||
|
|
||
|
01:58:43.000 --> 01:58:45.000
|
||
|
That comes from the human brain.
|
||
|
|
||
|
01:58:45.000 --> 01:58:49.000
|
||
|
Does it come from the human brain or the human thumb?
|
||
|
|
||
|
01:58:49.000 --> 01:58:52.000
|
||
|
What is he taught? Oh my gosh.
|
||
|
|
||
|
01:58:52.000 --> 01:58:57.000
|
||
|
And it's actually trained with everything that we've ever thought.
|
||
|
|
||
|
01:58:57.000 --> 01:59:01.000
|
||
|
Anything that human beings have thought has been documented and it can go into these large language models.
|
||
|
|
||
|
01:59:01.000 --> 01:59:04.000
|
||
|
And everybody can work on these things.
|
||
|
|
||
|
01:59:04.000 --> 01:59:07.000
|
||
|
And it's not true only certain wealthy people will have it.
|
||
|
|
||
|
01:59:07.000 --> 01:59:09.000
|
||
|
I mean, how many people here have phones?
|
||
|
|
||
|
01:59:09.000 --> 01:59:12.000
|
||
|
If it's not 100%, it's like 99.9%.
|
||
|
|
||
|
01:59:12.000 --> 01:59:16.000
|
||
|
And you don't have to be kind of from a wealthy group.
|
||
|
|
||
|
01:59:17.000 --> 01:59:20.000
|
||
|
I mean, I see people who are homeless who have their own phone.
|
||
|
|
||
|
01:59:20.000 --> 01:59:22.000
|
||
|
It's not that expensive.
|
||
|
|
||
|
01:59:22.000 --> 01:59:24.000
|
||
|
And so...
|
||
|
|
||
|
01:59:24.000 --> 01:59:27.000
|
||
|
It's not that expensive or it's absolutely a necessity.
|
||
|
|
||
|
01:59:27.000 --> 01:59:30.000
|
||
|
What a jackass.
|
||
|
|
||
|
01:59:30.000 --> 01:59:32.000
|
||
|
Even homeless people have phones.
|
||
|
|
||
|
01:59:32.000 --> 01:59:33.000
|
||
|
They're so cheap.
|
||
|
|
||
|
01:59:33.000 --> 01:59:36.000
|
||
|
Nowadays, the world is so great.
|
||
|
|
||
|
01:59:36.000 --> 01:59:39.000
|
||
|
The world is so great that even homeless people have their own phones.
|
||
|
|
||
|
01:59:39.000 --> 01:59:40.000
|
||
|
I mean, come on.
|
||
|
|
||
|
01:59:40.000 --> 01:59:41.000
|
||
|
It's crazy.
|
||
|
|
||
|
01:59:41.000 --> 01:59:47.000
|
||
|
Soon homeless people will be able to sequence the bugs that they sleep on with the cheap technology that's available.
|
||
|
|
||
|
01:59:47.000 --> 01:59:49.000
|
||
|
That's the distribution of these capabilities.
|
||
|
|
||
|
01:59:49.000 --> 01:59:52.000
|
||
|
It's not something you have to be fabulously wealthy to afford.
|
||
|
|
||
|
01:59:52.000 --> 01:59:55.000
|
||
|
So you think that we're heading into a future where we're going to live much longer and we'll be much more equal?
|
||
|
|
||
|
01:59:55.000 --> 01:59:59.000
|
||
|
We think we're heading into a society where we'll live much longer, be wealthier, but also much more equality.
|
||
|
|
||
|
01:59:59.000 --> 02:00:00.000
|
||
|
Yes, absolutely.
|
||
|
|
||
|
02:00:00.000 --> 02:00:01.000
|
||
|
We've seen that already.
|
||
|
|
||
|
02:00:01.000 --> 02:00:06.000
|
||
|
Well, we're at time, but Ray and I will be back in 2124 and 2324.
|
||
|
|
||
|
02:00:06.000 --> 02:00:07.000
|
||
|
So thank you for coming today.
|
||
|
|
||
|
02:00:07.000 --> 02:00:08.000
|
||
|
Thank you so much.
|
||
|
|
||
|
02:00:08.000 --> 02:00:09.000
|
||
|
He is an American Treasurer.
|
||
|
|
||
|
02:00:09.000 --> 02:00:14.000
|
||
|
Thank you so much.
|
||
|
|
||
|
02:00:14.000 --> 02:00:16.000
|
||
|
Wow, that was painful.
|
||
|
|
||
|
02:00:16.000 --> 02:00:18.000
|
||
|
I mean, but it's good to see it, right?
|
||
|
|
||
|
02:00:18.000 --> 02:00:22.000
|
||
|
I think it's really important to see these things for what they are because that's the illusion.
|
||
|
|
||
|
02:00:22.000 --> 02:00:28.000
|
||
|
That's the actual people that believe this stuff is why I end this.
|
||
|
|
||
|
02:00:28.000 --> 02:00:33.000
|
||
|
A lot of these talks with this slide right here.
|
||
|
|
||
|
02:00:33.000 --> 02:00:35.000
|
||
|
Oh, come on.
|
||
|
|
||
|
02:00:35.000 --> 02:00:36.000
|
||
|
That's not what I wanted.
|
||
|
|
||
|
02:00:37.000 --> 02:00:40.000
|
||
|
This slide is what I selected you demon.
|
||
|
|
||
|
02:00:40.000 --> 02:00:42.000
|
||
|
You should have known that.
|
||
|
|
||
|
02:00:45.000 --> 02:00:47.000
|
||
|
Why are they doing this?
|
||
|
|
||
|
02:00:47.000 --> 02:00:51.000
|
||
|
I say at the end, and this is definitely important.
|
||
|
|
||
|
02:00:51.000 --> 02:00:56.000
|
||
|
We are at a time point where they don't want any more people on the planet.
|
||
|
|
||
|
02:00:56.000 --> 02:01:01.000
|
||
|
They want this to be the kind of top of the curve of population.
|
||
|
|
||
|
02:01:02.000 --> 02:01:03.000
|
||
|
And they are sorry.
|
||
|
|
||
|
02:01:03.000 --> 02:01:04.000
|
||
|
I got to go the other way.
|
||
|
|
||
|
02:01:04.000 --> 02:01:06.000
|
||
|
They want this to be the top of the curve of population.
|
||
|
|
||
|
02:01:06.000 --> 02:01:17.000
|
||
|
And then they want to while we're up here, they want to invert our sovereignty to permissions so that they can start collecting data from our babies from our children from our grandchildren.
|
||
|
|
||
|
02:01:17.000 --> 02:01:24.000
|
||
|
And as they collect that lifelong data, then they're also going to try and reduce the population reproduction rate.
|
||
|
|
||
|
02:01:24.000 --> 02:01:27.000
|
||
|
And they're already reducing the population reduction rate now.
|
||
|
|
||
|
02:01:27.000 --> 02:01:36.000
|
||
|
But as they reduce it, then the population will slowly decrease for around three generations and will get right back down to two billion people.
|
||
|
|
||
|
02:01:36.000 --> 02:01:54.000
|
||
|
Maybe they'll even stop all reproduction in the average person, anybody that takes their medicine, anybody that subscribes to the system just won't find a reason to reproduce or won't be very good at it or won't be able to raise a successful family except in the context of turning them over to the state.
|
||
|
|
||
|
02:01:55.000 --> 02:01:58.000
|
||
|
That's really the slavery that's coming.
|
||
|
|
||
|
02:01:58.000 --> 02:02:00.000
|
||
|
It's a multi-generational warfare.
|
||
|
|
||
|
02:02:00.000 --> 02:02:13.000
|
||
|
It's multi-generational plan that has come to a mini peak in our lifetimes, but is going to continue into the future if it is allowed to because this diversity needs to be collected.
|
||
|
|
||
|
02:02:13.000 --> 02:02:23.000
|
||
|
The data needs to be collected because that's the data that's going to allow their AI to solve the cause, the singularity to occur.
|
||
|
|
||
|
02:02:24.000 --> 02:02:31.000
|
||
|
That's the whole point. It's these questions that they're asking are the wrong questions.
|
||
|
|
||
|
02:02:31.000 --> 02:02:49.000
|
||
|
You're supposed to accept the inevitability because if you accept the inevitability, it's the same thing as accepting this new world order of pandemic RNA being possible and real and gain a function RNA being possible and real means that you're trapped in that in that paradigm forever.
|
||
|
|
||
|
02:02:49.000 --> 02:02:56.000
|
||
|
They want you trapped in that paradigm where you think that there's nothing sacred, that everything can be cracked, that everything can be modeled.
|
||
|
|
||
|
02:02:56.000 --> 02:03:00.000
|
||
|
Within five years or 15 years, there's going to be no more disease.
|
||
|
|
||
|
02:03:00.000 --> 02:03:18.000
|
||
|
They want you to think that it's okay, that it's inevitable that you're going to have to turn over your decisions, you're going to turn over your medical data, your medical decisions, you're going to turn it all over to AI because AI is going to be, as they said, millions of times smarter than you.
|
||
|
|
||
|
02:03:19.000 --> 02:03:26.000
|
||
|
Ladies and gentlemen, please stop all transfections and humans because they are trying to eliminate the control group by any means necessary.
|
||
|
|
||
|
02:03:26.000 --> 02:03:35.000
|
||
|
And yeah, this has been giga home biological where intramuscular injection of any combination of substances with the intent of augmenting the immune system is dumb.
|
||
|
|
||
|
02:03:35.000 --> 02:03:40.000
|
||
|
Transfection in healthy humans is criminally negligent and RNA cannot pandemic.
|
||
|
|
||
|
02:03:41.000 --> 02:03:46.000
|
||
|
Thank you very much for being around for this presentation of the independent right web.
|
||
|
|
||
|
02:03:46.000 --> 02:03:51.000
|
||
|
We are going to get to some mini courses soon.
|
||
|
|
||
|
02:03:51.000 --> 02:03:57.000
|
||
|
It's just a question of how life getting in the way and me juggling a few balls to make sure I can do this daily.
|
||
|
|
||
|
02:03:57.000 --> 02:04:05.000
|
||
|
And so when I can, those mini courses will start and I think we'll probably make a little bit of an announcement, but maybe we won't either because we'll just start.
|
||
|
|
||
|
02:04:05.000 --> 02:04:11.000
|
||
|
So maybe the first one will start this week and it's going to be the mini course on preons.
|
||
|
|
||
|
02:04:11.000 --> 02:04:23.000
|
||
|
It is going to involve a couple, a couple study halls that involve the dead lady from MIT because she does a few more good presentations about protein folding and preons.
|
||
|
|
||
|
02:04:23.000 --> 02:04:32.000
|
||
|
A little more detail and it can walk us in to the understanding that's going to be required with regard to the yeast chaperone proteins.
|
||
|
|
||
|
02:04:32.000 --> 02:04:36.000
|
||
|
And so I am going to do two study halls with her in the coming days.
|
||
|
|
||
|
02:04:36.000 --> 02:04:41.000
|
||
|
And then we're going to break right into Stanley's papers about preons.
|
||
|
|
||
|
02:04:41.000 --> 02:04:50.000
|
||
|
Ladies and gentlemen, Giga Home Biological, unlike PBS News Hours, brought to you by viewers like you and Giga Home Biological is made possible specifically by these people.
|
||
|
|
||
|
02:04:50.000 --> 02:05:01.000
|
||
|
And especially by Greg James and Steph Cohen, Stephen Cohen, who have been for the last few months and Greg for the last couple of years on and off have been just.
|
||
|
|
||
|
02:05:01.000 --> 02:05:12.000
|
||
|
It's hard to describe that kind of support because more or less those two guys for this month, for example, are going to pay our rent.
|
||
|
|
||
|
02:05:12.000 --> 02:05:18.000
|
||
|
And that means that the support is, yeah, it's going to get us there.
|
||
|
|
||
|
02:05:18.000 --> 02:05:26.000
|
||
|
It's going to get us across the finish line, which I think maybe the finish line is coming in the sense of we're almost there.
|
||
|
|
||
|
02:05:26.000 --> 02:05:32.000
|
||
|
We're almost at self-sufficiency. We're almost at the stage where this is really going well.
|
||
|
|
||
|
02:05:32.000 --> 02:05:43.000
|
||
|
And I think what's going to be fun. My wife, Fearla, is going to start live streaming yoga classes, which is not going to be anything associated with Giga Home Biological, except I'm going to help her do it because I got a lot of the equipment.
|
||
|
|
||
|
02:05:43.000 --> 02:05:46.000
|
||
|
It's going to make it easier than me editing that stuff.
|
||
|
|
||
|
02:05:46.000 --> 02:05:53.000
|
||
|
It'll give us more time to put stuff up on sub stack and more time for me to work on many courses that I think are going to become a real big hit.
|
||
|
|
||
|
02:05:53.000 --> 02:06:05.000
|
||
|
Especially the many courses I think that we do for biology, for virology, for immunology, many courses that will be good for parents and good for high schoolers and good for everybody.
|
||
|
|
||
|
02:06:05.000 --> 02:06:11.000
|
||
|
And I think that's really where we go. So we'll have these little playlists on YouTube and stuff. It's going to be awesome. So I'm really excited.
|
||
|
|
||
|
02:06:12.000 --> 02:06:31.000
|
||
|
And also actually Star Trek is still in the works and just got postponed, but it seems to get better the longer we wait because people act more and more silly and it makes it more and more easy to tell a really elaborate story about how much we were fooled.
|
||
|
|
||
|
02:06:31.000 --> 02:06:38.000
|
||
|
Hey, lots of talk. Got a lot of work to do. Thank you very much and I'll see you again tomorrow.
|
||
|
|