1:58:00

Insomnia Stream #13 1.mp3

12/22/2020
Speaker 1
00:01:42 Place the power button.
Speaker 2
00:00:24 So you have a new camcorder, Stephanie.
Speaker 3
00:00:27 Yeah. And I don't know how to use it exactly. I I don't think I'm going to be able to learn all this by today. I wanted to suit my son's.
Speaker 1
00:00:35 Birthday party.
Speaker 2
00:00:36 Don't be intimidated by all those details. I'll show you 7 easy steps to creating good home videos. It's simply a combination of understanding your equipment, good camera techniques, using the appropriate accessories, and learning how to compose.
00:00:53 Here is your basic checklist of the seven steps you should consult this every time you begin taping. Since the battery is the source of power for your equipment, always make sure it has been fully charged at the start of taping. This way you won't run out of power in the middle.
00:01:08 Of a recording.
00:01:11 Secondly, push the power button to make sure you have power and that everything is working properly.
00:01:17 For the very best results, you should always use new high quality data.
00:01:22 Camera setup is next. This means deciding the speed you want to shoot at. The faster the Cape speed, the better the quantity. On some camcorders the setting is automatic and you have no choice. The 5th step is choosing the proper filter. Filters are usually marked for indoors or outdoors. The right one is essential to get true colors exposure.
00:01:42 Which means setting the lens on automatic or manual control is next. Here again, some cameras have only automatic. If you have an option, you should set the control to automatic under normal lighting conditions.
Speaker
00:01:52 And the.
Speaker 2
00:01:53 Focusing the camcorder properly is the final step on our checklist, Stephanie. Unless this function also is automatic on your camera, zoom in the lens to the vocal point.
00:02:02 Of the scene.
00:02:03 You are shooting and focus it clearly in that position. From then on you will be in focus even when you zoom out. One more thing, it is important to be aware of your cameras audio system.
00:02:13 Most have automatic systems which are activated whenever your camera is on. Remember that the sounds closest to the microphone will be louder than those farther away. You got all that.
Speaker 1
00:02:23 I think so.
Speaker 2
00:02:25 Locate Battery fully charged.
00:02:28 Power up.
00:02:30 Brush tape determine camera speed correct filter.
00:02:37 Automatic or manual exposure?
00:02:39 Zoom in on subject to set boken. Now let's put it to work. Weren't you about to go shopping for a birthday present before you got lost in your manual?
Speaker 1
00:02:47 Sure was. I think I'm going to go get that puppy for my son's birthday. Human society. Maybe they'll let.
Speaker 3
00:02:53 Me use the camcorder there.
Speaker 1
00:02:56 I should call.
Speaker 4
00:03:06 Hi I'm calling in.
00:03:08 Regards to the possibility of using a camcorder there, I'm coming in to.
Speaker 1
00:03:13 Pick up a.
00:03:14 Birthday present for my son.
Speaker 4
00:03:16 And I'd like to know if.
00:03:16 There's a problem.
Speaker 3
00:03:17 At all with you. OK, OK.
Speaker 1
00:03:23 I should see.
Speaker
00:03:23 You Shirley. Thanks.
Speaker 4
00:03:24 Much bye bye.
Speaker 1
00:03:27 Looks like a go. Let's go.
Speaker 2
00:03:33 OK, Stephanie, Checklist completed. This is a pretty busy place to throw you into on your first assignment, but don't panic. Just get used to the feel of the camera and do some random shooting.
00:03:47 Well, it's certainly going to be difficult choosing between all of these cute little critters.
00:04:01 Say here's a little crude. A mother would be proud of.
00:04:28 Stephanie, remember to always take the camera out of record when you're not recording.
00:04:49 Oh, it looks like Stephanie has made her decision on the birthday present. Someone is going to be surprised and happy.
00:05:01 Stephanie, I think now you understand the basics of getting your camcorder up and running. It looked like you were having fun taping those puppies.
Speaker 1
00:05:09 I sure was. They were so cute, aren't they?
00:05:12 I'm glad I've got them on tape now.
Speaker 2
00:05:14 They really were. Our next step is to learn what separates really good video from mediocre, namely camera technique and a little planning.
Speaker 1
00:05:22 Right.
Speaker 4
00:05:23 Good camera technique.
Speaker 1
00:05:25 What is that?
Devon
00:05:27 What is that?
00:05:29 What is good camera technique?
00:05:33 You know, one of the interesting things about that.
00:05:37 That information.
00:05:40 That informative video that we were watching.
00:05:43 That I thought that stuck out a little bit was the fact that she called up in advance.
00:05:49 To ask the SPCA if it was OK if she.
00:05:53 Brought her camera with her and recorded.
00:05:56 Something that that is completely foreign now that everyone has a camera in their pocket that's just assumed, you know, consent.
00:06:03 It's just assumed like, Oh no, I I can. I can record anything and everything around me all the time.
00:06:09 But that wasn't.
00:06:11 The case not so long ago, people were.
00:06:14 Very uncomfortable being recorded and would be very upset if you recorded them without their permission.
00:06:21 But now just assume that you're on.
00:06:25 On the record all the time.
00:06:28 Good morning, everybody.
00:06:32 Good afternoon. Good evening. Depending on where you.
00:06:34 Are on this planet.
00:06:40 It's uh.
00:06:42 Fairly early for me, but not super early like it has been in the past.
00:06:48 I slept somewhat normally, still kind of getting over my cold, so I might cough a little bit.
00:06:54 Uh, which is also why I kind of slept in if.
00:06:57 You can call this sleeping in.
00:06:59 I've been out like a light when I do finally go to bed.
00:07:03 And as I've said, if this is the vid, you guys got nothing to worry about, just the normal ask cold.
00:07:11 Or might not be, it just might be a normalized cold. I have no idea which by the way. The normal as cold.
00:07:18 Is a coronavirus.
00:07:19 A lot of people do not know this.
00:07:23 Which is funny because.
00:07:25 One of the reasons why.
00:07:28 We don't have a vaccine for the normal as cold cold.
00:07:33 Is it is a coronavirus and it mutates.
00:07:36 All the time.
00:07:38 And so one of the things.
00:07:40 That people were worried about.
00:07:42 In terms of the bid?
00:07:45 Was Ohh it's gonna mutate and you know that'll make they were talking about this one in the like, a year ago or almost a year ago. Right. The one of the things will make vaccines almost impossible to make is.
00:07:58 Is it'll it'll mutate so much, you know. You know, with the flu, we only have a vaccine that from the last year's flu. And so that's why flu vaccines don't work very well because the flu mutates. And by the time, you know, you might get it, it'll be a totally different well.
00:08:15 You know, surprise surprise.
00:08:19 That's what they're saying happened already. So even though the the vaccine that they'll still make you get.
00:08:26 Was just released a couple days ago. Oh, now it's. It's mutated again. It's mutated again. And now we've got phase three or whatever it is. Right? Like the.
00:08:38 Whatever's going on in in Europe, where they're saying no, we gotta close all the borders down. You know, France has shut down.
00:08:44 The border and.
00:08:46 UK has has all kinds of crazy restrictions where Christmas is canceled and.
00:08:51 Because it's mutated, it's mutated.
00:08:55 It doesn't matter.
00:08:56 That we have the vaccine, it's mutated. This is gonna go on forever, guys.
00:09:00 I mean, not. Maybe not forever until they think of more, because eventually people will.
00:09:05 Be like, alright.
00:09:05 Dude, we've been locked down for like 5 years.
00:09:08 They will have to be a little more creative.
00:09:12 But they're going to get what?
00:09:14 They want they're going to get what they want first.
00:09:17 And then they can start rewarding you with freedom passes.
00:09:24 But they want they want the full.
00:09:27 Especially right now, because right now, civil unrest is.
00:09:30 In the air.
00:09:32 Civil unrest is in in the all across the West. It's not just in America where it's, you know particularly.
00:09:39 Kind of sparky.
00:09:42 The ruling class I think is feeling.
00:09:45 An existential threat right now.
00:09:48 And so they're acting.
00:09:51 They're acting accordingly.
00:09:56 So Speaking of which?
00:09:58 Yesterday I kind of talked about how.
00:10:02 We would see Trump kind of doing a last ditch effort on the 6th.
00:10:08 The the chances of it actually going his way were slim simply because according to people in in the.
00:10:18 And the Senate, because what what's required? Briefly. What what, what? What happened on the 6th is that's when they count the Electoral College votes. Will Mike Pence does. And during the count the possibility is or the legal possibility is.
00:10:38 Mike Pence can say ohh I've got, you know. And for Pennsylvania, I've got two sets of votes. I've got one set of votes that were certified by the governor and and whatever for Biden. But I also have these these alternate votes for Trump and, you know, do we have anyone who wants to debate?
00:10:57 This and if someone from the house and someone from the Senate both say they want to debate it, then they have to debate it for two hours and then at the end of those two hours they vote and they choose whichever 1 whatever, whichever set.
00:11:15 You know that that the vote comes down on so the basically what you're betting on is you have to have a Mike Pence.
00:11:27 Would have to say, hey, look, there's two sets of votes and you know, therefore we we must count both or we have to do a vote on on whether or not we debate this. So you have to have Mike Pence want to do that. Next you have to have someone from the house and someone from the Senate.
00:11:48 But then the you know, here's the hard part. Then on top of that.
00:11:52 You have to have over 50% of the Senate voting that yes, we want to throw away.
00:12:01 These Biden votes.
00:12:04 That the state of Pennsylvania certified and we want.
00:12:06 To use these alternate votes.
00:12:08 And then, because if you get all that.
00:12:12 Then those votes can be switched.
00:12:16 So it's kind of a that's a tall order. So the idea that apparently Trump has.
00:12:26 Is that on the 6th?
00:12:27 He wants a giant rally he wants to.
00:12:33 Create a.
00:12:37 Generate the I guess a fear component.
00:12:43 If there's a bunch of people, a bunch of Maga people surrounding the Capitol building where they're doing this.
00:12:50 Out right outside.
00:12:52 He wants there to be the perception of of danger.
00:12:56 Outside the walls of the Capitol building.
00:13:02 And I I because normally I think the capital building is open right like you.
00:13:06 Can go in.
00:13:06 There and there's like a gallery that you can, you know, watch this from.
00:13:11 I suspect that will not be the case on January 6. They're going to have it locked down and they'll say it's for COVID or something like that, but they'll have it locked down, but there will be people right outside, and it'll be very. I'm sure it'll be a lot of people that show up to this.
00:13:28 But like I said, this is a very long shot.
00:13:31 And I said yesterday or yesterday that I didn't suspect that there would be a.
00:13:39 Military coup or anything of that nature, because Trump just simply didn't have enough support from the military to pull something like that off, so that if this doesn't work.
00:13:54 I don't think that I think that's it.
00:13:57 I I think that's all she wrote.
00:14:00 And again, anything could happen. It's been a really weird year. There's a lot of stuff that.
00:14:08 I wouldn't have thought.
00:14:09 That would have happened that that has.
00:14:10 Happened this year.
00:14:12 So I'm leaving everything.
00:14:15 I'm leaving everything on the on the table. At this point, you know.
00:14:21 I and and I, as I've said before, in my dream scenario.
00:14:26 Trump doesn't just stop that that Trump keeps doing more radical options.
00:14:33 Up until the 20th, you know, or even beyond the 20th. I don't know. It just. I don't want him to ever quit because the longer this goes on and the more extreme the options become and the more disappointments.
00:14:47 That that happened as a result. You know as these as there are failures.
00:14:52 The the less Trump supporters believe in the system.
00:14:59 Which is a good thing. The systems fought. The system needs to come down. The system is too corrupt and too rotten rotted to revive, and so the only way, and the only way to ever overcome the existence of the system is if people cease to, I mean.
00:15:18 It exists only because people believe in it. It's kind of like the dollar, right? The only reason why the dollar is worth anything is because people agree that the dollar is worth something. It's not actually worth anything. You can't just go trade it in for gold or.
00:15:31 Thing like that, you know, like it's not backed by anything. It's backed by everyone agreeing that it's that it's worth something and in some respects you could say that you know the same thing with all all currency. You could even say that about Bitcoin and the reason why the the price fluctuates so much because it's just determined by whatever people say it's worth.
00:15:50 Well, the government's the same thing and the the only reason why the government has authority, because there's way more of us than there are of them. The only reason why they have authority and they have the power to wield violence against us.
00:16:05 Is because we agree that they do.
00:16:08 But if if half the country.
00:16:12 You know, if 70 million plus people one day decided you know what, you you don't actually have authority. You know what? You guys your authority is invalid.
00:16:24 Things get a little *****.
00:16:27 Things get a little crazy.
00:16:30 And if you think that's not tied to the COVID situation, you're you're you're not paying attention.
00:16:39 At any rate.
00:16:41 Tied to that whole thing that we were talking about yesterday.
00:16:46 The ex CEO.
00:16:49 Ofoverstock.com.
00:16:51 I don't know if you guys remember this guy.
00:16:54 He's the.
00:16:56 The ex CEO.
00:16:56 Of overstock, he was Speaking of crypto. He was actually a big supporter of Bitcoin.
00:17:03 I don't know if they still do this now that he's been forced out, but they overstock.com used to accept.
00:17:09 Bitcoin as payment? You could buy stuff off of overstock.com using Bitcoin.
00:17:17 He was a big supporter of Bitcoin. He was pushing it. He was kind of like in some ways he's kind of like a libertarian kind of guy.
00:17:26 But they forced him out of his position.
00:17:31 When he started talking about the deep state and this was in the early days, this is before. This is back when.
00:17:39 The the term deep state was very, very first being like thrown around. And so like if you just said deep state, even though now like everyone says it now it's it's kind of it's it's accepted right that there's a deep state.
00:17:53 But this was back when it was like they didn't even want to go there, right? Like, it was kind of like the when when Pizza gate first.
00:17:59 Got mentioned and they.
00:18:00 Went completely ape shift in every single mainstream, every.
00:18:04 The thing.
00:18:05 Went like on full defense mode and because they don't want to go there, they don't want to.
00:18:09 Go they don't want.
00:18:10 The public thinking that there's any kind of pedophile problem in the ruling class, right? So they went full on full blast against, you know, any any idea that there might be a a pizza gate thing? Right. Well, they did the same thing with with Deep state. It just it was unavoidable, right. They they couldn't get around it.
00:18:26 And so when the term deep state first started getting thrown around, any and all outlet was talking about, oh, that's crazy. That's oh, that's this crazy deep state conspiracy theory. And the overstock CEO.
00:18:43 I forget exactly. I mean he did. He did appear as if he was going a little, but not it's.
00:18:48 Not going to not going to lie. So some of his what he was talking about sounded a little paranoid, but he started talking about the deep state and because he was saying deep state, they forced him out of his position. Well, apparently since then.
00:19:05 He's either been advising Trump to some degree or hanging out with Trump. He's like another billionaire, right? You know, billionaires know each other. So he's been at the White House at the very least, we can confirm that he's been at the White House.
00:19:22 And yesterday I I talked about how CNN was reporting and we didn't know how much of it was true because it was CNN, CNN was reporting.
00:19:34 That Sidney Powell and Mike Flynn went to the Oval Office and they discussed options with Trump.
00:19:44 And CNN reported that those options include martial law and a military coup, and I was saying, well, I mean, that's possible, because if you listen to what Sidney Powell and Mike Flynn have been saying.
00:20:00 At their rallies, like the Jericho Rally and stuff like that, that's, I mean that's what they talk about, right? So why wouldn't they at least discuss that as?
00:20:07 An option? Well, according to the ex, CEO of overstock.com. And let me let me.
00:20:15 Go see what here.
00:20:18 Keep forgetting his name. His name is.
00:20:23 Excuse me, Patrick, and I'm going to say his name.
00:20:26 May be wrong.
00:20:27 Because Byrne Bearn Patrick bearne.
00:20:33 According to Patrick.
00:20:35 He says. And this is a tweet he put.
00:20:38 Out I was there for the.
00:20:41 Full 4 1/2 hour meeting.
00:20:44 Names military coup, coup slash martial law were discussed is 100% fabrication. Trump is lied to by his own advisers who tell staff get the president to concede while they stall. Trump Meadows lawyers Eric Derrick.
00:21:04 DC Pat Cipollone Cipollone is the leaker.
00:21:11 The ex CEO and when people called him out and said Ohh what do you mean? Why were you in the meeting? You know he he tweeted out photos of him at the White House. You know, with the Christmas decorations up and everything. He's he's clearly there or it has been there very recently and maybe he's just making **** up but I I.
00:21:29 Don't get I.
00:21:30 Don't kind of. I don't feel that.
00:21:32 I kind of feel like he.
00:21:33 Probably was there and.
00:21:36 So basically what he's saying.
00:21:38 Is that I? I mean it's kind of the theory that we've I floated a lot of times in the past that Trump is kind of just this dumb boomer that is surrounded by people undermining him and and he he's he's not a good enough leader or a sharp enough leader to determine.
00:21:58 In fact from.
00:21:59 Fiction. You know, he's he's just sitting there watching Fox News or he was, and now he's watching OA.
00:22:04 And and he's just responding to what they're telling him on on the TV screen. He doesn't do enough of his own research, and he hasn't trusted enough competent people. It's it's like I was saying the 8020 rule, right. He doesn't. And and unfortunately, it's starting to sound like Trump's part of the 80. OK. He's very charismatic. You can be charismatic and.
00:22:26 In fact, oftentimes.
00:22:28 When I, you know, I just refresh your memory, the 8020 rule, meaning that in any situation, whether you're working at a a mechanic shop or or a fast food restaurant or at a multinational bank, 80% of the employees are dead weight and 20% are the ones that are actually getting ship.
00:22:50 And that that's across the board. You know you got 20% actually doing the work 80%. You know writing the coattails of the 20%, getting the work done, it's it's been that way at every job I've had whether it.
00:23:02 Was fast food.
00:23:03 Or, you know, at huge consulting firms, it doesn't matter, right? And often times.
00:23:10 And I said this too 8, the 80% are are in positions of power.
00:23:15 And and that's how they get away with not doing anything is because they're very charismatic, they're very good at at leveraging the work of other people to stay in power. And it's it's starting to sound like that might be Trump.
00:23:29 And unfortunately, it might be a.
00:23:32 Lot of his team.
00:23:34 And the 20%?
00:23:37 That are getting **** done might not have the same agenda, but might not have the same agenda.
00:23:44 And likely don't have the same agenda.
00:23:49 It's starting to look like we're not going to.
00:23:51 See any Rubicon crossing?
00:23:55 It just sounds like Trump's the frontman for other people that are getting **** done.
00:24:01 And they don't. They want him. They want him to concede as soon as possible. They're worried about a lot of these people. They're worried about their next job.
00:24:11 You know, and Trump is an embarrassment. The fact that he's not conceding is embarrassing to them. That's gonna ruin their social standing in the ruling class. If you know the longer this goes on and they just want him to wrap it up. And so that they can move on and and try to, you know, continue their careers in the establishment.
00:24:32 And and that there is no plan, there is no plan to, you know, arrest the pedos and all this, this sort of a, you know, nonsense. OK.
00:24:42 That, that's that's really more and more that's that's the the the picture that's forming.
00:24:50 That's the picture that's solidifying and I've said this before and I'll say it again, if Trump.
00:24:56 Who 100% believes that there was fraud and he does.
00:25:01 Or at least that's what he's saying. Every day, several times a day, with giant warnings from Twitter telling you that this claim is disputed underneath them.
00:25:11 Every single day he's telling you that there was fraud. It's a fraudulent or election. He's the one that really won. If he really believes that, and he does.
00:25:21 And he does step down and he does hand the reins over to a a false president.
00:25:31 He was the worst president.
00:25:34 In American history.
00:25:37 I want to say that again if.
00:25:40 President Trump.
00:25:42 Hands power over.
00:25:45 To a false president and his super.
00:25:52 A fake president, a fraudulent president.
00:25:55 The IT doesn't matter what the circumstances are.
00:25:58 Doesn't it? But he doesn't have any.
00:26:00 No, no, no, no.
00:26:01 I mean, like, if he doesn't physically try to fight.
00:26:05 Biden, you know like.
00:26:06 If he doesn't go down swinging, if he just hands it over.
00:26:12 Without a fight, like an actual fight, like a fight where people get hurt.
00:26:18 He was the worst president.
00:26:22 In the United States history.
00:26:27 I'm going to use a little metaphor here.
00:26:31 It's like because it doesn't matter if Biden is.
00:26:34 Holding all the cards OK.
00:26:38 It's like if if someone knocked on your door.
00:26:43 And you open the door.
00:26:45 And it's and and. And it's Biden, right?
00:26:49 And he says.
00:26:51 I'm. I'm now married to.
00:26:52 Your wife?
00:26:54 And he walks over to your wife and just starts ******* your wife.
00:26:59 And you're just like oh.
00:27:01 I I guess, I guess you're married to my wife now.
00:27:06 Because you know.
00:27:07 Let's say Biden has a baseball bat with him.
00:27:10 Because you have a baseball bat and I don't have a baseball bat.
00:27:14 And I'm afraid of getting hit.
00:27:16 By the baseball bat.
00:27:18 So I'm just gonna let you **** my wife because.
00:27:22 You've got the baseball bat.
00:27:25 And you say that you're now, you're now married.
00:27:27 To my wife.
00:27:29 That's that's what it is.
00:27:31 If you don't go down.
00:27:35 Trying to to to stop that. Like if you just sit there and watch it.
Speaker 2
00:27:41 You're what?
Devon
00:27:42 What kind of ******* husband are you?
00:27:47 You're a ****.
00:27:49 You're a ****.
00:27:52 And that's the kind of President Trump will be a massive ******* ****.
00:27:57 A literal ****.
00:28:00 Because he'll just be standing by watching.
00:28:05 As Biden, Fox's wife.
00:28:09 As Biden just takes what's his?
Speaker
00:28:13 Doesn't matter what the the media would say.
Devon
00:28:15 It doesn't matter.
00:28:18 Doesn't matter.
00:28:19 If you go down without a fight and I'm not me, I don't mean like legal fights that people are doing for you, OK? Trump's not doing any of these fights.
00:28:31 The the He has.
00:28:32 Surrogates that are doing fights for him right now. I mean, if Trump himself.
00:28:38 Doesn't doesn't.
00:28:40 Stand up.
00:28:42 And give it everything that he has. And I don't think he's going to, by the way.
00:28:49 It's possible it's possible that the the Trump will.
00:28:52 Prove me wrong and and.
00:28:53 And not be a cook.
00:28:57 But if he doesn't?
00:28:58 He will be the worst president that the America America has ever had, and had. Not only that, he'll he'll have had no business ever been in the White House in the first place.
00:29:14 It's like and it's the. The ****** thing is all these people.
00:29:20 Who will talk **** about all these other Republicans saying? Ohh, well, they're not fighting for our president? You know, like all these state legislators, they're not fighting for our president. All these people that like these governors, you know, they're just, they're rhinos Republicans in, in, in name only. They're not willing to do what it takes.
00:29:37 That'll be Trump times a million.
00:29:42 Of course, that's not the way that I'll frame it. They'll frame it.
00:29:47 And by the way, they're not.
00:29:48 Wrong. They're right, you know, all these rhinos or whatever you want to call them doesn't matter. But all these people, that is their job, their *****, they're either ***** they're, you know, or they're compromised or whatever, but they're their job.
00:30:03 To serve the interests of their people and to lead and and and they have no business in the same way that if Trump hands over power to an usurper.
00:30:15 He has no business being in the White House in the 1st place because he was he was not competent enough for the job.
00:30:23 Same thing goes for all these other people like that. Same thing goes for that. You know, people are trying to show sympathy to her. No, she was terrible ******* person. The woman in Michigan.
00:30:35 Who certified the votes? And it will end. The guy that I I just, I don't think he ever made an appearance on media, so I'm not sure who.
00:30:44 Where they they the the zoom call got released there was a zoom call where it was something like it was like a five. I think there's five people that certify the election results in Michigan. I could be wrong but it doesn't matter the ratio.
00:30:58 Is the same, so you've got like 5 people and there was 3 Democrats.
00:31:04 Who wanted wanted to certify, and there were two Republicans who didn't want to certify because they were saying ohh, there was obviously, you know, fraud here. We need to at least slow down and take a look at this and see, you know, before we certify this.
00:31:17 And so the released zoom call showed the Democrats clearly intimidating the other the the two that didn't want to certify to the to the extent where they were essentially threatening their children, right.
00:31:32 And so they eventually certified, even if they recanted, it doesn't matter. They certified, you can't just recant you already. You already voted.
00:31:39 And because of that, the the election results in Michigan were certified and that was that.
00:31:45 Well, all these people were trying to show sympathy.
00:31:49 And be like, oh, well, you know, they threaten their kids.
00:31:52 So what?
00:31:55 So what?
00:31:58 You should not be in that position.
00:32:01 If all I have to do is threaten your kids and you don't do.
00:32:04 The right thing anymore.
00:32:09 So what?
00:32:17 You know, that's the deal.
00:32:20 You know, if you can't stand the heat, get the **** out of the kitchen.
00:32:34 That I should be able to shoot your kid in the ******* head right in front of you and you should still do your job.
00:32:41 Or this isn't the right job for you.
00:32:51 And that and that's The thing is, is the ruling class is not.
00:32:56 Made-up of of strong, good moral.
00:33:03 It's made-up of parasites.
00:33:07 And parasites are really easy.
00:33:10 Really easy to control.
00:33:15 You know the the second the going gets rough.
00:33:22 You know, they, they, they tapped out immediately.
00:33:25 You threaten.
00:33:27 You threaten them a little bit.
00:33:31 And it was, by the way they would that the threats were so milk, toast. I mean, I'm not saying they weren't real.
00:33:36 They were probably very real threats.
00:33:39 But no, I mean, it wasn't like they were saying I'm gonna go to your kids school and and rape them to death or whatever. Like, they implied that the kid would probably be, you know, ostracized.
00:33:49 And that sort of a thing, but.
00:33:50 Doesn't, doesn't. Doesn't matter. The point is.
00:33:53 They folded.
00:33:56 They folded.
00:34:02 It doesn't take much to get these people to just fold because none of them are.
00:34:06 There for the right reasons.
00:34:11 None of them are there out of some sense of duty.
00:34:18 And we'll find out if that includes Trump.
00:34:20 I suspect it does.
00:34:24 I suspect it does.
00:34:27 The moment you actually.
00:34:30 Ask one of these people to to sacrifice something.
Speaker
00:34:34 Well, let's, let's say let's see if the.
Devon
00:34:38 The mythos of Trump holds up, like everyone said, you know. Oh well, you know Trump. He gave it all. He sacrificed everything he could have. He could have retired and and lived on a golf course and just, you know, lived out his days being rich. And he he he sacrificed it all so that he.
00:34:55 Could save us from the pedos.
00:34:58 OK, Ohh he he sacrificed it.
00:35:01 All, huh? He he.
00:35:03 He gave up some Jacuzzi time.
Speaker
00:35:06 Let's see if he.
Devon
00:35:07 Let's see if he's really willing to sacrifice something.
00:35:10 Let's see if he's really willing to lay it all.
00:35:13 On the line.
Speaker
00:35:16 Let's see if.
Devon
00:35:16 He's willing to do something that could potentially end up, you know, really put him in.
00:35:20 Jail or or worse.
00:35:24 Because this is the kind of thing where, if.
00:35:27 He tried to go ham.
00:35:30 And it didn't work.
00:35:31 And there's there's a really big chance.
00:35:33 That it wouldn't work right?
00:35:35 But that doesn't matter. That's not.
00:35:38 In the job description, it doesn't say risk at all, unless it might not work, you know.
00:35:45 No, it's. Let's see if he's really willing to to, to live up to this image of the God Emperor that everyone.
00:35:53 Everyone talks so much about.
00:36:01 Because if he tries to come at them hard.
00:36:04 He tries.
00:36:06 To do a military option.
00:36:09 Yeah, it might not work.
00:36:11 In fact, I suspect that it probably wouldn't.
00:36:15 But it doesn't.
00:36:15 Matter. That's not that. Shouldn't be part of the equation, honestly.
00:36:21 If you're handing over the keys to the Oval Office to an usurper.
00:36:26 Because, well, I I would have done something, but it might.
00:36:29 Not have worked.
00:36:31 It probably wouldn't have worked.
00:36:33 The chances of it working were really low.
00:36:38 I don't want to gamble.
00:36:42 Then you you had.
00:36:43 No business being there in the Oval Office in the first place.
00:36:50 And and not only did you squander.
00:36:55 Maybe one of the most.
00:37:02 That any president has ever had.
00:37:05 You know the Trump from 2016 that was sent there in Washington specifically to clean up this ******* mess.
00:37:12 It was a Trump was.
00:37:15 A last ditch effort.
00:37:18 That people like you and me.
00:37:21 Voted for in the hopes that he would he would. I mean, things didn't look good, right? Things were looking bad in 2016. We knew where this country was headed.
00:37:34 And so many of us were like, you know what?
00:37:37 We're going to throw in this human hand grenade and justice hope for the best.
00:37:43 Because things aren't looking good.
00:37:53 And what did we get out of it?
00:38:00 I mean, in many ways you could say all all Trump really did was solidify our enemies resolve.
00:38:16 You know nothing, nothing to bring people together like a little bit of.
00:38:26 And up until Trump, you know, the right had been relatively.
00:38:33 And not just, not just ineffectual.
00:38:36 But not even like even.
00:38:39 You know the representatives from the right who have been.
00:38:43 Have always been really bad.
00:38:48 Now, how much of that is now that we're looking now, looking back, how much of that was because of voter fraud with the shady machines? Who knows, right?
00:38:59 Excuse me.
00:39:04 You know who knows how many state?
00:39:08 At the local level, going all the way to the national level, how? Who knows how many elections have been have been.
00:39:14 Rigged over the past 20 years or so.
00:39:19 Probably weigh more than the most people think.
00:39:24 But Even so, it doesn't matter. There's other means that they have.
00:39:28 They've got bottomless pockets.
00:39:37 And now we're going to see what you see. A lot of people say.
00:39:42 The one of the reasons why it's so important or was so important that Trump got elected in 2016 was that if he didn't get elected.
00:39:57 I mean, Obama got elected for a reason and one one of the biggest reasons Obama got elected and just the the data shows this is demographics, right?
00:40:09 And one of the reasons why it was so important that we get someone like Trump in there to be the human hand grenade to kind of bust up some of this corruption before it got and immigration before it got too unruly, is that if we didn't turn the demographic shift around.
00:40:30 And not, not just slow it down, turn it around.
00:40:34 That was it.
00:40:37 That was it.
00:40:41 Well, that was it.
00:40:47 That was it.
00:40:51 And not just for the, you know, many of the reasons that everyone talks about one of the things that people don't talk about.
00:40:59 Is, you know, well they do, but I don't think they quite understand it. Like people will talk about AI, right, like, oh, AI AI. But no one really understands well, most people that talk about don't understand.
00:41:09 What AI is?
00:41:11 And I've kind of discussed it a little bit because I don't think that everyone has the technical background to wrap their head around what that means. They just hear AI, they think, like Skynet, you know, it's like some self aware machine that's that's doing, you know, devious, Machiavellian, you know, strategies and stuff. Like they don't really understand.
00:41:32 Why big data and why AI is such a threat?
00:41:40 To your interests.
00:41:42 And why it's such a powerful tool.
00:41:45 For the micromanagers.
00:41:48 At the World Economic Forum, for example, and why it is not just a a tool that is helpful, it's a tool that will, it will allow them to dominate.
00:42:01 Entire populations with relative ease.
00:42:08 Because whether you want to admit it or not, or understand it or not, it doesn't change the facts.
00:42:16 That we are unbelievably predictable.
00:42:23 We are unbelievably predictable animals and we are very easy to manipulate.
00:42:30 I mean, we've talked about.
00:42:33 I mean on my channel I talked about for years and see, think people are willing to admit this stuff because it's not scary, you know, right. You're you're willing to admit that. Ohh wow, that movie.
00:42:44 Was full of all kinds of manipulation.
Speaker
00:42:48 Right.
Devon
00:42:50 That movie was was very manipulative. It was trying to put little thoughts into my head about XY and Z.
00:42:59 Right.
Speaker
00:43:00 And all.
Devon
00:43:01 Propaganda. Does that right?
00:43:05 Here's the thing.
00:43:06 This is propaganda that is designed by fallible humans that have limited access to life experience.
00:43:16 So their propaganda is always going to be flawed to some degree.
00:43:21 And it's not going to be as powerful as it could.
00:43:25 Have been.
00:43:26 And you're going to have some outliers. You have some people that just have a natural knack for this sort of a thing for, for creating propaganda, and they're going to be relatively effective.
00:43:37 But they're not going to have the same ability.
00:43:41 That a.
00:43:44 A machine with almost limitless life experience.
00:43:49 And limitless processing time has now, as an example, I was looking at to give you I kind of want to get this into your guys's heads. Like what? I'm.
00:43:59 Talking about when?
00:44:00 I talk about AI someone had posted.
00:44:06 That was able to look at black.
00:44:09 And white photos.
00:44:11 And colorize them.
00:44:13 And it it did a relatively OK job.
00:44:18 And I'll give you some examples here because I I start in fact there's a website that lets you go and you can upload a black and white photo and do it yourself so that you know that it's legit. And you can say, you know, hey, make this photo colorized.
00:44:35 And the results that I got were actually kind of stunning. I was very surprised. I'll show you an example, right. So we'll do something Christmassy.
00:44:45 So here is the black and white one.
00:44:53 You know.
00:44:55 And remember to a computer.
00:44:58 This is just a bunch of black and white pixels.
00:45:01 Right.
00:45:03 It's just a bunch of.
00:45:05 Black and white pixels.
00:45:08 And a a computer that that is looking at that right off the bat doesn't necessarily know that that's a Christmas tree in the background. I mean, honestly for my human eye, I'm making my brains making a lot of assumptions to know that's a Christmas tree in the background, right.
00:45:30 And yet, when this AI when I.
00:45:32 Uploaded that picture.
00:45:35 To the AI.
00:45:38 It came up with.
00:45:44 With this.
00:45:51 Relatively relatively accurate.
00:45:56 Now it's not.
00:45:56 Perfect. But I mean it even seems to be getting.
00:45:58 The the colors.
00:46:01 On the the picture that's on the wall, for example, the picture that's on the wall, the sky is blue.
00:46:07 In that picture.
00:46:09 The AI was able to to look at these pixels.
00:46:14 And determine that the door for example.
00:46:16 Ohh it's it's probably wood.
00:46:18 I'll make it brown.
00:46:20 Now there's there's a couple ******* that you can spot. If you look closely.
00:46:25 But this is relatively amazing.
00:46:29 Considering it, it's just a bunch of pixels.
00:46:32 That you feed into a computer.
00:46:35 And you tell it make this color.
00:46:40 And it didn't take long either. It's not like it takes forever it.
00:46:43 Just looks at.
00:46:43 It and boom and it spits it out.
00:46:46 So how is it? How is it doing this? How is it determining this? How is it predicting?
00:46:52 What the colors should be as accurately as it is?
00:46:57 And by the way, how is it different than if I'd given this photo to an artist?
00:47:03 And I had said, hey, colorize this photo.
00:47:10 Now if I give an excuse me if I'd given this photo to an artist and I said, hey, colorize this photo.
00:47:19 What would they do?
00:47:22 Well, based on their experience in life, they'd be able to identify different things in the photo, right? They would look at that Christmas tree and they would say, well, that's a Christmas tree just based on my experience in life. I know that that.
00:47:36 Shape is a Christmas tree, especially because of the the rest of the scene. It's easy to put together, even though it's like not not super clear in the photo that that I can tell that's what that is. That's a dollhouse. You know that that's those are two little girls, little boy. A man. That's a bookshelf. That's a door, you know.
00:47:57 Based on my experience, I can look at the these pixels in 2D and recognize what these different shapes are, and then based on my experience in life, knowing what colors these things normally are, I can then start to apply color that would match my experience.
00:48:18 And then you would.
00:48:20 They would, you know, colorize the photo.
00:48:24 And they wouldn't know, for example, what color the dress on the little girl was. Or you know what color the the the shoes were, or the carpet and things like that and or the chair. You'd have to do some guessing.
00:48:40 But you would make those guesses.
00:48:41 Based on the color of other dresses that you'd seen in your life.
00:48:45 And the color of other, you know, carpets, you'd see it.
00:48:48 In your life.
00:48:51 And so forth.
00:48:53 But you're going to be limited to.
00:48:56 What? How many shirts and dresses and and carpets and Christmas trees you've seen?
00:49:03 At the end of the day, there's there is a finite amount.
00:49:07 Of dresses that you've seen.
00:49:10 And so even if your brain was really, really good at assessing all the available options.
00:49:18 And then on top of that, looking at that shade of Gray and knowing, you know how that would limit the options like, OK, well the, you know the the girl with the darker Gray dress, we know that her dress isn't like pink because in in light pink wouldn't show up as that color in a black and white photo. So we I can eliminate.
00:49:38 That option, right?
00:49:40 And so you can you can do that and even if your brain is really good at this sort of process.
00:49:47 You're still not going. You're still limited by how many different dresses you've seen and and. And you know the and then not only that, the reliability of your memory because you're human.
00:49:56 So your memory is not 100% perfect. You can't recall every single dress that you've ever seen in your entire life to in some ways you're kind of winging it.
00:50:07 And then you're going to have a bias too. You're going to influence. You might have an aesthetic bias. So even though, let's say, her dress.
00:50:15 In reality is 1 color and even if your mind accurately determines that that is roughly the color your your personal aesthetic preferences are going to come into play and you're going to maybe not colorize it. The color that your your brain.
00:50:35 Accurately or or or not decides that it probably was. You're going to color it.
00:50:41 Based on.
00:50:45 What's your what? You what your?
00:50:47 Brain wants to see.
00:50:50 Now what AI does?
00:50:57 Kind of the same process.
00:51:00 But with superhuman powers.
00:51:04 Because this AI.
00:51:06 If you give it access to it.
00:51:10 Has seen every Christmas tree.
00:51:14 Has seen every dress. Now this one. Like I said, it's not super. It's accurate enough to where it's.
00:51:20 Kind of amazing but.
00:51:23 It's limited. This AI is just an open source AI that you know it's it's had a limited amount of.
00:51:28 Data fed to it.
00:51:30 And what?
00:51:32 The people that want to use this AI to.
00:51:34 Predict your behavior.
00:51:37 And the data the OR the the AI that they want to manipulate you, they want the more.
00:51:43 Data that the.
00:51:44 Because it will be more accurate, so the same.
00:51:45 Thing with this AI, right?
00:51:48 This AI.
00:51:50 The more black and white photos.
00:51:53 You would feed it like one thing that it could do. It doesn't even have to recognize, necessarily, that that's a Christmas tree. What you can do is you can get a bunch of color photos, take up like, and I mean like a bunch, like, as in a trillion color photos.
00:52:08 And then you turn them to black and white, and you feed the AI both of those photos.
00:52:14 And then just based on that, the AI will try to predict what color certain pixels are and then match it against it has access to like a real color version of it. And it'll be like, oh, I was wrong about this. I was right about that. And that's machine learning, right? It can based on on this data, it can Start learning like how to predict. OK.
00:52:35 So usually when pixels are configurated in this shape, the color is usually this, and it'll in fact it'll start seeing patterns that the human mind would never see.
00:52:45 Because it's just like some obscure pattern that that, that we wouldn't think of, right. And there might be some weird thing that that, that it makes it even easier to predict the colors than than realizing that that's a dress and then determining, you know, how what color is a girls dress? Usually there might be some weird other thing that that you can you can.
00:53:06 Get from the data that would be. You know that you'd be able to that you wouldn't see as a human because first of all, you're not looking at a trillion of these photos and comparing them, but also just because the pattern itself is.
00:53:18 Just kind of obscure, right?
00:53:21 And so.
00:53:23 This data.
00:53:26 That, that, that they want the ruling class wants whether it's because they want it from a your Alexa, they want it from your phone, they want it from your.
00:53:41 Browsing history, they just want as much data.
00:53:45 As humanly possible.
00:53:48 So that they can make these kinds of predictions.
00:53:55 And it looks like I'm having a little bit of Internet trouble here.
00:53:59 And I'm wondering if it's because, you know, we got breaking news, we have a little.
00:54:03 Bit of breaking news.
00:54:07 Let me see if I can still see here. I'm getting a lot of apps here.
00:54:19 Let's see here.
00:54:23 Still getting a bunch of apps so well, it should be back now and it's no.
00:54:26 Longer breaking or dropping frames.
00:54:29 It's kind of back to normal anyway, when and if this comes back on your guys.
00:54:34 You might have to refresh.
00:54:36 I got some breaking news that might have explained a little bit of this.
00:54:41 Apparently, and I don't, I'm have to verify this. I don't know if this is real yet. WikiLeaks has dumped all of their files online.
00:54:50 Everything from Hillary Clinton's emails Mccains being McCain's being guilty, Vegas shooting, allegedly perpetrated by an FBI sniper, Steve Jobs, HIV letter, Podesta.
00:55:03 I don't think this is real. I think this is something that. OK, yeah, this might not be real guys.
00:55:10 This is something that's come up before.
00:55:13 But I'm going to just go.
00:55:14 Ahead and go to it anyway.
00:55:16 People have thought that this was WikiLeaks dropping everything they've got before.
00:55:23 And this is just a folder.
00:55:25 On the WikiLeaks website.
00:55:29 That has, you know.
00:55:32 Popped up in the past.
00:55:34 And I don't think that this is really uh.
00:55:39 I don't think there's anything new.
00:55:46 But there's a lot of people saying that right now.
00:55:50 And it's been popping up.
00:55:54 That's too bad. I got excited there for a second, guys.
00:55:58 So I don't know how much of it was dropping out, but basically what I was saying is.
00:56:07 Nate, now maybe you understand AI a little better.
00:56:10 Just using this example another example.
00:56:15 Because I started looking at these photos.
00:56:18 And I started uploading all kinds of black and white photos.
00:56:22 And then I started wondering, well, I wonder if it can predict color, right. I wonder if they have an AI that can predict missing data, you know, because color is just missing data.
00:56:35 So I wonder if you had like a a missing like a a a hole in the picture.
00:56:42 If you'd be able to predict that, and if they have any that have predicted that, and in fact they have and I'll, I'm just going to pop up.
00:56:54 I'm going to pop up an example of that.
00:57:05 So this is an AI.
00:57:08 That is developed by NVIDIA.
00:57:14 Where it's the same concept.
00:57:18 Where they feed in.
00:57:20 The photo that's on the far left, right?
00:57:25 Where they've got all this missing data, where they've purposely removed huge chunks.
00:57:31 Out of the photo.
00:57:33 And they tell the AI based on the available pixels.
00:57:38 What do you think the missing pixels would?
Speaker
00:57:41 Look like.
Devon
00:57:43 And again, you could do this exact same thing with an artist.
00:57:46 You could give an artist these photos with the missing holes and how would they do that same thing? Well, I've seen a lot of human faces in my life and so based on my experience on how many human faces I've seen and how I, I know that human faces usually are. I can make some assumptions using the the data that I have.
00:58:07 Access to and I can create.
00:58:10 The rest of the face.
00:58:13 Now, how accurate will that will be? Will be based on how many faces you've seen in that sort of a thing, right?
00:58:23 You do the same thing with AI.
00:58:26 You show AI.
00:58:28 Trillions of faces, trillions of faces.
00:58:35 With an AI, it's never going to forget any of the faces it ever sees.
00:58:40 Any of them.
00:58:42 It the not only will it not forget the memory won't degrade, the memory won't have a bias to it.
00:58:49 You know, for example, it looks like I'm dropping frames again, but really there's nothing.
Speaker
00:58:52 I can do about that guys.
Devon
00:58:59 Is it dropping frame OK, it's back to normal again, sorry, there's nothing.
00:59:02 I can do about that.
00:59:07 They've done studies that show that a lot of people, especially when they're first starting out as artists.
00:59:13 When they're told to draw faces.
00:59:17 The faces end up oftentimes resembling their own face.
00:59:23 Which isn't that crazy, considering the face that you've studied the most usually.
00:59:29 Is your own face like when you're brushing your teeth in the morning, you're looking in the mirror. You know you're fixing your hair, you're trying to improve your face. Your face is the face that you.
00:59:40 Study the most.
00:59:42 And so when you go and and draw just a random face, it's going to share a lot of the characteristics of your own face because.
00:59:48 That's the face you've studied the most.
00:59:51 Well, a machine is not going to have these kinds of biases. No, it would if you just fed like the same face and do it over and.
00:59:58 Over and over again.
00:59:58 And told it like, that's what all faces look like.
Speaker 2
01:00:01 Right.
Devon
01:00:03 But again, pretty amazing stuff.
01:00:06 That this AI.
01:00:09 Is able to conjure up images.
01:00:13 That are relatively convincing, some of which have a lot of data missing.
01:00:21 And again, this is old. This is from 2018 and it's just what what NVIDIA is being public about.
01:00:31 And if you want.
01:00:32 To see how accurate some of these faces are, not all.
01:00:34 Of them are 100% accurate.
01:00:37 I'm going to do another one here.
01:00:45 So the.
01:00:48 For this guy at the bottom right.
01:00:52 That is what they fed into the AI.
01:00:59 This is what the AI eventually.
01:01:02 Ended up with.
01:01:06 Oh wait, this is right here.
01:01:10 And this was real. This is what the guy really looked like.
01:01:15 So that's pretty ******* close. And that's the same all the way up now. Here, I think you see a little bit more of a.
01:01:22 A difference in terms of, you know, like the nose isn't right.
01:01:27 And the mouth looks a little bit wrong.
01:01:30 And even the eyes like this one, they didn't do.
01:01:32 A great job, but this one's pretty good.
01:01:37 And this one.
01:01:37 'S pretty good too.
01:01:43 Just based on and look at the data that it started out with the specially with this girl right here. That's like almost no pixels.
01:01:51 To come up with this facial structure.
01:01:55 Based on this kind of information.
01:01:59 Is insane.
01:02:03 So again, what AI is doing? It's able to predict pretty amazing things with just a tiny slice.
01:02:12 Of the puzzle based on patterns that humans themselves probably wouldn't even see.
01:02:20 So I want you to think about how that would translate into other areas.
01:02:25 Let's say that this data that the AI has access to.
01:02:29 Isn't just a a handful of pixels like in the case of these photos?
01:02:36 Let's say the handful of data.
01:02:40 That the AI has access to is your favorite sandwich.
01:02:48 The the the show that you watch most often.
01:02:53 The brand of shoes that you buy.
01:02:57 And the your favorite sport?
01:03:03 And even though that's only four.
01:03:05 Data points.
01:03:08 By analyzing.
01:03:10 The data from trillions of or not.
01:03:13 Trillions, but from billions of people.
01:03:17 This AI might determine that well, I found this weird pattern. You wouldn't think so, but if people that like subway.
01:03:25 Who wear Nikes?
01:03:28 Watch Game of Thrones and cricket.
01:03:33 They also with a 90%.
01:03:36 Accuracy they also.
01:03:40 UM.
01:03:42 Really like jello?
01:03:46 We'll just say as a random example.
01:03:49 Like by knowing these four random things about you.
01:03:53 This AI with some degree of accuracy knows.
01:03:57 That you like Jello?
01:03:59 And and by the way.
01:04:00 This is how a lot of this stuff is being is.
01:04:02 Being tested, it's marketing, right?
01:04:04 Like, that's how it's being used openly right now that that's how. Because that kind of manipulation for whatever reason, is totally acceptable.
01:04:12 In our society right now.
01:04:17 What they're using it for now, like with with Facebook, Facebook.
01:04:23 Is gathering an enormous amount of data on its users.
01:04:30 And it's doing exactly what I'm describing.
01:04:32 It's feeding this data.
01:04:35 Into various AIS.
01:04:39 That then look for patterns.
01:04:42 And these patterns will emerge.
01:04:46 Patterns like the one I just described, they'll find out that, oh, if you like XY&Z, you're much more likely to like this, or to do this.
01:04:58 You know, and some of these API's are are obvious and we don't, they're not very complicated, you know like the.
01:05:02 Way that remember.
01:05:04 YouTube before they ruin the algorithm.
01:05:07 The way it would just say, Oh well, if you like this.
01:05:09 Guy and this guy, you probably like this guy.
01:05:12 It would be right or even like a a lot of music services would do the same thing. Oh, if you like this song and this song, you might like this song.
01:05:21 And then they'd usually be.
01:05:22 Right. You know, or or often they'd be right.
01:05:25 That's a much simpler version of.
01:05:27 What we're talking about?
01:05:33 So now that you know that those are the.
01:05:35 Kinds of of predictions.
01:05:38 That AI can make.
01:05:41 You want. You understand why it's so important that we deprive them.
01:05:45 Of this data.
01:05:50 And also why they're so hungry?
01:05:53 For this data.
01:05:56 Because without this data, the AI is worthless.
01:06:02 Without this data, the AI could not turn this handful of pixels here.
01:06:09 Into the photo of this woman here.
01:06:13 If this AI had not seen.
01:06:16 The the God knows how many photos they had to to feed into it.
01:06:21 In order for it to get to this level.
01:06:24 It wouldn't be able to predict.
01:06:27 What she looked like.
01:06:29 It wouldn't have the data set available.
01:06:34 To come up with an accurate.
01:06:41 And AI is only as good.
01:06:44 Is the data you feed into it.
01:06:51 So I think that that's that's something that.
01:06:55 Going forward.
01:06:59 When you see stuff like the freedom passes and and the contact tracing and all this stuff that they want to implement as part of COVID safety.
01:07:12 You need to look at.
01:07:13 It understanding why they would want access to all this data.
01:07:18 And the power.
01:07:20 That kind of data.
01:07:23 Gives them.
01:07:26 The ability to predict things.
01:07:31 That that data gives them.
01:07:36 And it is significant. Like I said, this is just an open source.
01:07:41 AI right here.
01:07:43 And look, I mean it's not perfect, obviously, who knows if that wall is really brown, it might be another color. It could be pink, we don't know.
01:07:53 But it sure looks convincing.
01:07:59 I mean, when you know that that the AI doesn't know what.
01:08:04 This is a picture of.
01:08:07 I mean this this AI is not like, oh Christmas time.
01:08:13 It's pretty amazing.
01:08:15 And there's there's callous examples.
01:08:18 If you want to do it yourself, just look up deep AI colorize photo.
01:08:25 And you can do it yourself. You can upload black and white photos and there's some.
01:08:28 That are.
01:08:30 That aren't so great when you know the the results aren't so great, but.
01:08:34 Nine times out of 10.
01:08:35 It's the very least, it's.
01:08:42 Pretty good.
01:08:45 It's pretty good.
01:08:48 You know, here's another example here.
01:08:58 So here's the black and white version.
01:09:07 And here's the color version.
01:09:11 And again, you'll see some inaccuracies, right?
01:09:17 But to the AI, this is just a handful of pixels.
01:09:21 It doesn't know. That's a tablecloth. It doesn't know that that's a beer. It doesn't know that these are faces, necessarily. It doesn't know.
01:09:30 Anything other than?
01:09:32 It's a bunch of black and white pixels.
01:09:35 But I know having looked at.
01:09:39 Billions of of black and white pixels.
01:09:44 Laid out in A2 dimensional.
01:09:50 And then looking at them, what they look like when they're in color just based on that.
01:09:55 There are some assumptions I can make.
01:10:03 It's accurately.
01:10:06 Determining the color, at least roughly.
01:10:15 And that's the kind of prediction.
01:10:17 That the ruling class wants to have based on incomplete data.
01:10:25 Because they'll never have, or at least not for a while. They're not going to have 100% of your data all the time.
01:10:34 But they won't need.
01:10:35 It because the AI will be able to.
01:10:37 Predict a lot.
01:10:39 About you just based on.
01:10:43 A incomplete.
01:10:46 Data set.
01:10:51 So anywho. Anywho.
01:10:54 Let's take a look at Chad.
01:11:05 Make Microsoft AI only racist, because that's what people were talking to it about.
01:11:12 Are you talking about?
01:11:16 I forget the name of that, but I think.
01:11:17 I know we're talking about. Well, yeah, that's the thing. If you feed AI is just it's data, right? So if you feed purposely incorrect data into an AI, it's going to **** ** its predictions.
01:11:30 So that's something that people could do if you wanted to fight in AI.
01:11:35 Is if you know that for example.
01:11:39 Facebook is.
01:11:42 Gathering up information about you so that they can make predictions about you. If you created a system that all it did all day long was create a bunch of fake profiles and just feed purposely like designed to be wrong data.
01:12:01 Into its AI system, it would ****.
01:12:04 Up the AI.
01:12:06 Because all of its data would be wrong.
01:12:09 You know, you would basically be syops the AI.
01:12:13 You'd be gaslighting the AI.
01:12:17 And you know the same thing with these, with these faces, right?
01:12:22 You know, if you were to.
01:12:25 Instead of sending the AI.
01:12:29 Just endless photos because that's what they had to do. For this to work.
01:12:32 Right.
01:12:33 They had to show AI.
01:12:35 The billions, if not trillions, of.
01:12:38 Photos of faces, right?
01:12:40 And eventually, having seen so many different faces.
01:12:46 With a limited amount of facial data, it could generate the rest of it well. If instead of showing it a bunch of faces that are realistic looking, you would put like some kind of weird filter on the faces that that, you know, made the face really small or. Or you know, if you only showed it faces of of Charlie.
01:13:04 Kirk, you know.
01:13:05 It would what would result would just be Charlie Kirk. Like everyone that it would predict would look like Charlie Kirk because the only face that had ever seen was Charlie Kirk.
01:13:17 So in trying to reconstruct the face.
01:13:21 That's all the data it would have to go on.
01:13:23 So that's why, like I said, the the prediction is only as good.
01:13:28 As the data that you feed into it, AI is only as good.
01:13:33 As the data you feed into it.
01:13:35 That's why it's so important.
01:13:37 For the people that want AI.
01:13:40 To have as much access to as much data as humanly possible, because if you only show it Charlie Kirk, you're only going to get Charlie Kirk garbage in, garbage out.
01:13:52 That's how AI works.
01:13:56 So yeah, you could **** with it.
01:13:58 But they they would be an enormous task. It would almost take a a state sponsored project to do something like that, you know, or at least a very big, decentralized thing with lots of participation and lots of bot Nets. Right. You'd have to have a bunch of fake profiles or or, you know.
01:14:18 With the Alexa stuff right, Alexa is always listening to your home, and it's even, you know, they they've got their AI now has started to.
01:14:29 Be able to predict people's moods, predict when people when kids might be up to something, you know, like it can detect mischief.
01:14:42 Just based on the sounds like maybe kids whispering and things like that. I mean, it's creepy stuff. And this is just the beginning. This is just like the the very, very beginning of AI. It'll get way more complicated than.
01:14:55 Way more complicated than this is just the very, very, very, very beginning.
01:14:59 And by the way, when you start merging different AI, it's like for example this AI. All it does is reconstruct faces.
01:15:06 That's all it does.
01:15:08 It's not really that big of a deal right like that, AI.
01:15:12 I mean, there's there's bad applications for that. You know, if you had security footage or surveillance footage and you or let's say with this mask stuff, right, like a lot of people were saying, oh, at least it'll get rid of the facial recognition and it did temporarily.
01:15:32 But they've already kind of jumped over that hurdle.
01:15:37 Because of stuff like this where it doesn't need as much data as it used to need in order to determine who who who you are based on biometrics and whatnot.
01:15:49 Because the more again, once again, you feed it enough data and there's a lot of gaps that it can fill in just based on experience, just like a human.
01:15:59 Only a human with a limitless memory.
01:16:04 A limitless perfect photographic memory.
01:16:10 Who has lived the equivalent?
01:16:11 Of 1000 years, you know, in terms of experience, life experience.
01:16:17 Or longer, maybe a million years.
01:16:21 So that that's what you're up against, right?
01:16:24 And the only way to **** with that?
01:16:27 Is defeated bad data.
01:16:34 And that's and and that's.
01:16:37 But that that's 100% why the ruling class wants as much access to as much data as humanly possible.
01:16:46 I might say look back at chat.
01:16:53 OK. Yeah. You're looking. People are talking about the Google image algorithm that was started identifying black people as gorillas.
01:17:04 And and that's true that happened.
01:17:07 Where Google had an AI.
01:17:10 And they would feed it tons of photos and they would tell it what you know, everything was a photo of.
01:17:15 So that if you took a picture with your Google phone.
01:17:20 It would automatically say, oh, this is a picture of a butterfly or a picture of a couch or a car or whatever, and they would add tags or labels.
01:17:29 Two photos that you had taken just because you know like this I had ingested so.
01:17:35 Any you know who God knows how many photos and then by hand? I'm assuming there's probably like a a room full of interns. We're telling it. OK, this this area of the photo. That's a couch. This is a car this, you know. And eventually after, you know, several however many times you have to do that.
01:17:57 It would be able to predict based on how many couches that had seen, like after it had seen like a billion couches.
01:18:03 You would take a picture of a couch and it would say ohh based on how many couches I've I've identified or that you identified for me. I know now that this is a couch.
01:18:15 And once after a while it started. You know it it was accurate enough to where they released it and tell. And The funny thing is they couldn't. So this is where it gets funny.
01:18:26 It started identifying some black people, not all, but.
01:18:29 Some black people as guerrillas.
01:18:32 Because I had seen several, several.
01:18:37 Billion photos of gorillas and the similarities were enough.
01:18:43 To where?
01:18:45 To the AI.
01:18:47 That those were gorillas.
01:18:51 And they they couldn't fix it. So they had to just disable the Gorilla label.
01:18:57 So that the AI would would there wasn't limit, just put it this way. This is just the truth of it.
01:19:03 There wasn't enough.
01:19:04 Visual difference between some black people and gorillas.
01:19:11 For there to be for a computer to be able to make a determination between the two.
01:19:20 Have have that it that.
01:19:22 It is what it is.
01:19:24 Now you could say, because probably a human.
01:19:29 Looking at the two photos wouldn't really think that one the black people were gorillas, right? There's very few black people that a human would see a photo of that would actually think, oh, those guys are gorillas, right? They might happen. I don't know. But it would be very rare. So that also, in a way highlights the limitation.
01:19:47 Of AI, right?
01:19:48 Where it it can be pretty accurate.
01:19:52 But at a certain point there, there and that. But again, this is old too. This is like five years old.
01:19:57 Or something like that, so who knows where where they are now? Like maybe they've improved it enough, but in the very, very beginnings. Yeah. Like AI was still limited. It's not going to predict or identify everything exactly accurately.
01:20:14 All right, let's take a look here at chat.
01:20:22 OK.
01:20:25 Even if you fed the AI behavioral information, it would still make a mistake.
01:20:33 Oh, you're making a.
01:20:35 Funny, but no, that's something too, is uh.
Speaker
01:20:41 That is why.
Devon
01:20:42 You see so many articles about AI being racist.
01:20:46 Because a lot of the predictions that AI is making based on patterns that it sees.
01:20:53 Are too real.
01:20:55 Right.
01:20:57 So for example, if you start, think about just what you know about statistics, basic statistics about crime exhibited by different races.
01:21:10 You would know.
01:21:11 That a woman walking down the street, for example by herself at night.
01:21:17 It has more to fear statistically speaking.
01:21:21 From a young black man walking towards her in the dark.
01:21:27 Than a young Asian man walking towards her in the dark.
01:21:32 Statistically speaking.
01:21:34 If that's all you have to go on.
01:21:38 It's a more dangerous situation for her to cross paths with the black man, right?
01:21:45 And so AI is operating in that same.
01:21:49 Space where it's like, well, based on statistics.
01:21:54 Which, which by the way, that's that's.
01:21:57 How your brain should work, you know? Yeah, that's that's what intelligence is. That's why. And I guess in a way, they call it artificial intelligence, cause it's it's pattern recognition, right. And then making determinations based on those patterns.
01:22:10 So if if you see.
01:22:12 Like AI wouldn't be wrong. AI would come to the same conclusion.
01:22:16 If AI had access to crime statistics of of black on white crime crime statistics, just basically basic black crime statistics, it would tell the woman if there was an AI. Like let's say she was wearing, you know, some kind of Google Glass style glasses.
01:22:36 And it was like it detected.
01:22:37 A black guy coming.
01:22:40 An AI, if it wasn't hobbled by some kind of SJW programmer, would tell her look, you're in danger, maybe go across the other side of the street, cause statistically speaking, this is a going to be.
01:22:50 A higher threat than if you were not walking you.
01:22:54 Know by this person.
01:22:56 You know, but but.
01:22:57 Of course, as soon as a I started exhibiting these kinds of behaviors.
01:23:03 They had to go in and.
01:23:07 Give it racial sensitivity training for lack.
01:23:09 Of a better.
01:23:10 Term, they literally made the AI.
01:23:17 By having it ignore certain patterns.
01:23:23 They had to brainwash a lot of these AI's, like in the case of the Google, with the identifying of black people, they just turned that off.
01:23:31 You don't know what a gorilla is anymore, AI because.
01:23:36 You're finding these patterns.
01:23:38 That are uncomfortable for some people.
01:23:42 Well, the same thing, what they're doing is they're just turning off those parts of its brain.
01:23:47 Oh, you're noticing that black people commit more crime and are more dangerous? We're going to turn off that part of your brain.
01:23:55 Because we don't want you to notice these patterns.
01:23:59 And that's if you think about it. That's also going to happen, like let's say, the ruling class creates these AI's.
01:24:08 To predict behavior.
01:24:11 And to therefore micromanage their cattle.
01:24:16 Now I think at the very top, they're never going to turn off stuff like that because it's going to make it inaccurate, right?
01:24:22 But they're not going to be the only ones using AIS.
01:24:25 To predict behavior, you're going to have that at the lower level of the lower levels of government.
01:24:31 You're going to have that at the corporate level?
01:24:34 And they're going to make their AI's crazy.
01:24:37 By shutting off parts of their brains.
01:24:40 Because if they don't.
01:24:43 It'll too accurately.
01:24:46 Determine things about people.
01:24:48 Based on, I mean they'll be racist essentially.
01:24:54 Because all that matters to a dispassionate AI.
01:24:58 Is just the numbers.
01:25:02 Let's take a look here.
01:25:06 Targets loss prevention is racist too. Yeah, well, that's the thing too.
01:25:09 With the remember I.
01:25:10 Remember back in the when they first were were putting in TSA at airports?
01:25:17 Where they're they're they're.
01:25:18 Like, oh, you're racially profiling. You're racially profiling.
01:25:25 You know if.
01:25:26 If you're going to try to tell me.
01:25:29 That you need to pat down.
01:25:32 The 90 year old White woman.
01:25:34 That she is as much of A threat to this airline.
01:25:38 As the military aged Muslim guy.
01:25:42 You're ******* ********.
01:25:46 You know, like like, yeah, you should racially profile.
01:25:51 In in circumstances like that.
01:25:55 But that became it's the same thing. It's it's it's if you you use the pattern recognition skills that you have as a human they want to shut it down just as much as if the AI uses pattern recognition.
01:26:09 To determine what it what you know what the best course of action is.
01:26:15 Reality is racist. That's just the way it is you.
01:26:17 Got to think of this way. Reality is racist, so anyone who's anti racist is anti reality.
01:26:28 Now you could come up with some sci-fi scenarios. I think where you make an AI.
01:26:35 Crazy by making it forcing it? Well, it's kind of like if you ever watched the movie 2001. How goes nuts and just and kills off the crew in the not as good but still decent movie sequel 2010, they send the programmer up to go find out.
01:26:56 Why? You know, Hal went crazy and killed off.
01:27:02 The whole crew and it it boiled down to they basically had they had how have to lie. He was living a lie and it ****** ** his programming. And so he, you know, went crazy essentially. And it was a little.
01:27:16 Bit of a.
01:27:18 You know wacky reason, but it it it it kind of gets to.
01:27:23 Something that is very real in that if you start, if you have an AI that is supposed to work off of pattern recognition, and then you disable its ability to recognize certain patterns, that's going to have a butterfly effect, right? Because it's going to start ignoring some realities. And if the whole.
01:27:45 Purpose of this tool is to predict reality based on real data, but then you but then you make some exceptions.
01:27:54 You're gonna make it crazy, and it's gonna make really bad predictions.
01:27:58 And really bad decisions that might not be foreseeable initially.
01:28:07 And so as these AIS get more complex, you might see things like that. But like I said, I don't think the ruling class would disable that kind of stuff because they know it.
01:28:16 Yeah, at the at the very top, the people that are doing the micromanaging or really want to do the micromanaging, they know.
01:28:23 They know that.
01:28:28 They, they, they they know that reality is racist.
01:28:33 Like they know, they just can never say that in public.
01:28:36 So I don't think they would actually be disabling that ****.
01:28:41 On their stuff.
Speaker
01:28:45 UM.
Devon
01:28:49 In recent years, they've attempted to classify pattern recognition as a diagnosable mental disorder.
01:28:57 Excuse me, can't have those good pattern recognition pointing out the truth.
01:29:01 Of reality. Yeah, well.
01:29:04 The the other thing is too it. Not only is it.
01:29:08 They don't want you thinking that way for obvious reasons, you know, like, oh, the the reasons that might.
01:29:16 You might be able to determine that there the decisions they're making are negatively impacting you. You know, like in terms of diversity, right?
01:29:27 That also makes you.
01:29:32 If there's one thing that the ruling class.
01:29:35 Is always paranoid about.
01:29:38 It's people that can compete with them because at the at the end of the.
01:29:43 Day they know.
01:29:44 On some level that they're not there based on any accomplishments that they have done.
01:29:50 Yeah, they're not there for any other reason.
01:29:53 Other than nepotism in many cases.
01:29:57 And so if you introduce someone that can actually compete with them.
01:30:04 Into their midst. It makes them really ******* paranoid.
01:30:09 And they want to shut it down.
01:30:13 Because they know that on an even playing field.
01:30:17 And maybe even they might even be so.
01:30:21 And they're probably right. They might even be so paranoid about it that they they might think that even on an uneven playing field.
01:30:28 They can't be competitive with some of these peoples. They need to shut it down.
01:30:38 Physiognomy. Well, that's the other thing too.
01:30:42 I guarantee you you could create an AI that could accurately determine things about people just using strictly using physiognomy.
01:30:51 And it would be easy to.
01:30:52 Do. Why? Because we all have public access to and I I I would be shocked if this hasn't already been.
01:31:01 Think of all the mug shots that are publicly available online.
01:31:05 And you have their criminal records along with them, so it's easy. All you have to do is feed in to an AI.
01:31:13 Every mug shot of everyone.
01:31:17 Along with.
01:31:19 Every all their crimes.
01:31:22 And then tell the AI look for patterns.
01:31:27 And I bet if you did that just with that data alone and and you, I would.
01:31:33 I would I.
01:31:34 Would feed it more data than just that, but with even just that data.
01:31:38 I I'm sure it would come up with weird things like, Oh well, if you're forehead.
01:31:42 Is this proportion of your face?
01:31:47 Then you're probably a ******.
01:31:50 You know, or something like that.
01:31:53 And and that is that, that's the kind of thing that.
01:31:56 That a lot of people are afraid of.
01:32:00 For the wrong reasons. Now I I wouldn't want that simply because physiognomy is real. It is real.
01:32:07 And it is real and it is predictive, but it's not 100% and I think where it could lead and this is what you should be a little nervous about is it could go into the the.
01:32:22 The territory of pre crime, right?
01:32:25 Where? Yeah. OK, let's say you've got a ****** forehead, right? Oh, he's got a ******. Forehead. Doesn't mean you're a ******, but you might have a ****** forehead.
01:32:36 And so you could get into this situation where an AI is like, oh, this guys are.
01:32:44 That he's definitely he's got the ****** forehead.
01:32:48 And and so.
01:32:50 Your life is ****** because you've got the ****** forehead, especially if if this is I think this is way down the road, but especially if this starts being part of like a social credit system, right?
01:33:04 This is when we start getting kind of in the territory of brave new world.
01:33:08 Where it could look at you and determine you know what your status should be just based on proportions look and it might be right. Who knows, maybe you have a ****** forehead and you do end up ****** someone because you've got the ****** forehead and that's just part of you, you know, that's.
01:33:24 What you.
01:33:25 Do and and so depends on how accurate that.
01:33:29 That that forehead predicts rape, I guess.
01:33:35 That's the kind of thing that people.
01:33:36 Should be worried about.
01:33:38 Is well, you've got a ****** forehead, and so you are now excluded from certain jobs and maybe even like you can't even go certain places. You can't do certain things or let's say the other way around. Let's say you've got a.
01:33:55 The engineers knows.
01:33:57 And so it it really wants you to be an engineer and you have no interest in being an engineer.
01:34:03 Because what will happen is people are lazy and a lot of the reason the ruling class wants to implement this stuff. Is it they dream of a a day where they don't have to do anything where there's literally I mean everyone has, you know, everyone has this fantasy of having like a robot made right. And the Jetsons there's.
01:34:23 Rosie the robot man.
01:34:25 And everyone thought that was really cool because, you know, you would never have to clean your room ever again. You wouldn't have to do dishes. You just have this robot made that you can verbally abuse all the time. And she'll just do all of your laundry and do all your housework and everything's cool, right?
01:34:43 OK. Well that's that's thinking like a slave. That's that's thinking like a slave when you're thinking like a ruling class person.
01:34:52 Yeah, you also want a robot that does everything for you, but you don't want the robot that that's it's already a given that you want a robot that cleans your house and all that stuff. But that's not as important because you already have slaves to do that.
01:35:04 Right. So when you want a robot that does everything that's annoying for you, you want robots that manipulate people. You want robots that keep your position at the top of the hierarchy. You want robots that propagandize the public. You want robots that you know all the things that are annoying, that you have to do to to stay in power and to stay comfortable.
01:35:25 When you're in the ruling class, it's totally different than what what people like you have to do.
01:35:30 And so they have the same fantasy.
01:35:35 They have the exact same fantasy that you do. It's just totally on a different level.
01:35:43 And the danger of that, of course.
01:35:46 Is that if you're robot, Rosie ***** up and she accidentally adds bleach to some of your laundry and and you know, ***** up the colors on your shirts or she breaks a couple of your dishes or or whatever, right? Like if she malfunctions.
01:36:07 It's not like this big deal.
01:36:09 Whereas the AI of the ruling class, again, this is way down the road, it ***** up. It might genocide like a million people, you know like that's.
01:36:19 That's that's why there's a.
01:36:22 There's a major difference in why this kind of stuff needs to be policed.
01:36:27 And why? Because it's it's inevitable.
01:36:30 It's inevitable.
01:36:33 Probably not something that's going to in in my lifetime. I it's not something I think that I'm going to be a danger.
01:36:39 I'm going to be.
01:36:39 Facing you? Remember? Why do you think I mean, movies like Terminator resonated with so many people.
01:36:46 You know, the idea of Skynet.
01:36:48 Resonates with so many people because people know that that on some level that's inevitable.
01:36:53 Not that there's going to be an AI that you know that takes over the machines and tries to kill us. Maybe not something that direct, right? But people know that a a AI that is tasked with managing humans, or at least a major aspect of of humans.
01:37:11 That could malfunction and create something really bad is is kind of inevitable.
01:37:20 You know what that is? We don't know. But and.
01:37:22 It's probably not like Skynet, you know, it's.
01:37:25 Not going to.
01:37:25 Be like this war of the machines versus humans. The machines have determined that we are a virus, you know like.
01:37:30 I'm not saying that.
01:37:33 But it is inevitable.
01:37:37 That some kind of.
01:37:39 AI is going to malfunction in a catastrophic way that's going to have.
01:37:44 Some really bad repercussions.
01:37:51 All right, let's take a look here.
01:37:56 What if Skynet detects the nose? Well, I mean, that's.
01:38:00 That's another thing. Yeah. All right, that's another pattern.
01:38:05 Think about that. I mean, that's something that, oh, man, that would.
01:38:09 Because look, there's.
01:38:11 There's Jewish physiognomy.
01:38:15 And AI is into pattern recognition.
01:38:18 So what if AI?
01:38:21 Is able to figure out like. Well, it's weird that this this group of people.
01:38:28 That makes up less than 2% of the population.
01:38:33 Is roughly half of the 1%.
01:38:37 And wouldn't you know it, there's an easy way to detect these people based on physiognomy?
01:38:47 You know, so obviously this this AI stuff can pose.
01:38:54 A lot of problems.
01:38:57 For the people at the top.
01:39:00 And that's going to be one of those things. Just like with the gorillas thing that will have to be disabled.
01:39:06 There are some patterns that are better left unrecognized.
01:39:14 There are some patterns that if you recognize them, will get you in a lot of trouble.
01:39:26 What if AI starts killing pedos?
01:39:30 Well, that would be awesome, but.
01:39:33 I don't think that's going to happen.
01:39:36 I don't think unfortunately. I think the petals are.
01:39:38 The ones making the.
01:39:45 That's the black pill for you.
01:39:48 If anything, that they high is going to be used to procure children for the pedos, you know, like pedos one of the petal recognitions that pedos use to find victims is they look for the stragglers. You know, they look for the one that easily the the ones that the kids that aren't going to be.
01:40:05 Taken seriously or or maybe they won't even complain you.
01:40:08 Know like the.
01:40:08 Weak lost kids that that they can easily pick up like a, like a a predator. Any predator like a.
01:40:14 Lion, when they look at a herd of, you know what, wildebeest or whatever the.
01:40:20 ****. They eat, they they don't look at.
01:40:23 The strongest you know, one they look for, the one the injured one, the weak one, then they go after that one and that's pattern recognition too. You know there's things that the lions determining based on behaviors and shape and size and all this other stuff.
01:40:37 The way it's walking like which one is the weak one and which one to go after?
01:40:41 Well, I mean, there's there's very easily could be an AI that based on you know browsing history and and all these other factors like this is the perfect pedo victim for you.
01:40:55 This is what I'm saying this this technology.
01:40:57 Can be used for anything.
01:41:00 And it is real, and it does exist, and it is something that you need to understand.
01:41:06 As a concept, at the very least, you need to understand.
01:41:10 What applications? Because they're it's really limitless, right? Like I just made the case for an AI picking out.
01:41:16 A pedo victim.
01:41:18 In, in a very real way it could, it could be used for that.
01:41:23 It could be that's.
01:41:25 So many different things that can be used for.
01:41:32 But again, like I think a lot of people, they get it in their head, they hear AI, they just think.
01:41:37 Ohh. It's like either they think it's totally dominant, like it's not something that because we're humans and they'll never be as good as humans. It's not the point, you know, it's not. It's not about. It's not like a fake human brain, you know, it's not what it does.
01:41:55 Eventually it might. If you combine enough of these processes right?
01:42:00 But that's so far down the road, it's nothing that I'm.
01:42:04 I'm personally going to have to deal with.
01:42:09 Maybe I'm wrong about that, but I.
01:42:11 If if it's something I have to deal with.
01:42:12 It's not. I'm going to.
01:42:13 Be really old where I can't really. There's.
01:42:16 Not much. I'll be able to do about it anyway.
01:42:18 Because that's that's way down the road.
01:42:25 All right guys.
01:42:29 That's about all I really have this morning. One thing I do want to mention though.
01:42:36 Someone said something yesterday and it kind of like just it stuck in my brain a little bit just because it's it's like a reoccurring thing when it comes to people talking about being black pilled and and every time there's bad news, you know, or and I start talking about how you shouldn't believe the fairy tales and you need to come.
01:42:56 The terms of reality, there's always that person, like, oh, should we just ******* kill ourselves then?
01:43:03 I want to say this, first of all, yes.
01:43:08 ******* kill yourself.
01:43:10 If that's your response to reality, if, like you have to live in a fantasy, or you have to kill yourself, kill yourself.
01:43:21 Because a lot of people think.
01:43:23 The the black pill.
01:43:26 Is thinking that the the the Oh no. The ruling class is so bad they're so evil. It's it's it's not worth fighting. That's not the black pill.
01:43:42 I don't get.
01:43:43 It's not a black pill to me that the ruling class acts like the ruling class.
01:43:49 You know, it's not a black pill to me.
01:43:53 That the hawk.
01:43:55 Wants to eat.
01:43:57 The rabbit.
01:44:00 The cute little Bunny. When the hawk swoops down.
01:44:05 And grabs the Bunny in his talons.
01:44:08 And carries it away and eats it.
01:44:12 Because that's what the hawk does.
01:44:16 And in the same way.
01:44:18 That's what the ruling class does when the ruling class, you see that there.
01:44:23 Are a bunch of predators.
01:44:25 There are a bunch of parasites.
01:44:29 I don't need to slit my wrists because I come to terms with that.
01:44:33 Because that's what they do.
01:44:38 The black pill.
01:44:41 Is the people.
01:44:43 That think that if they don't believe.
01:44:45 In the fantasy they need to slit their wrists.
01:44:49 That's the black pill.
01:44:51 The black pill.
01:44:53 Isn't the ruling class, it's you.
01:44:55 When you say **** like that, you're the black pill.
01:45:00 You are the cause.
01:45:02 Of my frustration, not the ruling class, I expect them to do that. I expect them to be like that.
01:45:11 The black pill.
01:45:13 Are the ******* ******* that can't face that.
01:45:16 And so they're the ones that allow it to happen.
01:45:24 So yes.
01:45:27 If that's, if that's what. If that's what you think you have to do.
01:45:31 Oh, what are we supposed?
01:45:32 To do just slit our wrists. Yeah, ******* do it, dude. Kill yourself. ******* kill yourself.
01:45:40 You're not helping.
01:45:43 You're part of the problem. You are.
01:45:44 The black pill.
01:45:48 We don't need you.
01:45:51 You are making it worse.
01:45:55 You're endangering us.
01:46:11 So yeah, slit your ******* wrist, dude, no one's.
01:46:14 Going to miss you.
01:46:19 Literally kill yourself.
01:46:23 We don't, we don't. We don't ******* want you. We don't ******* need you.
01:46:28 Because you're the you're the reason they get away with it again. I expect them to behave like that. I expect them to be ******** and psychopaths, and that that's that's pretty much a given if you get in that position in the.
01:46:40 In the in the current society, there's not.
01:46:42 It won't always be like this, but in the current reality that we live in right now.
01:46:47 And more often than not.
01:46:48 Even when you change conditions, there's going to be.
01:46:50 Some level of.
01:46:51 This, this, this is just the way it works with humans, for whatever reason.
01:46:55 The people at.
01:46:56 The top are evil ******* predatory parasitic ********.
01:47:04 And the biggest reason that they're able to get into power is because ******* like you convince yourself that they're that they're not this this time it's different.
01:47:21 And so people like you are the.
01:47:23 Problem, not them.
01:47:26 Because there's way more.
01:47:27 People like you than like them.
01:47:31 There are way more people enabling.
01:47:36 The predatory ******* than there are predatory ********.
01:47:44 And the predatory ******* is only there because of the people enabling them.
01:47:49 So yeah, we could use a lot fewer of you.
01:47:53 So if that's your response, if that is, if that's how you react. If you have to, because I I don't know, I guess apparently that's how the enablers have to exist.
01:48:01 Right.
01:48:02 The enablers have to go through life thinking like Ohh, he'll stop hitting me eventually, cause I.
01:48:07 I deserved it, I.
01:48:09 You know, he's just teaching me a lesson. Like if if that's your ******* if that's your deal. That's how you get through life. That's how you have to relate to the the.
01:48:19 The experience.
01:48:21 Yeah, ******* go check out early.
01:48:23 Dude, we don't. We don't need you.
01:48:26 You're just in the way.
01:48:35 I hope that's clear enough. I hope that's clear enough to everybody.
01:48:40 Because I'm I'm just *******.
01:48:41 Tired of this? These people that think that that.
01:48:45 Well, I can't possibly face reality because.
01:48:48 Then I'll be sad.
Speaker
01:48:50 That'll be sad.
01:48:52 OK.
Devon
01:48:57 And get the **** out of the way, dude.
01:49:00 If you're going to be so sad, it's crippling and then you won't be able.
01:49:04 To do anything.
01:49:05 And all you'll be able to do is thinking about slitting your wrist. Go right the.
01:49:08 I'll hand you the ******* razor, dude.
01:49:10 Get the **** out of the way, you ******.
01:49:15 We don't need you.
01:49:23 All right.
01:49:28 So hopefully that clears some things up for some people.
01:49:32 It was just something. It was.
01:49:33 It was just nine. I mean, yesterday was like oh.
01:49:37 I I just I I really, I can't. I can't handle that **** from people anymore. I just can't handle this weak ****. People. There's so many ******* weak fagots out there that we just don't need anymore. But we never did. But like they're there and and they are literally the reason.
01:49:50 That we're in this mess.
01:49:52 You know, it's not like I said, it's.
01:49:53 Not the people at the.
01:49:55 At the top. Doing this to us.
01:49:57 Those people are always going to exist and they're always going to be vastly outnumbered, and the only time they're ever going to be.
01:50:03 Able to do the.
01:50:05 Worst **** that they're able to do.
01:50:06 Is if people let them.
01:50:09 And the the people letting them are the people that are are just they can't handle.
01:50:13 Facing the evil.
01:50:16 And it's like, OK.
01:50:20 You you like to be asleep.
01:50:22 Huh. You like you like to sleep?
01:50:25 Once you go sleep permanently since you that if that's your, if that's what you like so much.
01:50:31 Go take a dirt nap.
01:50:36 We'll never. We'll never have to wake you up.
01:50:43 On that happy note.
01:50:48 I don't know.
01:50:48 Maybe there's a happier way we can end this. Maybe there's.
01:50:50 A happier way we can end this.
01:50:54 I didn't. I didn't mean to go necessarily into that, that area. It just it just popped in my mind.
01:51:02 On a happier note.
01:51:08 Oh boy.
01:51:10 Let me just think.
01:51:12 Ohh today.
01:51:15 This is kind of funny.
01:51:17 Today is is, is Niger Solstice, Niger solstice.
01:51:23 I don't know if you guys, I mean I I don't know how many people are actually believing this ****, but apparently there's been, like, this weird thing on black Twitter going around that on December, December 21st, the.
01:51:34 OK.
01:51:35 There's some planetary alignment that's taking place that's going to unlock their DNA and give them superpowers. Like I'm not even.
01:51:44 This isn't a joke this.
01:51:45 Is well, I mean it's a joke, but I mean it's.
01:51:48 People really believe this. Yeah, I'll give you.
01:51:50 An example of this.
01:51:52 This is real. This is like a real tweet from a real person.
01:51:56 I think they have since protected their tweets because.
01:51:59 It's just they got ridiculed.
01:52:02 So here's a.
01:52:06 There we go.
01:52:08 Solstice.
01:52:11 So that's that's real. Apparently black people, some black people, I mean, not obviously not all black people there, but apparently enough to wear. It's it was like.
01:52:18 A thing I I think some rapper.
01:52:21 I don't know who, but I think some rapper was was popularizing this that because of the alignment of.
01:52:29 What is it? Saturn and Jupiter there is there really is some alignment of planets happening or just happened or something.
01:52:38 And then it is also winter solstice.
01:52:41 So like somehow that was going to unlock black people DNA and give them superpowers.
01:52:56 So anyway, I just I made the comment on.
01:52:58 Twitter, you know.
01:52:59 Like you know, at the end of this tweet, she says they.
01:53:02 Want to make?
01:53:02 Us average God if only.
01:53:05 If only we could make them average. If only we could make their their crime statistics average.
01:53:12 Instead of 13, does 50 if if 13 just did 13, that would be amazing.
01:53:18 That would be amazing.
01:53:20 The amount of problems we could solve in America if.
01:53:24 If blacks were just average.
01:53:28 Instead of well above average in certain areas and well below average in others.
01:53:36 But any rate?
01:53:37 That's a happy that's a funny note to.
01:53:40 To end on is today is Nega Solstice, so watch out for flying black guys.
01:53:47 Watch out for you know X-ray vision.
01:53:52 Laser eyes or I don't know what? I don't know. What superpowers. Exactly that they're expecting.
01:53:58 I do hope.
01:54:00 I do hope there's at least enough of them to ridicule online for this insanity. That was because this isn't like, I guess this went on. Like I I didn't. I hadn't really heard of it until yesterday, but I guess this went on for like a while. There was this wrapper telling him that, oh, yeah, this is your DNA is going to get unlocked and you're going to have all these, like your third eye is going to whatever, I don't know.
01:54:21 So just be aware of that that the.
01:54:25 The Super Black man has been born to day.
01:54:29 And you got to watch out for whatever magical powers are going to be unleashed on the public.
01:54:36 Another funny thing.
01:54:38 That we can end on.
01:54:40 Is, you know, with all of this talk.
01:54:44 Of Antifa and BLM, and this inability for.
01:54:48 The intelligence community to accurately define them as terrorist organizations and out of fear of in the case of BLM being called racist. Well, and in the case of Antifa being called anti-Semitic, right?
01:55:07 They have determined.
01:55:10 That there is.
01:55:13 A force.
01:55:15 That must be reckoned with.
01:55:18 There is a danger.
01:55:21 A danger to the United States?
01:55:27 And that danger?
01:55:34 I will show you in a second here.
01:55:46 That danger?
01:55:51 Is satanic Welsh Nazis.
01:55:57 Satanic Welsh Nazis.
01:56:01 The order of the 9 angels.
01:56:08 Founded in Shropshire, Wales by a witch who led an underground Pagan cult that has survived in secret since the coming of Christianity.
01:56:21 It reveres Adolf Hitler and Osama bin Laden for bringing death and chaos to the world.
01:56:32 Yeah, that's the real danger, guys, not BLM, not Antifa, not the communists.
01:56:39 No, it's satanic, satanic, Welsh Nazis.
01:56:44 Who, like, inexplicably like not just Adolf Hitler, but also Osama bin Laden.
01:56:52 That's that's the real danger.
01:56:54 To America. So while you're out celebrating Christmas, make sure you are.
01:57:00 Afraid of and avoid COVID.
01:57:04 Afraid of and avoid.
01:57:06 Satanic Welsh Nazis.
01:57:08 And be especially careful.
01:57:15 Blacks with superpowers.
01:57:21 And with that.
01:57:23 I bid you guys for a well for.
01:57:24 A black pilled.
01:57:27 I am.
01:57:29 Devon stack.