1:56:03

Who Writes your Bedtime Stories?.mp3

08/18/2023
Devon
00:00:00 What's up everybody?
00:02:32 How's my audio sound?
00:02:33 Does it sound blown out at all?
00:02:35 Or let me take a look at my levels here.
00:02:38 I had to.
00:02:39 Make some changes to the well.
00:02:43 This is like the fourth really the 5th computer.
00:02:46 I've been using this week.
00:02:50 Lots of hardware being swapped around.
00:02:54 Lots of computer surgery going on.
00:02:56 I think this is going to be the string machine.
00:03:00 The only problem is.
00:03:02 For those of you who are technical, I guess.
00:03:05 Is the video.
00:03:06 Cards in this one are from an old crypto mining machine and there are these AMD they're these cheapo AMD cards.
00:03:14 That I got because you could get a bunch of five. I think there were five 50s and they were less than 100 bucks a pop. And if you put like four of them together, they were like worth it was the most.
00:03:27 OpenCL or or whatever it is the.
00:03:31 The AMD.
00:03:33 GPU cores are called, but it was like the most cores you'd get bang for.
00:03:38 The buck, unfortunately, just as regular video cards, they're not that great.
00:03:44 And I've always been a NVIDIA person and I didn't realize how many problems.
00:03:49 Well, I don't know if it's like a lot of problems, but there there are some incompatibility problems with AMD video cards when it comes to cool.
00:03:59 Fancy streaming and visualization software like I've got going on right here.
00:04:05 I'm I've actually.
00:04:06 Had to do some pretty, you know, some electronic gymnastics to make this work.
00:04:13 But it seems to be working.
00:04:17 My other machine is on life support again, but I've got electricity today, which is good.
00:04:23 I haven't had electricity.
00:04:26 Well, hopefully I, you know, knock on wood.
00:04:29 Hopefully I continue to have electricity during this stream.
00:04:33 It has not been a regular thing.
00:04:37 As as I don't know if anyone, anyone who lives in the southwest and I'm not, I don't think I'm doxing my location.
00:04:42 Everyone knows that I live in a desert and there's.
00:04:44 Not exactly a desert in the northeast.
00:04:46 We're experiencing a heat wave, a heat wave, record heat.
00:04:52 Death Valley was 130 degrees, 130 degrees, the hottest place on Earth, the.
00:04:58 Other day.
00:05:00 So that's that's nice.
00:05:03 California is doing rolling blackouts.
00:05:08 Related to this heat wave.
00:05:11 Well, I don't know.
00:05:11 That's what they say.
00:05:12 It's related to.
00:05:13 I I kind of think it is.
00:05:14 I know some people are have some kind of.
00:05:18 Conspiracy about that?
00:05:20 Ohh, it's California trying to do.
00:05:22 No, I think it's just really that their infrastructure is not.
00:05:26 What it should be all the I mean, think about what they're spending money on is not infrastructure.
00:05:33 And when?
00:05:33 You have 130 degrees and Death Valley people running their AC's.
00:05:42 And, you know, people don't like to be uncomfortable.
00:05:45 I don't like to be uncomfortable, I guess, but.
00:05:48 I'm a little bit used to it right now.
00:05:49 It's like right now it's still the morning.
00:05:52 It's not, it's not noon yet.
00:05:55 And right now where I'm sitting.
00:05:58 It's let's take a look here. Oh, good. It's 82 degrees.
00:06:03 It's 82 degrees, which actually isn't that bad except for it's also.
00:06:10 I don't have the thing with me here, but it's probably like around 60% humidity because of my.
00:06:16 My swamp cooler.
00:06:18 It's been it's been a hot week.
00:06:20 It's been a very hot week, very sweaty week.
00:06:23 With sporadic electricity, so that's been fun.
00:06:30 But I figured I'd get this started here in the morning before it really heats up.
00:06:34 You know, and the power has been going out, it's been going out in the afternoons.
00:06:39 So hopefully we are able to maintain a connection here.
00:06:45 So hope you guys are having a good week having a good summer.
00:06:48 I wish it would end.
00:06:49 I wish this.
00:06:49 I mean, I'm.
00:06:50 I'm done with this summer.
00:06:52 You know when the seasons change.
00:06:55 It's always fun for like, a little bit.
00:06:58 You know, you get like that nostalgia, like when the winter first kicks in, like when you've been going through summer for, you know, from well right now it'd be like an endless amount of time.
00:07:11 And when the winter finally kicks in, and like you have like that first chilly morning, you know?
00:07:16 When you walk.
00:07:17 Outside and it's like, oh, it's a.
00:07:18 Little chilly it's.
00:07:19 Usually not chilly, and there's something about that that activates.
00:07:25 A part of your brain.
00:07:28 That makes you remember the last time it.
00:07:31 Was like that.
00:07:32 I don't know if you guys experienced the same thing, but and this is true of everywhere I've lived right when the season changes enough to where it's like that first day.
00:07:41 Of ohh it's.
00:07:42 It's winter now, or it's fall, or it's summer, you know, doesn't matter.
00:07:46 Anytime the season changes enough to wear.
00:07:48 Oh, it's officially different now.
00:07:52 It's like your mind.
00:07:54 Starts reminiscing or not?
00:07:56 I don't know if that's the right word, but it starts recalling.
00:08:00 All this data.
00:08:02 From the last time that the weather was like that.
00:08:08 I'm and on I guess on the opposite side of that.
00:08:11 So it's fun for like a little bit cause like oh, yeah, I remember last winter, you know, last Christmas or.
00:08:16 You know, last summer we did this or whatever.
00:08:19 And and then you get used to the the change after a while and then at least here for right now with the summer and then and they get really tired of it. It's like you're like, OK, I get it, it's hot, it's hot all the time. Everything's hot.
00:08:37 It well, it doesn't stop being hot.
00:08:39 I wish it would stop.
00:08:41 You know, it's so hot the last.
00:08:43 Few weeks.
00:08:45 I've got.
00:08:47 I've got cactuses that are that are dying in the heat.
00:08:51 I thought like.
00:08:52 I don't know if they're going to make it.
00:08:53 Some of these guys and.
00:08:54 And they're not.
00:08:56 They're not like, you know, some kind of tropical cactus or something that shouldn't be here, you know, though, all those have died a long time ago.
00:09:03 You know, these are.
00:09:03 Like cactuses, that should survive.
00:09:06 Just fine where I am.
00:09:08 If it would, I don't know.
00:09:11 Stop being really hot all the time and and maybe rain once.
00:09:15 That'd be good.
00:09:17 My water stuff, you know, it's not.
00:09:19 Like I just let them fend for themselves.
00:09:21 They I will eventually.
00:09:22 Once everything is established.
00:09:25 Yeah, it's.
00:09:27 It's been a really hot summer, really hot and dry summer.
00:09:31 I think for a lot of people.
00:09:33 You know California through the Texas.
00:09:35 Right now, there's like a heat wave map just like this big blotch all over the American Southwest.
00:09:43 Just on fire.
00:09:46 Actually, that that's another funny thing.
00:09:48 I I didn't look into it further, but I guess California has.
00:09:51 A fire tornado.
00:09:52 I don't even want to know.
00:09:52 What? That is a fire.
00:09:55 2020 man it it it is not.
00:09:59 It's going to go down as one of the the worst years.
00:10:03 Well, actually it.
00:10:04 Well, that's the that's.
00:10:04 The ****** thing?
00:10:05 Right, this is.
00:10:07 It's like the.
00:10:09 The opening day of bad years.
00:10:11 For the American Empire, but anyway.
00:10:16 So I kind of want to.
00:10:17 Talk about a few things.
00:10:20 I do have.
00:10:22 You know, I do.
00:10:22 I do have my, my, I actually have.
00:10:24 I have a animation that I'm I'm working on and I've got a couple other videos on the web.
00:10:30 But with all this computer hardware swapping around I just.
Speaker
00:10:33 You know.
Devon
00:10:34 Like I I forgot.
00:10:35 That was, there was one video I forgot.
00:10:36 I was even working on until I hooked up.
00:10:38 This hard drive was like, what's this?
00:10:41 And like looked inside, I was like oh.
00:10:42 Yeah, it's like 1/2 finished video.
00:10:45 It's been a mess with all.
00:10:46 These computers everywhere where I start working on something.
00:10:48 The next thing I know, like you know, there's hard drives and video cards and processors and RAM everywhere and.
00:10:54 Oh, good Lord, I want that to be over.
00:10:56 But anyway, uh.
00:11:00 One of the things I was thinking about.
00:11:01 The other day.
00:11:03 With with you know we we talked about the change, some of the changes that multiculturalism bring and one of those changes of course is the low trust.
00:11:18 Society right, which has rippling impacts that go.
00:11:24 All over the place.
00:11:26 And it's not just the multiculturalism directly, right?
00:11:31 It's not just a direct result of, you know, this group is high trust, and this group is low trust.
00:11:39 And so therefore, if you have less of this group and more of this group.
00:11:44 Then it's, you know, low trust or lower trust.
00:11:51 That's part of it.
00:11:53 That's part of it.
00:11:55 But really, a bigger part of it I think.
00:12:00 Is even your your high trust population?
00:12:06 Begins to to become low trust.
00:12:10 Not just because you know they're, they're now.
00:12:15 Surrounded by low trust people and interacting with low trust people again, that's that is part of it.
00:12:22 But another part of it is you feel.
00:12:26 Less connected.
00:12:29 To the society that you're in.
00:12:31 So in other words, you treat your family differently than you treat strangers.
00:12:36 I mean, even in a high trust society.
00:12:38 That's true, right?
00:12:40 If you're.
00:12:43 Dealing with immediate family members.
00:12:45 You're going to be less guarded.
00:12:49 You're going to be more open, more, more honest.
00:12:51 Honest, really.
00:12:52 I mean you are.
00:12:54 Maybe even like in a way like less polite, you know, because you're just telling them.
00:12:59 How it is because you love them and.
00:13:00 You know, whatever, but.
00:13:03 You're going to be.
00:13:05 You have an investment in your family.
00:13:08 You want your family to succeed, right?
00:13:11 And part of that is.
00:13:14 The exchange of honest facts, you know, being honest with.
00:13:17 Each other.
00:13:19 And as people realize.
00:13:23 That the society that they live in, they have no familiar connection with.
00:13:30 What begins to motivate them other than the success of the society?
00:13:37 Because we all get atomized.
00:13:41 And we all become these individuals.
00:13:44 As you know, libertarians love.
00:13:47 You are no longer in it for the success of the society you're in it for the success of yourself.
00:13:55 And so you're not as.
00:13:57 You're you're less likely.
00:14:00 To be.
00:14:02 Honest about something.
00:14:03 If being dishonest is going to personally benefit you, even if it hurts the society at large.
00:14:13 And so necessarily, what that means?
00:14:17 His innovation goes down.
00:14:20 And you know, one example of this we I think what was it last year or the year before that we had that study come out where they were unable to replicate in, in an obscene amount of scientific research that was peer reviewed and and published and everything else and.
00:14:38 You know, some group went back and started trying to replicate these studies and found that.
00:14:45 Most of them.
00:14:47 Or I don't know if it was most or about half, but it was a.
00:14:49 It was a large substantial.
00:14:51 Amount of them.
00:14:53 We're just ********.
00:14:56 It was, it was lies.
00:14:58 It was dishonesty.
00:15:00 And you know that has a ripple effect too, because now other people are basing.
00:15:04 Their research on you?
00:15:05 Know this this crap research.
00:15:08 And it just gets worse and worse and worse.
00:15:12 A big part of that is.
00:15:15 These researchers, they're not.
00:15:18 Motivated by what is good for the nation, what is good for the group, what is good for their society, it's what's good for their career, what's good for them personally, what's good for them right now?
00:15:29 What's going to make them the most money?
00:15:31 You see this with capitalists all the time.
00:15:33 When a capitalist makes a decision, it's not what is going to how is this going to impact?
00:15:41 The community that I live in, how how is this going to impact the society that I'm a part of?
00:15:47 Because really, they're not.
00:15:48 They're not a part, they're.
00:15:49 Not they don't see themselves.
00:15:50 As as part of your society, certainly.
00:15:55 The question then then is how is this going to benefit me?
00:16:02 And so decision after decision is made.
00:16:06 To benefit them directly at the expense of everyone else around them.
00:16:14 And that's just going to increase and increase and until there's not.
00:16:21 Any kind of?
00:16:25 Because you won't be able to trust research.
00:16:28 You won't be able to trust business partners.
00:16:31 I mean, you're the the.
00:16:32 This is going to have.
00:16:34 There's a reason, China.
00:16:37 Is excelling at the rate that they are because they don't suffer from this problem.
00:16:43 I mean they suffer.
00:16:44 I mean, the part of this.
00:16:44 Is the human condition right?
00:16:46 The greed is.
00:16:48 Is always going to be a part of the human experience.
00:16:52 So there's there's.
00:16:53 Going to be an element of this going on in Chinese society, everyone in it for themselves because on some level I think that's just how humans operate.
00:17:01 It's a survival mechanism, right?
00:17:05 But I believe and maybe I'm wrong.
00:17:08 That the Chinese.
00:17:11 Are more collective.
00:17:14 They care about.
00:17:16 What happens to their society?
00:17:18 And even if they don't, I guess they have a government that.
00:17:22 That perhaps does.
00:17:24 And that is very handsy.
00:17:27 And so they can't get away with the same sorts of things that.
00:17:32 The the the individualist capitalist libertarians can in the West and do.
00:17:44 So that was just one of the things I was thinking about and it's kind of tied indirectly.
00:17:52 Something else that's been in the news lately, I I think a lot of you guys have been following or you know, at least nominally.
00:17:59 The the Milly Weaver situation, right?
00:18:02 No, I haven't looked up anything this morning, so I don't know if there's anything added, so maybe I'm going.
00:18:06 To be missing something about it, I I doubt it.
00:18:10 But for those of you who don't know.
00:18:13 The basic idea, as I understand it.
00:18:17 Is last week.
00:18:19 Millie Weaver, if you don't know who that is, she's a like a Stringer.
00:18:26 For Infowars, as stringers like a a contract reporter, right?
00:18:31 So she's not a an actual employee of Infowars, but they pay her to do stories and that sort of thing.
00:18:38 They have a few.
00:18:40 People like that.
00:18:42 And most people have have seen.
00:18:46 Millie Weaver.
00:18:47 She's like the blonde.
00:18:49 That has done a lot.
00:18:50 Of work for Infowars, she has her own YouTube channel.
00:18:54 I've never actually.
00:18:56 Checked out her content that wasn't on Infowars and I haven't really seen, to be honest, a lot.
00:19:01 Of her Infowars stuff.
00:19:04 But I saw enough to, to real.
00:19:06 You know, my view of her.
00:19:07 And look, I don't mean I don't have any.
00:19:10 Animus to her, anything like that.
00:19:12 But really, I didn't really.
00:19:14 I didn't exactly.
00:19:16 View her as like a hard.
00:19:18 News kind of a person if you know what?
00:19:21 It was.
00:19:21 It was more of like the like if to translate this to maybe like a local news station, she seemed like the kind of of reporter you would send off to go cover the the puppy.
00:19:34 You know, the puppy adoption drive or you.
00:19:37 Know something like that.
00:19:39 And not not someone that you have covering the White House or whatever.
00:19:46 And I don't mean to be insulting to anyone.
00:19:49 It's just that's.
00:19:50 It is what it is.
00:19:52 And so I was a little bit surprised.
00:19:56 When someone told me.
00:19:57 Ohh yeah, she released this.
00:20:00 Video this documentary.
00:20:03 That had a big expose on, you know, some deep states, blah blah blah stuff.
00:20:11 And while she was releasing it, you know, she was arrested and and it was crazy.
00:20:19 And I thought to myself, what?
00:20:20 Alright, well, she was really, first of all, let's determine cause already already this seems like.
00:20:25 Out of character, you know, like So what?
00:20:28 Let's find out.
00:20:29 What's going on here?
00:20:31 Let's see if she was really arrested and all that.
00:20:33 Stuff and you know.
00:20:35 It seemed like she was really arrested.
00:20:37 She really was arrested.
00:20:39 You know, she streamed, I guess, or uploaded.
00:20:41 I don't know which.
00:20:43 A video of the cop.
00:20:45 Being there and and arresting her and and I don't know if it's her husband or her boyfriend or.
00:20:50 Whatever at their home.
00:20:54 And she was saying like ohh, you know, I'm just I'm releasing this.
00:20:58 And so it's like, alright.
00:21:02 You've piqued my interest.
00:21:04 Let's see, because you know.
00:21:06 This is a little weird.
Speaker
00:21:08 It is a little weird.
Devon
00:21:10 So let me watch this video and see how is this.
00:21:14 Tied if it is at all tied because there was number real information yet at the moment.
00:21:19 As to whether or not this was even related you.
00:21:22 Know but you.
00:21:23 Know it, it seemed like again.
00:21:25 You don't expect.
00:21:26 That the the girl that you're sending off to cover the the.
00:21:29 Puppy adoption stuff.
00:21:31 To get arrested with her with her husband.
00:21:33 You know, with, with an indictment, that was apparently secret indictment.
00:21:37 Like, you couldn't even, you know, there were.
00:21:40 There were reporters calling up the Sheriff's Department trying to get more information on that, and they were vague.
00:21:45 They did say a few things and we'll get it in a second.
00:21:48 But it was vague enough to.
00:21:49 Where you were like, wow, maybe.
00:21:51 I don't know.
00:21:51 Maybe it is.
00:21:53 Let's see what's up with this video.
00:21:56 So I watched her video.
00:21:57 It's called.
00:22:00 What shadow gate?
00:22:02 I think.
00:22:05 And I think it's off YouTube now.
00:22:07 I think YouTube has since banned it, and so it's probably on bit shoot, which by the way, you guys should definitely subscribe to me.
00:22:18 Over on bit shoot, there's a lot of stuff that like hasn't been on this channel.
00:22:22 So if you've just been.
00:22:25 Checking out YouTube and you're like, wow, you know, Devin hasn't posted in a while. That's not true. I've been doing streams over on D live, so go over to D live.
00:22:35 It's D live forward slash Devon stack, not black pill.
00:22:38 There's some fake ones that are that are called black pill, but you know, it's not me.
00:22:44 So go to D live forward slash.
00:22:48 You know Devin Stack and that's me.
00:22:51 And I've been doing streams there and then uploading them over to bit shoot.
00:22:57 So there's actually a decent amount of, you know, hours of content.
00:23:01 That is not on the YouTube channel right now.
00:23:05 And that's because, quite frankly, it's.
00:23:07 The days are days are numbered here on YouTube, everyone knows that.
00:23:12 Everyone knows everyone who who uses the autoplay feature knows that you know like.
00:23:18 They don't want you watching anything but CNN and they just want it.
00:23:22 They want to be Netflix, they want to be Netflix and cable TV.
00:23:26 That's what they.
00:23:27 Want it's what they want and that you know.
00:23:30 With maybe some?
00:23:32 Other content creators with makeup tutorials and justice garbage, you know, mindless content.
00:23:39 That's fine, whatever.
00:23:41 I mean, it's not fine.
00:23:42 I wish they weren't doing that.
00:23:43 But you know what?
00:23:45 What can I do?
00:23:48 So go over to D live and.
00:23:50 Bit shoot and.
00:23:50 Subscribe over there.
00:23:52 And while you're over there, you can check out that documentary.
00:23:57 And just you know, it's like an hour, hour and change.
00:24:01 And there are some, you know, useful things in it the.
00:24:06 I'll give you the lowdown.
00:24:09 The basic idea of the documentary is it's a lot of stuff.
00:24:13 We already kind of know, but I guess it it gets a.
00:24:15 Little more specific.
00:24:16 And it adds one angle to it that again, we all kind of knew, but it just it just adds some specificity to it.
00:24:24 The bottom line is.
00:24:27 A lot of the products that the intelligence agencies have developed over the years for influence campaigns.
00:24:37 Originally meant for overseas, so influencing elections influencing.
00:24:46 I don't know.
00:24:46 Maybe rebellions influencing revolutions?
00:24:52 The software that they use.
00:24:55 That they developed for these purposes.
00:24:59 Has in this.
00:25:01 A lot of people don't realize this most.
00:25:03 Of what gets done.
00:25:04 By these agencies.
00:25:06 And especially in terms of software, anything technical?
00:25:10 Is done by a contractor.
00:25:13 If you might.
00:25:15 Remember that Edward Snowden didn't work for the NSA.
00:25:20 He worked for Booz Allen Hamilton as a contractor contracted.
00:25:26 By the Anna SA.
00:25:28 And that's something that is common and.
00:25:31 It's very common.
00:25:34 I've I've especially with agencies like the NSA, I've I've met a lot of people who.
00:25:41 Worked for the NSA through Booz Allen.
00:25:46 There's other agency or not, agencies, contractors like Deloitte.
00:25:52 What's the big one?
00:25:53 The big one?
00:25:55 There's a big one I keep forgetting I.
00:25:59 Anyway, there's lots of them and and employee wise.
00:26:03 They have 10s of thousands of employees.
00:26:06 These are these are not small contractors.
00:26:08 These are huge, huge multi national contractors.
00:26:15 OK, so they have, I mean, Deloitte and Booz Allen and and these these types of contractors, they're all over the world.
00:26:24 They're doing the same kind of work for different intelligence.
00:26:27 Agency and not just intelligence agencies, government agencies, essentially.
00:26:30 So they're in the Middle East, they're in.
00:26:32 You know, they're they're working for the Saudis, developing software.
00:26:36 They're in the, you know, they're all over up in DC, a lot of them have offices in DC and in, you know, in Northern Virginia has, like, a lot of these companies are based in Northern Virginia because that's where all the intelligence agencies are.
00:26:51 And so they developed these products, these software packages.
00:26:57 For use, you know at the, let's say, the CIA wants to do an influence campaign, they want to change the outcome of a election.
00:27:07 They want to meddle.
00:27:09 If you will.
00:27:11 In foreign elections, you know, they they want to do essentially what they what the left was accusing Russia of doing.
00:27:18 They want to set up shop.
00:27:22 In let's say, I believe that that one of the more recent ones was Afghanistan.
00:27:26 This actually happened.
00:27:29 And they set up all these socks that they have, software that does that manages this it's project.
00:27:35 It's a mixture of it's like project management software, but also kind of facilitates the actual creation of the.
00:27:42 The social media account creation, you know, like the the sock puppet accounts, the bot accounts it.
00:27:48 It manages the bots.
00:27:51 It also manages the data, so if you want to go to Afghanistan and influence their election, you know the first thing that they do is they have data collection and and by using this data collection.
00:28:05 They can, you know, they scrape all of the social media accounts and they they have AI that looks for trends.
00:28:12 And let's say that they, you know, the AI determines like, oh, you know, most Afghanis are worried about.
00:28:21 XY and Z and the cool things that they're interested in is, you know, AB and C.
00:28:29 And so if you tailor your message to address XY and Z and use it with the memes that you know the the style, the thing that's popular with AB and C, you'll have the most impact.
00:28:43 But not only that, as you deploy these memes as you deploy these influence.
00:28:52 Because this is a real time thing, this is a.
00:28:55 This is an ongoing thing that this software is constantly.
00:29:00 Scraping these social media accounts and constantly monitoring all interactions that's part of the problem.
00:29:08 You know, a lot of people don't realize or think.
00:29:10 About rather.
00:29:11 That it's not just oh, you know, the government spying on me, they're going to see what ***** I'm looking at.
00:29:16 It's that's like a.
00:29:17 Tiny that doesn't matter at all.
00:29:19 Then it really.
00:29:19 I mean, it really doesn't.
00:29:20 Unless you're doing something really horrific and making.
00:29:22 Blackmail you with it.
00:29:24 That they don't really give a **** about that.
00:29:26 They give a **** about.
00:29:27 Is the ability to develop.
00:29:32 Or to come up with this big data in the 1st place so they can tailor this this mean campaign.
00:29:40 To deploy with the with whatever objective they have.
00:29:46 But also so because they're it's an ongoing thing.
00:29:50 They can see how effective it is.
00:29:52 In real time.
00:29:54 So if they make a meme and you know, just as an example, it's not just always just a meme, but let's just.
00:30:00 Say they make a meme.
00:30:02 And they put it on social media.
00:30:04 You know, they can look at the metrics and be like, OK, well, this meme really resonated.
00:30:08 This one fell flat.
00:30:10 And you know before.
00:30:13 So this is where AI gets dangerous when it's mixed with big data.
00:30:16 This is where it gets dangerous because before, if you had this kind of a capability.
00:30:22 I mean, it's still.
00:30:23 Kind of.
00:30:24 You know, it's ****** **.
00:30:25 It's it's a power that.
00:30:29 No one's ever had before.
00:30:31 But excuse me, COVID kov the.
00:30:37 The the IT used to be that you'd have analysts, right?
00:30:40 So you'd have all this data coming in in.
00:30:42 Fact that used.
00:30:42 To be the argument that the NSA would give you that.
00:30:45 Ohh well, it's OK that we are.
00:30:48 We're sucking up all this data.
00:30:49 We don't have enough analysts to sit down and watch all this stuff.
00:30:52 So like, you shouldn't be worried about us vacuuming up.
00:30:56 All your data.
00:30:57 Because we can't, we can't even use it.
00:31:00 The only reason why we're doing it.
00:31:02 Is if you know Billy ****** nuts and does a mass shooting or something we can like, go back and do a social media and whatever, right like that was their argument, not even that long ago, like 5-5 years ago. You know, maybe even a little bit less.
00:31:20 But now, as we are coming to realize, we they have AI that they can unleash on this unmanageable amount of data.
00:31:31 And now all of a sudden it becomes manage.
00:31:34 Now you don't need half a million analysts reading posts and looking for, you know, trends and and filling out forms which.
00:31:45 Is what they used to do.
00:31:48 You have AI that is written by probably another one of these contractors, informed by an analyst who probably was also another one of these contractors.
00:31:59 And they use whatever.
00:32:02 Skills and whatever classifications they used when they were analyzing it by hand to write an algorithm to do it.
00:32:10 And you know through trial and error they they perfect it and and.
00:32:15 And they will perfect it.
00:32:17 You know these AIS are getting smarter.
00:32:20 They're getting.
00:32:20 They're really good at doing just really repetitive.
00:32:25 Small things, right? Like when people think AI and they think ohh that's that's like 1000 year or 100 year, 50 years in the future whatever because it's so you know the the human mind is so complex.
00:32:36 AI will not will.
00:32:37 Not be able to replace us.
00:32:39 You know, for a really long time and everyone's just worried.
00:32:41 About it for no reason and.
00:32:43 If that's the way you look at AI is.
00:32:46 A human mind replacement, you're right.
00:32:49 Because it's not, it's not going to replace the human mind for several years.
00:32:54 I mean, I don't know.
00:32:54 Maybe The Who knows.
00:32:56 There might be a a huge uptick in in advancements as AI starts programming itself, but as of right now, that's not the worry though.
00:33:05 The worry is these very specific tasks that used to take 500,000.
00:33:13 To do, which made this big data kind of useless right?
00:33:19 In a lot of ways it was useless.
00:33:22 It is now very useful.
00:33:26 Very useful because not only can it see.
00:33:30 How well your meme is performing.
00:33:35 After you do it a few times after, it analyzes the process after it analyzes how it gets shared, who shares it, who shared what mean last time, you know, at at what time of day is there maybe a different time of day that people share memes and it can just sit there and crunch all these numbers and tell it finds this.
00:33:55 Optimum mean delivery system.
00:34:01 It can from there it can.
00:34:02 It can go even further.
00:34:03 It can start predicting.
00:34:07 Predicting how people are going to react and then if the prediction is correct, it's like OK and it creates another rule in how it analyzes data.
00:34:16 If it's wrong, it scraps it.
00:34:17 Try something else.
00:34:18 It's learning.
00:34:19 You know people, they they call it machine learning.
00:34:22 And again, a lot of people think ohh, that's just crap, you know?
00:34:25 You know, they'll never be all the and it's.
00:34:27 Look, you're not looking at the right way.
00:34:29 It's a very specific thing that it's doing very specific thing.
00:34:34 You know, like the deep blue, when they built deep blue to to beat the world's chess master, which happened like.
00:34:43 I don't know.
00:34:43 It was you.
00:34:44 Know it was a.
00:34:44 While ago now, like 10 years ago.
00:34:47 That was a computer.
00:34:49 That was it was an AI, essentially, that was just really good at one thing.
00:34:55 It's always good at couldn't do anything else, but it could do chess better than the best chess master.
00:35:03 And that's how all these AIS are going to perform.
00:35:07 They're going to get better than any human.
00:35:09 At a really specific task, and eventually you know.
00:35:14 The ones the AIS, I guess we will have.
00:35:16 To freak out.
00:35:17 About are the AIS that?
00:35:19 Begin incorporating all these little mini tasks and coordinating them into some kind of.
00:35:27 Consciousness or something like that.
00:35:28 But that is a long way off at least.
00:35:30 I don't know.
00:35:31 My unexpert opinion is that's a long way off, but anyway.
00:35:35 So back to the story.
00:35:37 So the CIA or whatever agency would go out to Afghanistan and deploy this software, this AI that has this massive data collection.
00:35:46 And they could tell in real time how effective it was it was performing.
00:35:52 And they could and did.
00:35:55 Influence elections successfully.
00:36:01 And what Millie weavers video covers is one of these contractors.
00:36:12 I forget the name of of this specific one.
00:36:14 It doesn't really matter though that much, it's in the, it's in the documentary.
00:36:18 You can check it out.
00:36:20 I I'd like to do further research on that particular contract there and I'm you know, I'm not trying to avoid like you know, I don't want to talk about it.
00:36:29 Just I don't remember the name because it doesn't for what we're talking about today doesn't really.
00:36:34 What matters is because this contractor is a private company.
00:36:46 What did I what were we just?
00:36:47 Talking about with.
00:36:50 The loss of in Group preference and doing stuff just to make money.
00:36:56 Regardless of the impact it has on your society.
00:37:03 Because all you care about.
00:37:04 Is yourself alright? Well so.
00:37:07 Because these contractors are private entities.
00:37:13 Private companies run by private individuals.
00:37:18 Who don't feel any connection to you?
00:37:21 At all.
00:37:23 Or to any of their neighbors.
00:37:25 These are citizens of the world if.
00:37:27 They're citizens of anything, or maybe their own kingdoms.
00:37:32 Certainly not members of your society.
00:37:35 And they now have this very valuable tool.
00:37:41 This very valuable influence tool.
00:37:47 And a lot of people.
00:37:48 Like would like to think.
00:37:49 Oh well, you know, there's no way.
00:37:53 There's no way I could be influenced by something like this.
00:37:58 I'm too smart for that.
00:38:01 I'm an individual.
00:38:07 I am an individual.
00:38:10 You can't predict my behavior.
00:38:14 I'm my own person.
Speaker
00:38:17 I'm unique.
Devon
00:38:19 I'm a snowflake.
00:38:26 And of course, that's not true.
00:38:28 All of us.
00:38:30 All of us are very.
00:38:32 You know, and this this is regardless of what you believe with God.
00:38:35 It's not.
00:38:36 It doesn't matter.
00:38:38 There's nothing to do.
00:38:38 With, with, with God we are.
00:38:40 Still complex machines machines though.
00:38:43 That that have predictable behavior.
00:38:45 I mean you.
00:38:46 I'm sure there's people that you know very well.
00:38:49 That you can predict their behavior.
00:38:51 Right.
00:38:54 And the more you get to know.
00:38:55 Someone the easier it is to predict their behavior.
00:39:01 And the more that you can predict this behavior, the easier it.
00:39:04 Is to manipulate them.
00:39:08 Maybe not for bad reasons.
00:39:11 But it's true.
00:39:12 The more you get think about it like in terms of a spouse or children.
00:39:17 Or even just like a best friend or a parent.
00:39:21 The more you get to know their personality.
00:39:24 The know the more.
00:39:26 You know what buttons to push to get the result?
00:39:29 That you want.
00:39:30 And again, I'm not saying about anything nefarious, but let's just say you want your mom to take you or let you spend the night at your friend's house. This is something that.
00:39:41 Even kids figure out right?
00:39:44 And you know that there's like, a certain type of day that you should ask, like, don't ask her right when she gets home from work or right when she's cooking or, you know, you start to, you develop, you're basically doing what?
00:39:54 This AI is doing.
00:39:56 You're developing a path of least resistance.
00:40:01 You know that.
00:40:02 OK, if I ask her right after she's had her.
00:40:06 Her glass of wine with dinner and and everything's relaxed and. And you know she's she's watched her soap opera or whatever, you know.
00:40:17 You're going to have the most success.
00:40:20 In that moment, and then you can even you even know how to tailor the question like you know that like, OK.
00:40:25 And if I asked her this certain way, you know I can't just demand that she let me do this.
00:40:30 I have to make it sound like it's her idea.
00:40:33 You know, this is just this is just basic.
00:40:36 Manipulation tactics that everyone uses.
00:40:43 And So what this product is that these contractors have developed?
00:40:48 Is this exact exact same thing?
00:40:53 Just on a massive scale.
00:40:55 And whether you want to believe it or not.
00:40:58 You you are. You are manipulative. Yeah, it's you. I don't know if it's some people are easier to manipulate than others. You know, we we joke around about NPC's, right?
00:41:08 But everyone.
00:41:10 Has been manipulated at one point or another in their life.
00:41:13 Everybody has.
00:41:14 I have.
00:41:15 You have.
00:41:16 Everyone's been manipulated.
00:41:18 Everyone's had buyers remorse.
00:41:23 Had buyers or Morris with like a girlfriend or something like that, you know, everyone's made a decision.
00:41:31 Because they were manipulated into making that decision.
00:41:36 And then realized after those conditions went away. Ohh I made a mistake or sometimes you don't realize that you made a mistake. Some people go years without even realizing they're getting manipulated. And I guess that's the MPC's right?
00:41:52 And so these private contractors.
00:41:57 Have this software.
00:41:59 That can.
00:42:01 And does.
00:42:03 Manipulate people.
00:42:07 It just does.
00:42:09 Given given the resources and access to data necessary.
00:42:13 For it to work.
00:42:15 It does. It manipulates people.
00:42:23 It does work and it works on everybody to varying degrees, but it works on everybody.
00:42:29 And it's only.
00:42:30 Going to get better.
00:42:32 You know, I think that we saw especially around 2016, you'd see a lot of this stuff kind of in its.
00:42:38 I wouldn't call it its infancy, but it's, you know, it was.
00:42:41 It was a little unrefined.
00:42:42 It was somewhat easier, as you know, as time went on, you would.
00:42:46 Start to notice.
00:42:48 Oh, that's obviously a share blue bot or that's, you know, whatever Hillary bought.
00:42:54 Kind of a thing.
00:42:56 And you would start to because you'd recognize patterns.
00:43:00 Because a lot of this stuff with.
00:43:02 The AI and and everything is just it's.
00:43:05 It's not as complex as a human right, so it's going to have some of its limitations, but that's going to go away.
00:43:10 That's going to that's going to slowly go away.
00:43:15 You know we have.
00:43:17 They have chat bots that they've developed, right, and they do the touring test, you know, with for those of you who don't know, it's you basically.
00:43:27 Talk to a chat bot and try to determine if it's a chat bot and there's certain methods you can use to you know to talk.
00:43:37 That'll reveal that it's a machine, right?
00:43:40 Like you can ask it certain things that that trip trip it up or or whatever.
00:43:44 And that's getting.
00:43:46 Harder and harder to do.
00:43:48 They're getting chat bots that are that are becoming.
00:43:52 More and more human like and not just chatbot.
00:43:56 There's also AIS, and these are easy to find online, which, by the way, means they're not the best ones.
00:44:03 Where they can you tell it to read a bunch of newspaper articles and then write a newspaper article based on the newspaper articles that it read and it does and.
00:44:13 You know it's.
00:44:14 Like 90% there. Like there's some sentences. They're just like that doesn't even make sense.
00:44:19 But again, this is what's just publicly available for free.
00:44:23 So that's not.
00:44:24 What they're using?
00:44:26 At all.
00:44:29 So you have these private contractors who are not.
00:44:35 They're not bound.
00:44:39 By FOIA, they're not bound by the Freedom of Information Act.
00:44:44 So if you send a request to this company like you know, send me all your emails on all of your dealings with Hillary Clinton or whatever that they just tell you to **** ***.
00:44:55 They don't have to respond to anything.
00:44:57 They're a private company.
00:44:59 They can do what they want.
Speaker
00:45:03 This is.
Devon
00:45:05 What we're witnessing right now.
00:45:10 Is when individualism and capitalism and my private company.
00:45:16 Starts to really, really.
00:45:20 Spin out of control.
00:45:25 You know that this whole thing with, you know, let's let's fight the good fight.
00:45:29 Let's let's go after these censorious.
00:45:35 Companies like YouTube and Facebook and Twitter, and that sort of a thing.
00:45:40 Yeah, it's really kind of a small.
00:45:43 Part of the pie here.
00:45:46 And that, that's that's certainly, I guess important and it's important that people understand that, but that's.
00:45:53 That's only the beginning of the the problems.
00:45:55 We're going to have.
00:45:57 Unless something changes.
00:46:01 And it's funny because it's all the people.
00:46:05 Right that push individualism.
00:46:10 All these people that.
00:46:13 That say, oh, you know, private companies can do what?
Speaker
00:46:16 They want.
Devon
00:46:18 It's them, but they're getting.
00:46:22 Screwed over by this stuff.
00:46:25 The most.
00:46:28 Now another thing to think about with this this kind of a tool, right?
00:46:31 This kind of influence tool.
00:46:34 I mean, if I was running one of these.
00:46:36 Private companies well.
00:46:37 Let's first take it one more step.
00:46:41 The the problem with this you know the private company having, we've already described the problem, but what Millie describes in her?
00:46:50 Documentary is these these companies now are it's it's no longer just a tool of an intelligence agency that, I mean, they still, I mean, let's let's just be honest.
00:47:01 The CIA spies on Americans.
00:47:03 It's in their charter.
00:47:04 They're not supposed to do that.
00:47:05 But everyone knows they do, right?
00:47:06 But at least like you.
00:47:07 Know there's some.
00:47:08 Kind of.
00:47:08 I don't know.
00:47:08 I don't know.
00:47:09 That really matters that much, but like there's nothing to prevent, not even like some Charter that they they're going to ignore.
00:47:16 There's nothing to prevent the the private company to selling these tools.
00:47:22 Not just to people that want to sell you cheap sunglasses on Facebook or something like.
Speaker
00:47:26 That but you.
Devon
00:47:27 Know people that want to influence elections.
00:47:31 Right.
00:47:35 And that's what's happening.
00:47:38 And that is what's happening and it's been happening for a long time.
00:47:47 That is, that's essentially.
00:47:50 What this documentary covers, and it's a little more specific, it talks specifically about.
00:47:56 A tool it talks specifically about, you know, some of the companies, some of the contractors and the people behind these companies and that sort of a thing and.
00:48:07 And yet like.
00:48:07 I said this isn't a deep dive.
00:48:09 That's not.
00:48:10 I I want to talk about the concept because I think that's what a lot of people are missing is kind of this the big picture of what this is.
00:48:16 But if you want.
00:48:17 To find that stuff out, it's in the documentary and there's like by like I said the the the product and the, the the companies mentioned, there's just a drop in the barrel.
00:48:27 There's so many of these.
00:48:30 These contractors out there that are that have these kinds of tools and that are making them available to anyone who wants to influence the public and they work.
00:48:39 And they work.
00:48:43 And I know there's a lot of people that are tired of me banging the OR beating the the queuing on dead horse.
00:48:52 But part of this project management software.
00:48:56 Is exactly the kind of software you would use to manage a product or I'm sorry, a project.
00:49:02 Like Q Anon, it's exactly what you would use.
00:49:07 Exactly what you would use.
00:49:13 Think about it.
00:49:15 You release the Q drop or whatever that is.
00:49:20 Whatever it's called.
00:49:21 For those of you, I mean, I think most people are familiar with Q and on, but if.
00:49:24 You the real short version is.
00:49:27 Oh, now what is 8 Colon now used to be on 8 Chan before there was on 4 Chan.
00:49:30 There's the anonymous poster posting stuff like I'm an insider.
00:49:34 And don't worry, everybody. Everything's gonna be fine because there's patriots that are doing a secret war against.
00:49:39 The deep state, blah blah blah.
00:49:40 Blah blah. So don't worry, everything's fine and you know, here's some some stuff to research because.
00:49:48 You know, here's another rabbit hole to go down.
00:49:51 And everything's in code and whatever.
00:49:53 If you're using a tool like this.
00:49:57 You know, a lot of people are wondering, like, how is it still going on like this thing has been going on for almost four years now.
00:50:05 This Q Anon thing and it just seems to never die.
00:50:08 It seems to never die no matter how wrong the predictions are.
00:50:13 It just seems to keep going.
00:50:16 Well, now I think you have your answer.
00:50:20 You have a product like this.
00:50:23 That after every Q drop.
00:50:26 You are monitoring the data of all the queue people.
00:50:30 You know which you know exactly what they're doing.
00:50:33 You know you've you've analyzed, like, let's say that there's one.
00:50:36 Queue drop that you do.
00:50:39 And it generates a bunch of videos, like a bunch of the the queue, Youtubers or whatever.
00:50:44 They make a bunch of videos and you know, you get, like, a bajillion retweets, I guess.
00:50:49 Now, that's not as much of a thing because, you know, Twitter, I think, has banned the accounts.
00:50:54 But like Gabby, even you know, like, cause it's not like they only monitor the the normal sites, they they monitor everything that's open.
00:51:03 Think about parlor.
00:51:04 One of the things that.
00:51:07 Parlor does is Parlor gets a lot of your.
00:51:09 Personal data, doesn't it?
00:51:13 You think it's not feeding that data directly to to software like this?
00:51:17 Feeding that data directly to products like this?
00:51:21 Absolutely, it is absolutely it is.
00:51:27 And for the sites like Gab, it's they don't have to participate in in like an open sharing, which I absolutely think parlor is.
00:51:36 Because it's open, it's an open website, right?
00:51:38 Like you can just go and and scrape accounts all day long, and there's nothing to stop you from doing that.
00:51:43 Same with Twitter, Facebook, I think to some extent.
00:51:49 You have this product.
00:51:51 This AI that can determine OK well that post that you just did.
00:51:57 At this time and you use this kind of phrasing, really did it generated a lot of Twitter stuff and a lot of, you know, YouTube videos and a lot of activity, a lot of engagement that's that's that's like if you want to think like a really small.
00:52:17 Version of what I'm talking about.
00:52:20 There are Twitter people that do this OK.
00:52:24 I'm not one of them.
00:52:25 But there are Twitter people that do this.
00:52:27 Where they'll tweet something and they'll immediately open up the metrics.
00:52:35 And they will watch to see how that tweet performs.
00:52:39 And they will.
00:52:40 They'll basically do by hand or manually or whatever.
00:52:45 What this AI is doing.
00:52:47 They'll they'll take note of the time of day that they sent it out.
00:52:52 They'll take note of the phrasing they used.
00:52:54 Maybe the the length of the tweet.
00:52:59 Different, you know, different aspects, but they can see.
00:53:02 The rate at which it's being shared, you know and and when you look at these metrics, it gives you enough data to at least kind of get a feel right.
00:53:10 And there are people.
00:53:12 I know this for a fact.
00:53:14 That really watch that and they craft each tweet.
00:53:19 Carefully based on what they've learned by watching that little screen.
00:53:25 And it works.
00:53:26 If you're good at.
00:53:26 It it works.
00:53:29 And that's what that AI is doing, just in a more complex way than a human could do.
00:53:34 Because it's looking at not just one little screen, it's looking at like a billion different metrics that no one human.
00:53:39 Being could keep track of.
00:53:42 But it's using the same methods.
00:53:45 It's using the exact same methods.
00:53:47 So if you want to know why Q.
00:53:48 Anon's not.
00:53:49 Dead, that's why.
00:53:54 I don't know that for a fact, but.
00:53:56 I I'd be very surprised.
00:53:59 If a product like that.
00:54:03 Isn't being used.
00:54:07 And that's how they keep it going.
00:54:14 And that's not all.
00:54:17 You got to think about it's not just monitoring.
00:54:21 How and and tailoring and reacting to the psyop like like Q Anon, right?
00:54:31 You also have to think about.
00:54:38 Narrative herders.
00:54:40 Bedtime story authors.
00:54:45 In other words.
00:54:48 It's so much easier if you if you have someone that's already in existence.
00:54:54 That has a platform.
00:54:56 I mean, look, you could use me as an example.
00:54:58 I wouldn't be immune from, you know, something like this, right?
00:55:02 I have subscribers.
00:55:04 I've got a following.
00:55:06 I have people that that like what I say I I'm sure I influence.
00:55:10 I hope I influence honestly.
00:55:11 I mean that's kind of why I'm doing it.
00:55:13 I'm honest about that.
00:55:14 I'm not.
00:55:14 Not trying to trick you into thinking stuff, I'm.
00:55:17 Being trying to be honest.
Speaker
00:55:18 You guys.
Devon
00:55:20 But if you have someone like me or there's and there's lots of people with way bigger, you know followings than I've gotten, you know, some smaller or whatever.
00:55:27 I guarantee you if you're running an influence campaign, that's something that you're going to target.
00:55:32 In other words, why?
00:55:35 Bother with crunching the numbers.
00:55:39 With every single.
00:55:43 Target that you want.
00:55:47 When you can.
00:55:49 Do it more efficiently through a conduit through an asset.
00:55:57 And the asset does.
00:55:58 It doesn't necessarily mean the asset knows they're an asset.
00:56:01 OK.
00:56:02 That doesn't mean that they get someone like me and they blackmail me.
00:56:06 Right?
00:56:07 And then.
00:56:07 You have to say this.
00:56:09 Tell your audience this or else.
00:56:12 You know, it's doesn't have to be that or even like, oh, tell your audience this, it will pay you.
00:56:16 It doesn't.
00:56:16 It doesn't have to be that at all.
00:56:20 They just have to use an influence campaign directed at me.
00:56:26 It's all I got to do.
00:56:29 Yeah, assuming it works.
00:56:30 And there's going to be like again.
00:56:32 On some level, I would imagine it would probably work.
00:56:36 Some something would work on some degree because I don't think anyone.
00:56:39 'S immune to this stuff.
00:56:40 Right.
00:56:41 It's just more of like how much it would work.
00:56:44 You know, like Facebook, this was years ago, too.
00:56:47 They they did a little test to see if they could put people in a bad mood and they did it right.
00:56:51 Like with not 100% success and I would like to think I'm because I.
00:56:55 Study this for a.
00:56:56 Living, you know that I'm maybe a little more immune to it than most, but.
00:57:01 I'm sure that they could, you know, put me in a bad mood or something like I'm.
00:57:04 Sure that they could **** with my head.
00:57:06 The right part? Maybe not everyone could, but like I'm sure the right software with the right people behind it there, you know no ones. That's the whole point. This is why this this shift's dangerous. No one's immune to it.
00:57:19 And so if you're one of these influence operations.
00:57:25 Why not get some of these people who have already established an audience they've already established trust with an audience?
00:57:33 And and try to.
00:57:34 I mean you're you're to run your own influence campaigns.
00:57:37 You're not going to stop.
00:57:38 It's all you're going to stop doing what you're.
00:57:39 Doing you're keep doing it, but to augment that.
00:57:43 You're going to want to.
00:57:46 Find other influencers who already have.
00:57:51 You know an audience and try to get.
00:57:54 In their heads.
00:57:57 And try to influence their thinking.
00:58:01 And like I said, this doesn't even have to be like a blackmail thing or you're.
00:58:04 Getting paid thing.
00:58:06 It could be let's.
00:58:09 Let's go on his Facebook feed.
00:58:15 You know, constantly post like you know they they'll dig.
00:58:19 I would imagine that they've got a lot of data on every including myself.
00:58:22 They've got a lot of data on everybody and they got a lot of data on all these.
00:58:25 Influencers, you know, they can go through the years and years.
00:58:28 Of what you post on the Internet, you know, all kind of any records that you would be able to come up with.
00:58:35 What did he study in school?
00:58:39 How many kids does he have?
00:58:41 You know what?
00:58:42 What jobs?
00:58:43 What's his favorite movie?
00:58:45 What kind of music does he like?
00:58:47 And using that information.
00:58:50 You know, the AI probably already has an algorithm that all you do is you feed it the data.
00:58:55 And then it says OK.
00:58:57 Based on this data that I have about him, he's going to be susceptible to this.
00:59:03 And this and this, these are the.
00:59:05 Angles you need.
00:59:06 To use like psychologically, like that's what it's doing.
00:59:09 It'll create like a psychological profile.
00:59:12 And it will say OK, psychologically these are the.
00:59:15 Angles of attack.
00:59:18 And they just do what they would do to like Afghanistan to influence their election to the influencer directly, right?
00:59:26 This is one reason one of many.
00:59:29 I don't spend a lot of time on social media if I do, it's it's like right now.
00:59:35 It's like I'm telling people stuff, you know, or I'm researching something specific, right?
00:59:40 Like either I'm releasing stuff.
00:59:43 Or I am researching stuff.
00:59:47 But I don't I don't screw around on like Facebook or not even really Twitter, you know, I'll tweet stuff.
00:59:54 But and this is another thing like think about this, if you're wanting to like, how would you get to an influencer, right?
01:00:02 I mean, YouTube comments.
01:00:04 In the chat right now, I haven't been looking at it and see, that's not like I don't look at the chat.
01:00:08 Sorry guys, I'll I.
01:00:09 Look at the Super chats.
01:00:10 Maybe at the end, but I don't look at the chat very often and maybe after I've had some time to refine who can be in there.
01:00:18 So you can weed out some of these people.
01:00:20 That's one.
01:00:20 That's one of the reasons.
01:00:22 You're gonna have people saying ****, even if you think to yourself. Oh, that's not going to bother me. That's not going to get in my head, and it might not to the degree that the person's saying whatever it.
01:00:32 Is they're saying?
01:00:33 Like what they want it to do is not have the same.
01:00:35 Impact that they want it to, but.
01:00:37 I mean.
01:00:38 I'm not a robot.
01:00:41 Right.
01:00:43 It's going to have, even if it's a tiny, tiny, tiny amount even.
01:00:46 If it's a tiny crack.
01:00:49 In my wall my.
01:00:51 My massive.
01:00:55 Brick wall protecting my.
01:01:01 Innermost thoughts, or I don't know.
01:01:04 Anyway, you got it.
01:01:06 Like it it it's gonna, you know, it's.
01:01:07 Gonna at least.
01:01:09 It's gonna do something right?
01:01:11 And that's we're humans.
01:01:13 That's just the way it is.
01:01:15 In fact, even if, like let's say I saw a comment that I knew was from a bot that's gonna.
01:01:19 Have an influence cuz now.
01:01:21 That, oh, there's bots in here.
01:01:23 Maybe that's the.
01:01:24 Influence they want.
01:01:24 Maybe they want me to think.
01:01:25 There's bots.
01:01:27 You know, I mean like it, it's you see how this can get.
01:01:31 Real easy to do.
01:01:35 If you have an AI that can crunch all these numbers and and predict the behaviors.
01:01:40 And look, it's it's a machine.
01:01:42 It's and well, even if it was a person.
01:01:43 It's going to get it wrong sometimes.
01:01:46 We are human, right?
01:01:48 We're unpredictable.
01:01:49 We don't always behave the way that the machine or the algorithm or the analysts are going to.
01:01:54 Predict that we're going to behave.
01:01:58 What I'm telling you as as the years go by, it's going to get better and better and.
01:02:03 Better at these predictions.
01:02:06 Because at the end of the day, while we might be complex, unpredictable machines, we're still machines.
01:02:12 Soul or no soul were still machines.
01:02:15 The soul is still operating within this machine.
01:02:20 Don't believe me?
01:02:22 Well, then, how the **** does Down syndrome work?
01:02:25 Does their soul have Down syndrome?
01:02:29 No, of course not.
01:02:31 The machine that their soul is inside of has Down syndrome.
01:02:36 We're machines.
01:02:39 And because we're machines.
01:02:41 We some of our behaviors, it's predictable, it just is.
01:02:48 So that's what she covered in her.
01:02:54 Documentary. It's worth a watch.
01:02:57 It's worth a watch.
01:02:59 I tweeted out Susie Dawson that a really long.
01:03:05 Analysis she analyzed the.
01:03:09 The documentary and and it like, it's a really long thread like I think it's like 70 tweets.
01:03:14 Like really like, it's really long.
01:03:17 And I retweeted that on my Twitter.
01:03:19 If you wanna go check that out.
01:03:21 Susie Dawson, if you don't know, is someone who has journalist that has since fled to Russia.
01:03:26 Snowden style.
01:03:28 Because she was doing a lot of reporting on the intelligence agencies and and honestly probably had tools like this unleashed on her, you know, because these tools, they don't have to just like let's say it doesn't have to be, oh, I want to influence this person to get people to vote for Trump.
01:03:46 Or Biden or whatever, right?
01:03:48 It doesn't have to be that.
01:03:49 If you have someone that you don't like, you can.
01:03:52 You can tailor the message to get them to commit suicide or something like that, right?
01:03:57 Really. I mean, you could.
01:03:59 If your objective is to drive someone crazy.
01:04:03 Drive someone to suicide.
01:04:05 I don't see how the AI wouldn't have that capability.
01:04:08 Why I if I was writing it, I would definitely include that capability.
01:04:15 That's just the way it is.
01:04:16 I mean it's so these that's why this software is so dangerous.
01:04:23 Is because as it becomes more powerful as these AIS get more refined as the data that the AI has more access to.
01:04:34 It becomes more granular.
01:04:37 And more real time.
01:04:41 They're going to.
01:04:43 Most, I mean, I'm sorry all these NPCS, they're they're not going to just be NPCS.
01:04:48 They're going to be.
01:04:49 ******* wind up toys.
01:04:52 I mean they they are.
01:04:53 That's most people, most people.
01:04:58 Imagine that. Imagine.
01:05:01 A software tool that the richest the you know the highest bidder has access to that literally controls the behavior of most people.
01:05:13 Do you think stuff like these riots?
01:05:16 Wouldn't use software like this.
01:05:23 I mean, I don't think we have to get into all the reasons why, but.
01:05:27 Seems like that's a.
01:05:30 An easier target to work with, right?
01:05:34 An easier target to sway an easier target to influence, an easier target to.
01:05:41 Whip up into fits of violence.
01:05:46 And actually Millie touches on that a little.
01:05:48 I don't think that this was part of it, but it is an important point.
01:05:52 That when Obama was president.
01:05:56 These these same people were discussing.
01:05:59 Were given free phones.
01:06:02 The Obama phone.
01:06:04 The Obama phone, I don't know.
01:06:06 If you guys remember that.
01:06:08 And gave us a phone.
01:06:13 Well, that does a couple things.
01:06:14 One, it's that that funds two way.
01:06:18 So now this AI that you've developed.
01:06:21 Has access to this community's phone. In fact, I mean, they could even make it legal. They could write that into the terms of service.
01:06:30 Yeah, you get a free phone, but we're going to you have to share.
01:06:32 Like, think of the think of it this way.
01:06:34 How many times have you seen when?
01:06:37 You've installed a piece of software.
01:06:40 At the very end or during the installation process, there's the little check mark where it says, can we send anonymous, so don't worry about.
01:06:49 It it's anonymous.
01:06:51 Anonymous data to help us improve.
01:06:55 Our product.
01:06:58 I hope you're not leaving that checked.
01:07:03 I hope you're not.
01:07:03 In fact I have.
01:07:04 This theory.
01:07:06 That these AI systems I I don't.
01:07:08 I don't think this is right.
01:07:09 It's I think this.
01:07:10 Is wishful thinking.
01:07:12 But I have this theory that.
01:07:14 The at least because of that, that aspect because there is that check mark that high IQ people that are seeing that and like.
01:07:22 **** that.
01:07:23 And so the AI is in at least, you know, on some scale, probably a much smaller scale than.
01:07:29 Really matters, but.
01:07:30 Like on some scale, the AI is being deprived.
01:07:34 Of the the the data from the smart people that are on that are going no, **** that.
01:07:38 And so it's going to base all of its decisions on what influences Dumber people.
01:07:44 But I you know, that's that's wishful thinking.
01:07:47 They have access to.
01:07:48 Not everyone does.
01:07:49 But you know, not all the advertisers and stuff might not.
01:07:53 But the governments and these contractors certainly have, they don't, they don't require your permission to get it.
01:07:58 But in the case of the this Obama phone thing, that could even just be in the.
01:08:01 Terms of service.
01:08:02 Oh yeah, you can you get this?
01:08:03 Components free, but we we.
01:08:06 Need these anonymous statistics to help improve the Obama phone?
01:08:12 And so they just get that data and they now have full on access to what this community does on phones, what memes do they share, what websites do they visit, what songs.
01:08:27 Do they like?
01:08:29 Who are their favorite movie stars?
01:08:32 What jargon do that like?
01:08:33 What new words have entered their vocabulary?
01:08:41 You know.
01:08:44 If I you know, if you're getting all these text messages, you now know how to.
01:08:47 Text like them.
01:08:50 What videos are they commenting on?
01:08:52 What are their comments saying you so you get all this data?
01:08:56 And it's just like I was telling you, when you're a kid and you really get to know your mom and you.
01:09:00 Know exactly how to ask her for the things you want.
01:09:03 That's that's what that AI just accomplished with these Obama phones.
01:09:08 And not only that, because it's a two way St.
01:09:12 these Obama phones.
01:09:14 Is they can now.
01:09:17 Use what they've learned.
01:09:20 To then manipulate.
01:09:22 Just like you do with your mom.
01:09:25 They now manipulate the people with the phones into thinking and doing whatever they want.
01:09:33 And I'm sorry, but you know, if you're a recipient of an Obama phone.
01:09:39 I don't have a whole lot of hope for your.
01:09:43 Ability to.
01:09:46 Resist an influence campaign.
01:09:49 You know, for obvious reasons.
01:09:54 So that is another aspect that she covers.
01:09:58 But that has.
01:10:01 Like I said, I think that's.
01:10:02 I mean, that's just really a small piece because I don't think that.
01:10:05 They need the permission and also think about just every new phone that you get.
01:10:13 You know the default is.
01:10:15 It gives you like news updates.
01:10:19 You don't think that's part of this?
01:10:23 You don't think that plugs in it?
01:10:25 Part of this software isn't just.
01:10:29 The AI figuring out what's going to influence you and coming up with stuff, it's just it's.
01:10:35 The reason why none of this stuff was dangerous before or in existence before is just that the sheer amount of of manpower would take not just to perform it, but to coordinate it.
01:10:50 Right.
01:10:54 But I don't know how many of you listening have worked for a major corporation that works on very, you know, multi billion dollar projects if not billion million.
01:11:05 You know, projects that are at least 10s of millions of dollars and take like, you know, maybe 10 years to accomplish.
01:11:11 Think the iPhone?
01:11:12 The think of think about the the project management software required when they came out with the first iPhone.
01:11:19 You don't think that that was a complicated task?
01:11:23 You don't think that?
01:11:24 I mean because think of.
01:11:25 All the different companies involved in that.
01:11:28 You had, you know, the.
01:11:29 I'm sure people.
01:11:30 Some people are at least familiar with the Gorilla Glass thing where you know, Steve Jobs went and, you know, talked to the guy who had a glass company and got him to make the the glass for the iPhone and and and you know, but that's just like one tiny story and this huge complicated.
01:11:49 Ambitious project that required a lot of ******* project management.
01:11:56 Just the hardware.
01:11:58 And then think of the you know the the software you have, the operating system that had to be written, you had all the different apps that had to be written for a brand new operating system.
01:12:08 I mean, this wasn't just something that you.
01:12:10 Know Steve Jobs was like I know it's good iPhone.
01:12:14 I'm gonna draw a picture of an.
01:12:15 IPhone on a napkin.
01:12:17 And then hand it to an engineer and he's going.
01:12:19 To make me one.
01:12:19 No, this is we're talking like.
01:12:22 A lot. A lot.
01:12:24 A lot of project management went into this stuff.
01:12:28 And if you've worked on a project like that, I'm not saying I was involved with the.
01:12:31 IPhone but I've.
01:12:32 Worked on big projects like this that.
01:12:35 Cost, you know, 10s of millions of dollars.
01:12:36 And took multiple years to.
01:12:39 Complete and my part of it was only like a really compartmentalized.
01:12:44 You know aspect of of the bigger picture.
01:12:47 But I would have to participate in the project management software so that the management people, project managers.
01:12:55 Would be able to.
01:12:56 I mean it would be impossible.
01:12:59 Without the software for them to to accomplish this, because there's just so many moving pieces, so many different people with different milestones to meet, with different requirements, different deadlines, different.
01:13:11 You know, it's just it's just a mess. If you've got 20,000 people working on the same project, I mean.
01:13:17 For multiple years.
01:13:19 The only way you can accomplish that is with.
01:13:22 You know this project management software, so that's another aspect of this.
01:13:26 Where if you have an influence campaign.
01:13:29 And you now have the ability to manage all these moving pieces.
01:13:32 You can add more complexity to it.
01:13:36 You know you can.
01:13:37 You can now instead of justice saying, well, we're going to have like it's not.
01:13:40 I don't want you to think that this software for example is ohh this is.
01:13:44 Just what they.
01:13:44 Use to see how you know the effectiveness of queuing on like that.
01:13:49 That example you.
01:13:50 You know right where it's limited to just you know how many you know, videos are being created and and how many tweets are being made and that sort.
01:13:58 Of thing.
01:13:59 Yeah, you could include all kinds of stuff.
01:14:01 You could also say, OK, we're going to incorporate, we're going to see how or maybe they already have an algorithm or some kind of prediction of of how this will unfold.
01:14:12 But we're going to report negatively that we know that the people who like Q Anon hate The Daily Beast just as just making that up.
01:14:20 I don't know.
01:14:21 They do.
01:14:22 I I think they did hard left.
01:14:24 Or at least that's my understanding and hard left.
01:14:29 So they could say OK, Q Anon people hate The Daily Beast.
01:14:34 We will make we'll have The Daily Beast write an article.
01:14:40 About Q Anon and it'll actually make them believe it more, you know, stuff like that, like.
01:14:45 And you can have and it doesn't have to be.
01:14:47 Queuing on.
01:14:48 You can have every, every kind of every asset you have out there, you know, like I was just talking about, like you're going try to influence assets, whether they know they're an asset or not, they're going to try to influence individual.
01:14:59 People out there that have.
01:15:03 And so that's part of the management software too.
01:15:06 If you've got a big overarching.
01:15:09 In fact, that was part of the screenshots in the software that in Millie's documentary was, you know that if you had a influence campaign, it was part of a bigger campaign. And so you had to like, you know, in the management software, you had to tell it like.
01:15:26 Which, you know it was a sub campaign of a larger campaign. So the larger campaign like in Q Anon's case for example, is to get Donald Trump elected.
01:15:36 Like, that's like, that's like the overarching campaign, right.
01:15:39 And then Q Anon is like a sub campaign within the see and that's the sort of thing and you can just and and look, Donald Donald Trump getting elected.
01:15:47 Could be a sub campaign of a bigger campaign.
01:15:51 You see, that's why this software and this data.
01:15:55 Is so dangerous and and I just, I don't think people are wrapping their heads around it because they they don't work in the kinds of fields that you know, like, unless you're someone who's writing code.
01:16:07 And unless you're someone who's.
01:16:10 Worked with big data before to accomplish things or or maybe even use real specific AI you know and and realize how much better it's getting.
01:16:20 You know, like the other day, there's an AI that upscales footage.
01:16:26 Like you know how people always makes everyone makes fun.
01:16:29 Of in the 80s.
01:16:31 Where they would say zoom, enhance zoom, enhance, like with like security camera footage and it would blow it up to like this insane like, you know the the source video, especially these days, most people know how video works, right?
01:16:44 So back then, like you have SD video, this used to drive anyone that worked on video at the time ******* nuts.
01:16:49 But if you had like a security camera on.
01:16:51 On standard def, you know you're talking, like not very many pixels, you know, like talking about 480 by 7:20 at best.
01:16:59 And they would zoom it in like they would zoom in, like basically 5 pixels, and there'd be a face.
01:17:05 And like really, you zoomed in five pixels for A and made a face out.
01:17:09 Of five, no, it would just.
01:17:11 Be 5 ******* giant pixels on your screen.
01:17:13 There is no zoom.
01:17:14 Enhance, zoom, enhance.
01:17:17 And that was true.
01:17:20 Until recently.
01:17:22 What they did was they got a a bunch of low res photo or or I'm sorry high res footage and then they down rest, right. So they would get, they would shoot something like 6K.
01:17:37 And then they would convert it to like 320.
01:17:40 You know or or maybe even less.
01:17:42 I don't know.
01:17:43 And they would feed it into this AI and like just hours and hours and hours of this.
01:17:49 And the AI by looking at hours and hours and hours of the differences.
01:17:57 What the six K footage looked like?
01:18:00 And the 320P footage looked like.
01:18:04 It started to find.
01:18:11 I guess it started to find patterns.
01:18:15 And it could start to predict just by looking at the 320P footage.
01:18:21 What the six K footage looked like?
01:18:25 Not all.
01:18:25 It didn't always get it right.
01:18:27 But it's ******* scary good.
01:18:31 Forget the name of the the.
01:18:32 I'm sure you could find it by looking at AI.
01:18:35 You know, scale footage, that kind of a thing on.
01:18:39 A search engine but.
01:18:41 I had never heard of this before until recently.
01:18:43 I don't remember how I I think I stumbled across to looking for something else.
01:18:48 And I.
01:18:48 Was like what?
01:18:49 OK, because I still had that in my head like the little zoom and hence zoom and OAI is going to.
01:18:55 Fix that it did.
01:18:57 It was scary.
01:18:59 They had example after example of you feed 320P.
01:19:03 Footage into it.
01:19:04 And like again, like I said, it's not perfect.
01:19:08 You know, like if you have text in a on a sign for example, that is illegible because it's, you know only a couple of pixels.
01:19:15 It's not going to be able to figure out what the text says.
01:19:19 You know that's impossible.
01:19:20 Well, that's the point.
01:19:22 It's actually.
01:19:22 Not impossible because.
01:19:24 That's that. That's way down the road, though, and even that would never be 100% right, but it could.
01:19:29 It could predict what could what.
01:19:31 Might this text be?
01:19:32 You know, like this sign is a McDonald's sign.
01:19:35 I have read.
01:19:36 Every McDonald sign that has ever existed.
01:19:39 So what might this text be?
01:19:40 But anyway, that's that is down the road.
01:19:44 But that's the kind of that's the kind.
01:19:45 Of power they have now.
01:19:50 A machine never gets tired of looking at this data, and it never forgets any of the.
01:19:55 Patterns that it sees.
01:20:01 And once it has it figured out it's it's figured out.
01:20:05 There's a documentary Speaking of autoplay.
01:20:08 I don't know why.
01:20:10 YouTube kept maybe this was part of the influence campaign on me, right?
01:20:14 Maybe they're trying to get me to talk to.
01:20:15 You guys about this?
01:20:16 You never know.
01:20:18 But one of the things that I saw recently on autoplay and it played it twice.
01:20:22 I usually don't have autoplay on for this very reason.
01:20:25 Well, it's usually cause it's like Fox News and Jordan Peterson.
01:20:28 Like around.
01:20:28 I don't really.
01:20:29 That's all I ever.
01:20:31 And so it's usually when I have like a new browser, because I've been installing different machines and so it's usually I have a new browser and I haven't turned it off or whatever.
01:20:39 And I just have it on in the background or something.
01:20:42 And auto played this documentary about a AI that they created.
01:20:50 That was like kind of like the the deep blue to beat the chess champion, only it was for the Korean game go.
01:20:59 And I've never played go before.
01:21:01 I've never heard of go.
01:21:03 It looked like a fellow from the 80s where you have like the white and black circles on a board.
01:21:09 I had never played Othello either.
01:21:11 I just that's what it looked like to me.
01:21:13 But apparently it's one of the most complicated games ever made. It's a Chinese game, but it's popular in Korea, but it's like 1000 years old.
01:21:23 Or some **** like it's, or maybe longer, and it's literally the most complicated game in existence.
01:21:30 And a computer has never beat even like a amateur.
01:21:36 At go.
01:21:38 And so this company in San Francisco created an AI and I think it was.
01:21:43 Called I've just been called AI GO or Ultra go or something go.
01:21:48 And they, the whole documentary, was about the process of creating the AI.
01:21:54 And, you know, by the end.
01:21:55 It beats the human.
01:21:57 You know, spoiler alert I guess, but it beats the world champion and go.
01:22:04 And it's like this big shocker, because this was supposed to be something that, like, was way more complicated than chess.
01:22:11 And there, because there's just a million more possibilities.
01:22:15 And not only did it beat the the lead player it go, it was making moves that.
01:22:22 Had never been.
01:22:25 That no human would make right.
01:22:27 In fact, they look like errors to these pro players.
01:22:31 You know the the, the, the top players were looking at these moves and thinking I think the computer just bucked up because like the.
01:22:38 This is looks like a mistake like this.
01:22:40 I don't.
01:22:41 I don't even understand where this.
01:22:42 Would would go.
01:22:45 No pun intended.
01:22:47 And and then eventually you figure it out because, like, you know, as the the gameplay goes on, you realize, oh.
01:22:53 That's why he did that.
01:23:00 They it they're going to get better and better and better and it is something you got to be aware of.
01:23:04 And something.
01:23:04 That we need to talk about maybe.
01:23:08 If you if we can't.
01:23:12 If we.
01:23:12 Can't get rid of it.
01:23:14 And we're not going to have access to it.
01:23:16 Really, the only choice is to private access to you.
01:23:21 That really is the only choice you have if they have AIS and these influence machines.
01:23:28 That are because, look, you can think that you're.
01:23:32 Totally impervious to influence.
01:23:34 You're not.
01:23:34 You're just not OK.
01:23:36 Like no one is.
01:23:38 You might be better than some.
01:23:39 There's a gradient.
01:23:40 It's not like we're.
01:23:41 All the same, right?
01:23:43 But you're not impervious.
01:23:46 And the only it's going to get to a point unless something is done.
01:23:49 To stop this.
01:23:51 Where the only way to escape it is to deny it access to you, and there's only a few ways you can do that, and a lot of that.
01:23:59 Involves a lot of unplugging.
01:24:05 So anyway, I think it's important, you know, like the title says, who writes your bedtime stories?
01:24:14 It's important to see who's writing my narrative.
01:24:18 Who's running my narrative, and are they writing it because they know I'm going to like it?
01:24:26 Excuse me.
01:24:28 Are they are?
01:24:29 They is this a specifically tailored narrative for me.
01:24:37 Yeah, it's not like 100 years ago where if you picked up a.
01:24:40 Book and and you liked it.
01:24:44 You didn't have to worry if some AI determines you would like it.
01:24:47 Lynn wrote it specifically so.
01:24:49 You would like it.
01:24:51 Which is where we're headed.
01:24:56 Who's writing your narrative?
01:24:59 Because everyone in.
01:24:59 The world has a different narrative of.
01:25:01 What's going on in the world?
01:25:06 And it used to be we had at least similar narratives, but with this atomized society we a lot of us have wildly different narratives now.
01:25:18 So who's writing yours?
01:25:26 And are they writing it specifically?
01:25:29 To influence you.
01:25:33 Because our narrative is, is what determines.
01:25:35 What we do?
01:25:37 My narrative determines what I do.
01:25:39 You think I live in the middle of nowhere in a?
01:25:41 Bunker because.
01:25:42 Of something other than a narrative?
01:25:44 No, my narrative is look, it's going to be really ****** to.
01:25:48 Be in cities.
01:25:50 And it might be more than ******, I don't know, but it's possible.
01:25:55 That was my inner narrative.
01:25:57 I told you guys all about.
01:25:58 This years ago this isn't.
01:25:59 Something this is no surprise, I tell.
01:26:01 I broadcast my narrative.
01:26:04 So what did I do?
01:26:05 Because that was my narrative.
01:26:06 My inner narrative was like, yeah, that's going to be bad to live in cities, and I don't even know if, like, we can rely on supply chains.
01:26:14 And, you know, like it is.
01:26:17 As society just gets more and more Adam, you know, like I had a narrative in my head as to how things were going to go.
01:26:23 And based on that narrative, I.
01:26:26 Made decisions and took action.
01:26:32 And everyone has that everyone has their narrative and you base your decisions based on that narrative.
01:26:39 A lot of people have a.
01:26:40 Totally different narrative than I do.
01:26:43 And they do the opposite.
01:26:44 Things that I do.
01:26:47 Here's another here's a perfect example of that.
01:26:52 Now the George Floyd incident.
01:26:56 What was the narrative?
01:26:59 The narrative was.
01:27:03 Bunch of racist cops.
01:27:06 For no reason at all.
01:27:09 Killed this innocent black man.
01:27:12 Like they've been doing to our people.
01:27:16 For decades.
01:27:19 If you believe that.
01:27:23 If that's your narrative.
01:27:27 It's pretty easy to predict what you're going to.
01:27:29 Do right.
01:27:30 You're going to behave.
01:27:33 Based on that narrative, your actions are all going to be based on the narrative so.
01:27:37 If you see.
01:27:38 The looting and rioting.
01:27:39 And you think to yourself like that, that is so insane.
01:27:42 That's so out of control.
01:27:43 I can't believe it.
01:27:44 Why are they doing that?
01:27:46 It's because you don't have that same narrative.
01:27:52 Like all those stupid like white chicks?
01:27:55 But like, you know, all these white people that are kneeling and.
01:27:57 Doing all that ****.
01:28:00 That's because in the with their that makes sense with their narrative.
01:28:09 Your narrative determines what you're going to do.
01:28:17 Since the dawn of time.
01:28:21 Primitive tribes have a narrative.
01:28:25 And it's to control the tribe, not necessarily anything nefarious.
01:28:29 But you know.
01:28:31 It's good to have everyone on the same page with the same narrative.
01:28:35 Motivated by the same things based on the narrative right, you know, the American dream is a narrative.
01:28:41 And anyone can become president.
01:28:43 You can.
01:28:44 You can be born poor and and and work your way up the the system and eventually be in the white.
01:28:51 House. Someday you.
01:28:51 Of course that's not true, but that's.
01:28:53 That's the narrative, right?
01:28:58 And it helps certain people.
01:29:00 If you believe that narrative, because you're going to behave as if that's.
01:29:04 That's true.
01:29:05 It's a good part of your narrative.
01:29:07 You know, think about all the the.
01:29:09 The the I will George Washingtons.
01:29:11 I will never.
01:29:11 Tell a lie.
01:29:12 I chopped down the chair.
01:29:14 That didn't happen.
01:29:16 It's part of a now it's.
01:29:17 A made-up narrative.
01:29:22 There's a lot of stuff by the way, that Lincoln said.
01:29:26 About slaves and black people in general, that.
01:29:29 Didn't get included in the narrative.
01:29:37 Got to maintain the narrative.
01:29:43 And at least even if it's fake, like in the case of Washington and Lincoln and all these people.
01:29:49 At least with these fake narratives, the American dream and such.
01:29:55 If everyone believes it, if everyone's on board.
01:30:01 No. Yeah, you'll be maybe.
01:30:03 Under the influence of that narrative.
01:30:05 But there's.
01:30:06 It's actually, I mean, you know.
01:30:10 It's it's a lot better than everyone having, like, unbelievably different narratives like.
01:30:16 We have now.
01:30:20 You know, because at least at that, if everyone has the same narrative you, it's just like with a the Bible or or any other book right there.
01:30:28 There's people.
01:30:28 There's one book.
01:30:30 And then there's just different interpretations of it, right?
01:30:34 So you can have a book like the Bible, and you have 10 different people read it and they come up with 10 different ideas as to what these passages mean.
01:30:44 It's the same book.
01:30:45 It's the same narrative.
01:30:47 It's just they have different interpretations.
01:30:51 Right.
01:30:52 And you at least have that kind of.
01:30:53 A wiggle room.
01:30:56 And instead of the chaos that we have today.
01:30:59 Where now you've got 10 different interpretations of the Bible of the Torah of the Talmud, of the the.
01:31:06 Quran of the satanic.
01:31:12 Whatever Darwinian.
01:31:17 In you know, new age **** is in, you know, whatever.
01:31:20 So, and that's just one aspect, right?
01:31:22 That's just religion.
01:31:28 So you have all these different Arabs going on now and all these different interpretations.
01:31:32 Going on now.
01:31:38 You know chaos.
01:31:41 You have.
01:31:41 You don't have the ability.
01:31:46 You see the slaves, I guess don't have the ability.
01:31:51 To form a cohesive group when there's that much, just, you know, among the dissident, right.
01:31:56 Think of how many different narratives are going on.
01:31:59 You know Q.
01:32:00 Qi don't. Well, I don't know that I call it dissonant. Right, you know, I mean.
01:32:03 But like the the.
01:32:05 You know you have people that are following the queue and narrative.
01:32:08 You have people that are following the natsoc narrative.
01:32:12 You have people that are following the Paleo conservative narrative.
01:32:17 You you have all these different narratives.
01:32:20 And then different interpretations of those narratives.
01:32:26 And that's just one.
01:32:28 One subset.
01:32:31 Of the right wing narrative, you could say it.
01:32:33 See what I'm saying like multiculturalism.
01:32:38 Is just going to exacerbate this.
01:32:47 Which makes you wonder.
01:32:50 If the ruling class.
01:32:52 Yeah, people always say this all the time, right?
01:32:54 They always say, well, it doesn't matter who gets elected.
01:32:57 You know the same thing keeps happening.
01:33:00 Well, maybe just maybe.
01:33:04 They have the same narrative.
01:33:11 Maybe they have different, you know, like people.
01:33:13 I've often wondered that, you know, is the left right thing that goes on at the at the ruling class level.
01:33:19 How much of it is just complete fiction and some of it is.
01:33:22 And how much?
01:33:23 Of it, is it like a genuine difference of opinion?
01:33:26 You know, it's probably a little bit of both, right?
01:33:28 It's probably the same narrative.
01:33:31 And justice, two different interpretations.
01:33:34 But they're definitely on the same page.
01:33:36 They're all going to the same meetings.
01:33:37 They're all part of the same clubs.
01:33:41 They might just have different ways of getting those same objectives.
01:33:46 So anyway, I don't know how long I've been talking.
01:33:48 I think it's been a while though.
01:33:53 Yeah, a little over an hour.
01:33:55 So I'm going.
01:33:55 I'm going to look over.
01:33:56 I'm going to see if I can find.
01:33:57 Some super chats I don't know if anyone sent some.
01:34:00 And I don't know how to look at those, but I will figure that out.
01:34:06 Your activity I think is where it is.
01:34:08 Here we go.
01:34:12 From Cert $50.00, I appreciate it. Black pills are racist. Indeed, that is what they will have you believe that is.
01:34:19 One narrative.
01:34:21 One narrative is that pretty much everything is racist.
01:34:23 You know, I was thinking about this the other day.
01:34:25 So you hear the people that are talking about racism is, you know, it can only be from white people.
01:34:30 Racism is it's it has to be from a place of power.
01:34:35 Actually, I can't talk about this on YouTube.
01:34:39 See, all the more reason.
01:34:41 To go over to.
01:34:41 D live and bit shoot.
01:34:43 Because I this is actually this might be the subject of a a future stream.
01:34:50 Alright, we got doodle squatch after dark $5. Tom Hanks is a Greek citizen now.
01:34:56 I did not know that.
01:34:57 Is that true?
01:34:58 I don't know if that's true.
01:35:00 That would be very curious if that is the case.
01:35:05 By the way, when I made fun of the.
01:35:09 Tom Hanks, for those of you who haven't been going to bit shoot and haven't seen, I think it's my last video on bit shoot.
01:35:16 Where I make fun of Tom Hanks pedo dungeons.
01:35:21 I'm not saying that.
01:35:23 Tom Hanks is a a squeaky clean guy.
01:35:26 I'm making fun of the fact.
01:35:27 That he probably does not have underground tunnels full of anyway.
01:35:34 Charlemagne $5 I watched a video on creating a solar powered on on creating solar powered AC search solar powered OH OK air conditioner from the channel tech ingredients. I think I saw that actually.
01:35:46 Excuse me.
01:35:48 The IT what it did look like.
01:35:50 It it's a little complicated but.
01:35:53 And it didn't look like it was that expensive.
01:35:56 It was.
01:35:56 A little.
01:35:57 You gotta remember, for me, anything I can't have shipped out here, which isn't like a lot.
01:36:01 I mean, well, it has to be anything you.
01:36:03 Get on eBay.
01:36:03 Or something like that, right?
01:36:07 Because if you have to have it delivered freight, like something big, it's like really expensive to ship.
01:36:13 And then if I have to drive out to.
01:36:15 Go get it. Like at a Home Depot. I don't have like a Home Depot or something like that. Real close by. I'm driving 100 miles or something. If I go to do that and in my truck.
01:36:24 Am I?
01:36:24 Am I very old?
01:36:26 Big block truck.
01:36:28 That gets just the gas on that gets expensive.
01:36:33 James Fields, 1999, says message redacted.
01:36:37 James fields.
01:36:38 Not tonight again.
01:36:39 Devin, have you checked out the pulse Strat project?
01:36:42 I have not.
01:36:44 Not sure what that is.
01:36:45 I am copying and pasting that.
01:36:46 Now to my notes.
01:36:49 Right.
01:36:49 I'll just Google it right now.
01:36:50 Why not?
01:36:51 Let's see.
01:36:54 I hate that I'm.
01:36:55 I keep saying I I don't use Google, but I hate that.
01:36:58 That's part of my vernacular now.
01:37:00 Whole strap project.
Speaker
01:37:05 I don't.
Devon
01:37:06 I don't know what this is.
01:37:07 It looks more complicated than just like a thing, but I've got that open now.
01:37:12 I'll look into it.
01:37:14 Telecast bear 199. Are you Christian? Yeah. I mean, look, I've talked about this before. I don't have the most rock solid faith all the time. I don't think anyone really does. I mean, maybe a few people do, but.
01:37:28 Yeah, I mean, I I certainly believe in the theology.
01:37:34 I don't.
01:37:34 I'm not one of these.
01:37:35 People that gets.
01:37:37 Super into the nitty gritty of every little thing you know, like, and there's all these different.
01:37:44 Flavors of Christianity, and I think some of it's self defeating.
01:37:47 You know, having all these different factions fighting each other, you know, it's different narratives, I guess, right.
01:37:52 But yeah, I I.
01:37:54 Think Christianity is a part of MyHeritage? Certainly.
01:37:59 Millennial ****** 8950 dollars thumbs up preciate it.
01:38:04 $2.00 from madcat. Do another Owen Benjamin.
01:38:06 Stream I'd I'd do.
01:38:07 That I mean, I I don't invite myself.
01:38:09 Other people streams.
01:38:11 Telecaster bear 199. Thanks for the IG vids. During the peak crazy stuff, yeah, that got for those of you know that that that got my account banned.
01:38:22 That put me just to give you.
01:38:24 An idea of the influence stuff we're talking about.
01:38:27 I was posting every video I could find as quickly as I could find it, and I was the one finding the videos and posting it and it was putting me in a real ******* **** mood.
01:38:40 I mean cause all I was watching all day long was just like fighting and fighting and rioting and rioting and fires and burning.
01:38:47 And like I still I knew.
01:38:48 On an intellectual level like OK.
01:38:50 This isn't it. Yeah.
01:38:51 But but I started to think maybe it was cause like it kept going so long and it was so brutal and no.
01:38:56 One was stopping.
01:38:57 I was just like, what the ****?
01:38:58 So just to give you an idea of how you.
01:39:00 Know that could that can mess with your head.
01:39:05 And then not sleeping very much right?
01:39:07 Cause I was always posting always posting, always posting.
01:39:11 And with a slow Internet connection, by the way, which is not easy.
01:39:13 I like I was sitting.
01:39:14 There finding the.
01:39:16 Video downloading it, which takes some time and then uploading it, which takes some time and then like you can't, there's only so many things you can do at the same time because most most of what I was doing was on a tablet anyway.
01:39:27 Millennial hockey 8950 dollars.
01:39:29 Just remember the yes words.
01:39:33 The words yes.
01:39:37 Brent Gordon, 499. Stay strong, everyone, I agree.
01:39:40 Yeah, look, here's the.
01:39:41 Thing I made that part of my.
01:39:44 Queuing on video.
01:39:46 Look, just because we're not getting saved by Superman.
01:39:51 Doesn't mean that you have to be like, oh God, there's the end is near.
01:39:56 We're all going to die.
01:39:57 That's I've never said that.
01:39:59 It's just that you can't keep thinking you're going to be saved by Superman, OK. It doesn't mean that everything's **** because Superman's not going to do everything for you.
01:40:10 You know, everyone needs to to like when I hear everyone knows who I'm talking about.
01:40:16 And yes, I do too.
01:40:17 I'm just trying not to be.
01:40:18 * **** ***** it.
01:40:20 When I hear people talk about like, oh, you gotta make sure everyone you know, gotta keep morale up.
01:40:24 Blah, blah.
01:40:24 It's like, **** you.
01:40:25 Dude like.
01:40:28 You can't.
01:40:29 You can't lie to people.
01:40:30 And keep them happy about what's going on.
01:40:35 You go the pork eater $3. Some kind of weird.
01:40:40 Hippo GIF thing telecast Bear 199 look up. Save James. Please save James.
01:40:48 I don't know what that is.
01:40:52 I I feel like if I look at it, I'll look up right now.
01:40:54 But I feel like I'll find like a lot of weird stuff.
01:40:56 Because that's not very specific.
01:40:58 Save James.
01:41:04 I mean, there's a lot of save James different things.
01:41:08 The one I well, this is current, actually Texas judge reversed.
01:41:12 Oh, that's saved James.
01:41:15 Well, I mean, look who didn't see that coming.
01:41:18 See, here's the thing.
01:41:19 Everyone gets mad at me and calls me a black pillar.
01:41:22 And like, oh, you're just in a bad mood because you're predicting something bad's going to happen. And I'm like.
01:41:27 Yeah, but something.
01:41:28 Bad is going to happen, you know, like.
01:41:32 And with this too, I I you know, I was like, OK, that's cool, the judge ruled.
01:41:39 You you gotta see people get excited over headlines and they don't drill down into it.
01:41:43 And realize that's.
01:41:43 Not the end, right?
01:41:45 It's like OK, another example of this was that guy in Albuquerque who shot an Antifa guy, totally legit.
01:41:52 Everyone saw the video, he was in it, well within his right to do that.
01:41:56 And then he was charged, and then they dropped the charges.
01:41:58 And I told I told them immediately I was like, look.
01:42:01 They dropped these charges.
01:42:04 But they're that doesn't mean they're they're not going to reassign different charges, right?
01:42:08 Like I think they they kind of knee jerked the wrong charges.
01:42:12 And now the prosecutor actually wants to get a conviction.
01:42:14 And that's what happened.
01:42:16 And everyone was calling me a Debbie Downer.
01:42:17 And I'm just like, no, it's.
01:42:18 You got to understand how this.
01:42:20 Works, but yeah for those.
01:42:21 Who don't remember save James.
01:42:23 Is that little boy.
01:42:24 That the mom wants to turn him into a girl and the and the dad doesn't. And but now not only is the mom getting her way, the dad's going to pay for it, but are we surprised? And again, this is the beginning. This isn't.
01:42:36 You know, this is this is is.
01:42:38 This is gonna be a shocking.
01:42:40 In like a couple of years that this is happening is finding out that there was a gay marriage in a Unitarian church.
01:42:47 You know.
01:42:50 Diversity in comics bear 499 your uploads have been missed. Yeah. Like I said, though, there are there is still content on bit shooting delight.
01:42:59 In fact there.
01:43:00 Was even a deal last year.
01:43:01 I didn't put on big show because it.
01:43:02 Was like kind of man, but you know.
01:43:06 And I do have.
01:43:07 I got a lot of stuff in the.
01:43:08 Works like it's.
01:43:11 85 degrees currently in the room I'm sitting in. And look, it's just it's tough when all your computers are are broken and it's 85 degrees to sit down at your computer and do stuff.
01:43:24 System or spikes?
01:43:25 Our system is in total control.
01:43:27 We don't stand a chance.
01:43:28 No, I mean, some people don't stand a chance.
01:43:30 But look, some people don't stand a chance.
01:43:31 A lot of stuff.
01:43:32 Some people don't stand a chance to alcoholism, you know, I mean, this is just another thing and it's going to have to get defeated one way or another.
01:43:42 But like I said, you can, you can just deny them access to you.
01:43:46 You know, it's difficult.
01:43:48 I do it.
01:43:49 Though I'm still on.
01:43:49 The Internet I'm talking to you guys right now.
01:43:52 You can still be on the Internet.
01:43:53 Doesn't mean that you can't.
01:43:54 You have to just disconnect, you know, unplug entirely.
01:43:58 Just don't watch their crap and don't use crap social media stuff.
01:44:05 Zach, 5 bucks.
01:44:06 Holy **** caught.
01:44:08 Devin live. Yes, you did.
01:44:10 Livia Livia, I don't know Florea.
01:44:16 It sounds like a plant.
01:44:17 When is your second book coming out?
01:44:18 I get asked that a lot, and I know I wish it was faster.
01:44:22 Also, please return to A to return to at fix schedule at least a video of Fortnite was four night 14 days.
01:44:32 Like I said, that's still true if you're at the.
01:44:36 Deliver bitchute.
01:44:38 Big Thor 20 knocks I.
01:44:42 Don't know what a knock is.
01:44:44 I think that's Norwegians kroners or something.
01:44:48 You're a champ, Devin.
01:44:49 Appreciate it.
01:44:53 Let's see here Alan £10, could it be that America versus Russia narrative is fake?
01:45:00 Well, it was, I think during the Cold War.
01:45:02 Honestly, I think a lot of the Cold War stuff was fake.
01:45:06 And there's a couple of books on the subject where we were in the background sharing information with each other, but it was a good narrative to get both slave classes.
01:45:17 Because we work better when we have, like the boogeyman, right?
01:45:21 So like the Russians, their boogeyman was all the, you know, the the imperialist, capitalist Americans and the.
01:45:29 Americans was, you know, all the the the, the Iron Curtain, you know, the the.
01:45:36 What was it? The evil?
01:45:38 Empire or whatever.
01:45:40 So but at the same time it it could be one of those things where the ruling class is testing out two different systems simultaneously.
01:45:48 So like well, let's try out these two different systems at the same time.
01:45:51 You guys do this one, but there was definitely sharing going on in the background and a lot of it was fake and gang.
01:46:00 And even now, right?
01:46:05 And well, if you're talking about the election stuff, they're not talking about Russia.
01:46:09 And I know.
01:46:10 Look, Adam Green, I haven't looked at his stuff yet.
01:46:12 He keeps talking about how Russia did interfere.
01:46:16 I I maybe they did.
01:46:18 I haven't seen the evidence that maybe he has.
01:46:21 Maybe I have to watch what video?
01:46:22 He's got on that.
01:46:26 Ramon Luevano Dev and I am a Mexican who discovered you through one of your kin.
01:46:32 I will be leaving the US soon enough.
01:46:35 I wish you and your people the best.
01:46:37 God bless.
01:46:38 All right.
01:46:39 Well, thanks.
01:46:39 I appreciate that.
01:46:43 And look.
01:46:48 I'll talk about some other time, but yes, kind words.
01:46:51 Appreciate it.
01:46:53 Penelope Manard always with the awesome super chats.
01:47:00 I'm very.
01:47:02 I I don't even know how to thank you that you are the I think you're.
01:47:05 The record holder so far, but yes $200.00 from if by the way, if you don't follow Penelope on Twitter.
01:47:13 I would do so now go to Twitter and look for Penelope Maynard.
01:47:20 She's got a lot to offer and honestly has a lot of it has a very similar.
01:47:26 Background like we both started out as you know, really pushing Trump is hoping that he was going to pull us out a lot out.
01:47:32 A lot out out of a lot of this mess that we.
01:47:37 But as time goes by, it just seems like he's another piece of the machinery. And by the way, this this is all part of it, you know, with Millie's.
01:47:47 With Millies documentary, she kind of glosses over Trump's participation and in a lot of this stuff, you know, with Cambridge Analytica and all this other stuff. I mean, it's not just the deep state and Democrats and everyone's using these.
01:48:03 These influence programs, and she does mention Roger Stone and you know who was Roger Stone working for?
01:48:10 So it's not like rocket.
01:48:11 Science, right?
01:48:12 But she makes it sound like ohh, but Trump didn't like that.
01:48:14 So that's why he fired.
01:48:16 I don't.
01:48:17 I don't think that's why Trump, I don't think Trump didn't like that.
01:48:21 Of course, they he liked it.
01:48:22 They all liked it.
01:48:24 It works.
01:48:25 Why wouldn't you like it?
01:48:28 You know, people got to get over this.
01:48:29 Idea that that Trump is the character that he plays on TV, I really don't understand why people think that.
01:48:38 I I don't know that.
01:48:41 He's as bad as some people think, but he's certainly not as good as most people.
01:48:45 Most of his supporters think.
01:48:49 But anyway, yeah. Penelope Maynard, thank you very much and really helping me keep the the AC in effectively going. Morgan Davis, $2.00 Box thinks it's evil to tell people the truth about Q.
01:49:05 Yeah, I know, I know.
01:49:09 I know.
01:49:14 Does he though?
Speaker
01:49:17 Does he?
Devon
01:49:18 Does he really think that?
01:49:29 I'm not going to start any drama over 2 bucks. I'm not going to start any drama. Oh, I already. I, you know, look, I nudged that hornet's nest.
01:49:40 It is what it is.
01:49:42 I'll just say.
01:49:46 Ask yourself.
01:49:49 Does he really think?
01:49:53 You're doing the devil's work.
01:49:56 For telling the people, for telling people the truth.
01:50:03 And I'm going to throw something else in here.
01:50:07 OK.
01:50:11 A lot of people have God.
01:50:13 Probably doesn't exist.
01:50:16 But it's better to believe in him.
01:50:19 Because it makes you have a happier life, right?
01:50:23 Like, that's the criticism.
01:50:26 Well, I mean, I have the same criticism, by the way.
01:50:29 And maybe I have it wrong, but that's my understanding, right?
01:50:32 And there's and if not, Jordan Peterson, there's people that think that way, right?
01:50:36 That OK.
01:50:40 There is no God.
01:50:43 I'm an atheist, but I'll have a happier life.
01:50:47 If I believe in God.
01:50:52 And I don't see the difference.
01:50:56 Between that?
01:51:00 Saying to people.
01:51:05 Q is fake.
01:51:07 But it's better.
01:51:08 If people believe it.
01:51:11 Because they'll have a happier life.
01:51:14 Sounds like the exact same thing to me.
01:51:18 And anyway, that's all I'm going to say.
01:51:24 So Gregory Gregory Dickinson, $5 really appreciate that. Oh, look, we got another one live, the Florida £2.00. Not a plant, bro. Not sure.
01:51:36 What that means?
01:51:39 Ohh and I got to see.
01:51:40 Ohh this isn't up.
01:51:41 Let's cause it's not showing me.
01:51:42 Everything I'm missing?
01:51:44 It doesn't show let.
01:51:45 Me maximize this screen.
01:51:47 There we go.
01:51:48 That showed it to me.
01:51:50 Gregory Dickinson.
01:51:51 Hey, for a bunker in.
01:51:52 Oklahoma, right on.
01:51:54 Oklahoma is now a lot of Indian land, huh?
01:51:58 Hopefully you're not on a reservation now you know.
01:52:02 Most people aren't.
01:52:02 Know what that's about?
01:52:03 But he does, I'm sure.
01:52:05 Matthew Jamison, $5 Devin, do you ever read Stephen King?
01:52:10 I can't help but notice many of his books and old horror books in general have rapid child predator.
01:52:15 Loans, yes, including especially the book it.
01:52:20 It's like a child.
01:52:22 Rape ****.
01:52:24 And yes, there is a lot of that stuff in his books.
01:52:28 A lot of his I did used to.
01:52:29 I used to read his books.
01:52:32 But it started to get really gross.
01:52:34 And it was probably one of the last ones where I was just like, yeah, can't do this.
01:52:41 OK, Spike, 2 bucks.
01:52:43 One day, drones will hunt us.
01:52:45 No revolt.
01:52:45 The right loss, no.
01:52:48 Again, it's we're not at Skynet levels yet.
01:52:52 That's still really down the road.
01:52:56 I don't advocate that kind of thinking at all.
01:52:59 I think it's self defeating.
01:53:01 I mean, it's quite literally self defeating.
01:53:05 Bjorn. Magnesium. Magnesium Bjorn Magnussen $50.00 and no message that I can see.
01:53:16 Really appreciate it, Bjorn.
01:53:19 That that really is helpful.
01:53:21 You just paid my Internet bill for the.
01:53:25 Well, maybe not after YouTube takes its.
01:53:28 Cut and provided they don't.
01:53:30 And my account anytime soon, but.
01:53:35 That's why you need to go over to D live.
01:53:36 I'm going to remind you guys one last time.
01:53:38 Go to D live.
01:53:40 And go to bit, shoot and subscribe to me over there.
01:53:47 I'll be doing more of these.
01:53:50 Kind of informal talks on D live and less on YouTube for some obvious reasons.
01:53:57 There's also a couple other things that are.
01:54:03 That are coming out that I'm going to check out.
01:54:05 I'm not going to say what it is yet, but like different platforms that look like.
01:54:09 That might have streaming.
01:54:10 I wish that she would have streaming.
01:54:12 I talked to those guys.
01:54:13 Sometimes, and I told them how.
01:54:14 Would I feel about that?
01:54:16 I think they're aware of it.
01:54:17 It's just a lot of infrastructure I think to, to be able to handle smooth streaming.
01:54:21 Even YouTube doesn't have it right all the time, and they've got all of the.
01:54:25 Backing of Google so.
01:54:28 Anyway, I really appreciate it guys.
01:54:31 Stay strong.
01:54:33 Do not fall into the trap of thinking everything's hopeless. It's not. It's just hopeless to use the system. It's not hopeless.
01:54:41 We've survived many things like this isn't the first time our people has been faced with odds and it look it might.
01:54:49 It might be one of those things where we're in for like a couple of centuries of bad.
01:54:53 It could be that's.
01:54:55 Look, it is where it is.
01:54:57 You know, we could cry that we weren't the ones that were born during, you know, the time when boomers were born, where we got to ride like the wave and of wealth coming from the the victory of, well, well, you know, the victory.
01:55:07 Of World War 2 and.
01:55:10 Yeah, you can complain about it and whatever, but it's, you know.
01:55:15 It it is what it is, we're still I mean, bottom line, we're still better off than a lot of people that have lived throughout history.
01:55:21 And and it's, it's always good to have perspective on, you know, yeah, it's bad.
01:55:26 But look it one of the reasons why it's bad is it's so good.
01:55:30 So don't lose sight of that.
01:55:32 Anyway, in the meantime you guys have a great week and we'll talk again soon for Black pilled I am of.
01:55:39 Course Devin stack.