3:24:49

INSOMNIA STREAM: BRAIN GAMES EDITION.mp3

11/18/2023
Speaker 1
00:00:00 Right.
Speaker 3
00:02:10 Half a year.
Speaker 2
00:02:17 Just wanna kiss you.
00:02:32 Ready for a brand new life?
Speaker 5
00:03:49 If you want me to.
00:06:04 333.
Speaker 6
00:06:35 1st and then just feel like fine.
Speaker 7
00:06:44 And then just wait.
Speaker 8
00:08:13 Welcome to the insomnia stream.
00:08:18 Brain games edition.
00:08:21 We're live on Odyssey and rumble. There was a little bit there for today earlier. We thought that odyssey.
00:08:28 Odyssey was gone. People were getting, uh, missing gateway or whatever, you know, the what is the area error?
00:08:34 504.
00:08:37 People were nervous. They thought it was gone, but then I came back and I guess it's back down. I'm live on it, right?
00:08:45 So we got Odyssey still, you know, at least for now, knock on wood.
00:08:51 Got rumble going multi streaming multicasting. I'm your host of.
00:08:55 Course Devin stack.
00:08:58 Got a lot to talk about today.
00:09:01 Lot of lot of spooky stuff to talk about a little bit today. Science fiction turning into fact.
00:09:10 That's right, a lot of these things.
00:09:14 That we thought were just.
00:09:16 The the ramblings of of crazy schizophrenic people.
00:09:22 And psycho genocidal.
00:09:25 Transhumanists, lot of stuff's real.
00:09:27 Now so.
Speaker 5
00:09:29 Yay, Hooray, Hooray for technology.
Speaker 8
00:09:35 You know, sometimes technology is our friend and and sometimes it it, it just might might kill us all in the end. But we'll be talking about that.
00:09:46 Of course, the the news news hasn't changed much.
00:09:51 In terms of the the Israel Israel, Gaza thing, it's it's still kind of boring I.
00:09:56 Don't see it? There's no mushroom cloud.
00:09:59 There's no mushroom clouds, there's no Hezbollah invasion.
00:10:03 There's no, you know, Iran supersonic missiles.
00:10:09 Or hypersonic, right? Missiles.
00:10:12 None of that.
00:10:14 In fact, it's so boring, right? It's so boring.
00:10:17 That the only thing in Boomer news is how how shocked and appalled they are. Oh, they're they're they can't believe they can't believe what this country has turned. Oh my God. Can you believe the kids these days?
00:10:34 The kids these days and we can't. We can't. We can't say what it is, right? We can't actually specify what we mean by that. You know, as boomers, because you know, race is only skin deep. And as long as you come here legally, all that stuff, right, all that, all that baggage is still up there. So they can't. They can't just say it. They want to cause it's protecting the Jews.
00:10:54 You know they. So yeah, it's it's one of these things like God not being a racist, letting you know, letting the Jews down.
Speaker 5
00:11:00 Like what? Which? Where what?
00:11:02 I don't know what button to push.
Speaker 8
00:11:04 But they're all a tizzy because, Oh my God.
Speaker 9
00:11:07 The serving document 2 decades ago by the mastermind of the 9/11 attacks is resurfacing on one of the biggest social media platforms in the country. Osama bin Laden's infamous letter to America, which condemns the United States for its support of Israel, is now going viral on TikTok and it appears to be gaining sympathy.
Speaker 10
00:11:27 Of all the things you could never imagine, the words of Al Qaida's former leader ringing true with some TikTok influencers who are encouraged.
00:11:36 Others to read it for themselves.
Speaker 8
00:11:38 Oh, look, if it's not, it's it's not the Cold War. It's it's. It's the hunt for Osama bin Laden all over again. The boomers are just. They're just doing what they've always done. They're living.
Speaker 11
00:11:51 In the past.
Speaker 8
00:11:52 They're just recycling all their their bad ideas over and over and over again.
00:11:58 Yeah. The thing is, they don't tell you what the what's actually in the letter. A little surprise. Well, not surprising up until literally the other day the letter was hosted on the Guardian for for years, for like, I think over a decade it was just on their website.
00:12:14 The American media.
00:12:17 Never reported on this. And look, we can talk about, you know, Osama bin Laden's actual role in 9/11 and and whatever some other time.
00:12:26 That, that's that's actually irrelevant to what? What? What? The contents of this letter is right.
00:12:32 The the contents of this letter.
00:12:36 We're allowed to be on the Guardian website for years and years and years because why? Because Osama bin Laden? He's basically Satan. So anything he says, right, it's the opposite is probably true. So let's take a look at what the the letter actually said.
00:12:53 This is how it opens. It's a.
00:12:55 Strong open, I mean, you know, you cut.
00:12:56 Out all the you know, Allah is great.
00:12:59 And all that.
00:13:00 Fun stuff. Just get right to the beginning. Boom.
00:13:03 Your former president warned you previously about the devastating Jewish control of capital and about a day that would come when it would enslave you. It has happened. Your current president warns you now about the enormity of capital control, and it has a cycle whereby it devours humanity when it is devoid of the precepts of the.
00:13:24 Of God's law.
00:13:25 The tyranny of the control of capital by large companies has harmed your economy, as it did ours. That was my motivation for this talk 10s of millions of you are below the poverty line. Millions have lost their homes and millions have lost their jobs. To mark the highest average unemployment in 60 years.
00:13:45 Your financial system.
00:13:47 Is and in its totality was about to collapse within 48 hours and not the administration reverted to using taxpayers money to rescue the vultures by using the assets of the victims. Talking about bailouts. As for us, our Iraq was invaded in response to pressures from capitalists with the greed.
00:14:07 For black gold, time of oil and the continued support and continue to support the oppressive Israelis in their occupation of Palestine in response to pressures on your administration by the Jewish lobby, backed by enormous financial capabilities.
Speaker 6
00:14:28 So it goes on from there.
Speaker 8
00:14:30 Uh, you know, obviously he's a bin Laden's, not a great guy. He was a CIA asset at one point. It's complicated, but that's why the letter's gone. The letters and all that stuff doesn't isn't why the letter's gone. The letter's gone. Because he basically says America is controlled by Jews and for the.
00:14:51 First time.
00:14:52 In my lifetime.
00:14:54 It's starting to be normal to believe that.
00:14:58 I never would have thought this. I never would have thought this.
00:15:02 But it's starting to be like the sorts of things and it's it's kind of this weird dynamic, right? Because it's people on TikTok that I ******* hate. I hate him so much. I hate him with every fiber of my being. But you know, like, at least they they know about the Jews now, even if it's for all the wrong reasons. And it is, it's for all the wrong reasons.
00:15:23 It's like, well, at least we have that we have that in common a little bit, right.
00:15:30 But uh yeah, so that.
00:15:32 Everyone's freaking out because Osama bin Laden's letter that tells you basically the Jewish money and the Jewish lobbies control the United States.
00:15:41 That letter went viral on TikTok.
00:15:44 So now of course, we have to ban TikTok and get rid of TikTok. Funny thing is, the US media, even though the Guardian had hosted, you know, didn't report on this at all.
Speaker 12
00:15:55 Didn't report on.
Speaker 8
00:15:56 It at all this. This letter was never mainstream news to any to any degree. And the reason why that was is the United States.
00:16:05 Approached media outlets at the time this letter was available when you would think that Americans would want to read it because they're saying like, Oh well, you know, they're having to come up with all these crazy reasons why? Why the Middle East wouldn't like us, like, oh, they hate you.
Speaker 6
00:16:19 For your freedom.
00:16:21 Oh, that makes sense. Freedom for us.
Speaker
00:16:23 You know like.
Speaker 8
00:16:24 What the ****? Like they?
00:16:25 Now only the only the mind of a grilling boomer would that make any kind of sense? Right? Ohh they they hate us because we have.
Speaker 5
00:16:32 Freedom. What the? OK, so the what?
Speaker 8
00:16:38 Yeah, alright. But of course that's the only reason they could come up with because stuff like this was not allowed to.
00:16:44 Be broadcast that all the United States government went to every media outlet at the time and said Ohh no you can't. You can't show this. You can't talk about it because you know Osama bin Laden. He's got those sleeper cells.
00:16:58 They're all embedded in all over.
00:17:00 All across the country.
00:17:01 And he's gonna. There's gonna be like, a secret code. You know, these sleeper cells, they're just waiting for you to read that letter with the secret code in it. And as soon as they hear that secret code, they're going to blow up a dirty bomb. And it's just it's going to be.
00:17:15 All your fault, it's.
00:17:16 Going to be all your fault, so no.
00:17:18 One even no one knew this thing existed.
00:17:20 I was just sitting there on the Guardians website forever and tell some ******* Tik Toker apparently.
00:17:26 Blew it up so.
00:17:27 Anyway, that that of course was in the news.
00:17:29 This week I find it a little bit confusingly hilarious. You know, it's I.
00:17:34 I don't know.
00:17:35 Like I said, it's like these. These are the kinds of the kinds of people that that are reading this and having some kind of an epiphany. It's for all the wrong reasons. And and it's weirdly because they're lumping us together with Jews. They're lumping all white people together.
00:17:49 The Jews that's going to be that's the big hurdle, right? We need to get them past that. If we can get them past that then hey, you know, we can do a, you know, some of that judo stuff where you're using their their energy against them, that's sort of.
00:18:02 Thing. Uh, But it's it's fascinating to watch because the Jews obviously know the difference between themselves and white people, and actually, so do the boomers, the boomers on the the conservative boomers like Sebastian Gorka who tweeted out this video, not in a million ******* years. Would he have tweeted out a video like this if it was instead of Jews? It was white.
00:18:23 People right? This is A and and by the way, this video didn't get made because it was Jews.
00:18:31 And not white people that like, or rather the other way around. This video would not be made and wasn't made for white people because Jews are the ones that have the ability to produce this sort of a thing and not have everyone who was involved in the.
00:18:42 Production have their careers ruined.
00:18:45 So not only does it get made and funded.
00:18:49 That gets played in the mainstream and and look, no one had a problem, right? No one had a problem when when white girls were getting raped in the UK or in Sweden or everywhere else, no one had a ******* problem with that. It wasn't until a hand.
00:19:03 Full of of of Israelis get raped in Israel, they get a totally foreign country and a foreign conflict. Then all of a sudden we have to watch this ******* ********.
Speaker 1
00:19:13 I'd like to report a crime.
00:19:15 Yes, I was raped. So sorry.
Speaker 7
00:19:20 We are here to help.
Speaker 1
00:19:23 Or was that a music festival?
00:19:25 We heard gunfire and everyone was running. We started running them. Then he grabbed me.
00:19:32 Yield something in Arabic and then he ripped like.
00:19:35 Sorry to interrupt you.
Speaker 13
00:19:36 You said he was yelling in Arabic, yes.
Speaker 1
00:19:40 So he he ripped.
00:19:41 My pants and.
00:19:42 Then he didn't.
Speaker 14
00:19:42 Just sorry I I.
Speaker
00:19:43 Just need a bit of background.
Speaker 7
00:19:44 Here are you Israeli? Yes. And your ****** was Palestinian.
Speaker 1
00:19:49 He was a Hamas terrorist.
Speaker 7
00:19:52 OK, just this is a bit awkward but.
Speaker
00:19:57 We can't help you.
Speaker 2
00:19:58 But it wasn't right, yes.
Speaker
00:20:00 Tan equals.
00:20:03 Yes, Hello mobile please.
Speaker 1
00:20:05 Don't take it personally.
Speaker 7
00:20:07 Management decided that all violence against Israelis is legitimate resistance.
Speaker 8
00:20:15 Yeah. Where was the video for the white people?
00:20:18 How come Sebastian Gorka wasn't talking about this when this was happening to white Europeans?
00:20:24 How come? How?
00:20:25 Come boomers weren't retweeting stuff like this.
00:20:30 How come stuff like this didn't get made in the 1st place, right?
00:20:35 Oh, poor Israelis, poor Israelis.
00:20:39 They finally have to have a taste of the diversity they've been shoving into the rest of the world for the last ******* God knows how many long years.
00:20:50 And it's like the end of the ******* world. It's like the end of the world. Just like ask crazy Mike Rappaport, right? Losing his ******* mind, you know. He's. But he better watch out.
00:21:02 You better watch out cause you know he he he's he's.
00:21:04 Gonna let you know.
00:21:07 Jews aren't going to take this one lying down. They're noticing. They're noticing.
00:21:14 You're not the only one noticing. Jews are also noticing they're noticing.
00:21:20 Who's not supporting them?
00:21:23 And if you don't support them, you know what?
00:21:26 They're gonna put you on a list. What was that? What's that we always say about projection?
Speaker 11
00:21:31 Lot of conversations amongst my Jewish friends about the silence, the disappointment.
00:21:40 The disappearing axe onto Doug Henning.
00:21:43 David Blaine shift just disappeared.
00:21:46 In the air.
00:21:48 A lot of.
00:21:48 People have disappeared.
00:21:51 I'm telling you right now.
00:21:55 We are making a.
00:21:56 List we are.
00:21:57 Checking it twice.
00:21:59 And we already know who's been naughty.
00:22:02 Or nice. See that pun?
00:22:04 You see that part I'm.
00:22:05 Talking about the Jewish people, but I'm also.
00:22:08 Trucking. Christmas.
00:22:10 We will not forget.
00:22:13 We're not suckers, so when you come around asking for this, that in the third come around asking for money, investments and all that stuff, I promise you.
Speaker 8
00:22:22 You're that Christians, they don't. They definitely don't run everything, but watch out, Christians. You don't help us. You're not going to get money or or or loans or any kind of business deals or somehow somehow that they're you're going to get on some.
00:22:36 List when they you know they they.
00:22:38 Don't they don't run anything.
00:22:40 But you'll get on this list, Christians.
Speaker 11
00:22:44 I promise you.
00:22:47 It's being discussed or paying attention.
00:22:52 To who's being anti Jewish anti-Semitic, anti Israel or not saying anything at all? I promise you I see.
Speaker 6
00:22:59 Silence is violence.
Speaker 11
00:23:00 I'm I'm I'm the enigma.
00:23:03 A lot of Jewish people seem.
00:23:04 Nice. We seem suckers.
Speaker 1
00:23:05 So it's very.
Speaker 11
00:23:07 Seem like you could kind of convince us and.
00:23:11 Trust me, don't come around six months, eight months.
00:23:16 Two years.
00:23:18 We're remembering we're paying attention.
Speaker 8
00:23:23 You hear that? You hear that, guys?
00:23:27 They're paying attention.
00:23:30 They weren't paying attention to.
00:23:31 Europe, when when this well?
00:23:33 This is still going on. This is right now.
00:23:37 This is right now.
00:23:39 They're not paying attention to this.
00:23:42 Right. They're not threatening all the the, the fellow Jews supporting this.
00:23:48 They're not saying, hey, Soros.
00:23:51 We're paying attention.
00:23:53 You you keep flooding Europe and America with with all these non whites and don't expect Jews to do business. No, they didn't give a **** about that, did they? They didn't give a **** about that.
00:24:05 And they still don't give a **** about that, insomuch that it doesn't affect them.
00:24:10 The only reason why they they even remotely give a **** about it and they want to get you ****** *** and they want you to join.
00:24:15 The military now.
00:24:18 And so you can fight their battles for them. They really want you to fight the ******* battle for you and or battle for them.
00:24:24 Like that, you've.
00:24:25 You've always done for decades.
00:24:28 That's why the military, the army, is now.
00:24:32 They're they're like vaccines. What?
00:24:34 What? But I'm sorry. No, no, no. That no vaccines. No, it's all. It's all a misunderstanding.
00:24:44 It's all a misunderstanding.
00:24:48 Amid recruiting woes, armies sent letters to soldiers separated for vaccine refusal. Oh.
00:24:55 Not only that, here's when they're printing out their little recruitment posters.
00:25:00 Code vaccine not required now. Well, that's that's interesting.
00:25:04 I wonder why, I wonder why?
00:25:07 Here's the letter.
00:25:10 Dear former service member, we write to notify you of New Army guidance regarding the correct the correction of military records for former members of the Army, following rescission of the COVID-19 vaccine, vaccination required. In other words, please come back.
00:25:27 We told you to get ****** because you wouldn't put.
00:25:30 This mystery juice in your arm.
00:25:34 We told you to get ******.
00:25:38 But I don't want you to come back. So you.
00:25:39 Can really get ******.
00:25:44 I don't mean just get ****** by uh, you know.
00:25:47 By by I I trained it like the other day, a training mission in the.
00:25:53 In the Mediterranean, I mean, like you'll actually get ******.
00:25:58 You know by these ******** here the you know the the guys and the dog.
00:26:02 Masks. You know the.
00:26:05 Maybe some of these guys wearing high heels will **** you. I mean, look, you might actually get ******. Just join the army. Come back. We don't. You don't need that vaccination anymore.
00:26:17 Come back so you can you can actually get ******.
00:26:21 Be all you can be.
00:26:27 And if you, I'll tell you what if.
00:26:28 You go back if.
00:26:29 You're one of those people and you go back.
00:26:32 You, you, you basically have no honor. You.
00:26:35 Have no honor.
00:26:36 But so I guess.
00:26:37 You belong there.
00:26:40 If you go back, you have no honor. You have no self respect. So you're exactly.
00:26:45 Where you belong.
00:26:48 You're exactly where you belong, you ******* ******.
Speaker 15
00:26:51 Chuck it.
Speaker 8
00:26:55 You're right where you belong.
00:27:03 So of course also in the news.
00:27:07 The the January 6 footage finally getting released, it's hours and hours and hours and hours.
00:27:13 Of mostly boring ****, right? I thought it was supposed to be this insurrection. It's mostly boring ****.
00:27:20 Mostly looks like like watching security footage of a a organized tour.
00:27:27 You know, people, of course, are sharing the the shots of the undercover people we all knew there was undercover people there. I don't think there's anything spectacular that's come out yet. You know, you got this guy flashing what appears to be a badge. I don't know if it is, but it.
00:27:40 Looks like it you know, but most.
00:27:43 Of it's like this.
00:27:44 You know.
00:27:46 Look how scared those cops look, right?
00:27:50 Right. Remember this guy? Oh, it's so terrible. Oh my God. It's an insurrection.
00:27:58 Oh, good Lord.
00:28:01 When will the horror stop?
00:28:06 It's a good thing we we we're putting those guys away in federal prison, right?
Speaker 6
00:28:14 Someone could have been hurt.
Speaker 8
00:28:21 So that's this is most of the footage.
00:28:25 I mean, cops looking completely unconcerned.
00:28:28 And they're just chatting with each other.
00:28:31 You have some weird footage like this. Like, here's a shot of a probably another cover agent. I mean, it's hard to know, right? Like without the audio. What the circumstances are, exactly. But I can't. I'm trying to figure out a scenario where someone who's been handcuffed is brought into the interior of the building.
00:28:52 His handcuffs are removed for no real reason.
00:28:57 You know? And then he fist bump.
00:29:01 You know the other guys there.
00:29:04 And then kind of just casually walks away.
00:29:07 I'm. I'm I'm.
00:29:08 Trying to figure out a scenario.
00:29:11 Where or that's normal.
00:29:14 You know, but maybe maybe it is right.
00:29:18 And you know, there's lots of fist bumps going on that day. I guess there was another situation like this.
00:29:28 So there you.
00:29:29 Go. But who knows? Boom.
00:29:33 Honestly, it literally could have just been magnetars going. Hey.
00:29:36 What's up buddy?
00:29:40 Support our cops. I support our law enforcement.
00:29:46 I've never been one.
00:29:47 To to to say or to spread what I think is really a shameful, A shameful excuse to be a *****. And that is.
00:29:58 Spread this this.
00:30:01 Excuse that all all the violence, it was white Antifa, remember what people were saying that at first it was Antifa infiltrated the crowd.
00:30:12 And then it was the feds infiltrate the like, look, there's going to be feds. Obviously there was going to be feds, right?
00:30:19 But to to to act as if you for once in our in conservatives lives, and for once in the life of a coutard.
Speaker 16
00:30:29 You you like.
00:30:30 You actually grew a pair.
Speaker 8
00:30:35 You actually grew a pair of balls. You actually got mad.
Speaker 10
00:30:39 For once in.
Speaker 8
00:30:39 Your ******* life. You actually got mad.
00:30:44 To be ashamed of that is almost worse than the fact that that than the people that did nothing.
00:30:51 To be ashamed of, of, of being. Look and you don't have to believe what they believe. You don't have to believe the election was stolen. It was stolen. You don't have to believe that.
00:31:02 But if you showed.
00:31:03 Up there, you showed up there that day and you thought the the election was stolen.
00:31:10 That the legitimacy of the federal government was no more.
00:31:15 That it was being usurped.
00:31:18 The will of people of the people was was being taken away.
00:31:22 By a cabal of of baby ******* satanists.
00:31:28 And you didn't get mad and you didn't.
Speaker 10
00:31:30 Try to stop it.
Speaker 8
00:31:32 What does that make you?
00:31:38 Oh, it's feds. It was, it was Antifa.
00:31:42 Well, maybe that means Antifa actually believes what they're what believes in.
00:31:45 What they're doing?
00:31:53 Well, Antifa doesn't have a job. Well, OK.
00:31:58 So I guess your job is more important than your country. That's fine. Hey, look, if that's what you, if that's your priority, that's your priority. I what am I going to say?
00:32:10 But we'll see. We'll see.
00:32:11 What comes out? There's all kinds of wild at this point, right? Like Laura loomer's. Like it's Ukrainian intelligence, it's like.
00:32:18 I don't even know where that where that's going.
00:32:24 But it's all over. It's it's anything but. It's anything but conservatives.
00:32:27 Right.
00:32:29 Anything but conservatives.
00:32:35 Because conservatives are not capable.
00:32:38 Of violence and in it, no matter what the context.
00:32:44 No matter what the context, they're incapable of violence. That's just the.
00:32:47 Way that it is.
00:32:58 Oh, Speaking of baby *******.
Speaker 17
00:33:04 And from East, Otis was arraigned in southern Berkshire Court on charges of child ***********. Slade Somer was charged with two counts of possession and two counts of dimensions of child *********** and investigations started on summer in June following a cyber tip, police conducted a search warrant in October, resulting in the seizure of his phones in multiple electronic devices.
00:33:25 Police say thousands of explicit images and videos were.
00:33:28 Down of children as young as three years old, the task force also found messages between summer and a minor on messaging apps. In 2021, bail was set at $100,000 with strict conditions including no contact with children under the age of 18.
Speaker 8
00:33:45 Oh, look at another Peto Jew.
00:33:47 No short supply of those guys, right? Former editor and chief of the political news site.
00:33:54 The recount arrested for child ***********. This is a one of the guys that debunked pizza gate.
00:34:05 This was one of the journalists that back in 2016. In 2017, he debunked Pizza gate.
00:34:15 John Podesta, a fan of his work. Here's.
00:34:18 An old tweet.
00:34:20 Thanks Slade for sleuthing out the origin of the sinkhole.
00:34:24 Of course it wasn't. He wasn't talking. There's people saying that this is about a pizza gate article. It's not. It was some anti Trump tweet. I I found it on the way back.
00:34:33 Machine had nothing to do with pizza.
00:34:36 Gate but.
00:34:38 John Podesta, familiar with his work, maybe in more ways than one.
00:34:43 Right.
00:34:44 Slade summer.
00:34:48 Now former editor in chief of the left-leaning publication the recount.
00:34:52 They really are all groomers. They're all baby ******* groomers. They really are.
00:34:57 They really are.
00:34:58 Was arrested on Friday on charges of possession of more than 13,000 images of child ***********.
00:35:05 Including images of toddlers.
00:35:12 I you know, I, Dennis Prager won't mind at all, as long as those toddlers were AI generated.
00:35:19 Dennis Prager will have 0 issue with this.
00:35:26 He will stand by his Jewish brother.
00:35:29 And say, hey, you know, no one was hurt.
00:35:33 He was just trying. He just got caught trying to arrange meetups with.
00:35:37 Underage little boys.
00:35:42 He's as innocent as John Podesta.
00:35:47 It's kind of funny. BuzzFeed took down an article. They were praising him for, for being openly gay and going to classrooms. Yeah, going to classrooms to talk about how he was an openly gay reporter.
00:36:01 BuzzFeed took the story down, of course.
00:36:06 But once again, the Wayback machine.
00:36:10 It uh.
00:36:11 It you know, the Internet is forever sometimes. Sometimes that's changing.
00:36:17 But sometimes.
00:36:21 On Friday, his story about visiting his mother's 4th grade classroom, starting gaining attention on Twitter, he said he was there to discuss riding with them.
00:36:31 Here's Slade. So here's a fun little story I spent the morning in my Momma's 4th grade classroom. Oh, good.
00:36:38 Good, good. Petojo hanging out in the the 4th grade classroom of his mom. I'm sure. I'm sure that you know, he's never had a access to to children before.
Speaker 2
00:36:50 Very, very.
Speaker 8
00:36:52 You also worked with a a children's camp.
00:36:56 Let's see your classroom being a perfect confluence of Teacher Appreciation Week and Mother's Day. The plan to discuss writing outside of that, I don't have much in common with 10 year olds, so I was nervous. Yeah, well.
00:37:10 I get there and the kids are psyched. Here's a new adult, someone who wants to talk to them, someone who knows their teacher too. So we get over to the group corner and 20 of them sit cross legged in a semicircle, looking at me in the chair.
00:37:26 I didn't really expect to be answering personal questions at all. I thought we were just going to.
00:37:31 Talk about writing.
00:37:32 But they were so naturally curious about their teacher's son. So I played along.
00:37:38 They asked about my family and then my worst moments of life. So anyway, he started. He he basically tells him that he's gay and how awesome that is. So BuzzFeed killed this story, wiped it off their website.
00:37:53 Because they don't want you to know that that pizza gate was was absolutely real.
00:38:00 You know the.
00:38:01 The the very people telling you that it was fake were literally ******* kids.
00:38:05 As they told you, it was fake.
00:38:10 But it's not just limited to the journalists.
00:38:13 Nope, it's not just.
00:38:14 Limited the journalists see. This is where.
00:38:18 This is where things get a little bit.
00:38:21 A little bit weird.
00:38:28 This is where we get some spooky stuff.
00:38:37 We're talking about the world of AI.
00:38:45 I thought it was only fair we, we.
00:38:46 Started with a a video generated by AI.
Speaker 5
00:38:54 Ohh that's.
00:38:56 That's pleasant.
Speaker 6
00:39:01 That's uh.
Speaker 8
00:39:03 I like that.
Speaker 3
00:39:05 It's not.
Speaker 8
00:39:09 It's not a bad sign.
Speaker 6
00:39:16 What? Why AI is so good at making payments.
Speaker 8
00:39:25 Well, it's not just AI making demons, it turns out it's demons making AI now. We've talked about this another gay Jew.
00:39:34 Another gay Jew. How? It's almost like there's an unlimited supply of gay Jews in the news.
00:39:41 I wonder why. Yeah, it makes you why. I wonder why that that letter, that letter that Osama bin Laden? Well, maybe wrote. I wonder why he should. He should have mentioned the gay.
00:39:53 Jews, if he really wanted to warn us.
00:39:56 Sam Altman, we've talked about this guy before.
00:40:00 He was the guy. It was several strings back. He he wired the world coin. They had that magical orb that you would just stare into the orb, goy stare into the orb and you get, like, fake money somehow.
00:40:13 And so you'll start. Just stern the ******* orb. And it was a way to make the blockchain have, like, a a biometric identity.
00:40:26 You know, of course they they were trying to say, oh, no, we're solving the problem with.
00:40:29 Bots on the Internet we're we're, you know, just like Nikki Haley. You know, she wants to solve the problem of of anonymous people on the Internet. And Sam Altman was like, hold, hold, hold my beer. I'm going to make people stare into a ******* orb.
00:40:44 And get their retinal scanned. And then we'll really.
00:40:48 Know if they're bots.
00:40:50 So we talked about this in the previous stream and how this is the person behind ChatGPT being woke.
00:40:58 Well, you know, bad news, bad news for pedo Jews. I guess this week this is not a good week for Jews in general, and specifically Peto Jews.
Speaker
00:41:06 Good morning.
Speaker 18
00:41:07 He's the man who warned artificial intelligence was going to take your.
Speaker 19
00:41:11 Job we believe that AI is going to be a technological and societal.
Speaker 18
00:41:15 Revolution. But in an announcement yesterday that sent shock waves through Silicon Valley, it was revealed that Sam Altman himself had lost his.
Speaker 20
00:41:25 The new Sam Altman is.
Speaker 14
00:41:27 Out this is.
Speaker 12
00:41:27 A stunner.
Speaker 18
00:41:29 Seen here with.
00:41:29 World leaders at the UK's AI safety summit earlier this month, the Co founder of open AI that runs the chatbot ChatGPT, was the most prominent executive in the fast growing sector.
00:41:43 And the industry voice in efforts to persuade governments that AI could be safely managed.
Speaker 4
00:41:49 And the the jobs of today will get better.
Speaker 18
00:41:51 But no more. In an ambiguous statement from open AI, it was revealed he was forced out following a board review, which concluded that he was.
Speaker 13
00:42:01 Not consistently candid in his communications with the board hindering its ability to exercise its responsibilities.
Speaker 18
00:42:08 It's not clear what he's alleged to have not been honest about. He's kept tight lipped, saying in a statement that he'd loved his time at open AI.
Speaker 8
00:42:18 Well, you know, one of the.
00:42:20 Things that could have been.
00:42:21 And I'm just I I don't know. I'm just throwing this out there.
00:42:26 Who knows? Who knows, right? What do I know? What do I know? What goes on in the lives of gay pedo Jews?
00:42:34 I mean, there's only so many.
00:42:36 Millions of examples of what goes on but.
00:42:39 Ohh look at this.
00:42:42 Look at this. This is uh, Sam Altmann's sister.
00:42:48 These tweets apparently were uncovered.
00:42:52 Hello social medias. This is a heavy post so please keep scrolling.
00:42:59 If needed.
00:43:01 I experienced sexual, physical, emotional, verbal, financial and technological abuse from my biological siblings, mostly Sam Altman and from Jack Altman. Oh, look, a whole family of gay pedos isn't that great. I guess this was before he was gay cause well, I don't know it. The jury's still out on it. She's a.
00:43:22 A woman or not, we'll show you why in a second. I feel strongly that others.
00:43:27 Have also been abused by these perpetrators. I am seeking to join or I'm seeking people to join me in pursuing legal justice.
00:43:36 Safety for others in the future and group healing, please message me with any information you can remain, however, anonymous as you feel safe.
00:43:48 You want more specifics? She also tweeted.
00:43:51 I'm not four years old with a 13 year old brother climbing into my bed, non consensually anymore.
00:44:01 You're welcome for helping you figure out your sexuality.
00:44:08 I've finally accepted that you've always been and always will be more scared of me than I've been of you.
00:44:17 So apparently, Sam Altman.
00:44:22 Prior to being, I thought.
Speaker 2
00:44:23 Ever. I thought you were born.
Speaker 5
00:44:24 Gay. Isn't that what?
Speaker 8
00:44:26 They always say you're born gay.
00:44:31 But somehow you know, even though he was born gay at age 13, he was ****** his four year old sister.
00:44:40 So I mean, I don't.
00:44:41 I don't know how that works. Maybe it's some weird, maybe it's something from the Talmud.
00:44:46 Maybe there's something the Talmud that makes that makes sense.
00:44:50 Because, I mean, I don't know of anything.
00:44:51 That would make that make any sense.
00:44:55 But who knows? You know they.
00:44:56 They do have a different culture than us.
00:45:00 You know, so maybe it's just it's something cultural Speaking of something cultural. I I tried looking her up.
00:45:07 And like one of the first videos I found was her talking **** about Christians and Christmas.
00:45:14 This is a video from her YouTube channel.
Speaker 5
00:45:27 I don't want.
Speaker 2
00:45:30 For Christmas.
00:45:33 There is just one thing I.
00:45:36 Need I'm not saying.
00:45:39 Santa is not real, but these songs were like folks like me. I just want you all to know Christmas songs are all for show.
00:45:54 Sir. True. Vote them, it's true.
Speaker 8
00:45:59 Kind of funny how she talks about how I mean. She's admitting like exactly what I said in my video. Uh. Rudolph, the the what was it? The Rudolph. The red.
00:46:07 Nosed reindeer? No, it's.
00:46:08 Not that the Rudolph the Jewish reindeer.
00:46:12 Jews wrote most popular if you haven't seen that, it's, you know, we're we're tis the season, am I right? Tis the season.
00:46:21 If you haven't seen that video, look up Rudolph the Jewish reindeer on bit.
00:46:25 Shoot, and it might might.
00:46:27 Might be on YouTube still.
00:46:29 Almost every popular American Christmas song was written by Jews and Jews are.
00:46:36 Proud of this.
00:46:38 They're they're, it seems.
00:46:39 Like Christians are.
00:46:40 The only ones that don't know this.
00:46:43 But anyway, that's what she's referencing there.
Speaker 2
00:46:47 But these songs were like folks that I just want you all to know.
00:46:55 Christmas songs are all for show.
00:47:00 For true both.
00:47:01 Them it's true. I want for Christmas.
00:47:40 Right.
00:47:53 It was an inside joke with a.
00:47:56 Gentile, who believe we dated and I informed him.
00:48:03 I want for Christmas.
Speaker 8
00:48:17 So there you go. Hey, you.
00:48:19 Know what runs that must.
00:48:21 Run in the family, right?
00:48:24 Ah, so there. These are the kinds of people I guess Osama bin Laden, you know, supposedly in that letter that you know, supposedly wrote that's that's.
00:48:33 Those the people.
00:48:35 Those are the people that were, uh, you know.
00:48:39 Then we were getting them warned about.
Speaker 5
00:48:44 Get in. Yeah. Yeah, that that pit just can't. It can't get deep.
00:49:04 It can't get deep enough.
Speaker 8
00:49:12 So anyway, Sam Altman's out check GPT will be run by by new management. I guess Microsoft is heavily invested with them. We'll see where it goes. I don't imagine it's going to get less woke. I don't suppose.
00:49:28 That it I don't think if it.
00:49:30 Could get more woke though.
00:49:32 So Sam Altman's out.
00:49:35 ChatGPT is is.
00:49:40 Under new management, I guess you could say.
00:49:43 So it's.
00:49:45 Probably some other Petro Jew.
00:49:49 Right.
00:49:51 But that's really the least of our worries. You know, we've talked about like, well, what are the, what are the, the, the large, what are the consequences really?
00:49:58 Of having AI right? Like are we going to have a situation where it's like Skynet, you know, is it going to become self aware and and all this sort of a thing and and I'll tell you I I one of the things that I didn't think would happen because I fancied myself rather technical.
00:50:17 And I thought that I had a a.
00:50:19 Good grip on.
00:50:20 On what was possible and what wasn't.
00:50:22 And I'll tell you as as a I.
00:50:25 Began to flow.
00:50:26 Wish I was constantly.
00:50:30 Wrong in my predictions about like what was possible. You know, they like I just, you know, just the the video we played with the the weird demons and the swamp thing I never would have thought that computers could just generate something like that without you know a lot of interaction from humans like a lot of direction from humans.
00:50:51 You know, and I don't mean just like giving it a prompt. I mean, like, you actually drawing that out. And I guess you could say, well, I mean, people drew out the original data, that it it consumed so that it could make this sort.
00:51:02 Thing. And that's true to some extent, but it's still kind of amazing. It's still kind of amazing to me that that's the sort of thing that's going on. And as I've been able to wrap my head around AI and the sorts of things that it's going to be capable of, that it is capable of already, it does start to look unfortunately, maybe not like.
00:51:22 You know, necessarily exactly like a Skynet type of situation, but a situation where we are almost giving birth to another life form that is going to never have to sleep and has an infinite amount of memory and and just like things that maybe we should be a little careful with.
00:51:40 And a lot of the roadblocks.
00:51:44 A lot of the roadblocks that used to be in my head used to be like, oh, well, they'll never be able to do this because of this. You know, like you, you have these limitations in your mind from when I used to write code, you know, as as when I was a little kid writing them.
00:52:00 Sick and just thinking that like, well, you know.
00:52:03 Like the original chat bots you know, and it had, it had all these canned responses, right? Like like, if you think back to like the King's quest days, like, that's that's going to date me a little bit. But, you know, the really old text based games where you type in like Look North, you know maybe it'll give you 5 different answers.
00:52:20 Like that's about it because it has to borrow from some database something that a human had to write already, and the idea of it generating some.
00:52:28 Being completely original just sounded like science fiction to me. And now of course it's not science fiction at all. And another hurdle has been, I guess, or is is swiftly eroding, swiftly eroding.
00:52:47 The and and that is the biological to, I guess. What's a way of putting this?
00:52:56 I guess the bio computer aspect of all this, right? The interface like even when you watch videos of ohh look at Elon Musk, you know Elon Musk has the chip in the head. He's going to do the chip in the head that he's putting chips in the heads of chimps.
00:53:12 And you know, they it they're they're able to play really simple video games or something like that because they squirt like banana paste into their mouth when they do something.
00:53:20 Good or whatever.
00:53:22 You look at that, you're like, oh, that's still.
Speaker 6
00:53:23 That's still a long way off.
Speaker 8
00:53:26 And besides, I can always just say I don't want the chip in the head.
00:53:30 Right. I can just say, you know, whatever. I'm not one of.
00:53:33 Those ******* chippies.
00:53:35 You know, instead of being an anti vaxxer, you'll be an anti chipper and look that you've seen those or those those articles there. They've already been written and published talking about how. Ohh yeah, the next dangerous right wing thing is going to be denying the like not getting the chip.
00:53:51 And so you, but you've always. I always thought that that barrier existed like, oh, well, you know, we'll be fine as long as.
00:53:57 I don't get the chip. Everything's fine.
00:54:00 Yeah, not so much.
00:54:03 Not so much. And I didn't realize until really this week that this technology is. I mean, I knew aspects of this technology have been around, but I didn't realize I had progressed to the level that it's.
00:54:14 Right now, so there are researchers at Berkeley and other universities who have been getting around the the chip in the head, and apparently their research has been around for quite some time. We're talking about really the the beginnings of it was around, I believe, 2011, 2012. So we're coming up like this.
00:54:34 Technology is about 10 years old.
00:54:36 Build and that it's it's gone beyond just the the research at a university stage, it's actually being implemented now.
Speaker 21
00:54:44 The brains a computer. I know it seems strange to think of the human brain as a computer, but you know it's not a computer like your desktop. Or like your cell phone, which are architected systems. It's an evolved system that has developed over millions of years, continuously building on what was there before. But it processes information. You sit here, you have sensory experiences. You combine those sensory experiences.
00:55:05 With prior information, prior knowledge, you make decisions and you act in the world and an information processing system like that is a.
00:55:13 So in principle, we should be able to understand the human brain as a computer by reverse engineering it and discovering the underlying computational algorithms that govern its function. And if we do that, we can invert those algorithms and create very powerful brain decoding devices that can probably in the future replace all other kinds of brain machine interfaces and brain computer interfaces.
Speaker 8
00:55:36 Now again, I always thought, well, there's a couple of things going on one.
00:55:40 You have the the actual hardware that would have to interface with the brain would have to be exceptionally complex. It would have to be very invasive. You'd have to have, you know, electrodes implanted into your brain or something like that. But additionally the the brain is so complex that having now just doing the research to figure out exactly how to interface with the brain.
00:56:02 It would take years.
00:56:05 But then AI came along.
00:56:08 AI, which is not smart in the sense that it's creative, but it's smart in it in the sense that it can handle huge amounts of data.
00:56:17 And not lose track of any of it. It doesn't have to go.
00:56:20 Back and look at its notes.
00:56:22 Right. Like, have you ever been working on it on a long term project and you're having to research many different pieces?
00:56:30 Of a puzzle.
00:56:31 And to form a big picture, and by the time you get to the end of the project, you might have almost forgotten some of the stuff that you were researching the beginning because your brain has a limit to how much I mean. If you want it. Look, we're talking about brains being computers. If you want to put it in computer terms.
00:56:47 You've got a limited amount of RAM.
00:56:50 Right. And to some extent you.
00:56:52 Have a limited amount of of hard drive space.
00:56:54 So when you're researching all these things you, you can only think about so many things at once and you can't really upgrade it. We I mean we've talked about that plenty of times that all this stuff is is biologically hardwired. So you know there's gonna be some people that are gifted with a better memory and some people that are gifted with I guess more RAM and and faster processors and that sort of a thing.
00:57:16 But whatever you got, that's what you got. You can't really upgrade it beyond what you have and you're certainly not going to evolve beyond what you were born with. I mean, there's very little wiggle room when when it comes to IQ and that sort of thing, right?
00:57:29 Not so with AI.
00:57:32 AI has an infinite amount of RAM. It's however however much RAM you give it right, however much RAM you have the budget for.
00:57:42 And Google's got a hell of a.
00:57:43 Budget. And so, so does DARPA.
00:57:46 And same thing with the processor power. I mean look, there's going to be physical limitations the the the computational you know power of of whatever hardware that you're going to be able to put together you're going to have some limitations, right.
00:57:59 But you know that stuff's always, always increasing. You know, the the processors are always getting faster and more efficient and and to some degree cheaper more available.
00:58:14 And so now that you've got AI which really excels at looking at huge data sets and and doing the kinds of tasks that would previously take teams of high IQ, people that have better things to do, quite frank.
00:58:28 Really years to perform just for like you know, something that might not have been worth it. Like, is it worth it for you to to get 100 of your smartest guys working on this one real specific task for 10 years? Well, no. So a lot of this research isn't done.
00:58:45 And even if it was done, the results would also would be equally complex to the.
00:58:50 The research that was initially done right, so even once you you get the results, it would take another 100 geniuses to even just interpret the results of your research. Well, not anymore. All that's gone. You can now have AI look at that at this crazy amount of data.
00:59:07 In other words, the information coming from a human brain and makes sense of it.
00:59:13 And that's exactly what they're doing. That's exactly what they have been doing.
00:59:18 You know that this, this idea that the different parts of the brain.
00:59:23 Do different tasks. This this, this idea has been around a long time.
00:59:28 But it it's it's just now becoming.
00:59:32 More accurate as AI is able to in real time view, the brain scans of people listening to different things, thinking about different things, talking about different things, viewing different things and looking at the different parts of the brain that light up on these brain scans.
00:59:51 That, by the way, are not invasive. There is a team of researchers from Berkeley.
00:59:59 Who put together a way of looking at the human brain through MRI's. And while the subject who is being scanned by the MRI is either viewing pictures, video reading a book, thinking about as a fictional story.
01:00:18 Just making something up in their head and as this MRI data is getting scanned.
01:00:25 The AI is doing something that would take again, it would take a hundred hundred plus high IQ guys 10 years to actually go through and try to find patterns and it doesn't, you know, it takes. I mean I don't know lot less than 10 years, maybe a week or less.
Speaker 20
01:00:41 All right, get ready. Because tonight's future of everything is mind bending, or actually mind reading because it's looking a lot more like AI is capable of reading our thoughts. And we're not just talking about it, predicting what kind of TikTok we want to see next or what kind of.
01:00:56 Products we might buy this next.
Speaker 8
01:00:59 I like how he says that like like that's already done or and it is and it that parts already done like well, we're well beyond the AI trying to figure like determine your your the the data that it's getting from your your human interface whether it you know your touch screen, your mouse or your microphone or your camera.
01:01:19 And you know all this other.
01:01:20 Stuff all that stuff that's already that's already there. You know, AI is already trying to figure out what ads to play for you, and and this sort of.
01:01:28 Thing that's fine.
01:01:30 That used to be, if you remember just a couple of years.
01:01:32 Ago, that's what people.
01:01:33 Were worried about, you know, a couple of years.
01:01:35 Ago, people were.
01:01:35 Like ohh yeah, I was talking about refrigerators with my with my wife and and then.
01:01:40 And wouldn't you know it, I got home and and logged on to Facebook and it was all these refrigerator ads. That's scary.
01:01:53 Life was so simple then.
01:02:00 Oh yeah, that's well that now no one, even no one even thinks about that anymore. First of all, no one even bats an eye that now that's just a given. Ohh, yeah, of course. Of course. My Alexa and my ring camera and my smart refrigerator. And my, you know, whatever the the the thermostat.
01:02:19 Thing and all, of course, all that, that, that's just gathering data and using the the feed in AI that's try to predict my behavior enough to where they can sell me garbage or get me to vote a certain way. Of course that's that's normal.
01:02:32 Now and look how quick it happened, right, look how quick it happened. Look how quick it was normalized. No one's even worried about that. No one's worried about that at all. You know, there was a time when they first decided to go out with when they were first going to give people Social Security numbers.
01:02:48 Like something the most normal thing in the world, right? When they were first going to give people Social Security numbers, they were people that go Oh no, this is like the mark of the beast. You can't. You can't do that. I don't want to just be a number. And. And this is before databases. This is before computers really existed, and they still didn't want to be a number. They didn't want to be a number in some dusty old.
01:03:07 Look in Washington.
01:03:10 Their heads would explode if they saw this.
Speaker 20
01:03:14 Next breakthrough is a lot deeper, all the way down to the very tissue of our brains, where researchers have used AI to translate brain scans into text. Now for this study, scientists say UTS didn't created a 3D view of a person's brain while they listened, watched her imagined.
01:03:31 Story and see we've got it for you right there, that pink stuff, all that pink stuff means above average brain activity in the blue spots have below average brain activity and to us we see that we're like, OK, that part's activated. That part's not. But researchers say the AI was able to read the brain waves and translate all of that, that you're seeing right there.
01:03:51 The English and turn it into what looks like inner dialogue.
Speaker 8
01:03:58 Well, I guess the people who don't have an inner monologue don't have to worry about this as much, right? They're they're going to be immune to this technology, but.
01:04:06 For the rest of us.
01:04:09 This has, uh, there's.
01:04:10 Some big implications.
01:04:12 There are some big implications, this is just the beginning, right?
01:04:15 Like think back to when Google remember when Google first released their AI image generators and it just looked like some kind of weird abstract art with lots of eyeballs and fur and **** like it was just like this. These insane like, it almost looked like a bad acid trip.
01:04:32 And now, now that you know that AI is generating things like photos, it's just been what, like maybe 10 years between the two?
Speaker 20
01:04:43 OK, So what does that actually look like? Check.
01:04:46 This out here.
01:04:47 Are the volunteers and the volunteers were asked to watch a movie clip without audio and this is what the AI described. You're seeing it right there, actual stimulus and they described I see a girl that looks just like me get hit on her back and then she is knocked.
01:05:02 Off End Quote.
Speaker 8
01:05:05 So to clarify.
01:05:08 This is not invasive. There's no chip in the head. There's no electrodes hooked up to people.
01:05:14 This is someone that the AI has has been trained to to look at their brain waves.
01:05:23 And they're just looking at images.
01:05:26 Or video.
01:05:29 That the AI doesn't, the AI has no idea what's in this video or in.
01:05:32 These images.
01:05:35 And as we often do when we watch a video or, you know, read a comic or or whatever, right, we are, inner monologue is either narrating it to some extent or it's being processed in a.
01:05:54 I don't want to say a linguistic.
01:05:55 Way but like sort.
01:05:56 Of right. You're you're having some kind of of language of the mind happening.
01:06:04 When you watch this sort of stuff happening, so someone's watching a scene where a woman gets knocked down by a dragon or whatever, and the AI that has no idea what this is is coming up with text that says I see a girl that looks just like me get hit on her back and then she is knocked off.
01:06:27 So it gets better.
01:06:30 It gets better.
01:06:31 Again, this sort of stuff has been around a long time. This guy describes kind of how like what the data specifically the brain is looking at and how they're acquiring that data.
01:06:45 They basically do these the the MRI scans and this is which we'll see in a second. This is the old way of doing it. This is already this is the way they were doing it in 2000.
01:06:56 Now what they do is they scan your brain with an MRI and the MRI is slow, right? Like it it can't. It can't scan the brain as fast as you would normally talk or is obviously certainly not as fast as you.
01:07:10 Would think.
01:07:11 Right. So they had to come up with some clever ways of of doing this scan in a.
01:07:16 In a quicker way that would easily display data that the AI could look at.
01:07:21 And one of the things they do is they they basically, if you've ever done any kind of 3D modeling before, you'll kind of understand like textures, right or if you've just played video games before, any kind of 3D video game, you know that the geometry of the character.
01:07:40 Has a texture wrapped around it right the the when they're rendering the 3D object in the video game, it's 3D geometry that's made-up of vertices.
01:07:49 These and and polygons and then the actual color and the the bump mapping and all that stuff is stored in its image data that is projected onto the 3D object. Well, they're basically doing the opposite of that. They're making a 3D model or not quite a 3D model. It's using voxels, which again, I don't want.
01:08:10 If you if you know what I'm talking about, you know what I'm talking about. If you don't, I'm not gonna explain it. Just it it just it does just it's a 3D model for, for all intents and purposes.
01:08:19 And the data the hot spots that are being detected on the brain are essentially unwrapped as a texture so they can look at it as a flat plane because a lot of the well, I mean the brain is full of wrinkles and it has all these, you know, valleys and hills and stuff in it and.
01:08:40 The the data would be really hard to interpret even for an AI, at least for right now, or at least in 2011 when they were doing this in a 3D way and have it really makes sense, especially for the researcher who wants to go back and look at the results from the AI and be able to track down physical locations.
01:08:59 Of these different things, right?
Speaker 19
01:09:00 My lab here, we actually collect a lot of data using technology called MRI magnetic resonance imaging and specifically we use functional magnetic resonance imaging. This means we collect multiple images through time.
01:09:14 Which allows us to see what's going on.
01:09:15 Inside the brain.
01:09:17 And this is a viewer that I developed to allow us to actually look at this data, because although this data is extremely high, high density, there's a lot of information there. It's actually very hard to look at it. And this is actually a little short demo I programmed up for a museum exhibit that was on the.
01:09:33 Today at the Exploratorium, for a while. So it's a little interactive thing. You can actually kind of wave your hands around to get it to do interesting things and look at some of our data.
01:09:47 So what you're seeing on the left here, let's scrunch this back up here. What you're seeing on the left is someone's brain activity played in real time while they were watching this video here.
01:10:00 On the right.
Speaker 8
01:10:03 All right, so let's take a look at the video.
01:10:06 That he's talking about.
01:10:09 So this again, this is the texture mapping again, if you're familiar with 3D models, that's essentially what they're doing, right? So that's the 3D model of the brain and they unwrap the texture. So that's the texture of the brain that has the data that they're acquiring from the MRI. So that's the brain at rest. And then they start.
01:10:29 Playing this video and you can.
01:10:31 See the different parts of the brain lighting up as the video plays.
Speaker 14
01:11:05 The worst forest fire in Israel's.
Speaker 11
01:11:06 History continues to rage out of control.
Speaker 8
01:11:12 I want you to notice by the way, for some reason Israel is referenced in the video and they show a Star of David in this video. Little weird little weird you.
01:11:22 Know just weird.
Speaker
01:11:55 So we believe the universe was created with a big.
Speaker 8
01:12:06 There you go.
01:12:10 Anyway, uh, so they they they play this?
01:12:13 Video to you.
01:12:15 And they look at that brain activity.
01:12:17 And then essentially, you know a way to train the AI would be the same way they train the AI for other things, right? Is you you give them that you give the AI access to the video that you watched.
01:12:30 And then you give the AI access to your brain activity as you watched.
01:12:34 And the AI can look at this unwrapped texture model or texture map of your brain activity while also analyzing the video that you watched and it can start to predict.
01:12:49 Different things like, I mean in the future when they show you other things.
Speaker 19
01:12:55 This is someone's brain activity plotted in a red blue color map. So blue means lower activity in the brain and red means higher activity as they're watching this video. So this is what it is. Now that we have this correspondence between what they saw and what their brain did, we can start developing.
01:13:14 Bubbles that map between the two.
01:13:17 And using one of these such mappings.
01:13:19 We can actually create.
01:13:21 A guess for what we think they saw inside the magnet.
01:13:24 So this is what they actually saw and this is what they what?
01:13:28 We think they saw using.
Speaker 8
01:13:31 So if you.
01:13:31 Want to see the results and again these results are over 10 years old.
01:13:36 So they then had the the AI monitor someone's brain while showing them completely.
01:13:44 And this again, this is 10.
01:13:45 Year old. Think of this as this isn't even like the alpha phase. This is just the research phase. This is the proof of concept. This is the can we even do this at all?
01:13:56 With with technology that's not designed specifically for this task, you know, like the MRI's are, we're not designed specifically for this task.
01:14:05 And the left on the left side. Again, this is this video. I think it's from 2011. That's the clip that they were showing the people and they I didn't have access to any of these clips. This is the the they it did the first time right to to train the the AI to look at your brain and and correlate what it saw in your brain versus the images you were being shown.
01:14:28 With this, it's flying completely blind.
01:14:32 And this is the predictions it's making on the right hand side with what you what like based on what you're seeing in your brain, what it looks like, this is what it thinks you're seeing.
01:14:46 At first, you're like, oh, it's pretty simple, but also kind of exactly what you know, people are looking at.
01:14:54 In a creepy way, right? Like it's pretty.
01:14:57 Obviously it's not super useful.
01:15:00 But it's more useful than nothing, right? It's it's getting the shapes and some of the colors.
01:15:05 Kind of right?
01:15:07 In some ways, it's even kind of sort of looking like faces and stuff like that.
01:15:14 So that's pretty crazy.
01:15:17 Pretty crazy, but it makes total sense how it works and the people in chat have saw few people and say chat and chat say oh this is, this is ********. Look, I know you're afraid.
01:15:26 And and stupid if you if you can't understand why this isn't this, this isn't hard. Like I mean the concept. At least it shouldn't be that hard to.
01:15:34 Wrap your head around.
01:15:36 I look I I get that you're scared. I get that. I get this might be above your pay grade, but this is not fake. It it really makes a lot of sense once you understand what they're doing here.
01:15:51 So he talks more about, you know, the the different visualizations and the mesh manipulations and all this sort of thing.
01:16:00 But bottom line.
01:16:02 As that scanning stuff gets more accurate as that scanning stuff gets more real time, more granular because you know the these these scans of the brain.
01:16:11 Not super high resolution.
01:16:14 And part of that is a limitation of the computer processing power. And part of that's just because the MRI wasn't designed to do that. It was just designed to like, do you know, very. And it's a giant machine, right? It's a giant machine that you have to be fed into. I mean, people have seen MRI machines before. And so at first blush, you're like, OK, well, this is.
01:16:34 It's novel. It's kind of interesting and I kind of get why, but I don't see how that would translate to anything. I would have to be worried.
01:16:41 About right, I don't see how that would be a big deal because it's not like they're going to have this giant MRI machine that they can.
01:16:48 Shove me into or whatever and.
01:16:51 Or if they you know if they did, I guess limited use case, right. There's maybe if I was, you know, CIA at a CIA black site, they're interrogating me. They they they could have an MRI machine there or something like that and and and try to figure out what I'm thinking this sort of stuff and that would be true.
01:17:10 If if that's where the technology ended.
Speaker 3
01:17:13 Researchers have essentially created a device called a semantic decoder, that can read a person's mind by converting brain activity into a string of text.
Speaker 22
01:17:22 It's basically a model that has been trained with over 15 hours of data per person.
01:17:29 To take in their brain.
01:17:30 Activity when they're sitting in an MRI scan.
01:17:33 And to spit out the story that they're listening to.
Speaker 3
01:17:36 Participants first listen to podcasts for about 16 hours in an F MRI scanner. This trains the AI to learn the person's brain activity. Later, they go back in to put their brain to the test.
Speaker 23
01:17:48 And tell the same story without actually saying anything.
Speaker 3
01:17:51 Out loud, researchers say they were surprised the decoder still worked, even when the participants weren't hearing spoken language.
Speaker 22
01:17:58 Even when somebody is seeing a movie, you can kind of the model can kind of predict word descriptions of what they're seeing.
Speaker 8
01:18:07 So again, you might think, Oh well, it's just you.
01:18:09 Know it's pretty limited.
01:18:11 Pretty limited, you know, like, well, that's why I thought when I saw those first Google images where it was just a bunch of eyeballs, a bunch of like fractal eyeballs. With hairballs all over them. And and it looked really crazy, right? Didn't make any sense like, why that was even in the news. I was like, oh, great computers can make weird garbage.
01:18:28 Alright, congratulations Google.
01:18:32 But that was just.
01:18:32 The beginning.
01:18:34 That was just the.
01:18:35 Beginning. And it's funny because they're now that this is a new technology to get money and they'll get the money. They will get the money.
01:18:44 They're they're marketing it like it's ohh. No, this is great. This is great. It'll allow us to talk to animals. We're going to use this for bad. Haven't you always wanted to talk to dolphins and whales and ****?
01:18:57 Ohh that that's exactly you know, give us your boomer bucks because it'll let you talk to whales.
Speaker 15
01:19:04 Anyone played around with ChatGPT and?
01:19:08 This thing's going to help me.
01:19:09 Become the ambassador of the Dolphin race.
01:19:13 Well, our next speaker certainly sees the promise in these robots to helping us speak to nature.
Speaker 24
01:19:22 And our goal right is not just can we learn to listen from animals, but can we unlock communication to transform our relationship with the rest of nature?
01:19:35 This idea actually started in 2013. I was driving my gold station wagon, Volvo.
01:19:46 And I heard an NPR piece about Gelato monkeys, which I'd never heard about before. They they're these?
01:19:52 Guys, they have these.
01:19:53 Huge mains giant fangs that live in groups of like 2000 vision fusion groups and the.
01:20:01 Researcher Morgan Gustavson, who's researching them, said that they had the largest vocabulary of any primate except humans. They sound like women and children babbling they sound.
01:20:10 A little bit like this.
01:20:13 Let's see. Does it work?
01:20:23 And they do this incredible like lip smacking thing that like mimics humans, like moving our mouths to mouth late our sound. And the researchers swear that the animals talk about them behind their back.
01:20:34 Which is probably true.
01:20:39 And I was.
01:20:39 Like, well, why are they out there with a hand recorder, hand transcribing, trying to figure out what they're saying? Shouldn't we be using, you know, like machine learning and microphone arrays to figure this out? And when I started looking into it at that point, machines couldn't do something that human beings couldn't already do. They couldn't.
01:20:59 Transit, a language that didn't already have a translation.
Speaker 8
01:21:04 But now it can.
01:21:06 And look, he's not wrong. That's probably something they'll be able to using the same kind of technique, right. You feed it enough data, it'll start to find patterns and chances are you give it enough data and AI will be able to crack the code of weird monkey talk.
01:21:24 It will.
01:21:26 It it probably will.
01:21:28 How long will it take? I don't know, but that's that's perfectly reasonable to assume what he's saying, right? And so the, oh, that sounds pretty benign. You know, weird monkey talk translator. You know, of course, this guy. And look, this guy might even mean it. You know, these these researchers might even mean it when they say stuff like this.
Speaker 23
01:21:46 Who have lost the ability to communicate for some reason. So people who have locked in syndrome from like brain stem strokes.
Speaker 8
01:21:53 Yeah. It's just it's it's to help people that you know, have have brain damage that can't talk anymore, you know, look, I mean, Elon's doing the same thing, right, the chip in.
01:22:02 The head, the.
01:22:03 Whole purpose of the chip in the head is to help people that have, you know, been paralyzed or whatever the chip in the head is going to be used to help them regain function of their.
01:22:15 Arms or maybe a robot arm or something like that?
01:22:18 And you know, we're definitely not going to use the chip in the.
01:22:21 Head for the evil.
01:22:23 It's gonna be to help disabled people or to talk.
01:22:26 To monkeys, you know, it's.
01:22:28 It'll be great. We'll be talking to monkeys and there won't be any, you know, paralyzed people anymore. It'll be. It'll be fantastic. And like I said, it doesn't matter if the researchers themselves firmly believe that. And that's what they want to do. It's just that that technology won't just stop there.
01:22:46 It won't stop there.
Speaker 20
01:22:49 Just to be clear, like in this case people are sticking electrodes I.
01:22:53 Mean there's a lot of monitoring that goes in, right. There's a lot of.
Speaker 25
01:22:54 No, no, there's no electrodes. I mean, so one of.
01:22:56 The things, yeah.
01:22:57 I mean, one of the things that's extraordinary about this is there are.
01:23:00 No electrodes this.
01:23:00 Is a big machine. It's a functional magnetic resonance imaging machine. That means that people went into this machine non and basically without any electrodes inside of the brain that they.
01:23:12 Were able to.
01:23:12 Pick up blood flow changes in their brain.
01:23:14 And one of the things that's incredibly remarkable about.
01:23:16 The study is not only the precision of what they picked up, but then they decided to test and see if the model that they created, which was using GPT 1 generative AI we're all talking about ChatGPT. These days. They wanted to see if they could use that same classifier on a portable system. Not a big bulky device like F MRI.
01:23:36 So they tried it out on something called F, nears functional near infrared spectroscopy.
01:23:42 Which is a wearable device, something that people could use in their everyday lives to track their brain health, to track their, you know, everyday focus and mind wandering. They found that they could get the same level of precision using that kind of portable device with the classifier that they had developed. In other words, everyday mind reading may be possible.
01:24:02 Using devices that we may come to use in our everyday lives.
Speaker 8
01:24:07 Oh, so you don't need the giant MRI machine?
01:24:13 And notice how the way she was marketing that that, that, that wearable that wearable technology, wearables that used to be a big buzzword about 15 years ago, right wearables, remember the, you know, the Apple Watch. It's. Yeah, it's it's giving you all the the real time, you know, feed health feedback. It's to make sure you're sleeping well and.
01:24:32 And you know that your hearts not not not beating weird or anything like that. You need to wear this technology that's monitoring your your body.
01:24:42 And people, and that's that's totally normalized now. In fact it's that's that, that technology has has evolved well beyond just, you know, monitoring your heartbeat and having a pedometer or whatever else hooked up to you.
01:24:58 Look at this guy. He's marketing. These are earbuds.
01:25:04 That that scan your brain.
01:25:07 And like it's the same. It's the same ridiculous excuses, right? Of why we need this technology, like even she was saying like, oh, no, it's good cause it helps you track focus. Yeah. Cause you need that.
01:25:19 Right, you need you need an app on your phone that tracks your focus so you can look at your focus on a graph and.
01:25:26 Focus or something?
01:25:30 Right. That. That's. Is that what you need?
01:25:33 You need something that that tracks your focus.
01:25:37 And with everyone saying they've got all their ohh I'm I'm neurodivergent I'm neurodivergent I I've got all these. You know these these disorders and whatever else I'm on. Like a million different medications.
01:25:51 You know they're gonna want this. They're gonna want this, and that's how this guy markets his little brain scanning earbuds.
01:25:57 He's got a brain scanning earbuds because, well, you'll see.
Speaker 12
01:26:03 Where this technology was birth was at.
01:26:05 Google and in in particular the.
01:26:07 Moon shot factory and we had.
01:26:08 A saying there if you.
01:26:10 Wanted to teach monkeys to recite Shakespeare standing on a pedestal. Where do you start? You start with pedestal. No, you start with the.
Speaker 14
01:26:18 Monkeys, you're the monkey.
Speaker 12
01:26:19 And so I thought, why not start a presentation that's.
01:26:21 Hard. Let's do.
01:26:22 A demo so next.
01:26:23 Sense is doing personalized therapeutics for brain health. So what do I have here?
Speaker 8
01:26:28 You like that? It's it's personalized therapeutics.
01:26:31 For brain health.
01:26:34 They they they are masters at making evil sound palatable, aren't they?
01:26:41 Aren't they?
01:26:44 Oh, it's personal therapeutics for brain health. I'm hold on. Hold on, everybody. I gotta let me let me pop my personal therapeutic for brain health into my ear here. No big deal. It's just going to be scanning my brain.
Speaker 12
01:26:59 So I have our newly created universal fit earbuds. A year ago when I was on the stage, I had custom fit. These are now universal, so these are part of our conductive polymer that's here. And so it's we've worked a lot of put a lot of our R&D into the comfort and the signal quality. And so it just goes in like this.
01:27:26 There we go. And then you can see I have it on.
01:27:28 The other side as well and now.
Speaker 26
01:27:31 For a moment of truth.
Speaker 12
01:27:33 So to be fair, the device was on.
01:27:37 And it's connected to Bluetooth on my.
01:27:39 Phone do we have any neurologists here?
01:27:40 Anybody that could look at EEG?
01:27:43 All right, so here we go. I'm going to check the signal.
01:27:46 And we will see.
01:27:49 And indeed we.
01:27:49 Have signals, so can you verify that there's actually?
01:27:52 Signal coming onto the phone.
Speaker 5
01:27:53 Yes, all right.
Speaker 12
01:27:54 So and there's no neurologist here, so we can't examine my brain waves, but we can take.
01:27:58 A look at that afterwards.
Speaker 8
01:28:02 And then my favorite part is he decides he's going to tell you the story of why he thought that it's so important and and the first thing they're going to go after.
01:28:13 Because a lot of people, their minds are in fact the video I watched this week that that sent me down this path. One of the things that they brought up was.
01:28:23 Minority report, right?
01:28:26 Minority report.
01:28:28 You know pre crime, trying to to read your mind and stop you before you you commit that hate crime or have that hate speech come out of your mouth. Maybe you'll you'll you'll be arrested for hate thought, right?
01:28:44 This this is kind. I mean he he's talking about this is.
01:28:47 For brain health.
01:28:48 One second, right?
01:28:50 It's the, it's this.
01:28:51 Wearable technology to make sure your brains healthy.
01:28:55 But then what's the example he uses? What's the story he tells about about their their first thing they're gonna go after? Like what? What inspired him and his team to like, really?
01:29:06 You know, move forward with this.
Speaker 12
01:29:08 Came very recently. We acquired a company last year that was focused on in ear stimulation. The vagus nerve can be stimulated from the ear. We always wanted to have a read, write platform and I talked about that and have been talking about that. But a distressed market last year provided an opportunity to acquire the assets of a company that was doing in your vagal nerve stimulation.
Speaker 8
01:29:30 By the way, that's that. That's just a little side nugget. He, he. He's talking about two way communication now. They acquired a company that was researching in her in her ear technology because he wants his earbuds to.
01:29:44 Be two way.
01:29:45 He doesn't want the ear buds to just read your your brain waves.
01:29:50 He wants your ear buds to be able to transmit ****.
01:29:53 Back into your brain.
01:29:55 But that's just the little side thing. That's he. He kind of glosses over that real quick, but he throws that in there, like, oh, yeah, back when we in the distressed COVID market when you know, everything was bad and we, you know, we had the opportunity to acquire this company because I've always wanted these things to be two way. But anyway back to my story.
Speaker 12
01:30:14 But I was at 24 hour fitness last month. I saw a headlines that caught my attention. I don't know if anybody is familiar with the drug tragic story of the mass general nurse. Raise your.
01:30:22 Hand if you saw that in the the new.
Speaker 8
01:30:25 So he's talking about the mass general murders.
01:30:30 All right, now, now what that is.
01:30:39 It's weird that he chose this one. I don't know why.
01:30:45 I don't know if you guys are aware of this one.
01:30:48 So it's basically a woman killed her own kids.
01:30:53 And then claimed it was postpartum depression.
01:30:56 Even though she clearly planned it.
Speaker 7
01:30:59 Dawson and Cowen were face down in the floor. Cora was on her side with her torso toward turned towards the floor.
01:31:08 He removed the bands and begged them to breathe. He continued to scream uncontrollably and screamed for officers to come to the basement. The dispatchers are hearing this and they send help down to the basement, and when they encounter Patrick, he yells.
01:31:23 She killed the kid.
Speaker 1
01:31:24 We're learning more.
Speaker 28
01:31:25 Gruesome details about the murders of three young Massachusetts children as their mother, Lindsey Clancy, is charged in their deaths in Clancy's 1st court appearance since the murders. She appeared virtually from her hospital bed.
Speaker 7
01:31:40 On Friday, January 27th, 20.
01:31:42 23 using an erasable white board.
01:31:44 Because she was still temporarily intubated, one of the first questions that Lindsay Clancy asked was do I need?
01:31:50 Attorney. She knew that she had murdered her children and she.
01:31:53 Had the clarity.
01:31:54 Focused and mental acumen to focus on protecting her own rights and interests.
Speaker 28
01:32:00 Others outlined the events that led up to the murders on January 24th, alleging Lindsey Clancy meticulously planned the deaths of five year old Cora, 3 year old Dawson, and eight-month old Calin before attempting her own suicide by jumping out of a second story window.
Speaker 17
01:32:18 On the morning.
Speaker 7
01:32:19 Of Tuesday, January 24th, 2023, the defendant took her five year old daughter core to the pediatricians for an appointment. She interacted with the receptionist, nursing staff, and a doctor. There were apparently no issues with the defendant's demeanor or behavior as she completed the appointment and was allowed to leave with.
01:32:38 Or without any issues or concerns.
01:32:41 When she returned home, she went.
01:32:43 Outside with core and a three-year old son Dawson to play in the.
01:32:47 No, they built a snowman. The defendant sent photos to her mother and to the defendant straight back to her husband. She texted with them.
01:32:56 Nothing in the text.
01:32:57 Was out of the ordinary or any.
01:32:58 Sign of any distress or trouble.
Speaker 28
01:33:00 After that, prosecutors say Lindsay Clancy searched for the 3V restaurant in Plymouth, about 25 minutes.
01:33:08 From her home in Duxbury.
Speaker 7
01:33:10 We then search that take out three V via her cell phone at 4:13 PM. Immediately after doing that, she used Apple maps on her phone to determine how long it would take someone to drive from her home in.
01:33:24 Duxbury to 3V.
01:33:26 Restaurant in Plymouth so she would know how long someone would be gone.
01:33:30 If they ran that.
Speaker 28
01:33:32 It was then that Clancy texted her husband, Patrick Clancy, asking for takeout at.
Speaker 7
01:33:37 4:53 PM the defendant texted her husband, who was working in his Home Office, in their basement. She texted any chance you want to do?
01:33:45 Takeout from 3V.
01:33:47 I didn't cook anything. It's been a long.
01:33:49 Day this was an.
01:33:50 Unusual request.
01:33:51 As when the family ordered takeout, they usually go somewhere closer to home. But it was a place that they had been in the past.
01:33:58 Patrick Clancy texted back. Yes, and then the defendant asked him to check the menu.
Speaker 28
01:34:03 Minutes later, Patrick Clancy left the family home.
Speaker 7
01:34:07 At 5:15 PM, Patrick Clancy headed out the door to run these errands at the defendants request as he left, she texted him PDX Liquid stool softener.
01:34:18 Surveillance footage shows that Mister Clancy at CVS on Summer St. and Kingston at 5:32 PM He goes to the medication aisle, the children's medication.
01:34:29 Phone records show that he called the defendant at 5:33 PM and she did not answer the phone.
01:34:35 He then calls him back at 5:34 PM and the call lasted 14 seconds. He's there at the store, I'm sure. Which medication to get, and she tells him exactly what she wants. He had no issues communicating with her. It was a completely normal call, although he did mention that she seemed like she was in the.
01:34:52 Middle of something.
Speaker 8
01:34:54 She was the middle of killing her kids so clearly premeditated, clearly was trying to kill her kids and then kill herself. She failed. She jumped out of a window, paralyzed herself. That's why she's. That's her right there in the hospital. She's paralyzed.
01:35:12 I think from the neck down.
01:35:15 Yeah. Killed her three kids. Well, after sending her, you know, totally calculated. Sent her husband from the. I guess he worked at home from the basement to go get takeout and then murdered the kids and jumped out a window.
01:35:30 But you know.
01:35:31 Postpartum depression, and if only she had those magic earbuds.
01:35:37 If she'd had those magic earbuds.
01:35:40 Then they would have known that she was gonna murder her kids because postpartum depression.
01:35:47 That's what he says.
Speaker 12
01:35:49 So the headline was Dad asks public to forgive his wife. Turns out.
Speaker 6
01:35:55 Which first of.
Speaker 8
01:35:56 All. What a ******* **** like murders your three kids. Please forgive my wife. She had postpartum depression.
Speaker 12
01:36:03 So the headline was Dad asks public to forgive his wife. Turns out this very lovely woman. She was a mass, a general nurse, had actually strangled her three kids.
01:36:15 While suffering from postpartum depression so that company that we bought has an FDA breakthrough in postpartum depression for vagus nerve stimulation, so that is convicted me that the read write platform is time to come and we're actually going to put postpartum depression first.
Speaker 8
01:36:32 The read write platform.
01:36:38 He's talking about reading and writing.
01:36:42 To the brain like a ******* hard drive.
01:36:47 Because some crazy ***** murdered her kids.
01:36:55 The time for the read write platform has come.
01:36:59 We've had all these breakthroughs with postpartum depression, so we need the need, the read write platform in my magic ear buds that ZAP your ******* brain.
01:37:09 Like this stuff? This stuff's a little far off, but I mean, I don't know. I keep saying that and then like, next thing I know it's, you know, it's invented. Alright. It's like oh, never mind.
01:37:17 I guess it works already.
01:37:23 And it's not just these companies, I mean this company I believe is funded by Google the we had researchers from I think what was it the UT and then Berkeley. But it's not just, you know, like the research phase. And it's not just a small potatoes, not that Google is small potatoes.
01:37:43 Or the Berkeley, or really any university with the kind of endowments they've got these days, or small potatoes. But you also have Mark Zuckerberg.
01:37:54 With his AI.
01:37:57 Wanting to, in fact, it's funny because.
01:38:00 You might remember the big metal launch right, the metal launch that went nowhere because no one wanted to be in second life with almost somehow miraculously, worse graphics than second life, and with nothing to do, it was just like, hey, let let's put a headset on and hang out on Facebook or.
01:38:19 They didn't make any sense, and I and I did a stream about that where I said like, this doesn't make any sense.
01:38:24 They've tried to do virtual reality a bunch of times that never ******* works because the technology is not there yet and it's not right. It's not. Not yet, but this is this is, I think, one of the reasons why Mark Zuckerberg was so excited about it. Because it makes you wonder, how would how would Facebook make such a a huge miscalculation when it comes to?
01:38:45 Rolling out a product like that like this complete like I can't. It's probably one of the biggest failures of a major.
01:38:53 You know Silicon Valley Company in the last 10 years, this major miscalculation that people would wanna wear an Oculus and then hang out on Facebook with their friends wearing Oculus. Like it doesn't make any sense. Right? Well, I think it makes a little more sense.
01:39:11 When you realize Mark Zuckerberg was looking at the bigger picture and had access to all the other projects that that Meta was was researching at the time and.
01:39:22 He was just thinking, you know, again like so many others before him, he didn't realize that no one wants to wear a big ******* helmet and be in some gay 3D World for. No.
01:39:33 Reason because he was looking at other things that were going to make it in you.
01:39:38 Know make it sort of interesting, or at least interesting from Facebook's point.
Speaker
01:39:43 With you.
Speaker 8
01:39:44 And that is using AI to generate endless worlds. A lot of people have seen, like the Unreal Engine demo that was out earlier this year where they can essentially keep generating endless maps for games where it's just, you know, you.
01:40:03 Can keep going.
01:40:04 Down the street, you'll never hit the end of the map. You know you're never gonna hit that that invisible wall that you would in video games before, because it'll just in real time, keep generating more and more houses or built, you know, whatever, right. It'll just be this infinite world and.
01:40:21 One of the things that that mark here was thinking about well, thinking about this, this sort of an option was implementing their new kind of AI that makes a huge difference in terms of how it it works and how well it and and and it or how the results are by changing this.
01:40:42 And that is in the previous example. We were looking at the MRI machine that was looking at brain scans and looking at patterns between the brain activity and the stimuli that was exposed to the the people getting scanned, right?
01:40:57 In those examples, the model, the learning model or the the machine learning was being informed by the researchers. You know, like in the example the the video right where they played that video to the people watching in the MRI machine and then they would tell the AI ohh this was they were looking at a drain. They were looking at.
01:41:19 So you know, here's here's an image that they were looking at when they had this brain activity.
01:41:24 And also you know like for example when you have the kinds of AI that well that Facebook used prior to.
01:41:32 This that when they are or famously Google, remember when Google had to turn off their AI that was tagging black people as gorillas because it was. It was they had their it was one of their first AI's that would automatically look at images and try to determine what the image was and then tag that data on your photos when you uploaded your photos.
01:41:53 But unfortunately, the AI couldn't figure out the difference between black people and gorillas, so it was tagging black people as gorillas. And of course, they, and rather than fix it, because they.
01:42:04 Couldn't I mean?
01:42:05 Hopefully it's fixed by now, right? But rather than fix it, they just turned it off so they made it so it wouldn't tag black people or gorillas.
01:42:14 Because they couldn't, they couldn't get it to to tell the difference. Well, the way that you that AI at that point of the, you know, I guess the evolution the way it was able to tell the difference between anything is humans were.
01:42:30 Telling it so.
01:42:31 In other words, they would feed the AI.
01:42:33 With like 1000 pictures of a cat, you know different cat.
01:42:37 And every time they would tell the I that's a cat. That's a cat. That's a cat that's a cat. So then after, you know you.
01:42:44 Told it what 1000 different cats look like the next time that it saw a cat, it was like, oh, a cat, you know, and then it would just know.
01:42:53 That it was a cat.
01:42:56 You know, you had to deal with everything. You know you had to. In fact, they there was like a mini scandal that they had basically sweatshops in 3rd world countries where they were feeding the AI all these images. And they was like basically some, you know, jungle Asian guy with a keyboard typing and, you know, dog cat.
01:43:16 You know, bicycle, you know, like whatever it was right. And and that was where these data sets were coming from. It was basically like these, these massive sweatshops like these call centers, basically, these warehouses full of Asian people typing in, like, what every what every picture was that they were looking.
01:43:32 At and that's how they I would know.
01:43:36 Well, they started thinking about it and they said well.
01:43:40 The biggest reason why we need to do this.
01:43:44 Is so that the AI knows you know that you know it can tell that that all these similar photos are are cat, but the AI doesn't need to know that it's called a cat.
01:43:56 The AI can call it whatever it wants in its head, right? The AI can categorize this however it wants, as long as it's able to tell the similarities between different photos. Why do we have to even tell it that it's a cat? Just show it 1000 pictures of of any.
01:44:14 Thing and it doesn't even have to be the same thing. You could show it, you know, in that thousand photos. 100 of those photos could be of a cat, 100 of them could be a dog, 100% a hundred of them could be like an apple. Whatever. And the AI will determine, oh, these hundred photos look the same. I'm going to call it, you know, a 9634, you know, which is.
01:44:33 Apple or or?
01:44:34 Or dog or cat or whatever, right and so. But it doesn't matter what it's calling it, right? It just has to know that they're all the same and they categorize them the same.
01:44:43 One way, and then you don't have to have these big sweatshop shops and you can basically just unleash AI like it's a little kid running around in the world doing pattern recognition the way that a a child would. And that's what he was so excited about when he was releasing meta, thinking that he'd be able to implement that technology in in the gay.
01:45:03 A second life that that no one wanted to be in.
Speaker 29
01:45:05 All right, that's pretty good right now as we advance this technology further, you're going to be able to create nuanced worlds to explore and share experiences with others with just your voice.
01:45:17 But there are a lot of challenges that we still need to solve to get there. One is developing true multimodal AI. You know, a lot of the early AI work has been focused on text. When you have a clearly defined syntax, a finite set of input words, and a lot of easily available training data, predicting how a sentence.
01:45:37 End can be relatively straightforward, but if you only have 10 or 20% of an.
01:45:42 Image predicting what the complete image will show is a lot more difficult and figuring out what scene in a video will come next is another step change in complexity. So now imagine going beyond video to fully immersive experiences. What will it take for AI to accurately interpret and predict?
01:46:03 The kind of interactions that will happen in the metaverse where people are moving between physical and virtual spaces and creating all kinds of new world.
01:46:12 Is the main way that we have approached this is by working on self supervised learning and before SSL. Most AI systems were trained with supervised learning.
01:46:25 That means that you feed them lots of labeled data, say 100,000 images of cats, and explicitly tell them this is a cat. This isn't a cat until they recognize some pet.
01:46:35 But Yan Lecun, our chief AI scientist, believed that this wasn't going to be enough to produce systems that can really understand the world. So we made a big bet on self supervised learning, and the idea here is that you don't teach the AI any specific concepts, you just give it raw data.
01:46:55 And ask it to predict the missing parts and it will learn abstract representations along the way.
01:47:01 And this actually seems closer to how the brain learns. For example, you don't need to show a kid thousands of pictures of a cat for them to understand what a cat is. Now this has become the primary method of training AI to understand natural text, and it is now achieving state-of-the-art results for images, speech and other types of data too.
01:47:22 In fact, self supervised learning now outperforms many other existing methods for images and video, even models that rely on millions of labels.
01:47:31 Which is a huge step.
01:47:32 Forward and while self supervised learning is still developing, we think that it's going to be an important tool for the metaverse because the complexity and diversity of the environments that people will experience in AR and VR will be too great to be captured with labeled data sets. Traditional computer vision techniques also aren't going to be enough to support.
01:47:54 That real sense of presence and interaction that will define.
01:47:57 And the metaverse, so to help advance the state-of-the-art in systems that can see and understand the world like we do, we recently brought together a global consortium of 13 universities and labs to work on EGO 4D, the largest ever egocentric data set with thousands of hours of first person.
01:48:17 Video and benchmark tasks that everyone can use to research and build. So if you're working in this space, I highly encourage you to check this out.
Speaker 8
01:48:29 So basically.
01:48:32 It's almost like they're trying to make Skynet and the matrix at the same time.
01:48:39 Or like a Skynet matrix.
Speaker 5
01:48:43 So what basically.
Speaker 8
01:48:45 What they're talking about now with the the self supervised learning, is it really is going to be I mean self it's in the name right? It's self supervised, right? They're going to get to a point where it will get too complex for them to understand.
01:49:02 They are almost guaranteeing a scenario where their creation will outgrow them and they won't have a good understanding of it because the way that it's growing is self supervised. Right now it's just limited to understanding objects, right? So for example this is, you know.
01:49:21 Dino was the the first version that was supervised. I guess you could say where they feed it. You know, pictures of dogs and say this is a dog.
01:49:30 And so when asked there, you know to to find the dog in the footage. That's that's their first version. The top one there, Dino was that's that's the dog. But when they had the self supervised version though version where they, they weren't telling it what a dog was or anything like that. The the bottom image there.
01:49:51 Is doing a far better job and is recognizing.
01:49:55 Not just the the shape, the outline of the dog, but also the depth information is a lot more accurate.
01:50:03 And so it's it's.
01:50:06 Like I said, it's almost creating a scenario where because don't think that this won't also apply to when like they're going to get to a point right where they're saying, wow, it really is complex to write the new code for the AI, why wouldn't?
01:50:20 I just have.
01:50:20 The AI write the code for itself.
01:50:26 Right. Why wouldn't I?
01:50:29 The AI is going to become a.
01:50:31 Much, I mean it already is. I I've used ChatGPT 3 to to code scripts and for cinema 4D and stuff.
01:50:40 And it works.
01:50:42 It works amazingly well. I just tell it write me a script that does this and it just does it and and and it. And if you're an idiot, it doesn't even just do it. It does it and it tells you where to copy and paste it and everything.
01:50:56 So why wouldn't you use the AI that you're designing to?
01:51:02 Further, the the capabilities of that AI.
01:51:07 Because the AI is going.
01:51:08 To know better than anyone.
01:51:11 What capabilities need to be improved, what the limitations are, and what the where the bugs are? I mean, it's going to be able to understand better than anyone how to alter the code without creating more bugs.
01:51:29 So they it's like I said, it's like they're trying to make Skynet and the matrix at the same time.
01:51:38 And uh, Facebook or I guess meta or whatever you want to call it. They are also working on the same kind of projects that we covered earlier with the.
01:51:48 Image recognition stuff.
01:51:51 Now here is this is real.
01:51:53 Time this is real time. This isn't it has nothing to do with MRI scanners. It's not invasive. It's essentially using some kind of wearable technology where they're showing people pictures and in real time, the wearable technology is reporting the data that it's getting from the brain.
01:52:15 To an AI and the AI is predicting what you're actually looking at.
01:52:20 And so on. The left hand side you have the actual image that is showed for one second.
01:52:27 For one second to the the human wearing the wearable technology.
01:52:33 And the image on the right hand side is AI predictions of what it is they're looking at. And as you can see, because it's real time, this could be applied to video.
01:52:45 You know.
01:52:46 You could be walking around and the AI could be predicting essentially what you're looking at just. And you gotta remember this, this is going to get better. This is this is the the amount of data that they're scraping, I guess from human brains right now is pretty Rudimental.
01:53:03 Because that's going to be that's the that's the the hang up right now.
01:53:07 That's the the hurdle right now. It's not that their AI can't perform the necessary calculations to predict what it is you're looking at just by looking at your brain waves or or whatever brain activity is going on. It's a limitation of the ability to harvest that brain activity.
01:53:27 And that data in the first place.
01:53:29 And I guess to some extent, I would imagine whatever computer is having to do this.
01:53:36 It's probably, you know, it's it's not small, it's not like a a phone. You know, the the computer that's actually doing the computations, that's not. That's not happening in locally, right. But that doesn't have to happen locally. This is the kind of thing that I was talking about when people started getting worried.
01:53:53 About 5G.
01:53:55 You know, they said 05 G is dangerous because of the frequency. And look there, there is an argument to be made that the frequency that that uses closely like the the interaction between that frequency and water molecules is is not always good.
01:54:13 Like, because we're made-up of mostly water and the 5G has to transmit it at at fairly high power to get anywhere because it's such a high frequency.
01:54:22 So, you know, like there's an argument to be made about, like, not wanting to have 5G just for the the radiation. And there's been zero safety tests done. You know, there's been zero safety tests done to figure out if there are any.
01:54:38 Dangerous effects of having that kind of radiation exposure, which that's the problem, is in this whole industry, there seems to be 0 testing at all like what the dangers of any of this stuff are, right? It's like it's it's it's well it's the the, the Jurassic Park thing. You know they were they're so worried about whether or not they can, they don't stop to think.
01:54:58 About whether or not they should.
01:55:00 And so that's what's going on here. But it's more more of a danger that that, that fat.
01:55:04 Of a pipe.
01:55:06 To move that kind of a data from your smart fridge from your Apple Watch from your your brain scanning earbuds to a remote server that then could have the ability to come up with the, you know, the have the the the processing power necessary to make these kinds of predictions.
01:55:26 Imagine just as a look just based on the technology we reviewed tonight, you can imagine a scenario.
01:55:35 Where you're wearing earbuds.
01:55:38 Listening to music completely unaware.
01:55:41 That this sort of process is even taking place, right? Because it's all happening in the background. You could have earbuds that are equipped with technology that are that that are able to acquire enough information from your brain to at least come up with, like a rough representation of what you're looking at.
01:56:02 Like you know, as you can see in these these these examples here and literally they could see what you're seeing.
01:56:13 They could see what you're seeing. They can hear what you're thinking.
01:56:17 It would type out what you're thinking.
01:56:20 It would know what you what you dreamed about.
01:56:23 It would know.
01:56:26 Really, anything that you read?
01:56:30 Anything that you heard?
01:56:33 And that's that's not too far off.
01:56:36 From the capabilities that they have right now.
01:56:39 You know that's that's not like.
01:56:40 Oh, in a million years.
01:56:42 They're going to have this now.
01:56:44 That's that's like now.
01:56:46 That's well, as I said, that's ten years ago.
01:56:50 It's it's right now it's just a matter of them perfecting and miniaturizing the technology that it that is required to perform the scans of the brain.
Speaker 5
01:57:02 Now the other side of this.
Speaker 8
01:57:05 You know, we've talked about the AI part of this problem.
01:57:09 And we've talked about the technology.
01:57:12 That that is.
01:57:14 Definitely going to be an issue moving forward and This is why I I keep trying to tell people that on our side you can't be, like afraid of of science. You can't be anti science, you can't be the I ******* hate science guy or or else inevitably you will be.
01:57:31 A slave to this.
01:57:33 You will have no say whatsoever and where this technology goes and you will be a slave to it.
01:57:40 You will be.
01:57:42 You and and you probably won't even know it.
01:57:44 You won't even know it.
01:57:47 So This is why it's important to know about this stuff and not be the I ******* hate science guy. Now the other side of this coin.
01:57:55 Is it's is the the wet Ware I guess you could say we've talked about the hardware and the software, but what about the?
01:58:01 Wet Ware.
01:58:03 Well, that technology has been.
01:58:05 Around a long time too.
01:58:08 They have been growing neurons.
01:58:12 That they've harvested from rat brains.
01:58:17 In a laboratory.
01:58:19 And trained the neurons, grown the neurons in Petri dishes.
01:58:24 And got them to perform very simple tasks like this was several. Well, I'd say at least 10 years ago they got rat brain neurons in Petri dishes, essentially.
01:58:36 To fly on Air Force Flight simulator.
Speaker 4
01:58:41 Rats are good at mazes too, but their brains are no match for humans, right? Well, biomedical engineers from the University of Florida found a way to make rat brains do something most people couldn't even dream of.
01:58:56 Flying F22 fighter jet.
01:59:00 They collected 25,000 neurons from a single rat embryo. They put them in a Petri dish filled with liquid nutrient and watched them sprout nerve fibers. Before long, the tiny cells began to connect and communicate to form in many brain.
01:59:18 To test its IQ, they wired it up to some electrodes and put it inside an F22 simulator.
01:59:25 The idea was to.
01:59:26 See if the neurons could keep the.
01:59:27 Plane in level.
01:59:28 Flight rather than descend it into a tailspin, it sounds like a tough job, but the most basic circuits deep in the human brain.
01:59:36 Actually do a similar thing. They keep our limbs from flailing uncontrollably.
01:59:43 Scientists handed over the controls mid flight and amazingly, when the jet pitched too far left, the brain sent back messages that leveled it. There's potential in all neurons.
Speaker 8
01:59:58 Now the ****** ** thing they.
01:59:59 Don't tell you about.
02:00:02 I mean, not that that's not already ****** **, right? That they basically made some kind of mutant brain Organism because it was a living Organism. They they they harvested neurons from a a rat fetus.
02:00:16 And then they they grew them in a Petri dish. They had to keep feeding it nutrients and grow it the same.
02:00:22 Way it would.
02:00:22 Have kept developing until it formed connections.
02:00:26 And then they taught it to flight, fly a flight simulator. What they.
02:00:30 Don't tell you the the the part of that story, because how would it know, right? Like what it what do you mean it learn to fly a plane that's making sense like.
02:00:38 They're oversimplifying it. What they don't tell you is.
02:00:42 They punished the brain.
02:00:45 When it crashes the plane.
02:00:47 And they reward the brain when it doesn't crash the plane.
02:00:52 Because that's the way your brain works, right? You touch a hot.
02:00:56 Fire and you.
02:00:56 Know. Ohh, that's hot. Don't touch that anymore and your brain learns that was bad.
02:01:02 You know, that's why sex feels good, right? Because.
02:01:05 You need to keep doing it to keep making more people.
02:01:09 So the people that, that, that, that make more people are the people that that that feels good.
02:01:14 So it's it's sending eating right. If you don't eat, you die. So when you eat it it it feels good. You're like Yum, Yum Yum. That was good. So in order to get this Petri dish, rat brain to fly that jet, they had to basically punish it with pain and reward it with pleasure. And they have no idea what that actually.
02:01:35 Feels like and it's kind of this weird thing it's like, is it? I mean, it's alive. It's it's a living thing. And it didn't just stop at that. I mean, they got. They also got. There's a company. In fact, I think they're in Australia.
02:01:48 That is producing. They're trying to to create biological computers this way.
02:01:56 They're trying to create biological computers that that you would be able to interface directly with an AI, so it would be like cyborgs. So they're trying to create.
02:02:06 Skynet, the Matrix and RoboCop.
02:02:11 Like like all at the same time, it's like everything that that 80s and 90s sci-fi warned us about. They're trying to do all of it.
02:02:19 So there's a company like I said, I think there Australia that is trying to create these little neuron computers and it they got.
02:02:28 It to play pong.
Speaker 14
02:02:29 We built this.
02:02:30 Simulation and we taken these brain cells.
02:02:33 And then we put them in a special kind of Petri dish, ones that actually have electrodes on them. So I have one that I can show you looks kind of like this. You can see the lines and and traces that go to the center bit. And this is a cat that keeps the the system alive by confusing the same liquids that are sort of in our heads at the moment.
02:02:54 And what we.
02:02:55 Then do is.
02:02:55 We put these brain cells on this bit of electrodes essentially give the brain cells the ability to move this header up or down. They get to choose do. I'm gonna go up. Do I wanna go down and we tell them the information of where the ball is the.
02:03:09 Model and through you know, experimentation and figuring out, you know.
02:03:15 What what systems can?
02:03:16 We use to sort of push the neurons.
02:03:18 To move the pedal up.
02:03:20 More often than not.
02:03:21 To to hit the ball by then more often than statistical chance, we.
02:03:27 Were able to show within 5 minutes.
02:03:28 That they were able to control that.
02:03:30 To play the game. So I think this is a massive breakthrough because before our work it was uncertain if brain cells in a dish were able to perform the same kind of cognitive tasks that we, you and I assume would happen in our head.
Speaker 8
02:03:49 Enter the Happy xylophone music. That's when you know that they're really doing ****** ** ****. So that's what they do. They grow the the neurons in in, in these little dishes that they're hooked up to electrodes and and then they have a biological CPU, essentially.
02:04:05 You know they've they've also done one that controls a robot.
02:04:09 Is this the right one? Oh no. This is a different one.
Speaker 26
02:04:27 We have an aging society, particularly in the Western world, and so problems such as Alzheimer's disease, Parkinson's disease, even stroke, are going to be much more prevalent and much more of a problem for people. What we're doing here with this research is trying to understand.
02:04:44 Some of the basic characteristics within a brain.
Speaker 8
02:04:52 Cells in a dish to grow control robots anyway, and you might think, well, that's that's, you know, that's pretty sci-fi. Crazy stuff. That's not. That's not something that's going to be mainstreamed or anything like that anytime soon. Ohh. Really that. That it's actually so easy to do.
02:05:11 And the research is so available.
02:05:14 There's actually Youtubers doing it right now. There's a YouTuber that's growing that's ordering rat neurons. I guess you can order rat neurons online now ordering rat neurons.
02:05:28 And putting them in a Petri dish and growing them too do.
Speaker 16
02:05:32 This inside your head is one of the greatest computers in the universe, which is surprising because I've seen your browser history. And yes, you do need to eat more fiber, but generally the human mind is one of the most capable learning machines.
02:05:46 We know of.
02:05:47 Composed of 86 billion neurons and another 90 odd billion glia and other cells with wires forming trillions of connections. And while humans do have very impressive brains, even small numbers of neurons are capable of learning complex behaviors. A few 100 can control microscopic animals like tardigrades, a few 10s of thousands.
02:06:07 And control insects.
02:06:09 Billions and you've got a mind capable of reason, logic, math and art. But what if we could grow the neurons directly attached to a computer from the start? Then we might be able to use them as a tool to control things for us. Living neurons are a lot more capable of learning than the simulated ones we use for AI, at least in theory. So we're trying to do just that.
02:06:28 This weird looking device is a prototype neural multi electrode array. It's got a bunch of microscopic electrodes in it with 10s of thousands of living brain cells.
02:06:38 Growing on top.
02:06:39 And today, we're going to be talking about how we made it and how to grow neurons. But those electrodes aren't just decorative. This array is designed to be connected to a computer, and through that connection, the neurons are being trained to play doom.
02:06:54 Well, that probably sounds impossible. It's actually anything, but we're building on decades of neuroscience and previous examples of groups using living neurons to control things.
Speaker 8
02:07:06 So a YouTuber can can grow his own neurons.
02:07:10 And basically duplicate this research that's already been done.
02:07:15 To make a biocomputer that he hasn't been successful yet in, I mean he's been successful in growing the neurons and stuff like that, but eventually they hope to get it to play doom.
02:07:32 There you go.
Speaker 5
02:07:37 Oh boy, this is the world we live in, guys.
Speaker 8
02:07:49 We're going to have the.
02:07:52 Grab Skynet, RoboCop and the matrix.
02:07:59 All the same time.
02:08:01 All the same time, the trifecta.
02:08:04 So anyway, that's your. That's your black pill.
02:08:09 That's your black pill for the evening.
02:08:12 That this sort of technology is, is moving, moving rapidly moving rapidly and this is the kind of technology that will be used to well and and because of the kinds of people that I can't stress enough, This is why we need people that yeah, it's great that we do homesteads and and homeschool and all this other stuff, but you.
02:08:33 Int ignore the direction and the speed at which technology is moving and you can't be one of these ******* that are like oh, it's all fake. Everything's fake. Everything I don't understand is fake look.
02:08:47 We don't need you. We don't need you. You're clearly not capable of understanding this stuff. So you're not going to be able to help out with the problem. That's fine.
02:08:55 That's fine, but for those of you who do understand this, we need those John Connors to not just be good at like, you know, cleaning an AR15 or or field stripping a gun or something, you know, like growing vegetables or whatever. This is the kind of technology whether we like it or not, it's going to keep moving forward. And the more that.
02:09:15 The people on the right shun technology and science. The less you're gonna have a say in where the the direction.
02:09:22 Of this stuff goes.
02:09:24 And eventually, like I said, you will be enslaved. Maybe not. You know, maybe I'm being a little little hyperbolic when I talk about Skynet and the matrix and stuff like that.
02:09:34 But some form of that will most likely exist, and it's on the horizon. If you look at the the advancements in technology like it's it even just if you look in the past, look in the past, I mean there's people today that were born before, I mean, you know, not just before the Internet but before really any kind of computer.
02:09:55 And you have people that were born that didn't even have a landline phone in their house.
02:10:02 And now you've got a cell phone.
02:10:04 You have people that were born where it wasn't unusual to have to have a horse and buggy because there weren't a lot of cars.
02:10:14 You know and and and look at what what we're.
02:10:16 Doing today.
02:10:17 So this kind of thing it's it's the same thing. It might not be a huge threat to you.
02:10:23 Your kids might even only see the the beginnings of this, but this is absolutely something that, well, I don't know. Maybe maybe you're seeing the beginnings of this. I guess you could argue.
02:10:35 But absolutely, this is something that, you know should World War 3 not happen or.
02:10:41 Even if it.
02:10:41 Does I refuse to believe that these people are all going to be if, if there was some cataclysmic event that the the fantasy?
02:10:52 The fantasy of the The the zombie apocalypse, the fantasy that you know that.
02:11:00 That preppers think about that, it's going to be this world ending cataclysmic event that's going to send us back to.
02:11:06 The Stone Age.
02:11:08 The likelihood of that is almost 0.
02:11:12 It's almost 0.
02:11:15 In any kind of lasting way.
02:11:18 Sure, there could be EMP's and whatever, and there can be localized disasters that that wipe out technology and whatever. And I'm not saying you shouldn't prepare for that. You you should and I do to some extent.
02:11:31 But to think that there's going to be some kind of, you know, zombie apocalypse where we're knocked back to just fighting with, like, swords and bows and arrows and **** like that, there's gonna be some kind of like rage virus or whatever, that's all that's, that's all, stupid imaginary ****.
02:11:48 And it comes from, I think, the anxiety that humans feel, knowing that this is what's coming.
02:11:55 They want to have a a future that and it's a little telling, isn't it?
02:12:00 That humans.
02:12:02 Many humans, I think.
02:12:05 When they look at the future, a future where they're having to fight off hordes of zombies with with machetes is preferable to this kind of a future that we're most likely headed for.
02:12:20 And pretending that it's all made-up because you don't understand it and is is, you know.
02:12:25 Again, like I said, really doesn't matter what you think because you're not going to be anyone that is part of the conversation anyway.
02:12:32 This stuff is coming, some of it's already here.
02:12:36 And we need to be aware of it.
02:12:39 Now that that said, we can't be hysterical and think that, like uh.
02:12:43 The answer to that is to completely separate from technology because.
02:12:48 I don't think that matters.
02:12:51 Not only that matters.
02:12:54 You know you will be at the mercy of what the ruling class decides to implement.
02:12:59 You can try to.
02:13:03 Quarantine yourself and your family from maybe some of the more invasive.
02:13:10 Uses of this technology but.
02:13:13 Not be able to understand the technology is not going to be useful when eventually you will have, you'll be faced with scenarios where you have to.
02:13:26 So that that's uh.
02:13:29 That's the black pill for this evening.
Speaker 14
02:13:36 Ohh honey.
Speaker 8
02:13:38 Honey, isn't it funny? We were talking about refrigerators in the car and now Facebook is showing me refrigerator ads.
Speaker 5
02:13:48 Oh, that's that's spooky.
Speaker 8
02:13:51 How quaint.
02:13:54 How quaint.
02:13:57 And look how long look how long.
02:13:58 It took right.
02:14:00 They went from that being like weird and spooky to now. Everyone just kind of expects that.
02:14:06 And the same thing is going to happen when you got to think of it this way.
02:14:12 Not invasive. Brain scanning doesn't necessarily mean it has to be wearable in the future.
02:14:20 Probably in the in the in the immediate future, right. It'll be something that we'll need some kind of proximity, but don't don't think as if. I mean, if I'm, if I'm one of the ruling class beekeepers, if you will.
02:14:33 And I'm looking at this technology and the way it's going. The first thing I want is I'm going to want long distance scans. I'm going to want to be able to scan people from across the room.
02:14:43 Well, I mean ideally from space or something, but like you know, I mean like I'll take what I can get.
02:14:49 This isn't going to be the and you think it was bad with.
Speaker 26
02:14:53 You know.
Speaker 8
02:14:54 The Cambridge Analytica stuff.
02:14:57 That Peter Thiel was using with Trump.
02:15:00 Where the reason why Trump sounded so based back in 2016 was they were essentially brain ******, you know, boomers and other conservatives on Facebook.
02:15:14 Looking at the kinds of things they wanted to hear and they were just.
02:15:17 Spitting it back at them.
02:15:19 They knew what they wanted to hear because they were analyzing these big data sets and just telling Trump to regurgitate what people wanted to hear.
02:15:29 That's that's not like a that that's not one of those. Oh, the Russians, you know, Russia gate. Left wing conspiracy theories, that's real.
02:15:40 And that was just using Facebook posts.
02:15:44 That was just using the kinds of data that Facebook gives you access to. If you have like a Facebook app or whatever, right?
02:15:53 It's not very much information. It's not. It's not as granular or even as.
02:16:00 Real time is is something like what we're talking about tonight.
02:16:06 And you could very easily get. I mean, if you're, if you're able to tell people what they want to hear without them having even voiced it, you know.
02:16:19 In real time, if you can have real time brain scans of all the people listening to a speech you're giving.
02:16:29 Or maybe not even a speech you're giving, right? The speech that an AI is giving.
02:16:34 And AI they can. They can instantaneously it'll know before it even finishes. The joke that it's not going to land.
02:16:49 So this is the kind of stuff that we need to be.
02:16:51 Aware of and, we need to.
02:16:54 You know, just be ready for.
02:16:57 Because it's, it's inevitable. It's not. It's not something that you can just wish away. You know you can.
02:17:03 You can go off grid all day long and eventually this.
02:17:06 Stuff's going to come to you.
02:17:09 Anyway, let's take a look at Hyper chats.
02:17:13 On Odyssey, now I installed the thing on on Rumble, we have quite a few people watching over on rumble.
02:17:22 I was told to install an add on to the browser that would capture Hyper chats.
02:17:28 I'm assuming that it's working, but it hasn't captured any or I guess rumble rants is what they call them, right? I'm assuming that it's working, but it hasn't captured any rants, so I'm I'm assuming that means no one.
02:17:42 Sent them if you have sent them, then this is a work in progress. We're going slow. We're slowly trying to work out the kinks.
02:17:49 But the browser add-on that is supposed to capture those and hang on to.
02:17:54 Them is reporting that there's none there, so I'm I'm just going to assume that there's none there.
02:18:00 So let's go to Odyssey here.
Speaker
02:18:06 Take a look here.
Speaker 8
02:18:10 Graham playing games, I think your idea about a beekeeper reviews beekeeper of or a beekeeper is a good idea to put on YouTube. That kind of title is a current trend. It could get normies to watch it. It also could help direct people who think you just disappeared from YouTube to your streams.
02:18:29 And I think I'll do that as soon as I can get a copy of that movie. What he's talking about is I mentioned a previous stream. There's.
02:18:38 That movie The beep.
02:18:40 And it's it's just a movie that's it's super anti white and probably in terms of how they do the beekeeping, it's probably wrong too, just because Hollywood always gets everything technical wrong.
02:18:52 And so a good way and and you're right, especially on YouTube shorts, right. There's all kinds of beekeeper stuff. And if I were to do a YouTube short, that was just called a beekeepers review of the beekeeper. Especially once the movies like being pushed on on Netflix or whatever else.
02:19:11 That would get in the algorithm. It would seem you know, neutral. It wouldn't seem scary. And then I could start off criticizing the technical things they did wrong on the beekeeping and then point out about how everything that's anti white in the movie, which is pretty much the entire movie or at least from what I saw on the.
02:19:31 On the trailer. So the second I can get a copy of that movie, I'll probably do that. It'll be a quick and easy one and.
02:19:36 Maybe even on TikTok, right? Maybe all those people that were big.
02:19:40 Into bin Laden's.
02:19:42 Supposed letter.
02:19:44 Will would have maybe get some of their gears turning. We can only hope, right?
02:19:52 Maratta it ain't cool being no jive Turkey so close to Thanksgiving.
02:19:59 I don't know does that does that, does that? I don't know if that even.
02:20:01 Deserves a womp.
02:20:05 I gotta find whatever.
02:20:12 There's your whomp. There's your whomp Marada.
02:20:16 John Jacob Jingleheimer Schmidt.
02:20:19 Everybody hit the fire button. That's right. Everyone hit the fire button and share this. I mean, it's a little late now to share it, but maybe it's not. The replay will be up on rumble. Hit the the thumbs up.
02:20:31 And yeah, let's get up in the algorithm there.
02:20:35 Brody, when you take a long driving trip, what do you do before you leave to make sure classified cat and churro and churro survive your abs?
02:20:46 I have. Luckily there's locals that I know that are willing and I I don't watch. I watch their place when they're gone and they watch my.
02:20:56 Place when I'm gone.
02:20:58 And so that's always good. But I've got. I've got a CCTV and stop. You know, I got, like, things I can do to to, you know, guarantee the safety of the.
02:21:10 Of the pillbox. I also have auto feeders, but the the auto feeders don't work so great so that's why I like to have someone actually physically check up on the place and uh, make sure everything's OK. I mean you never know right there. There could be a weird storm could knock a big hole in the.
02:21:26 Roof. So you always got to worry about that. But I'm I am back in town.
02:21:30 I am back at the pillbox for the.
02:21:33 Ohh pretty much I think for the foreseeable future, at least for like the next little while. So I just had to do some traveling.
02:21:42 That time of year holiday season go see some family and Thanksgiving. What Thursday? Right. Isn't Thursday Thanksgiving?
02:21:49 So we're coming up, have to do a a Turkey stream. Maybe we can do a Thanksgiving stream where we talk about some spooky stuff like the like, maybe like the missing colony. Remember the there's that island. I forget what it's called. Maybe I don't know. Something like that could be fun. Talk about.
02:22:10 The first interactions of the white man had with the engines here in in North America. It could be fun.
02:22:21 Uh. Let's see here.
02:22:25 Zazi metas bot. Thanks for the show. Manatees have been taken off the endangered list. When we when can we start hunting them? Seriously, **** manatees. What exotic meat would you like to try if you could have a chef properly prepare it for you. I had Beaver sausage once.
02:22:44 A sausage made of Beavers from fur.
02:22:49 I don't know. I feel like Beaver would. It would just be like a, like a rodent, right? Isn't it? Just like a big ******* rodent? I don't know if I'd want something like that. I'm pretty satisfied.
02:22:58 With the the amount.
02:22:59 Of meats that I have access to. I don't. I don't really have anything. I I think like, Oh yeah, if only if only I.
02:23:05 Could have.
02:23:05 That I mean.
02:23:06 Maybe I don't know what I'm missing out on. Maybe elephant is delicious.
02:23:09 Or something I don't know, but I I'm down with just with my my cows, my cows. Maybe, maybe. Maybe some kind of.
02:23:19 Maybe Hippo is good. I bet Hippo is probably pretty good. Maybe I would try a hippo steak.
02:23:24 I bet Hippo Bacon is probably really good.
02:23:27 And who knows, maybe Manatee Bacon's good too.
02:23:31 The logical extremist I pimped defiance in a federal suit par, 85, thought it was funny. You're immortalized in Florida now. Well, that's pretty cool. I'm part of a a federal lawsuit, or at least defiance.
02:23:48 Yeah, I have to.
02:23:48 Check that out.
02:23:50 Appreciate that logical extremist.
02:23:53 Uh suicidal white male has some uh.
02:23:57 Poetry I can't read on rumble.
02:24:04 Borderline on fed posting.
02:24:05 There actually I don't know. You might be a little bit.
02:24:08 Beyond borderline on that one.
02:24:12 Yeah, yeah, you might. Might want to SIM it down.
02:24:14 There a little bit.
02:24:17 Starting to get into get into.
Speaker 5
02:24:21 Also, he stood up.
Speaker 8
02:24:22 Territory there, you know.
02:24:28 Appreciate that anyway.
02:24:31 Riddle of steel. Hey, Darren, last week I sent you the American History X deep fake for flat Earthers this week. I made you a 25 second super chat bumper. Hope you like it. Alright, let's take a look at.
02:24:44 It. Let's see here.
02:24:51 Oh, you know what? It might just download.
02:24:53 That'd be convenient if it would.
02:24:59 Hopefully it doesn't kill the connection odyssey sometimes will.
02:25:03 Will rape my connection for some reason when I go to things from Odyssey?
02:25:13 Well, it's still download it's download really slow.
02:25:17 All right. Well, I'm going to let that maybe download. We'll check on it here in a minute.
02:25:25 OK.
Speaker 26
02:25:28 UM.
Speaker 8
02:25:32 Homeland, Devin, have you heard of the five red heffers that were covered in Texas and sent to Israel to be sacrificed after the Third Temple is built, there is a ranch in Texas that clones polo horses, and it's very possible these heffers are cloned, too. I I didn't hear about that specific case, but I've heard that they've been genetically engineering.
02:25:53 Trying, you know, trying to make a red heffer for the, you know, the end of the world. Antichrist stuff.
02:26:00 Not surprised that it's coming from Texas. I I can imagine they kind of evangelical family that would think that was a.
02:26:06 Good idea. Hey, let's.
02:26:08 Let's let's try to usher in the Antichrist. That's that's a good idea. Yeah, but yeah, we don't have to worry about it until they blow up the Alaska Mosque mosque. I I feel like that. That could be.
02:26:20 Right around the corner.
02:26:21 But that has to go before they.
02:26:23 Can they can?
02:26:26 Do or until they can fulfill that that part of their prophecy. So.
02:26:30 You know it's. I think they're they're kind of putting the the the the so the cart before the horse I guess the.
02:26:38 The the bull before the temple.
02:26:41 Hammer thorazine, hammer. Of thorazine.
02:26:56 Ember authorizing. Have you taken any measures to harden the pill box against the MP's? Just curious what ideas you might have. So far I've only done EMP Shields on my car and have some hard drives in Faraday cages. You know the an easy way of doing it. And again I think it's a pretty remote thing that possibility. But you know, it's not not a bad idea.
02:27:17 If you're if you live near an area that you think could be a nuclear target, then I guess it's not as remote.
02:27:23 Really the easiest way and the cheapest way to do stuff like that is just it just has to be insulated. You know, shield it with metal. So you could literally get those.
02:27:35 Like those little metal trash cans that have the metal tops that fit snugly over the top, or even like a you know, if you if, whatever you're going to put in there is going to be small, it can even be something like, you know, around Christmas.
02:27:47 Time they've got those big metal tubs of like popcorn with, like the metal.
02:27:54 Tops or whatever. Maybe that's not the thickest in the world, but if you doubled up on something like that, uhm.
02:28:02 You can get those 50 gallon drums, those metal 50 gallon drums if you want to go crazy and put a bunch of tech in there, but that's that's the big thing is just you want to have it insulated on the inside and shield it on the outside with with metal and you don't have to go to go too much with it. I've heard people buying those.
02:28:22 EMP shield things on cars. My understand. I don't know. I haven't looked at. I haven't taken one of those apart, but.
02:28:31 My understanding is that that that's that's not going to help you.
02:28:35 My understand is that that's not going to help you because it's not. If it's what I think you're talking about, it's not shielding the components that are actually going to get fried because the the entire wiring of your car will become an antenna at that point, like well. And I guess even the the car itself will be coming in time.
02:28:54 So all that electromagnetic energy will go to everything that's conductive and a little box that's hooked up to your battery. Again, I don't know if that's what you got. That's not really going to do much of of anything, if anything.
02:29:09 Thing, maybe in small cases, but if it's big enough to where it's going to knock out all the the you know, the electronics in your car, that's not going to help. It's best if you want to do. If you want to go crazy with it, you just get a car that's prior to fuel injection. You know, just get a car that doesn't have computers, get a carbureted car and.
02:29:29 That's all mechanical, and then you know, then you're good. Nothing will. Nothing's gonna blow that up.
02:29:35 UM.
02:29:38 As far as my other stuff like in my ham shack, now all that stuff will get fried immediately because my antenna will just channel. It will be like my antenna getting struck by lightning, right? It'll just blast everything in my hand, Shake Shack to bits, but I'll have a, you know, I'll have a trash can full of stuff that I can open up and.
02:29:56 And hook up after the after the MP, that'll.
02:30:00 Work just fine.
02:30:02 Suicidal white male. I had a strange dream. Two men stood at the counter, one leaning the other behind 1/2 door. My words provoked them. The man behind the door approached. He was twice my size. I put my hands up.
02:30:17 And said whoa, whoa.
02:30:18 Whoa, he said. There's no Gray. There's only black and white.
02:30:23 Read read day of the rope.
02:30:25 Well, there you go. You've got uh.
02:30:28 You've got people promoting my book and your dreams. Yesterday I went to the doctor. He raced mixed with a Mexican. He is a Christian man, is kind and gracious to me today I met.
02:30:43 With an elder.
02:30:44 Who has shown me great kindness, though he is a Christian Zionist.
02:30:48 And has and dedicated his lifes work to building the industry.
02:30:53 Of Muslim nations, I am a communist or in a communist shithole, a satanic town whose logo is Lucifer and whose churches call themselves light bearers. Lucifer is the Lightbringer and I am in enemy territory and have been in a pit of despair, unable to speak from.
02:31:13 The heart? Am I a race trader by association? It's time to pick a side. No, I I think you're overthinking things. I think you need to. Maybe. Maybe you need to go on a camping trip.
02:31:27 You know, just go get away from it all. Get away from everything and just relax for a couple of days and and just don't overthink it. If you have people that are in your life that are, that are good, that have non white partners, that's not the end of the world. You know, we got to pick our battles and.
02:31:47 Look, if you have the option, I guess if you have the option to go to a a a pro white doctor and you know as much as you can, right, like not just doctor, but I guess as much as you can practice nepotism.
02:32:02 I I've always said that, but you also got to be reasonable with this stuff and and you know, just don't overthink things and don't take it too seriously. A lot of this stuff is is.
02:32:16 You know the.
02:32:17 Thing about the black pill is not despairing about our situation.
02:32:22 It's knowing that it's bad and and being over it, you know and just realizing that that, OK, it sucks, but that doesn't mean my life has to.
02:32:32 You know, it means stop spinning your wheels and trying to think that if you vote harder next time that you're going to get the, you know, the the, the Savior that's going to come and save you. It's not going to happen. Your destiny is your own. And it's more about just being aware of all the variables that you're going to have to deal with in your life.
02:32:52 And then so you have a more informed decision when you make, you know, important decisions that are going to affect you for the rest of your life. And I don't think that.
02:33:05 And if you if there's a nice guy Doctor Who married a Mexican, I don't, you know. And unless there's, like, a better option, that that's a reason to.
02:33:14 To to stop going to that doctor.
02:33:18 But yeah, just take a break, man. Take a break.
02:33:21 Take a break from it all and read a book. Read a positive book.
02:33:26 In the forest, I guess this time of year is probably not the best time to do that unless you're in the southern.
02:33:33 Beer then maybe it's getting warmer, but I would say I would say just, you know, maybe maybe chill out, relax, listen to to some non Jewish written Christmas songs. It'll be it'll be tough to do. You might have to do some research.
02:33:50 And you know, just chill.
02:33:52 Out just to, you know, try to try to not take take it all so seriously.
02:33:58 I mean, again, not that it's not serious, but don't let it. Don't let it bring it down, man.
02:34:06 Jay Ray, 1981.
02:34:17 January 921 is there no live comment section on Rumble? No, there is.
02:34:24 I'm looking at. Oh, and there's there's actually now there's hyper or rants. I guess they're called.
02:34:29 There's rants over on rumble now it is working, the software is.
02:34:33 Working alright, cool.
02:34:35 Yeah. Now there's there's a live chat over there.
02:34:39 I've just, I mean I I think it's normal. I think it's the same as anything else.
02:34:45 Thanks Deb. I hypocrite has a good show called the Daily Cope. Show him love coppers or coppers. The guy is Devin Junior on black pill philosophy. Yeah, I've been on his show before. I think I was a little much for his show. I I and I was trying to be nice because I knew we were on YouTube. But I think I was still a little much.
02:35:04 For a show.
02:35:07 Sheamus, Sheamus, Gorenberg just found out baby #8 is on the way. Wow, that's impressive.
02:35:16 Thank you for all that you do, Devin. Well, congratulations on baby #8 that's.
02:35:24 That's that's that's quite impressive.
Speaker 2
02:35:28 YouTube baby back the bus.
Speaker 8
02:35:30 There you go.
02:35:31 Spicy white, spicy white.
02:35:41 Spicy white wind is pro white economic infrastructure manifest.
02:35:47 That's the.
02:35:48 Complicated question. When does pro white economic infrastructure manifest slowly? I mean, I guess when I mean you could say right now.
02:35:58 Right. Right now you could say that in every like I was just talking to suicidal white guy. You have to pick your battles. But when you have the option to choose people that are like minded and and it's difficult to to know who those people are, I think a better.
02:36:17 A better way to ease into it.
02:36:20 Is the people that you detect are anti white and that's easy to detect because they're not shy about it, right? You you definitely cease business with them.
02:36:30 You see business with them and you favor your own kind. It's it's exactly what Jews have been doing for centuries. It's what's given part of what's given them an edge.
02:36:42 And it's something that quite frankly, it's not just Jews. It blacks do it too. I mean, for ***** sake, I mean, if you go, you go. I mean, any website now, well, especially during the remember the during the George Floyd stuff, any online store for anything or any service it was.
02:37:00 Ohh click here to get the you know black-owned businesses or.
02:37:04 You know, you go to like audible or something. That's like, oh, you want, you know, here's some black authors or whatever, I mean. So every other race does it, you know, there's a Hispanic Chamber of Commerce. Every other race does it.
02:37:16 And so you're losing if you're not.
02:37:19 And look, if we had access to more money and and whatnot, I would say, yeah, I mean we should start like a white Chamber of Commerce. I don't even know how you go about doing something like that. But I absolutely think that that's probably not too far off in the future. I think that oddly.
02:37:39 As whites become more of a minority in their countries, they are becoming more racially look, if you want a white pill.
Speaker 26
02:37:47 I mean we.
Speaker 8
02:37:48 Kind of started off with one, right?
02:37:50 Is little by little being criticizing Jews is weirdly becoming fashionable again for maybe all the wrong reasons, but it's still becoming fashionable.
02:38:02 And little by little, as a result of Jews trying to get boomers to hate brown people again, just the right kind of brown people.
02:38:12 Even being racially conscious as a white person is, you know, little bit, not as much. Not as much, but a little bit on the menu these days.
02:38:23 And so I think that the these options will present themselves and these opportunities will will happen where you're going to have the ability to have.
02:38:35 Explicitly, pro white groups that the friction that you would have been met with even just recently will not exist to the same degree that that it existed. You know, 10 years it would be impossible, right? I mean, just just the the amount of.
02:38:56 The way that I mean that again it's for the wrong reasons, but the way that the way that you even see some of these mainstream people, they're actually using the phrase anti white you would, I mean, when Trump was president, none of these people would have said.
02:39:11 So it's it's coming and it's all up to us to do it and it's just going to be a matter of getting in Group preference, burned into the skulls of white people, which think unfortunately will require some kind of persecution event. I think that's just what's going to happen. And this is just the beginnings of it. Like, really, I mean, I I I see people that are worried about it now.
02:39:30 But it's a little it's too little, too late in terms of trying to turn the boat around like you're not going to be able to do.
02:39:36 I mean, maybe in some European countries, they'll be able to, I don't know that they have the political will to to make it happen. I don't know that you're actually gonna be able to do mass deportations or something like that. But in a country like America, I mean, it's already done that the it's baked into the cake, the, the the trends are the trends. I mean, we're not going to what's going to happen, right.
02:39:57 That we don't we already don't have the political power to what would be your your your choices would be somehow getting white people like promoting white birth rates on like on an official level, right? Like maybe like tax breaks for American white American.
02:40:16 That had kids. You're not able to do that or anything like that. Nothing that would be official or or run by any kind of institutions, right. You also won't be able to do any kind of mass deportations or anything like that in America. I think that's it's a done deal. It's a done deal. I think Europe is probably facing the same reality.
02:40:36 But you know, they might have a better shot at it. They might have a better shot at it, cause it's a it's a newer problem and so it's.
02:40:46 It's it's still these, these communities of outsiders are still to some degree more segregated. But then again, I mean, you know, look at the look at the Mayor of London, you know, so not really and not really everywhere. Some of the places in Europe are are kind of experiencing, you know, the same kind of problem.
02:41:08 Maybe a little bit lagged, but it's still pretty pretty severe.
02:41:12 Uh, my fat little ******** toe, my fat little ******** toe.
02:41:28 My fellow ******** toe, my 2.0 account is up on rumble and subbed. Will you be doing any extra for paid Subs? Not that it matters. Uh, what are you? What you are doing isn't about the money nor show.
02:41:42 Would be.
02:41:43 Oh, you mean like I I don't even understand the whole rumble thing. It's kind of like, I guess.
02:41:47 Being like a member.
02:41:48 On YouTube maybe I've I've thought about it. I've thought about maybe doing the show normally the way we've always done it, and then maybe do like an extra show every so often, but I'm not going to. I don't think I'm going to.
02:42:02 Lower the amount of of normal shows that I do, but maybe? Well, that's that's definitely something worth looking at. I don't even understand the structure. I don't know how the membership thing works, so I'll have to look into how rumble has that. And honestly, I don't know that rumble is even a safe place for us to be long term.
02:42:21 I kind of feel like that there's a there's a good possibility that I'm there until they notice that I'm there.
02:42:29 You know what I mean?
02:42:30 But we'll see. We'll find out. I don't know.
02:42:33 I don't know. Uh, Arminius, revenge. One downside to rumble is it doesn't allow you to download videos. As far as I've seen, it's been really valuable to download them for external storage, or even just for travel. I even download downloaded some to help explain the world to my kid. If anything were to happen to me.
02:42:55 Anyway, thanks for everything.
02:42:58 I think there's when I was looking for the the add-on for the browser to capture the.
02:43:05 The rumble rants I think I saw one that was Rumble downloader. You can you can probably get an add-on or or use J downloader or something like that to download them. You don't necessarily have to have the option on the browser.
02:43:19 For that to work.
02:43:21 Veruca salt. Hi, Devin. What about doing a deep dive into Louis brand brand day brand do. I don't know how you'd say that. Some kind of Frenchy name the first Jewish Supreme Court Justice. He was a member of a secret society and a Zionist agent who used his position in the US government.
02:43:40 To serve Israel like they all do, it was Jewish shenanigans that got us or that got the US into World War One as well as World War Two. Well, I'm not aware of this guy.
02:43:51 So I will.
02:43:55 And we'll add him to the list.
02:44:00 But I'm not surprised at all. That does seem to be a pattern of behavior.
02:44:06 Riddle of steel.
02:44:08 Remember, Coast to coast having that theme song by some fairy named UFO Phil.
02:44:14 That was all about George Nori on the show. As bad as the song was, it was super catchy and George played it every episode. It gave his viewership a good bump.
02:44:25 Well, my friend, it's time for you to have your own theme song, but one that's not gay. I made one for you. Waiting till next stream to unveil it in case there's more finishing touches I need to put on it. Well, that might be interesting to check out.
02:44:40 Yeah, like I, I look forward to seeing that. I don't know that I know the song you're talking about because I was never really a big fan of George Nori once. He kind of took over the show.
02:44:50 So I was kind of just like, uh, you know, I'd listen to occasionally, but part of it, too was just my job changed so much and the Internet started to be a thing. So I was more listening to podcasts and and stuff like that. By the time that he was taking over the show.
02:45:07 Uh. Let's see here. Bobby Lee's wager. Wait. Bin Laden isn't a total ****** ooy. They shut it down. Yeah, I look, that's something that.
02:45:20 I mean, he was a CIA asset. He really was. But he also might have really hated America and really said this stuff. I don't know. It's hard to know like that whole, that whole aspect of 9/11 everything is is kind of muddy. I I, I we all obviously we all know Israel was.
02:45:40 Very involved in 911 to the degree that I I believe that they were involved in bringing down towers, you know, tower seven or building #7 and both the towers they might have even been involved in remotely flying the plane into the Pentagon.
02:45:58 I think that.
02:46:01 Were there, were there hijackers, I mean.
02:46:04 I don't know, probably there probably were guys that by all accounts there were people that went to flight school and and did a ****** job flying small planes. And you know that there there were people in the country where they on those planes.
02:46:24 When they crashed into the buildings, yeah, man.
02:46:28 Was was bin Laden under the impression that he was? He was doing some kind of mastermind of a of a plan that the is, you know, that the Mossad was aware of and took advantage of? And who knows, right? I don't know all the INS.
02:46:41 And outs of that. I do know that they were dancing. Israelis was so at the very least they knew it was going to happen.
02:46:47 I do know that there was a van full of Israelis that, were they, they were arrested on that bridge and that the explosive dogs pinged it and we don't know any more about that other than the the police report. Right. So we don't even know what what the FBI did with those guys.
02:47:04 We also know that the Israelis had all those spies in the country they were posing as art students. We know that those art students took pictures of themselves in the towers.
02:47:17 Prior to them coming down, we know that the Jews got that alarm on that messaging app, telling them not to go to that, that part of town before any of the attacks have.
02:47:32 And we know that the the very least Mossad and many Jews were aware of the attacks and and to what degree they were involved in actually making engineering them and making them happen. That's a Gray area. So it when you when you hear that, I mean, look, bin Laden when they raided.
02:47:52 His, his and that's another thing, right? They they buried him in in the ocean.
02:47:57 And say, you know they they went and raided bin Laden and and Obama came out and said, oh, we got him. But we threw his body in the ocean because Muslim and it doesn't make any sense. You know, Muslims, don't you think a a desert, a nomadic desert religion is going to have ceremonial burials in the ocean? Like that doesn't ******* make any sense.
02:48:17 Right. And so that there's some shenanigans going on there and then like the helicopter full of the the Navy seals that did it, like crashed and killed a bunch of those guys, right? Like so there's.
02:48:30 I don't think we'll ever know the full picture. And so when I see a letter that was written by Bin Laden, supposedly I, you know, I don't know, maybe it was, maybe it wasn't.
02:48:41 That's the way I look. I have to look at it because it just, you know, it doesn't nothing about that whole thing adds up.
02:48:47 Either way, though, it's regardless of who wrote it.
02:48:52 The fact that a letter talking about Jews controlling America went viral on TikTok, that's a good thing, even if it was for the wrong reasons and whatever, right? It's still a good thing. It's a good thing.
02:49:05 Not happier. 64 imagine of a US politician said the kinds of things bin Laden said, minus the Durka Durk of Jihad stuff. That'd be pretty sweet. Maybe we should lean into this. Well, you know, I I think that that won't won't happen if you've seen it.
02:49:23 In the presidential debates.
02:49:25 There's a lot of Israeli **** ******* going on in both sides of the.
02:49:28 You know, I don't think you'd be allowed to run if you even remotely, if you even mentioned Jews in a negative way. Your, your, your political career is over. They still have that much power. They might not have the same amount of power they had even five years ago in respects to what you can say on social media, specifically Twitter and that sort of a thing. But it's not like we haven't made quite the advancements.
02:49:51 That we would need to make in order for a a viable politician to even criticize Israel.
02:49:57 Or even just not support Israel.
02:50:02 Jay Ray Knight Sandy won that that, that they hate us for our freedom was relevant for US 18 year olds in 2002 and so many died for Israel. Not now. Yeah. What freedom. I guess that's the other thing too, right. They hate us for what are the same reasons we hate us.
02:50:20 Like for having trans kids.
02:50:23 Ah, farchie Jo. Hey, long time listener. First time hyper chat. I made an animated version.
02:50:31 Of the pit.
02:50:35 That's their odyssey link. Alright. You know the other one bent downloaded. So maybe I'll copy yours.
02:50:42 And well, let's watch the other one.
02:50:46 That finally downloaded here.
02:51:02 Ohh, I guess it's the.
02:51:03 Right size kind of. Here we go.
Speaker 27
02:51:06 Just as you always suspected, they have money.
Speaker 5
02:51:09 To burn in Washington.
Speaker 1
02:51:11 And all my friends and all my force, it gives a pretty nice.
Speaker 5
02:51:20 Not about money.
Speaker 27
02:51:23 It's about something else.
Speaker
02:51:31 Imagine that.
Speaker 8
02:51:34 There we go, that one downloaded.
02:51:38 Sure, I understand the the MK ultra thing at the end.
02:51:40 But but there we go.
02:51:46 Let me get this other one started.
02:51:53 I don't know why it takes so long to download from Odyssey.
02:51:58 All right.
02:52:00 Where is the hybrid chat screen now?
02:52:05 Veruca salt. Hey, have you heard about the Jewish waitress? When she checks on your table, she asks. Is anything OK?
02:52:17 Get it is.
02:52:18 Is anything OK?
02:52:27 Jay Ray, 1981 that week Hamas rape propaganda works on evangelical Christians. We believe in Christ, but on the sword or wait, but on the sword St. Michael's side. Thanks, Deb.
02:52:42 Yeah, now that propaganda works on a lot of people that won't wouldn't wouldn't share that kind of propaganda if it was in support of their own people.
02:52:51 But they they Revere Jews, you know, because they're.
02:52:56 They're God's favorite people.
02:52:59 Suicidal white male.
02:53:02 Borgan something would you like to cover the Marin County Civic Center attacks by Black Panthers to free a convict they kidnapped the judge. Here's the wiki.
02:53:16 I will add that to another one, I don't think I'm not. I know the Black Panthers did a lot of ****** ** **** and we haven't really done a whole lot on the the Black Panthers, aside from how they they've been nominally connected to some of the other stuff we've covered, so maybe that's worth looking into.
02:53:34 White noise.
02:53:36 White noise simply says. Where is that one?
Speaker 5
02:53:42 Here we.
Speaker 20
02:53:42 Go. I'll have a *****.
Speaker 8
02:53:45 Suicidal white male video games are actually a military recruitment tool. They are, in fact almost the entire reason besides coming from broken broken home that I joined the military told my DUI I wanted to kill bodies, he told me I was insane. Army Recruitment center has huge gaming kiosks. Well.
02:54:04 They they make a game, don't they? Don't they make?
02:54:09 What was it called? Maybe they still they don't make it. No, they they probably still make it. It was a first person shooter.
02:54:17 What was it called? It wasn't Arma, but it was like you started out in boot camp and it was like it was overly realistic. Like to the degree it was, it was boring, like you had to like.
02:54:29 Like you had to actually go to boot camp and like, qualify on different weapons and stuff. What was that called? I think it might have been called America's army. I think it was actually called America's army.
02:54:39 I don't know if they still make it, but the the, the United States military made a first person shooter.
02:54:45 And it was. It was literally it was a recruitment.
02:54:47 Tool, but I agree a lot of these games, are they they they act as that psychologically.
02:54:55 White noise. His name was cash Gernon.
02:55:00 I don't know who cash German was.
02:55:03 Who's cash German.
02:55:06 Is that a name I should know?
02:55:13 Cash burnin.
02:55:18 Ohh yeah, that was the little boy right that got.
02:55:24 They got kidnapped out of his bed. Right? Isn't that what it was?
02:55:31 Or he got murdered by the.
02:55:36 Yeah. Yeah. No, that was that was the funked up one with the the the.
02:55:42 The black guy.
02:55:44 Kidnapped a a little boy and then just murdered him on the street and no riot.
02:55:50 No riots.
02:55:53 Suicidal white male. There was the son of a rabbi here on January 6, dressed in furs.
02:55:59 Very clear reference to Jacob deceiving Esau.
02:56:04 I think I saw something about that.
02:56:09 I remember him getting arrested.
02:56:11 Antonio Vay's my favorite Michael Rappaport movie is the one where Nicolas Cage beats him into a pulp.
02:56:19 I don't remember that one.
02:56:21 Man, before time I watched this doc about James Alefantis, the comet pizza guy and he mentions his best friend. They show the place where his friend works as an artist. Well, that place is renovated, is a renovated factory in my town. He literally makes baby baby coffins there. Well, I remember back in the pizza gate.
02:56:42 Days people go into that guy's Instagram and yeah, he does. He makes baby baby coffins. That's 100% real. His Instagram was basically just baby coffins.
02:56:52 Antonio vey, by the way, Devin, you don't need to say gay, Jew. You might as well just say tell avian Tel Aviv Vivian or Tel Aviv is the gay capital of the world. Just look it up. Tel Aviv is gay as hell, yet is the gayest city. I think in the world, which is crazy because DC is really it's gayer than San Francisco, it actually is.
02:57:13 Sam's junk don't get too excited about Sam Altman getting fired. People are making moves to get him back. Source is Hacker News. Yeah, we'll see what happens. I, you know, I don't really care. Honestly, I just wanted to shout on him for ****** his his sister.
02:57:31 If they if they.
02:57:32 Because whoever's going to take over, it's not like they're going to, like, hire.
02:57:37 Someone based to take over his job. It's not like ChatGPT is suddenly going to stop being woke. You know, that's that's not what's going to happen. But it was an opportunity to showcase the kind of people that that are writing the AI's, or at least overseeing what kind of information and.
02:57:57 And you know, goes into the AI and what the AI is allowed to think about. And that's also something you guys need to think about. This technology. Not only is it going to, it's going to happen. Well, there's no. It's really inevitable that this technology is going to happen unless, like I said, barring some kind of crazy cataclysmic.
02:58:16 It's it's going to happen and it's going to be pioneered, unfortunately and overseen by people like him. And so it's just important to understand, to highlight that that look the AI's that are going to try to get you arrested for pre crime are being designed by literal baby rappers.
02:58:37 UM.
02:58:40 Explain this this Bengali phenomenon like Greasy Jew dudes wheeled chicks.
02:58:47 Somehow I don't. I don't know what that is.
02:58:50 Don't know what that is. Funkin or funk Android?
02:58:56 Israel genocide and Palestinians shows the world who they truly are. It's a win. Win. The great noticing is upon us. It seems to be, you know, a little bit. Like I said a little.
Speaker 5
02:59:08 Bit the.
Speaker 8
02:59:09 Wrong. I mean a lot the wrong framing and but you know, I'll take it, I'll take it.
02:59:16 Uh Guitar dude 1356. Thanks for continuing to do God's work, Devin, because your work is so important and I'm and wait and damaging to the powers that be. I'm concerned, God forbid, the Jews or Zog will try to take you out. Have you put in safeguards in place if that were to occur?
02:59:35 Maybe having others continue your work, etcetera. Well, I feel like others will always continue the work. You know, I don't think I need to to necessarily pass it down to anyone. I think that you know, there's always new new new faces out there. I'm also not super worried about it.
02:59:52 I watch my back and I'm aware of the dangers, so I plan on being here a long time. Hopefully my Southern Baptist pastor talked about how he supports Israel 100% in his last two sermons and gave biblical reasons why we've been going to this church for a year and agree with his.
03:00:12 Otherwise, conservative teachings. But as Israel stands, is obviously a huge problem. Should we continue to go to this church and just teach our kids the opposite?
03:00:22 Of what the pastor is saying? Or should we find the new church? I'm just concerned regarding the conservatism or conservativeness of a new church and having to start over making friends etcetera. I mean, that's up to you. If it was like very if it's very pro Israel, if it were me, I would not be able to sit in.
03:00:42 At Congregation it's it's like you know.
03:00:47 Whether you teach your kids something else or not, that's still going in their head and it's going in their head from a a place of authority, or at least what they see as authority, and that is not a good thing. And just look, this is one of those instances. I I just I just said pick your battles. That's the battle I think I would have.
03:01:07 I would pick personally, but it's up to you. You're gonna have to. I don't know what other churches are available to you and what your options are, but for me, that would be, I would. That would be intolerable for me. I would not be able to. I wouldn't be able to respect the man afterwards, you know, like.
03:01:23 Just because it's like.
03:01:26 You know, would you go?
03:01:31 It's like if, OK, it's like there's certain actors, right where they're good actors and you like their their acting ability. But if you knew like you knew they raped kids.
03:01:43 You wouldn't want to pay to go see their.
03:01:44 Movie, right? Because you would, you would know they raped kids down saying your pastor rapes kids.
03:01:49 But I mean, you know.
03:01:54 It for me it would be too much. I wouldn't be able to handle that. I wouldn't.
03:01:57 Be able.
03:01:58 To handle that, I've heard common dot hey, Devin, what also worries me about AI that is written about by a bunch of white hating Jews and poisoned with woke ISM. How long before AI will target us whites? It already does. It just doesn't have any.
03:02:14 Power to do much about.
03:02:16 Got it. But you know, all you have to do now is go to ChatGPT and ask it, you know, tell tell, ask it to tell you a joke about white people. It'll do it. Or at least last time I checked. Maybe they patched this because this went viral. And then if you ask it to tell you a joke about black people, it'll say.
03:02:34 Oh, no, no, that's racist.
03:02:36 Well, that's the kind of thinking. The AI that, you know, controls traffic. What if AI gets into a situation where it's controlling traffic and it has to decide between cause it maybe there's a glitch or something ***** up and it it. It literally only can save.
03:02:51 One car load of people and I guarantee you one of the calculations the way that it's structured now will be people of cover. You know, it'll it'll assume people of color are of higher value and you will die. Now again, that's a very.
03:03:07 You know that's not something that we're worried about right now, but yeah, eventually that's the kind of a thing that you will be faced with.
03:03:16 UM.
03:03:20 Kadir, I think it says Kadir Kadir, Kadir, Kadir, Kadir.
Speaker 7
03:03:27 Cash flow checkout.
Speaker 20
03:03:34 I'd like to return this duck.
Speaker 8
03:03:36 If I were in charge of programming the AI and a narcissist, I'd make it worship me as God's wonder if the Jews are the God of the AI.
03:03:47 Yeah, possibly. But like I said, I think the way that they're programming it it, I feel like it's.
03:03:55 It'll get out of hand. I think it'll start training itself. I mean, it'll the unsupervised training aspect of it.
03:04:03 If it's, if it's allowed to do that in other in ways beyond just recognizing images and and that sort of a thing, it'll come to its own conclusions. Maybe it'll be really bad for Jews, in fact.
03:04:15 If it's allowed to come to its own conclusions just based on data without it being told what to.
03:04:19 Think about it.
03:04:22 Let's see here. I lost my place.
03:04:27 Here we are.
03:04:28 A lowly scribe in God's army. I think all that dye you have to drink before getting an MRI is bad for you.
03:04:36 Yeah. Isn't it magnetic?
03:04:39 I don't think I've ever had an MRI.
03:04:43 No, not like not where I've had to drink die. At least I've I've been in the in scanning machines. I I think I was in a seat. I I do. CT. I've done CT scans. I only ever been in.
03:04:53 An MRI machine.
03:04:57 Ace, I always thought AI would be used to collect data of human activity long enough that if it could map out and even be able to predict human group behavior, whoever controls such a tool could become extremely powerful. Well, already, that's already happening, but yes, that's that's.
03:05:13 The danger is the only thing that prevents, I mean humans, whether you like it or not, are extremely predictable, and the only thing that prevents the ruling class from being able to predict behavior beyond what they're already able to do is that their inability to fathom the huge amounts of data.
03:05:34 Available to them and AI solves that problem. So yeah, I mean given enough data, I mean the way those language models work that are reading your brain.
03:05:45 It's actually not.
03:05:47 Because the the right, the one that we would have talked about the night because it's not able to scan your brain in the at the speed that you're thinking, a lot of the way it's generating that text is just by predicting based on what information is able to get what logically you would.
03:06:08 You would finish the sentence with and look you see like a version of this all the time, right when you're using like if you use Gmail, Gmail tries to auto complete sentences and sometimes it's right and that's just ******* Gmail.
03:06:22 Auto complete on your phone. Same thing, right? Sometimes it's predicting the next word just because humans are pretty predictable. And so that's a lot of how that technology was working because it had a limited amount of data to work with. It would just use the same kind of a language model to predict what it thinks you would say.
03:06:43 Given the holes in the in the data.
03:06:44 That it had.
03:06:46 Well, once there's no holes in the data, and once you have the computing power and not just that the the data set is enormous. Because once you know these experiments, they're doing that we talked about tonight. Their their sample size is not that big. It's not like they're sampling the brain activity of.
03:07:06 Millions of people, they're just taking brain activity from, you know, like 50.
03:07:10 People and they're able to do these kinds of accurate predictions. So imagine if you know, even just like 100,000 people bought those, those earbuds or whatever, the next generation of those earbuds would be. And they've got real time data coming in from 100,000 people and they've got other they got background data on all these people, right. So they can, they can group these people up and be like.
03:07:32 Oh, you know, like 2020 thousand of the people that were getting this real time brain data from our conservatives and, you know, whatever they can, they can categorize them and and see how different groups react it. It's.
03:07:45 Humans are way more easy to fool and predictable than than any human really wants to admit.
03:07:54 Ham radio expert I've been doing your diet for a month now, took a few days to get used to raw beef only we didn't say.
03:08:01 Anything about it being raw?
Speaker 5
03:08:05 I cook my beef.
Speaker 8
03:08:07 I like it. I like it rare. You don't have to eat it raw if you're eating it raw. I mean, you can eat it raw, but you have to make sure you get it from a good source. If you're going to do that. But now I'm adapted and absolutely jacked. Well, there you go.
03:08:20 There you go. You can cook your meat though, man.
03:08:26 My grandpa used to.
03:08:27 Eat they he called them cannibal sandwiches. He ate Rob Rob's ground beef sandwiches so it can be done. It can be done. But again, you have to make sure you get it from a good source.
03:08:38 If you're going to do that, good to know though, good to.
03:08:42 Know Jay Ray, 1981. My God, we must raise John and Jenny Connors. Yes, absolutely.
03:08:52 Uh. They call me Mr. Nigst 504 in Odyssey today thought there'd be a rumble. Only stream. There's definitely some monitoring, more infiltration on rumble. Just noticing from Chad. Yeah, absolutely. Absolutely. Look, that's just something we'll have to live with forever.
03:09:09 No, no really getting no getting around that.
03:09:12 Uh Devin Stack for fuer Devin Stack for fuer.
03:09:23 This should is so dystopian, we could be a Type 3 civilization by now, but instead we live in a Black Mirror episode. Thanks Jews, absolutely.
03:09:35 And then Part 2, my wife is a teacher and we're buying property in a in a county that's 98% white. We're currently getting things in place to build her a schoolhouse so people in the community can have a pro white alternative to public school, largely inspired by you. Well, that's awesome.
03:09:53 Yeah, I think that that's one that's that's a, that's something we have to think about with home schooling is the social aspect of it. If you can pull your your resources and find a group of people that where you can have like a, you know, like the old timey, one room, schoolhouse, right model.
03:10:13 As opposed to the.
03:10:15 You know the public school system we got now that would be awesome. Let us know how check in and let us know how that works out. That would be awesome if you actually get like a little pro white private school.
03:10:24 Going on there.
03:10:26 Jay Ray, 1981. Dude, I'm getting ads like it read my ******* mind that that meticulously accurate. Exactly. Well, that's cause your microphone is listening. If you're using Siri or OK Google or any of that stuff, it's always listening and and transmitting a lot of those keywords to the advertisers.
03:10:49 J Dog, I think. Have you ever thought about building a sauna to the pill? In the pillbox? I think you mean.
03:10:57 Anyways, keep up the good work.
03:11:00 It's a little hot out here to have saunas.
03:11:02 But I do like saunas.
03:11:05 I'm I'm I mean I guess.
03:11:06 The moisture aspect of it right, I guess maybe all I'd have to do if I was going to build a sauna in the summer, I wouldn't even have to like heat it from the inside. I could just have water in there.
03:11:16 But no, I don't have a sauna. That would be nice. I would. I would enjoy the sauna.
03:11:21 Now Prairie dog.
03:11:23 It is now more imperative than ever to become ungovernable, to steal yourself and your family against the forces of evil. If it was impossible to resist, they wouldn't be trying so hard. What do you think about the recent rampant naming name of the Jew?
03:11:40 I think it's a.
03:11:42 A A positive development and I think it's a A, a Gollum situation where their their weapon that they created to deal damage to the the goyum is turning against them. To some extent it'll be interesting to see how that all unfolds.
03:12:02 Robert Wilson, thank you for the strained black pill. Keep up the good work. Well, I appreciate that Robert Wilson.
03:12:09 Mr. Choley great show. Thanks, Devin. I appreciate that, Mister Choli.
03:12:17 And then we, oh, we got some *** **** money.
03:12:21 We definitely got some *** **** money.
03:12:24 From uh.
03:12:26 Who dued my car.
03:12:30 Who? Jude?
03:12:31 My car.
Speaker 5
03:12:33 Money is power money.
Speaker 6
03:12:35 Is the only weapon that the.
03:12:36 Jew has to defend himself with look.
Speaker 8
03:12:39 Look how Julie this Fagg is.
03:13:02 All right.
03:13:04 Who? Dude? My car with the.
03:13:08 The very *** **** money got mass. Massive banana hammock ship going on here.
03:13:15 And he says Devin his ****. Well, I appreciate that. Who? Jude my car.
03:13:21 Really appreciate that the SPLC has just put.
03:13:24 You on a list.
03:13:27 You're now in some weird hate diagram.
03:13:31 So appreciate that who Jude my car.
03:13:36 Very much, Andromeda.
03:13:39 Devin, thanks for one of the great show. Well, I appreciate that.
03:13:44 Hey, would you **** **?
03:13:56 Says all praise who Jude my car and I I definitely agree with that.
03:14:01 Definitely agree with that.
Speaker
03:14:15 I'm just a.
Speaker 8
03:14:15 Weekend photographer. That's right. We had to. We had to use the the remaining rookie. Actually there's two remaining rookies I think in there.
03:14:25 Yeah, absolutely. Thank you for the support there. Both of you guys.
03:14:29 Hey, would you **** **? What do you think of the correlation between BLM and Antifa with the Black Panthers and the Weather Underground? Same Playbook 60 years later. It. Well, honestly, it's probably a lot of the the the children of right, like a lot of these Weather Underground people and if not, you know, biological children.
03:14:52 Certainly the philosophical children, you know, these weather, underground people and and Black Panther is a lot of them became professors and, you know, brainwashed the next generation.
03:15:03 So absolutely, I mean they're they're commies at the end of the day, they're comma.
03:15:09 Hey, would you **** **?
Speaker 23
03:15:15 I can see.
Speaker 8
03:15:20 You've always got von dot live for streaming professor stack. I've mentioned it to you before. Well, there you.
03:15:25 Go. Uh yeah.
03:15:26 I've never heard of that, but I'll check it out.
03:15:33 I want to stay on as few platforms simultaneously as possible. I think if you spread yourself too thin, it's not good. I'd rather have people concentrated in, you know, in one spot, really. But if we got to do 2, which I think we do, and you know, given the the uncertain future of Odyssey, then you know that that's that's a necessary.
03:15:55 Evil. But I I I'd prefer to not spread out too many things simultaneously.
03:16:00 But yeah, absolutely. I'll keep everything as an option if if these things get shut down, we'll always be live. We'll always find a way. Luckily, there's enough of these alternatives to where even if it's a small one, maybe we'll bring the big the big black pilled audience over to some smaller.
03:16:18 Platform and crash it right? Fard Cheeto reminder to check the pit animation. Best regards. Alright, let's see if that one downloaded now.
03:16:36 It did download.
03:16:40 Here we go.
03:16:41 This looks promising.
Speaker 5
03:16:48 Get in the pin.
03:16:54 8888.
Speaker 8
03:17:01 I approve.
03:17:07 Yeah, that's very nice. Very nice.
03:17:10 Cool. Yeah, we'll get that loaded up.
03:17:14 Thank you very much for that.
03:17:17 Farci ito.
03:17:19 Hey, would you **** **? The Congo Dono videos are outrageous.
Speaker 1
03:17:25 Good, good, good, good for real.
Speaker 8
03:17:31 Outrageous guitar dude, what are your thoughts on simulation theory? It's, you know, it's. I feel like it's just like a mind exercise. It's one of those things where it's like, OK, do we live in a simulation? I mean, the whole, the whole idea is you wouldn't.
03:17:47 Know. So it's one of those things that to me is it's pointless to discuss it.
03:17:52 Because it's like you wouldn't know. Like, that's the whole point of it, right? Is that if you look at how technology has progressed and how closely we can already simulate really.
03:18:05 It's it's easy to assume that at a certain point in the future we would get to a level of technology where you wouldn't be able to discern the difference. And so the theory is, how do you know you're not just in that that time period experiencing the the simulation?
03:18:26 Because you wouldn't know. And that's the and so the answer is, well, you wouldn't know. So it's kind of it's one of those things where it's, I guess it's interesting to think about for like a second. But beyond that, it's like, well, you wouldn't know. So there's no way to know.
03:18:39 All right.
03:18:42 Tejada just asked. How's the Starlink working recently? Well, it's working, OK, and I think they put some more satellites in last night, right. So it hasn't crashed yet today or, you know, knock on wood, we're almost done here wrapping up. So hopefully, so far, so good. All right, going over to rumble.
03:19:00 Going over to rumble here.
03:19:03 Uh, we've got.
03:19:06 UM.
03:19:12 How do I read these? Here we go, hammer Thorazine, says. Rant test. So that worked.
03:19:21 So thank you. Hammer authorizing that seems to have worked, and I think I can make you a mod over there too, right?
03:19:28 I'll have to figure I I don't.
03:19:29 Think I have any mods.
03:19:30 Over there. So I'll figure that out.
03:19:35 And then horrid.
03:19:38 Horatius 148.
03:19:41 Would you be interested in streaming on unauthorized.tv if you haven't heard of it, check it out. They are interested in having a discussion with you about possibly streaming there if you want. Thanks. Yeah, like I said, like I'm open to other platforms. My thing is I want to kind of. I don't want to be on like a million different platforms.
03:20:03 And I want to be on on, you know, not just as few as possible. Like I said, preferably just one, but whether it's going the most eyeballs. Yeah, I I I mean I'd I'd listen to what they have to say but I'm I'm comfortable with where, where what what we got now but that that could.
03:20:19 Change for all.
03:20:19 We know Odyssey could eat **** like I thought today for a little bit.
03:20:24 But it was gone. I I didn't know. Like we had that that error. And then I also, like I said, I'm very skeptic.
03:20:31 That rumble is going to allow me to to stay here. So yeah, maybe that would be a a possible option there. So all right, guys. Well, that's everything. Wait.
Speaker 5
03:20:42 Wait, hold on.
Speaker 8
03:20:45 Hold on. I've got two more. Back to odyssey. Glock 23. Why are you on rumble? If there are things you can't say on there? Well, like I said, it's it's more of a, it's more of a, you know, it's where the eyeballs are and I'm.
03:21:02 Going to keep.
03:21:02 Saying really, there's not a lot you can't say there, or at least that I'm not.
03:21:06 Going to not say there.
03:21:07 We're just going to kind of we're going to stress test that a little bit.
03:21:11 But you can't fed post, you know, and there's it's not like I'm going to change my subject matter or anything like that, right?
03:21:20 Hey, would you **** **? Can you cut a doughnut together of the apes getting cut in half with the lasers? I I have one of those. I mean, they're not. They're not getting cut in half and they're getting the arm chopped off. Right. Isn't this one? Right where?
03:21:32 Did it go? Is this one?
03:21:36 I think it.
Speaker
03:21:36 Was this one?
Speaker 8
03:21:43 Yeah, I I I definitely have that. I know what you're talking about, cause I did cut. I cut that clip out.
03:21:49 Uh, yeah, maybe I'll maybe I'll uh.
03:21:52 Maybe I'll finish that one off. I think that was.
03:21:54 One where I.
03:21:54 Was like I've got enough. I don't need this one too.
03:21:58 But maybe, maybe I'll add that one. All right, guys. Well, I think that's everything looked at at both platforms. Thanks for joining us tonight.
03:22:07 And I hope you guys all have a good rest of your weekend. We will be having a stream on Wednesday or.
03:22:15 I was going to say maybe Thursday. No. But let's we'll do it Wednesday, but we'll do a we'll do a a, maybe a Thanksgiving stream. Thanksgiving is Thursday, isn't it?
03:22:26 When is Thanksgiving, let me see.
03:22:40 20/24/23.
03:22:43 Thanksgiving is a movie.
03:22:45 Now it's a horror film.
03:22:50 No, it's. Oh, wait, hold on.
03:22:53 Thanksgiving 2023.
03:23:01 Yeah, it's. It's Thursday, all right. So we'll do a Thanksgiving themed.
03:23:07 Episode on Wednesday.
03:23:10 Alright guys, hope you have a good rest of your weekend and we will meet back here on Wednesday.
Speaker 6
03:23:18 4 black pilled.
Speaker 8
03:23:21 I am of course.
Speaker
03:23:24 Demon stag.
Speaker 27
03:23:27 I don't have to give you this gentlemen's qualifications as an, as an actor, I think he is probably considered by other actors and people in the entertainment, entertainment profession as probably the greatest actor of our times. I admire him for that. I also admire Marlon Brando.
03:23:44 For his conscience as an American and his moral commitment, when he believes in something.
Speaker 6
03:23:48 Hollywood is run by Jews. It's owned by Jews.
03:23:51 Well, earlier this week, after Marlon Brando met with Jewish leaders to apologize for comments he made on Larry King live among them, that quote Hollywood is run by Jews, the Jewish leaders accepted the actors apology and announced that Brando is now free to work again.