
INSOMNIA STREAM: INFORMATIONAL NIHILISM.mp3
05/24/2025German Numbers Lady
00:00:00 No.00:00:47 Who's the right? Yes, yes.
00:00:54 No.
00:01:00 Thanks.
00:01:06 You can find.
00:01:13 666.
Portishead - Nylon Smile
00:02:10 You said, but I just can't learn to smile.00:02:20 Struggle with myself.
00:02:35 Please.
00:02:41 So my.
00:02:48 The reason to be there because I don't know.
00:03:58 Clear.
00:04:03 Reason.
00:04:38 Say nothing bad. Never had the chance to explain exactly what I mean.
The Go Gos - Vacation
00:05:08 Thank you.00:05:20 Now that I'm the way.
00:05:23 I wish I'd stayed in.
00:05:26 The state of my life, you will be in the when you look at me, I should have.
00:05:38 But I thought it was just for fun.
00:05:44 I see a sun.
00:05:54 Hell.
00:05:57 Actually.
00:05:59 That I forget.
00:06:04 And I still have.
00:06:59 Again.
00:07:12 It's.
Devon Stack
00:08:14 Hello there it goes all right.00:08:20 So I want to ask you guys how it sounds last stream because I had the AC on.
00:08:29 And I and I I I I should have tested it before, but I I had this. I have this noise gate that I thought would be a good idea to cut it off the audio and and it just made it made it sound bad. Like it made it sound bad especially on like.
00:08:48 Like phone speakers.
00:08:51 Or on like laptop speakers, it almost sound like I was on a walk, not not like a like a walkie-talkie, but it was. It was kind of. I didn't like it. It sounded kind of bad so.
00:09:01 I have the AC on and I have real time noise filters going.
00:09:11 So.
00:09:13 Hopefully it sounds better.
00:09:16 This time looking at chat to see if you guys.
00:09:19 Here but like like last stream it was like I was talking and then I would stop talking and and at the end of every time I stopped talking here the the tail end of you here and then it.
00:09:29 Would.
00:09:29 Cut off and if I had just left it going, it wouldn't have been as bad.
00:09:33 But it was like the the the cut off at the end.
00:09:38 Of each time I was talking.
00:09:41 So.
00:09:44 There I stopped talking so I would do it.
00:09:47 Alright. Well, that's good. Yeah, they still, I did do I did do a test before the stream and it sounded on the recording. But I just want to make sure it worked. People are saying it's completely inaudible. That's good because.
00:09:57 I've.
00:09:57 Got. I've got 2 real time. I've got the NVIDIA podcaster or whatever the fuck it's called. Thing going.
00:10:08 And which I'm assuming there's some kind of hardware thing and then in OS I have another noise reducer and then I have the.
00:10:20 The compressor thing and all that stuff, that's what it was making. I think the noise, sound bad was the fact that it.
00:10:26 Was.
00:10:27 It was amplifying the noise and then the noise gate was cutting it off and then it was just a mess. So I think I got a better now.
00:10:33 I think this is.
00:10:35 I think this is going to work now. So alright, people in chat are saying that it's.
00:10:40 That it sounds better now. All right, well, listen back and make sure.
00:10:46 But I think I think it's fixed. I think it's fixed this time anyway.
00:10:49 Welcome to the insomnia stream. Of course, the informational nihilism edition.
00:10:55 I am your.
00:10:57 Host.
00:10:58 Of course, Devon Stack.
00:11:03 Yeah. It's gonna be kind of chill tonight. Gonna be kind of chill.
00:11:07 We've been going pretty intense.
00:11:10 I guess the last the last few a little bit, we're not super intense.
00:11:16 I like the last one I knew was gonna piss some people.
00:11:18 Off. I thought it was.
00:11:19 Kind of funny.
00:11:22 I thought it was pretty funny, which is why I was mad. The audio was kind of messed up. I was like, oh, that kind of sucks.
00:11:27 Yeah. Which you've been a little smoother there, but.
00:11:32 Yeah, yeah, there's, there's. But again as always.
00:11:36 The ones that you piss people off, those are always the best ones. And uh, I've I've never pissed people off.
00:11:44 And then had them have some reasonable complaint about why they're pissed off. It's not. That's never literally never happened.
00:11:51 Like I I've never. I've never had people get mad at the stream.
00:11:56 And then listen to the complaints and going ohh hey, like I guess you're right. I guess you're right, I don't.
00:12:03 I did. That was me. That was my bad. I must say I've never made a mistake before. Like I've never pissed people off.
00:12:11 With a stream.
00:12:13 And uh.
00:12:15 And had it be over over something that I was doing, it was always the information they were mad about, not really real. Not ultimately. That's what it was. It was always the information. They were mad about and not me.
00:12:28 Which means that I am giving them the right and I'm giving them the information whether they know it or not. They need to hear.
00:12:34 That's, you know that which is good. That's good.
00:12:37 That's that's why. That's why people like the insomnia stream.
00:12:41 Because it keeps.
00:12:41 Them up at night.
00:12:44 One thing you might hear is there's there's a fucking cricket. There's a fucking there's actually multiple crickets that I have that in in various parts of the pill.
00:12:54 Likes right now, but one one in particular I keep almost. I keep almost CNN and I I haven't seen it. I don't. I haven't even seen him yet.
00:13:03 So I think he's one of these. Like really, they're like, really small, but they're really loud for their size.
00:13:11 And he's over there, somewhere in the corner. I don't know if you're probably picking up him. That's fine. Cricket. I wish it wasn't there, but like that, like in a way, it adds a little bit of atmosphere, right? A little bit of the the evening, a little little bit of the. Let's tell stories around the campfire. Kind of.
00:13:30 Kind of feel.
00:13:32 So although it drives me nuts, it drives me nuts.
00:13:36 But you know, I have to get right to figure out where that fucker's coming from.
00:13:40 We have we have two different kinds of crickets around here. We we have like the.
00:13:46 These these big nasty ones, they don't even. They're like weird mutant desert crickets, or they don't even really. They don't look like crickets. And maybe they're not. Maybe there's something else.
00:13:55 They look enough like crickets, right? I think they're crickets.
00:14:00 And then we have like these little baby ones and that's those are the ones I think that are in the house right now these because I've seen a couple around.
00:14:07 And these little fucking baby ones hopping around.
00:14:11 These little fucking bastards.
00:14:14 Anyway.
00:14:15 One of the things I wanted to talk about because it was something that.
00:14:21 Really kind of blew me away.
00:14:25 Was the release of of Google's new AI.
00:14:31 Their new video AI and just a lot of this AI in general.
00:14:36 People are starting to figure out ways of of kind of using it in in in non retarded ways and and just. It's kind of like with the Internet.
00:14:45 Right.
00:14:46 Where new technology pops out and at first, everyone's like, oh, this is going to change everything and it.
00:14:51 Is.
00:14:52 But they're not sure how yet.
00:14:54 And so there's a lot of really dumb ideas. And then there's a lot of really good ideas and and really it's tough in the beginning, especially to to know which ones which.
00:15:05 Until you've had a chance to, really.
00:15:09 Sort of test the technology yourself, right, like as a as a kid that had Internet really early on. I remember thinking like, oh, wow, this Internet stuff's amazing. It's going to.
00:15:22 This is going to change the world and then you start to see the limitations of it. And you're like, yeah, yeah, it will. But it's not like, you know, this is not the Star Trek holodeck technology that you were kind of maybe initially picturing. And then as you start to see it as a kid at the time I started to hear.
00:15:41 All these adults.
00:15:43 All these boomers that like.
00:15:44 All this summer that it's going to do this, this and this and you'd start to be like, no, not really, no.
00:15:51 No, I can tell you don't know what the Internet is yet, but and then that was the joke with boomers like all these Wiz kids. Oh yeah.
00:15:59 These little geniuses here, they can do whatever I still remember trying to teach my mom how to use a mouse, and it was really difficult because she couldn't not move her entire arm every time she clicked the mouse button, so she would end up dragging like icons and shit like all over the screen every time she clicked.
00:16:19 And I was just like, I remember, just wondering, like, how are you this uncoordinated, you know, and and it I guess it makes sense in retrospect. You're like, well, you know, I kind of grew up playing video games. And, you know, whether it was an Atari joystick or a Nintendo pad or whatever, you know, and kind of used to not having to like like.
00:16:39 My body flail around as I'm tapping buttons with my thumb, you know, whereas my mom, I I think she played pong when she was younger.
00:16:51 Uh, But yeah, wouldn't that be something? I kind of feel like that must have been.
00:16:55 Like.
00:16:55 Uh, that must have been a thing, right?
00:16:58 Like I I cause I.
00:16:59 Even had a friend like that, I had a friend.
00:17:02 That I don't. Maybe he didn't have a Nintendo or something, but he'd come over and play, and this was later later on because I feel like this was like in the Nintendo 64 days and he still did this. He was the kind of kid that we'd be playing like.
00:17:19 You know, like bond like 007 and he'd have the control on his in his arm and he'd be, like, flinging his arms around every time he moved around. And and he wasn't even. He wasn't even, like, Super Bad. I mean, he wasn't great at the game, but he wasn't like, really bad. Like, he could play the game, but it just, it was like watching.
00:17:39 Someone with the you know, like the the the.
00:17:44 Michael J. Fox disease. You know, he was just like all over the fucking place.
00:17:49 And I don't know. It makes me wonder like, what was it like?
00:17:53 What was it like back when if you were to go to, like, a bowling alley back in the 1970s when they first started having any kind of video games? We're like, we're like boomers, just like we was, that why arcade machines had to weigh so much because because they were, like, flinging them around like they couldn't couldn't control themselves.
00:18:14 Anyway, so yeah, yeah. When the Internet first came out, the boomers, and that's why.
00:18:18 The.com bubble.
00:18:20 That's why in the.com bubble, because you had all these really stupid ideas and boomers couldn't tell the difference between what would be a stupid Internet idea and a smart one, and they would just see E in front of a a company name.
00:18:34 Or or tech like you know it was literally like the office space, you know, in a tech kind of names like, they were just all these like, stupid names that sounded high tech. And so people would just throw money at it.
00:18:50 They would throw money at it because they'd they'd missed out on on Microsoft, or they'd missed out on on, you know, some other company that had had gone to the moon and they, they were just hoping that they could. They could catch that next, you know, that they, they they they could.
00:19:10 Look out and and buy the you know the Google that was going to come and and and become the new well Google.
00:19:17 And so you had all these companies that didn't really do anything and just got lots of money thrown at them and they didn't. They didn't succeed because they didn't really do anything. And that's why the bubble popped is lots and lots and lots of companies. We're just getting all this investment money. But they didn't have any good.
00:19:36 Ideas. And they just had like, you know, they had a catchy name and.
00:19:42 And they failed.
00:19:44 Then they failed.
00:19:46 So everyone lost their money and AI's kind of the same way where it's it's.
00:19:52 Brand new.
00:19:54 You can use it and see how OK this is.
00:19:58 This is really going to fuck shit up. This is really going to be a paradigm shift and just how everything gets done, but it's not easy right away to tell how that's going to take place.
00:20:13 And in the meantime, you're just owed and odd by a lot of these these new technologies and the Internet was the same way, you know, people were awed and awed by hyperlinks. If you can believe that.
00:20:27 What you mean there? There's like, a a word in a sentence that might and and part of it's blue and and and it's underlined and I can I can click the word and it'll open up a whole nother page of words about that word that's.
00:20:45 That's crazy.
00:20:48 I remember people like explaining hyperlinks to people, like just the name itself Hyperlink, Woo, wow, hype, it's hyperlink. I'm hyperlinking guys.
00:21:01 Look at it. It's not just hyperlinked.
00:21:04 And then we just when you started to have audio that you could play like streaming audio or not even streaming audio, just like like MIDI files that would play when you would open up some Geocities website with the, you know, the animated GIF background that would give anyone, anyone epilepsy.
00:21:25 A seizure. You know that, but that was.
00:21:27 Like, oh, look at that. Look at this, Margaret. This web page is playing a.
00:21:33 Song at me.
00:21:35 Yeah, it's open up this pants like. Yeah, it sounds like shit, but it's it's it's playing. It's music.
00:21:41 It's coming from the computer.
00:21:44 It's coming from the computer.
00:21:49 And people were that was amazing. It's amazing.
00:21:54 I remember.
00:21:57 Very late 90s and this probably would have been like around.
00:22:02 Close to 2000.
00:22:04 They were testing out, there was a website called dialpad.
00:22:09 Dialpad.com and it was free. I don't even know. Like, whatever happened to these people?
00:22:15 But I made a lot of prank calls using dialpad.com.
00:22:22 You could go and and you could just call people like it was it.
00:22:25 Was a dial pad.
00:22:27 And it was again. It was like they didn't know they were to make money with it. Obviously, I I doubt maybe they never figured it out, but you could just go to this dialpad.com and dial A number and it would just call someone on. And you're like, holy shit.
00:22:40 I'm calling someone on a on a web page.
00:22:43 And if they look at their caller ID, who knows what it says, but doesn't say me.
00:22:48 So I'm gonna. I'm gonna get an old kind of mischief for that with this.
00:22:53 So that was that was crazy. And again, it was just people who are over to deal.
00:22:57 With this even.
00:22:58 Even just monetizing video when that shit came out, people had, you know, YouTube channels and you'd make a video and millions of people would watch and you wouldn't get paid shit.
00:23:11 You wouldn't get paid anything.
00:23:13 It was just like well.
00:23:16 I made this.
00:23:17 I mean this viral video and it's on the news, you know.
00:23:20 It's.
00:23:21 That that's well, that's where the news still mattered more than the Internet, like. And that. And that's oddly, that's how you made it, right. Like, you knew you made it on the Internet when you were on TV.
00:23:33 As stupid as that sounds, that's how you knew you made it on the Internet when when your Internet meme made it to television.
00:23:42 And.
00:23:44 These people never got paid for doing, you know, making these videos that millions and millions of people saw, and yet people kept making videos because it was more about.
00:23:54 The fun. In fact, there was this. There was almost like a sense that it was dirty.
00:24:01 To do it for any other reason, like especially I think it's because when when monetization didn't really exist, when especially when it wasn't ad supported.
00:24:11 The ways in which you could monetize, they were kind of scum.
00:24:14 Baggy.
00:24:15 Or at least they felt kind of scumbags. You were like shilling some kind of garbage or something. Or uh, you know, doing some kind of of SEO trickery and. And so for a long time, you know, up until fairly recently it was, it was still.
00:24:32 Kind of like it felt wrong.
00:24:34 To make money with with your video, you that's in fact. That's partially why I think maybe that's why I don't super promote things and and I feel really weird. Like I I I I don't think I'd ever really have sponsors unless you know, unless I absolutely had to. And then I'd be like who'd sponsor the show because I wouldn't.
00:24:53 Wouldn't sponsor anything. I'd have to sponsor myself. I'd have to, like, have, like my own product or something like that, because I wouldn't want to have.
00:25:00 Some of their company, you know, like having even like a little bit of influence on on what I'm.
00:25:06 OK.
00:25:07 But there's this, like, dirty I I felt it felt it felt. It felt weird about just saying, like, hey, I should be getting paid.
00:25:13 For this you know.
00:25:15 And that so there there's it's it was very slow and it looked and and you could say the Internet still it's still evolving right. It's gone through all these different iterations.
00:25:27 At first it was just hyperlinks, hyperlinks and.
00:25:32 You know really bad web pages with animated GIFs and and and you know, occasionally you'd have some kind of multimedia, you'd have guest books. Remember the guest books.
00:25:44 Oh, look. Look, mom. Someone from Argentina signed my E guest book. They went to my website and.
00:25:53 Someone named Pablo in Argentina.
00:25:58 You know the the web counters like that was the that was the best you could do. And for in terms of analytics.
00:26:06 My little speedometer looking or odometer looking thing.
00:26:10 It's roll. I've had 300 hits. Wow. Imagine 300 people.
00:26:17 Went and read my my website about.
00:26:21 Bird watching in the Rocky Mountains or whatever garbage website you had.
00:26:29 And then it went from that to.
00:26:33 To social media, you know, you started having Myspace.
00:26:38 Myspace was.
00:26:40 Was kind of in a weird era, cause it was kind of like a it was in between. It was like an in between like these these really shitty GeoCities websites cause Myspace let you do that. Myspace let you just.
00:26:54 Make your page as ridiculous as you want it.
00:26:58 And put all kinds of crap on there, but they made it easy enough to where you didn't really need to know any kind of coating.
00:27:05 And so it it made it more normally friendly. That's really all it was. It was making it more Normie friendly.
00:27:12 And.
00:27:13 And because they made it more norming friendly and they made it more social.
00:27:18 You got more girls, more girls on the Internet.
00:27:22 More girls than something that nerds wanted. Actually, they they they shouldn't have wanted it, and they probably regretted it ever since. But at the time, believe it or.
00:27:31 Not.
00:27:33 You were like ohh, sweet girls. Girls are.
00:27:35 On the Internet now.
00:27:38 I met Girls, went out with girls I met on on Myspace.
00:27:43 So you could actually meet regular people, regular people in your town.
00:27:49 On Myspace.
00:27:52 And then you had of course Facebook come around.
00:27:57 And that that really that really changed the game and the same sort of thing. They marketed it to college kids. They did a really good job of, of making it. So one of the way, which is funny because you know, now it's basically all old people, but the the way they they made it hip and cool.
00:28:15 The way that and a lot.
00:28:16 Of these services did this.
00:28:18 A lot of these services, like Gmail, was invite only when it first started out and it was like a status symbol. If you had a Gmail address.
00:28:28 It really was, if you can believe it, I remember opening up a bank account and I had a Gmail e-mail address and when I was filling out my thing, the bank guy was just like ohh.
00:28:42 Ohh Gmail huh? I was like yeah, I got in and I got into the club. He's like, uhhh.
00:28:50 Can't wait till I get in that club. And I'm like ohh good luck buddy. This is a It's an exclusive club.
00:28:58 The exclusive club that I'm in.
00:29:01 This Gmail. That's how it was. They made you feel like you've got something special.
00:29:08 And Facebook did the same thing. They were like, Oh yeah, you know, you gotta have a.
00:29:12 Gottahavea.edu gottahavea.edu e-mail. And so inevitably.
00:29:19 That limited it to just, you know, college kids, or I guess faculty too. But mostly college kids. And so if you didn't have a.edu e-mail.
00:29:29 You couldn't. You couldn't access Facebook and they were. They were, in fact, there were students hanging.
00:29:33 On to their old.
00:29:35 Dot Edu emails.
00:29:37 After he left college that they wanted to hang on to their.
00:29:40 Their Facebook page.
00:29:43 So you know that that started to change the game.
00:29:47 You had more and more. You know, you had, you know, obviously the forums were starting to.
00:29:55 You know, get more complex.
00:29:58 And forms kind of led to other forms of social media like. Well, I guess like ex sort of.
00:30:07 Of.
00:30:08 But also kind of kind of Facebooking away it's, it's it's every evolving.
00:30:15 And now we're at this point.
00:30:18 Where and this is what's kind of what.
00:30:22 What's weird about the how AI is going to?
00:30:27 Changed the game is we've gotten. We're finally at this point where most people, or at least you know, in terms of.
00:30:36 Of.
00:30:37 Modern western.
00:30:40 People Americans are online.
00:30:44 And you could even say that maybe even like COVID amplified that a little bit, everyone being stuck at home and and starting to rely on like Uber eats and stuff like this, like, which sort of in a way, if you looked at that research, Whitney Webb was doing prior to prior to COVID and.
00:31:06 It's it makes you.
00:31:07 Wonder if like maybe this had something?
00:31:09 To do with it, the heads of Google.
00:31:15 Were doing it. There was a presentation that got partially.
00:31:20 Like it was classified, it got partially released. I'm trying to remember the name of it. I'm not.
00:31:25 Going.
00:31:25 To remember the name of it, but just briefly, it was. It was a presentation talking about the difference between the AI and other technologies as they were beginning to exist in China.
00:31:40 Versus the United States and talking about the legacy systems that were going to be difficult to overcome in the United States because of.
00:31:50 The the culture of the United States, you know this, this culture of independence and ownership and discussing ways that they will they. But it's basically the phase out ownership. They want everything to be like a rent economy and and they have stuff like this now with.
00:32:09 Apps where you're you're you're renting power tools.
00:32:13 You know, the idea being well, you only use a power drill as a as a normal and a Mcmansion you use a power drill. What like 3 * a year? Why do you need to own a power drill for that? Why not just rent it those three times and then you know that we don't. You don't have to have a power drill.
00:32:28 And.
00:32:29 And they're already kind of doing stuff like that. But the point that.
00:32:34 The heads of Google they were talking to the I believe it was the CIA or it was some federal agency.
00:32:42 They said one of the one of the big differences is that the Chinese market was radically different because they basically came from poverty.
00:32:54 And they were going from not owning cars, not owning houses.
00:33:00 Not owning, you know, not having investments really or just or anything, not having anything really.
00:33:07 Directly to using their their version of PayPal.
00:33:14 You know, they're they have this. I I'm blanking on the name right now, but they have this app that basically it's like a messaging app. It's like an everything. It's what Elon Musk wants X to be.
00:33:25 It's like their their PayPal Venmo.
00:33:31 You know telegram X. It was a WeChat, I think maybe it's WeChat, it might be WeChat, but they they have these these apps that you could just.
00:33:45 You start using for banking and all these other things that that, that you weren't doing before, you know. So there wasn't like a legacy system you had to phase out, you could just instantly go right into adopting it, right? And that you know Larry Page or or whoever it was, I think it might have been Larry Page.
00:34:05 And other people or or maybe it was the that Jew.
00:34:11 I forget his name now, but regardless, the the, the the heads of Google talking to the federal government, we're basically saying, look, we're have to figure out something that's going to get all these people online and all these things they do not online like, you know, a lot of, you know, the the stuff they do in person.
00:34:30 We're going to get them to do it online and we're going to get them and and there's still a lot of people, you know, buying stuff and brick and brick and mortar stores. And that's just it's it's in China, that's less practical because of the population density and everything else. It's easier to just, you know, do everything online.
00:34:49 And so we just need to get Americans to to act more like bug people. That's really, truly what the the the short version.
00:34:57 Of it was.
00:34:59 And I I kind of you know, and COVID happened shortly after and that is kind of what happened a lot of people started to, you know, use zoom all of a sudden you had people wanting to work remotely.
00:35:11 That, you know, they've never worked from home before and now think of all the benefits, you know, especially if you're some micromanaging civil engineer somewhere, you know, or social engineer somewhere you can say, oh wow, this is great because it's it's good for all of our green initiatives. You don't have people sitting there in the carpool lane.
00:35:33 For an hour, you know each way to work sitting there, just polluting the whole way to to work them back.
00:35:40 And so it cuts back on that and it cuts, so it cuts back on carbon emissions.
00:35:46 And it for companies, they saw it. I I think a lot of them don't see it that way anymore just because of.
00:35:54 The disaster work having everyone working from home can be for a lot of companies.
00:36:00 Because so a lot of people can't handle that and they just don't work at home, they don't. There's a lot of people go to the office and don't work and you expect them to work when they're at home.
00:36:10 And so the companies, though they're like, oh, we don't have to. We don't have to renew these these leases.
00:36:18 You know these really expensive leases for the these really these premium offices in, in downtown San Francisco or downtown really well, really anywhere is it's going to be expensive for a lot of these workplace.
00:36:34 When really they're just asking to come and sit down at a computer, so why can't they do that at home? And in fact, you might. You might go get away with not even having to buy them a computer and just make them use their own computer.
00:36:46 And you know that maybe they already got one right? So why? Why even pay?
00:36:53 For the the computers and and then. Now you don't have to pay for as much IT stuff, because if you've got a whole giant building full of of computers, these normies coming in there and you know these wages are going to be coming in there, breaking those computers all day long. So then you got.
00:37:11 Going to have all the the IT staff that can handle that and then just all the the all, all the infrastructure that goes along with that. So they just you know a lot of companies are like oh, this is great. They'll streamline everything which it does in China. You know it works better in China because.
00:37:32 There is a a different work ethic in China, especially with the the diversity hiring practices you have, the United States.
00:37:44 So.
00:37:46 You had a lot of people get online.
00:37:49 And they started to, and this is kind of you also had even even before COVID that's really kind of what got got Trump elected right is a lot of people started to get their News Online.
00:38:06 You know, I I guess as the the devices made it more accessible, you know like in a way the iPhone kind of made the Internet into like a, you know, a Fisher price toy. They made it into the, they made it stupid proof with big easy buttons on a little device that that any idiot.
00:38:26 To carry around in his pocket and everyone already kind of wanted because it was.
00:38:31 Yeah, this is if you remember or not or not. This is back when iPods were were really fancy. You know, just the idea of an iPod was really cool before the iPhone came out, people were paying a lot of.
00:38:44 Money.
00:38:45 Because it was just amazing and this is how quickly the the technology advanced just you know just prior to the.
00:38:51 Your phone being out, it was this amazing thing that.
00:38:55 That you could have this little tiny box in your pocket that could hold thousands of songs. Oh my God.
00:39:01 Oh my God, because the, you know, the technology prior to that, was that what was it, disc?
00:39:04 Then you have a mini this player and I did, but no, no one had those things. And so really it was a Discman. So you had like what you could carry around this big stupid.
00:39:17 Pancake in your pocket that that literally span A or spun a a a a disc like so fast that they actually had.
00:39:28 Centrifugal force.
00:39:31 In your pocket, and it would skip easy because it's like, you know, it's spinning this fucking disc around. So anytime you bump and or jostle it, it's skipping all over the place. So you have to like, be all real careful with it and and then you got to carry a bunch of those other discs with you. You got to have, like, this big case logic book.
00:39:50 With all these little plastic pockets in there, you know it's it's.
00:39:55 Like this?
00:39:56 It's like.
00:39:56 The.
00:39:57 Size of a couple of Bibles. You got to carry around with you just to have your music with you.
00:40:02 So people were very impressed that, oh, yeah, I I can have this little thing.
00:40:07 That just digitally has all my like literally all my music.
00:40:12 On this little box.
00:40:14 So people were paying a lot of money for these iPods.
00:40:18 A lot of money. And so when you all of a sudden had an iPod, which is what people view the iPhone as, it was an iPod with the Internet on it. That made phone calls.
00:40:30 And had a camera.
00:40:32 That's a holy shit.
00:40:35 Alright, give me one of those fucking iPods. Cause the iPod market was like that. It it's well and the iPhone market was too for a little bit. I I think it's less.
00:40:44 Now that you've got a lot of, you know you got, you have actual real competition from like the Android phones and stuff like that. But for a while it was like holy shit.
00:40:54 Holy shit. Let's see what the Steve Jobs that fucker is wearing his turtleneck.
00:41:01 He said get on that stage and tell us the amazing new thing that the fucking iPhone does.
00:41:08 And so you had all these idiots that never been on the Internet before, but now their iPod went on the Internet. So they're like, oh, this is kind of cool.
00:41:16 I can get on the Internet and then everything and then they made everything. You didn't have to.
00:41:20 Use the Internet.
00:41:20 You could use apps, so it was like.
00:41:22 It it turned into like stupid proof.
00:41:26 So you could have literal monkeys, you know, using apps, cause instead of someone having to go to a website and log in and browse and click on things, you could just make it so you touch the big button that has like a pic, a :) on it.
00:41:41 By the way, that's what emojis came out to make it, you know. Oh, look, now you don't have to, like, know how to like.
00:41:47 Say words.
00:41:49 You can just do like happy faces. They can be like cave drawings. You can, you know, you can just do like a picture of a :) and then a, then a sad face and like.
00:42:02 A.
00:42:03 Like a squirt gun or something.
00:42:08 I I remember when emojis came out thinking this, the beginning, the end like this, this right here.
00:42:14 Is the beginning of the end.
00:42:18 Because it was, you know, like, look we in the nerd days of the Internet, we still we had sort of we had we used little ASCII characters and you know.
00:42:27 There was a form of emoji.
00:42:30 But it wasn't, you know, it wasn't. You weren't.
00:42:36 You it wasn't being used the way it ended up getting used.
00:42:40 I remember I remember very clearly very clearly.
00:42:46 Going on like my my.
00:42:50 As the as Internet dating became an app, also expanding the reach of Internet dating right, it used to be there's these websites you had to go on and set it all up and whatever. And so most people didn't do that. But the second it was just like, oh, it's a it's a stupid app I.
00:43:08 Just.
00:43:08 Hit like tapped the fire.
00:43:10 Picture and then and then.
00:43:11 I just.
00:43:13 I just flipped the picture this way or that way and then then then it tells me if they like me and they made.
00:43:18 It really fucking.
00:43:19 Stupid proof.
00:43:20 And.
00:43:21 I I remember when I.
00:43:24 When I first message day.
00:43:27 An egregious.
00:43:31 Emoji over user.
00:43:34 And I I you know, as I said, I had Internet many, many years ago as a kid, I was one of the first people that, you know, first people I knew that had any kind of Internet. I was used to communicating with people via text.
00:43:47 On a computer and.
00:43:51 It was really comfortable with that and there was, there was even a little there was a little bit of a culture. There's a little bit of an Internet culture and an.
00:43:57 Etiquette, or netiquette as they call it. I guess back then.
00:44:02 And I were just being horrified when I would be talking to a girl.
00:44:09 And the amount of emojis.
00:44:12 The amount of emojis that would just come across.
00:44:16 At me.
00:44:18 And it was just like it was like an IQ test. I was just like, with every emoji you send me.
00:44:23 I know your IQ is five points lower than what I previously thought with every single one you send.
00:44:33 So yeah, like it's, but it's always changing, it's always changing.
00:44:39 And now the big thing is is.
00:44:41 The AI stuff.
00:44:43 Now the big thing is.
00:44:45 Is AI and the same sort of thing when it first came out, it was like holy shit.
00:44:51 This is crazy because especially if you've been on the Internet a long time or you just used computers for a long time, you were familiar with with attempts.
00:45:00 Attempts at making chat bots and you kind of understood what was going on when people made chat bots right that that there's, you know, especially if you ever played like a, you know, an old Kings quest game or something like that, you know, you know the limitations.
00:45:17 You know what was going on that like, it's just looking for keywords and it's trying to like sound, you know, like a person by responding to.
00:45:25 Some word that it recognized, but it's really just.
00:45:29 You know, just picking out pre predefined responses out of some database and maybe maybe you can you know previously and I that's what I thought I thought you know, well, there's never going to. It's never going to be like a computer that can like generate its own response. It's always going to have to go look at some database and maybe it'll be like a huge database so you know maybe it'll be able to sound.
00:45:50 Human.
00:45:51 Because the database will be so big that it will be able to choose from like you know all these responses enough of the time, one that sounds relevant, that it maybe it'll trick people and you'll pass the Turing test or whatever.
00:46:07 Yeah, the the whole, the whole.
00:46:11 You know, making making digital neuron thing didn't didn't occur to most people.
00:46:16 So when the AI chat bots first came out, it was kind of like, well, what the fuck? This, this is a little bit crazy because now.
00:46:26 Now, just like with the Internet, your, your, the gears start spinning the the possibilities.
00:46:33 The possibilities start start popping up in your head all the different ways this could go.
00:46:39 And this you go in all kinds of different directions and of course your head goes to, especially if someone has spent a lot of time on the Internet and seeing we have really we we haven't really discussed all the evil things the Internet was.
00:46:52 Used.
00:46:52 For, but if you've been here in a long time, you know that it was.
00:46:58 Yeah, there's a lot of lot of dark, evil shit has been on the Internet the whole time.
00:47:02 Too.
00:47:03 And so AI is going to be used for a lot of this bad stuff.
00:47:08 And and you know, like we've talked about, like the romance scam stuff, all that stuff could be automated or all kinds of scamming can be automated. You can automate, you know, information warfare. You can unleash chat bots onto social media platforms to, you know.
00:47:27 Give people a a false impression of whatever the social consensus is because, well, I keep hearing all these people.
00:47:36 You know, say voice this, this one opinion it must that must be the consensus. And because I'm an idiot MPC I'm going to conform to the consent which is how most people think.
00:47:47 They just conform to whatever they think is popular.
00:47:51 And.
00:47:54 You know that's that's one of the things that could easily be used for.
00:47:59 And then he started to have the AI image generation stuff.
00:48:03 And that was kind of freaky because again, it was like, OK, you know, I I I I remember when.
00:48:10 When Google's first AI images came out and they just looked like a bunch of fucking crazy eyeballs. And I remember thinking like, well, this is.
00:48:20 I guess, I mean, it's interesting. You know, it's like weird abstract art, but like.
00:48:25 I don't. Why are they? You know, why are they having?
00:48:29 Computers make this? It just looks, it looks a little horrifying actually. What's?
00:48:34 What are we doing this about?
00:48:36 You know? But then as that evolved, then it became.
00:48:40 More and more.
00:48:42 Photorealistic it became obvious like oh wow, this is this is going to make this going to create some.
00:48:50 More problems?
00:48:51 This is going to create some more problems because now instead of just being a chat bot, in fact if you watch any of these YouTube channels that that bust these.
00:49:02 These Nigerian scammers and these Indian scammers that that scam boomers out of their boomer bucks by telling them they're Johnny Depp or or George Clooney or Jennifer Aniston or whatever they one of the things they always say to these.
00:49:22 These retards that give their life savings over to these Nigerians.
00:49:27 They say, you know, if you're going.
00:49:29 To.
00:49:30 First of all, be that dumb thinking that Jennifer Aniston is is messaging some 65 year old retired.
00:49:39 Bake Bakery employee in.
00:49:43 Wisconsin for no reason. Like. Yeah, like, if you're gonna be that dumb, OK. What?
00:49:48 Or, but shouldn't you at least want to have?
00:49:52 A FaceTime call with her or something, and in the instances where you've seen where they have received images from George Clooney or Brad Pitt or whoever they think they're talking to, it's like the worst, you know, because it's black people doing it. It's like the worst fucking photoshops ever.
00:50:12 It's like the the photoshops that.
00:50:14 Like that, that when the in the 90s that the people were making it and with like flash animations and stuff, it's like this obviously cut out head pasted on somebody with like a obviously typed text on you know.
00:50:30 On some sign or something.
00:50:33 But easy, easy to spot for anyone who's even remotely.
00:50:37 Not retarded? No, I guess. I guess that makes sense though, because only retarded people are falling for this stuff anyway.
00:50:43 We'll start turning because you, you're like, oh, wow, you know, like, they could make.
00:50:49 You know, they could make generate new photos like you could even if you were, like, sort of smart, but maybe just a narcissist thinking that Brad Pitt wanted to talk to you and you did like a reverse image search of one of these photos. It's not going to exist, right? Because they can completely generate a a brand new.
00:51:09 Image and so you start to see how that and then of course that's just that's like the.
00:51:16 It's pretty low on the list of things. I mean, all the you could make fake news images right up until very recently. This is the kind you know people used, you know, picture is worth 1000 words and. And you know that seemed like it was better that that was the best kind of evidence in a courtroom. If you have video evidence or.
00:51:36 Photographic evidence. I mean you that was.
00:51:38 Your case closed, right?
00:51:41 You have video of someone committing a crime. That's.
00:51:44 That's it.
00:51:47 Now.
00:51:49 The.
00:51:50 The video stuff that came out this week.
00:51:54 From Google.
00:51:55 Is shockingly good.
00:51:58 Shockingly good.
00:52:01 It's not often. Well, actually, it's it's beginning to be often.
00:52:08 That I am surprised by this stuff that it's it is just getting better and better and better. In fact, here's.
00:52:15 A little clip.
00:52:17 That was floating around on Twitter.
00:52:20 The even the idea this is now look, it's still you can still there's something weird about it.
00:52:26 And by the way, one of the things that's weird about it, I want you guys to pay attention to this. One of the things that's weird about it is.
00:52:33 It's all white people.
00:52:35 And I don't think the person writing the prompt was telling it to specifically to to just make white people. I was noticing this.
00:52:42 About a lot of this Google AI stuff that how it's all white people.
00:52:47 In the in the video it's not 100%, but it's like way more than reality. You know it's it's especially for the generation of people being depicted in this in this video, right? We already know that zoomers are basically less than 50% white.
00:53:04 And so it was. That's the first thing that stuck out to me is like.
00:53:08 Why? Why is it so white?
00:53:12 And I and I started wondering is it because?
00:53:15 Is it because, like uh?
00:53:18 Remember the early Google AI that couldn't tell the difference between black people and gorillas? They're not being funny that that really happened. They had one of their their image recognition AI's one of the their first rollouts of AI was implemented into Google Photos.
00:53:38 I've never. I've. Well, I've had. I've had Android phones before, but I've never really used their stuff, so I'm not sure.
00:53:45 Exactly what, what the software was. But the whatever, you know, their cloud image surface service was it would it would identify what was in your photos. So it could categorize it and stuff like that and tag it. And it was tagging black people as gorillas.
00:54:05 Because the AI couldn't tell the difference between at least some black people.
00:54:11 And gorillas.
00:54:13 Which is pretty fucking hilarious.
00:54:16 But makes total sense.
00:54:18 And.
00:54:19 I almost wonder if there's something similar going on.
00:54:24 Where this AI, because of how it works, uh, if it starts making a black person.
00:54:33 And it gives them way too many ape like features. By the time it's done, by the time they're.
00:54:40 Made.
00:54:41 I don't know that's what's going on, but it just seemed a little bit weird that there's so many fucking white people in these in these demo clips that people are posting.
00:54:51 But here's, like I said, here's one.
00:54:56 Oh, and I guess one of the big.
00:54:58 One of the big upgrades.
00:55:01 Of of a lot of this stuff is it's not just that ohh look, because that's something we got used to a couple years ago. It was shocking at the time, but like, wow, hey, I can make a photo.
00:55:13 Like remember thinking like wow, people can generate a fake person.
00:55:18 And use it as a A.
00:55:22 As a a profile picture.
00:55:26 And you would, you wouldn't. No, you wouldn't know that that person didn't exist. And in fact, there was that website. It might. Maybe it's still up. I think it was thispersondoesnotexist.com.
00:55:36 And that was really fucking freaky. And that was no longer. That was like, maybe what, 20?
00:55:42 Well, I guess time flies. I guess that was about 10 years ago now, because I feel like that was probably around 2015 is when they had that.
00:55:49 That so it's about 10 years ago.
00:55:52 They had that website you know thispersondoesnotexist.com or whatever it was.
00:55:58 And that's all it did. It just made like.
00:56:00 A fake person.
00:56:02 And I remember thinking like, holy shit.
00:56:05 If they're able to make people that look real, I mean, just think of all you, you'll you'll never know. Like you could make dating profiles or whatever with just fake, and you wouldn't even know. It's not that this person is not real, and even that was. That was tough to wrap your head around. That was like, oh, I don't like that.
00:56:23 I could see all the ways this could be.
00:56:26 This could be bad.
00:56:29 And we've gone just, we've gone from that just from having a static image.
00:56:34 And then, you know, once they once they.
00:56:37 Once they went from just like the faces.
00:56:41 To the bodies. You're always given the impression.
00:56:44 Well, it doesn't.
00:56:45 Look totally real, yet like their fingers don't look right. Right? Like that. You know, that was the big deal for a long time. Like ohh. Like that. Guys, hand has seven fingers and.
00:56:55 And some of the some of the, you know, lame or am I still do weird, you know, still have weird problems like that or or all this. This lady has two arms or or three arms or you know something's wrong or something's weird about this angle.
00:57:07 And there's still like a little bit like a little bit of that, there's still a little bit of this uncanny valley shit going on with a lot of this brand new AI. But compared to what we had just a couple years ago, it's fucking insane. In fact, this this image I had up here and this is.
00:57:26 This is the old Google View or not view video, or I think it's called video.
Google AI Man on the Street Host
00:57:31 If you were any animal right now, what would you be and?Google AI Girl on the Street
00:57:33 Why? And I quick a sloth. Well, like a party sloth. Real slow.00:57:39 Biggest craving right now go but like.
Google AI Man on the Street Host
00:57:41 Sleep.00:57:42 In a giant warm.
00:57:45 Bread loaf define true love for me tonight.
Google AI Girl on the Street 2
00:57:48 Finding an onion ring in your French fries. Unexpected, beautiful deep.Google AI Man on the Street Host
00:57:55 Alright, real talk. What's the deepest philosophical insight you've had say?Google AI Girl on the Street
00:57:59 OK, listen, squirrels, right? They're.Google AI Man on the Street Host
00:58:01 Just they're just rats.Google AI Girl on the Street
00:58:02 Your brain right now describe.Google AI Man on the Street Host
00:58:04 It disco ball full of confused squirrels. Sparkly chaos, yes.Google AI Girl on the Street
00:58:10 Party mission tonight go.Google AI Man on the Street Host
00:58:12 Befriend a lamp, Bartholomew. We vibe deeply.Devon Stack
00:58:14 Name it.Google AI Man on the Street Host
00:58:19 Alright, wisdom time. What's your best piece of? Absolutely terrible advice for making tonight legendary lick glitter. You gotta lick the glitter. That pizza slice. Describe it. Soul. It's soul, spicy and death plotting world domination 1 greasy at a time.Google AI Girl on the Street
00:58:33 Triangle one word for your future go.Google AI Man on the Street Host
00:58:36 Legendary, or at least heavily sponsored by instant noodles? Probably not.00:58:42 This food line your fields.
Google AI Girl on the Street 2
00:58:44 My soul is buffering really bad. Why find my essence? Man, your current life goal.Google AI Man on the Street Host
00:58:51 Go to understand why Dracula never just ordered it. He had options, you know, maybe he just really liked the drama.Google AI Girl on the Street 2
00:58:58 Best life advice right now go.Google AI Romanian Guy
00:59:00 Never argue with the straight Bucharest dog. They have seen things and they always win the staring contest.Google AI Girl on the Street 2
00:59:05 Always man. Best Romanian street food wisdom go.Google AI Romanian Guy
00:59:08 Shawarma heals all wounds, especially the ones you get from trying to dance like a pop star after too much.00:59:13 Suika trust me.
Google AI Girl on the Street
00:59:14 Festival Wisdom 1 tip for your past self before.Google AI Man on the Street Host
00:59:16 You got here pack of portable charger. The size of a brick. My phone died at noon. I'm navigating by vibes and the vague scent of overpriced pizza.Devon Stack
00:59:25 So yeah, pretty pretty fucking insane.00:59:30 You, but you can. You know, there's something not right about it, obviously.
00:59:35 What they're saying is, is crazy too. It just seems like you're looking it. It seems like a dream like it seems like kind of a nightmare actually.
00:59:46 It's it, it feels like.
00:59:49 Something's not right. There's something just very unsettling about everybody, not just the way that they look. But you know what they're saying? Everything just seems really.
00:59:59 Truman show right? Like I guess that's one way to put it. Like it's very Truman show.
01:00:05 Like they they they there's no soul.
01:00:08 They are kind of just like props.
01:00:11 And it's really fucking. But to know that we went from, you know, thisisnotarealperson.com, which would just make a face. And that was freaking people out to this.
01:00:24 In in you know in like I think it's been about 10 years.
01:00:29 Is insane.
01:00:31 It's insane.
01:00:34 And this stuff is is looking really.
01:00:37 Really good. Really, really fast. I'll tell you the first thing that came to my mind is holy shit.
01:00:43 Uh, did I I I got in on just the real the the real tail end.
01:00:49 Of of ever being able to make motion graphics or 3D animation or all that, that whole industry is.
01:01:00 There will there, like there's always going to be a need for it in some way, and we're talking about why a little bit here in a SEC is part of this. Part of this is going to be a problem because.
01:01:12 AI relies on being the, you know, the training data, right? It relies on on and I'll I'll I want to. I want.
01:01:20 To tell you guys the.
01:01:22 Video I played the beginning of the stream that many of you replay gang guys. Fast forward through.
01:01:27 That was 100% AI.
01:01:31 The video that with all the bees and stuff, those are not, that's not footage.
01:01:37 Of bees at the pillbox.
01:01:39 We'll bring it up here because again, there's a lot of people that.
01:01:44 Probably Fast forward through that.
01:01:48 Right down to the opening shot, the time lapse shot.
01:01:55 Of the desert, that's not real. This was made by Google. I just described it.
01:02:03 And look, that looks very much like my intro.
01:02:06 Time lapse, right. And it's not this is.
01:02:11 It nailed it on the first try I I this is not real. This is not real time lapse footage of.
01:02:19 Of cactus is blooming.
01:02:22 And obviously, bees. I wanted bees. I wanted to give you guys a little clue. Bees don't live underground.
01:02:30 I it was a little hard now this is this is kind.
01:02:33 Of the issue I had.
01:02:35 I I told it to make bees crawling out from underneath the sand because I thought that would look really cool. It did look cool, but because the training data didn't have a lot of footage of.
01:02:45 He's.
01:02:46 Crawling out of sand it it, it really struggled to produce it for me.
01:02:52 And it took several tries before it actually made bees coming out of sand. And really the only shot that out of like maybe the 10 that I tries, I did that actually had them emerging out of the sound like I wanted with this one.
01:03:08 But yeah, these this is all this is all fake.
01:03:12 And it looks relatively good.
01:03:17 I thought the.
01:03:17 I thought this shot would give it away too, just cause you be like, well, how the fuck would you even get that shot?
01:03:24 That's pretty good. Now, it didn't look perfect. There are little things they're out there are little like like as much as those cactuses look pretty good because I have cactuses.
01:03:36 Right outside I I can spot.
01:03:39 Spot some weirdness about him and even the bees, because I, you know, I I work with bees and you can even see a couple of instances where the bee will flip directions. It'll.
01:03:50 Morph.
01:03:51 Like the the the image prediction of what should be the next frame fucks up and it and it makes the bee facing the wrong way with the next frame or that bee, it. You can see like just kind of comes out of a flower out of nowhere.
01:04:04 But yeah, it's pretty good.
01:04:08 Like it's all pretty fucking good, like, you know, so there's got, there's a.
01:04:14 Told to make a special forces operator covered in bees that one that was a nightmare too, and made a lot of really dumb shit. In fact, let me let me bring up some of the dumb shit that it made.
01:04:26 Before it actually got what I wanted.
01:04:30 Let's see here.
01:04:32 Because it was, it was fine. And like I said, it was fine and tell.
01:04:38 You start asking it to do things that you need that it probably didn't have footage of.
01:04:44 It probably didn't have any like one thing I asked her to do was I wanted it to have bees covered in black ooze.
01:04:52 Well, there's not gonna be a whole lot of footage of bees covered in black ooze, so it made these bees with, like, weird droplets on them. And they looked all fucking crazy.
01:05:02 And.
01:05:04 When it got really funny.
01:05:08 I hope I downloaded this, if not I'll download.
01:05:10 It.
01:05:11 I thought as an homage to.
01:05:14 To Truro, I was going to put something really kind of obviously fake and and scary looking. I asked it to make an orange cat in the desert that would open up his mouth and bees would fly out of his mouth.
01:05:32 That looked really cool and I thought that that again, this is probably might just be the version I have available to me because and I'll show you why here in a moment.
01:05:41 But the what?
01:05:43 What it came up with just looked really bad, like it looked worse than something I could make and and that's when I was like, huh?
01:05:51 There's obviously.
01:05:53 There, there's obviously some kind of issue. Let me download.
01:05:56 That.
01:05:57 That file.
01:05:59 And then also it it uh.
01:06:03 When I asked it to make bees attacking military guys.
01:06:08 And that didn't work at all. It just made like these psychotic. I'll, I'll show you and I'll show you this one too. Let me download some of this stuff real quick in the in the background here.
01:06:24 I've got.
01:06:26 New folder here.
01:06:34 The other thing I couldn't I I I I couldn't get to work. Is the audio stuff, I and I again I I maybe I'm just doing it wrong but.
01:06:44 I from what I read.
01:06:46 Because I was using their flow tool.
01:06:49 And from what I read, they said it well, it won't always.
01:06:53 It won't always generate audio and I was like, well, let's.
01:06:57 Kind of weird and none of the clips that I tried to generate audio generated audio and I don't know why. And again I probably something I was doing wrong because I had never used this before.
01:07:11 You know, let me download some of these.
01:07:14 There we go.
01:07:18 On the download, all the really bad ones that it made.
01:07:23 So this is.
01:07:25 Here's some of the sillier ones that it made.
01:07:28 Is what we're seeing people post on X. They're not telling you how many tries it took to get that.
01:07:37 But here you can see.
01:07:40 That they trained it on YouTube cause I told it to make a bee flying over the desert.
01:07:46 And it made this.
01:07:54 So you can tell ohh this must have it's training data.
01:07:59 That looks like a A.
01:08:04 A AI interpretation.
01:08:06 Of like some.
01:08:09 You know, either some New Mexico tourism video or some kind of honey farm, you know, like, oh, welcome to our honey farm here in the desert.
01:08:19 And so it's even though I didn't tell it to do like any kind of.
01:08:23 You know texturing like that.
01:08:26 It came up with this crazy shit.
01:08:30 Because it it was probably trained on all of YouTube.
01:08:34 Now 1 issue is going to be well, what happens when?
01:08:38 When no, when all the training data on YouTube is just AI stuff, because no one's out going out and shooting footage anymore. Because why would you go?
01:08:45 Out with a drone and spend an entire day just to get a couple aerial shots of the desert.
01:08:52 When you can just tell, I give me an aerial shot of the desert, and then you've got it. You're done.
01:08:59 This was supposed to be soldiers being attacked by bees and this is the garb the hot garbage spit out.
01:09:07 It's like, what the fuck is this even?
01:09:12 I was like how? How are people getting all these awesome results and I'm getting this this fucking garbage. So a lot of it was just fucking pure trash.
01:09:22 That it was spitting out.
01:09:25 Like, here's another one.
01:09:28 Look at those soldiers being attacked by bees or something. Like. That's what I thought.
01:09:33 That was.
01:09:40 This one was a little unsettling, but I don't know what to do with it.
01:09:50 Just some deer with bees crawling around on them.
01:09:53 But it looks pretty. I mean, you know.
01:09:55 Out of you get like you get some of the stupid shots.
01:10:00 May get some good ones too.
01:10:02 You actually get some good ones that that make you think like, wow, you know, like, this is really.
01:10:10 Really close. Oh, this is all right. So this is the Truro one.
01:10:15 It's just bad. It looks it like it looks horrible and and I'm wondering how some people are getting some of these.
01:10:22 Some of the the stuff they're getting, unless it's because it's something that happened in a movie, we're going to watch some of these clips they've they've come up with here.
01:10:31 But I told it to make it orange cat and I was very confident and this was going to look cool because I had seen so many of these other shots on on acts getting posted.
01:10:41 So I was like ohh like it's promising. Then I hit play and this happens.
01:10:48 What the fuck is this?
01:10:50 It's like some this is like some 1990s particle emitter bullshit. What the fuck is that?
01:10:57 So it made that stupid shit.
01:11:00 And here is the other my second attempt at it.
01:11:05 It was just as stupid.
01:11:12 It's.
01:11:12 Like.
01:11:13 The fuck is this? It's yeah, it just looks like a particle emitter, like a like a with all the default settings.
01:11:22 In after effects on some stock footage of a cat like it.
01:11:26 Looks so bad.
01:11:27 Like, why would why? Why is it?
01:11:29 Making this garbage.
01:11:31 You know, like and then I all I could think.
01:11:33 Of.
01:11:33 Was like cause it's you're asking it to do something doesn't exist.
01:11:37 But a giant here's what I understand a giant pig truck doesn't exist, and it seems to have nailed.
01:11:44 That pretty good.
01:11:47 And there's a lot of these other things on the Google website. So for example.
01:11:52 Let me see some of the more bizarre ones.
01:11:55 So this is footage from the on the Google website right now. I didn't make this one.
01:12:01 But it's like you know.
01:12:04 Underwater lady.
01:12:07 That probably doesn't. Oh, I don't know.
01:12:12 Some. Maybe there's there's stuff that's close enough to it.
01:12:16 Like this for example. This looks pretty good.
01:12:21 That's all. AI looks pretty good.
01:12:32 And then people are making AI's. I think. I think AI's be really good at horror. This you can tell it was obviously trained on like video game footage because this is what it looks like, video game footage.
01:12:49 Looks like I'm going to like first person horror video game.
01:13:00 Still here and this one was uh.
01:13:03 See this one's particularly impressive too. This one shows that the the you will be able to make.
01:13:12 Movies.
01:13:15 And and pretty pretty soon too. Pretty soon you'll be able to make and I would say within a couple of years.
01:13:22 You'll have feature length movies being made like using this kind of technology.
01:13:30 Why is there no audio? Oh.
01:13:34 There we go.
Google AI Depressed Lady
01:13:35 I tried everything for my depression, nothing worked.Google AI Depressed Guy
01:13:39 Every day felt heavy. I felt trapped.Google AI Depressed Lady
01:13:42 Then I tried pepperman.Google AI Dr. Patel
01:13:44 Our prescription helps your body secrete a special pheromone that attracts puppies.Google AI Depressed Old Man
01:13:50 I took the pill before bed and when I woke.Devon Stack
01:13:53 Up there, he was the love of my life.Google AI Depressed Lady
01:13:57 The pill does not target depression directly, but we found that it's really difficult to be depressed when cute.01:14:03 Dogs show up at your doorstep.
01:14:04 I used to feel so empty, but now I feel joy and mild concern. How a pea stain got on the ceiling.
01:14:12 My puppy listens twice as good as my ex-husband and only climbs into the lap of half as many of my friends.
Google AI Depressed Old Man
01:14:19 He chewed up my Bible and pooped in my good chair.01:14:22 Here, but I'm happy for the first time in years.
Google AI Man on the Street Host
01:14:27 Looks like a rat barks like a demon.Devon Stack
01:14:31 But he saved my life.Google AI Depressed Old Lady
01:14:33 I named him Earl. He follows me everywhere and farts in his sleep, just like my first husband.Google AI Voice Over
01:14:41 Coppermine for when your therapist says maybe you should get a dog.Devon Stack
01:14:49 It looks it looks really you know it it looks real.01:14:57 It look I mean.
01:14:58 If you had showed that to me a couple years ago.
01:15:02 I would. I would think that the.
01:15:06 The the, the, the, the, the anomalous behavior, like the weird enunciation at times I would I would have chalked that up to bad acting.
01:15:16 I would have said well.
01:15:18 Some of that sounds unnatural.
01:15:21 But it's probably just, you know, it's bad actors and so they just they sound unnatural.
01:15:28 I wouldn't think that there was AI.
01:15:32 Now that that which leads to another point.
01:15:36 If this is what is not just available now, it's it's free, or at least.
01:15:42 What I was able to generate with the B stuff, right? This. Uh.
01:15:47 This file here when I made this I used like a free month because they make you sign up for it, but then you know the first month is free and you can cancel any time. So like whatever.
01:15:59 But so I think after that it's like 20 bucks or something like that. But for almost nothing.
01:16:06 The average person can go in.
01:16:09 And create this kind of footage that would have cost you 20 bucks a month when I was doing freelance video jobs.
01:16:19 It was at least, and this if you went to the cheap, cheap, cheap cheap websites for stock footage, it was at least 20 bucks per shot.
01:16:30 Like this shot right here.
01:16:33 Not even 20 bucks.
01:16:34 Maybe as much as 500. It depends. You know there was.
01:16:39 Variability in terms of the you know how good the the shot was and and the the the the use usage rights and everything else. But for 20 bucks you can just make you can make whatever. I mean there's a lot of people out of Business Today is what I'm getting at. There's a lot of people if you if you made your living.
01:16:58 Going around with the camera, getting stock footage or just random stuff like girl drinks out of a cup.
01:17:06 You know, like a oak tree blowing in the wind, cause there's these huge and that's it. All these websites that sell that stuff and they're basically done, you know, like if you had a website that sold that kind of food.
01:17:18 Bridge that I used to use all those I used to use those websites all the time. All the time. I I've spent thousands of dollars on various projects buying the kinds of footage that you you see in this and and I never could find the exact shot that I wanted and and and.
01:17:38 So it's not only does this replace it, it's even better.
01:17:43 Than having the stock footage. But again, what happens when?
01:17:46 They don't have any.
01:17:48 Or at least not the the variety. They don't have the amount of footage to train the AI.
01:17:54 Rayon and what happens when the AI starts to get trained on footage created by by by AI? Like right now there's something a little bit off with these bees. There's something a little bit off with the cactuses, and that's that's after it has been trained on actual footage of bees.
01:18:15 An actual footage of cactuses.
01:18:17 And So what happens when you unleash the AI?
01:18:21 On footage of AI footage of bees or like this, if I put this video on YouTube, this would become part of its its training data and now all of a sudden it's got like this weird black dripping honey and these weird bees that don't look quite right and it's just incorporating that into its library of.
01:18:41 What bees look like?
01:18:44 I mean, so it's going to be tricky.
01:18:47 You know, I this very well could be like the golden era.
01:18:52 Of AI image generation. And then you you instantly get into this issue where all of it starts to regress because all of the the original data.
01:19:07 That it it's it's taking from all those people are going to be out of a job.
01:19:13 Right. Like if you had people.
01:19:18 Still going around shooting footage like 4K footage, 8K footage and getting paid. And that's the thing too, is who's going to the quality is going to go down.
01:19:28 Because why? Why pay a a guy with a?
01:19:36 An 8K camera or a 4K camera the the thousands of dollars that cause that's I was just talking about going to a stock footage website and downloading some, you know, just real basic, almost like clip art. It's like a clip art version of of footage.
01:19:53 Why even do that?
01:19:55 When you can.
01:19:55 Just for again, 20 bucks a month have it generate its own.
01:20:00 Own footage that looks pretty much exactly how you want, or at least close enough.
01:20:05 And you can make the changes you have to go reshoot it or anything like that. So all the people that that create this high end original.
01:20:12 Content. They're all their jobs, just dried up.
01:20:17 And so you're going to have, there's going to be a lot of weird fucking shit going on. But moreover, let's talk about some of the more obvious or the maybe the OR rather some of the less obvious problems with some of this technology.
01:20:30 Because I think it's easy for designers and you know, we all saw, like, Reddit fags freaking out just initially.
01:20:39 When the first AI image just started to be revealed, but like this is another thing, This is why I.
01:20:46 Was.
01:20:46 Saying like man, I I I.
01:20:48 Got out of this business just in time.
01:20:53 Because some guy, some guy in front of his computer.
01:20:57 It with basically 0 effort in an afternoon made this entire thing here.
01:21:58 So you know, there's a lot of a lot of VFX artists right now that are sweating.
01:22:08 This is.
01:22:12 Ah.
01:22:18 There's a lot of AI.
01:22:23 Where people in chat are saying this is not.
01:22:25 Is this not AI?
01:22:30 OK, now it is. Yeah, someone says this is a moving.
01:22:35 No, this is this is uh.
01:22:42 No, I I guess it's not a I.
01:22:46 I was wrong. I got. I got, I got tricked.
01:22:50 I got tricked guys.
01:22:53 I got tricked. Let me find some of these other clips. Then I just downloaded a bunch of these clips that.
01:22:57 People are saying, oh, this is AI.
01:23:00 Maybe I got tricked.
01:23:02 God damn it.
01:23:04 I got tricked. So you said. Good thing. I was looking at chat then, huh? So.
01:23:07 You guys think I never look at chat?
01:23:17 All right. Well, are these is this real AI?
01:23:22 My getting tricked on this stuff too, which by the way that that just goes to show that that's the problem.
01:23:32 That's the problem is we're not gonna be able to tell the difference.
01:24:25 Yeah. So again, I don't, maybe maybe there's people put see that that's that's going to get into what we're gonna be talking about tonight.
01:24:33 You don't know the difference.
01:24:37 So here's a people I know this is AI just cause it's fucking creepy.
01:24:45 You are posting this clip around.
01:24:48 They told AI to make a comedian again, like the jokes aren't funny. Everything's kind of.
01:24:55 Nightmarish, but at the same.
01:24:56 Time it looks like a person.
01:24:59 Looks like a person looks like a Jew. I mean, it looks like a Jew telling a joke.
Google AI Man on the Street Host
01:25:04 So I went to the zoo the other day and all they had was one dog. It was a Shih Tzu.Devon Stack
01:25:14 See.01:25:19 They made uh, they made a Jewish comedian looking pretty realistic.
01:25:25 Again, if you'd show me that like 10 years ago.
01:25:29 I'd be like, ah, yeah.
01:25:33 That's uh, that's just some bad comedian.
01:25:39 See if I can find I hadn't. I thought I had more clips than that. That said.
01:25:52 Might have one more. What's this one?
01:25:58 I do not have one more.
01:26:05 Anyway.
01:26:08 Well, let's talk about let's talk about.
01:26:12 What could happen as a result of all this stuff?
01:26:17 Changing.
01:26:22 So some of this stuff's pretty easy we've already.
01:26:25 Talked about some of these ramifications.
01:26:28 Some of the stuff's easy to figure.
01:26:31 Some of it's not as easy, and then there's always the secondary effects.
01:26:36 Right. The primary effects are going to be most people, I think.
01:26:40 Can't at least imagine some of the the problems you're going to have like so for example media.
01:26:49 You know, it used to be. If you saw footage of something happening on the news.
01:26:55 Yeah, they would deceptively edit it, which, by the way, that only is going to make the problem worse as we know they already have the capacity for deceptively editing. They've been deceptively editing since video has existed, so we know that that news organizations are not above editing the content of video evidence in order to make it look a different way.
01:27:16 And so this is going to just provide them with a more effective tool for doing that and just entirely making it up, which again we've also had, there's been very famous cases of news organizations just completely making up stories.
01:27:34 So we already know that that's something that that they're, you know, that they're willing to do. And this technology is just going to make it more, more, more easy to do or easier to do. And not only that, even if you had a a media organization acting in good faith.
01:27:53 You could trick them. You could have a bad actor provide them with fake footage, something that we've also seen happen. Well, it just happened to me, right? I played like a part of a movie that I'd never seen thinking it was an AI.
01:28:05 Clip.
01:28:05 Because because I got lied to you. I thought thousand, 1000.
01:28:11 And so you can the the opposite would happen if people giving you an AI clip saying it was real.
01:28:17 And you would have.
01:28:20 You would have people that might might have no bad intention just playing it and we saw this. Remember when you had news outlets posting shots from Arma, you know video or, you know, video footage from a video game saying that it was, you know, Syria.
01:28:41 And you had footage from there was like a shooting range in the South.
01:28:47 When they were blasting, you know, like automatic rifles that, you know, it was, it was like just.
01:28:53 Good old boy. Shooting heavy machine guns in a in a place in somewhere in the South. And they made it look like a terrorist training camps and. Yeah. And so you've you've already. We've already had instances where news outlets are are given like obviously not real footage and they still air.
01:29:14 And and.
01:29:16 You know, for the the the purposes of promoting the narrative.
01:29:20 So you're going to have.
01:29:23 You're going to have that sort of a problem with medium now that's going to be the that's like the first phase of of what's wrong with with this sort of a thing, but the the secondary effect is going to be people already don't have trust in media. They already don't have trust in media and that trust is going to be even further.
01:29:43 Eroded when everything is looked at as if it could be fake.
01:29:48 Because while we're not quite there yet, you know, like all this, all the the AI clips I've played, although it's it's, you know, like I said that that puppy commercial, it's pretty, you know, it's pretty close.
01:30:02 Uh.
01:30:03 It it's.
01:30:05 It's it's close. It's close enough to reality. People are gonna start questioning everything.
01:30:11 It's close enough to reality where everyone's going to start questioning everything they see.
01:30:16 Look, they were doing this before this technology existed, and they're well and and and they would and those people would probably say, well, this is just the version that we have available now this, you know, who knows what kind of version they've got privately. And there's some truth to that. There might very well be some military grade.
01:30:38 AI that the CIA has access to, that's already creating that. That already doesn't have some of these, you know, the creepiness of the the people in the footage, you know, it's a little less creepy or it's a little less.
01:30:56 You know, detectable in terms of of looking like like AI. So you're going to have people that don't believe anything that again, you already have this, but it's going to be more rational for the average person to think to themselves. Yeah, it might be fake.
01:31:13 I know that because my little my nephew Billy over there, I saw him the other day make a video of something that I thought was that looked totally real.
01:31:23 And and you're also going to get the the opposite of that you're going to get a lot of people because confirmation bias seems to be the.
01:31:33 The.
01:31:35 The most powerful force in the in the universe, you're going to get like Q tards and MAGA tards.
01:31:43 That are going to embrace fake videos that that reinforce their biases.
01:31:51 You're going to have MAGA tards sharing fake videos all over face.
01:31:56 Book and old people especially completely ill, equipped to different. I mean these are people who who.
01:32:06 They're just never going to be able to. They don't even know what AI.
01:32:08 Is.
01:32:09 That they're not going to.
01:32:10 Know.
01:32:11 They're not going to be able to wrap their head around and they're too old of a dog to learn the new trick.
01:32:18 And so they're going to have issues. And really what it's going to create, it's really going to create what what I would call a state of informational nihilism.
01:32:30 And what I mean by informational nihilism.
01:32:34 Is there's going to be no source that anyone's going to trust?
01:32:40 And therefore truth itself.
01:32:44 Will become subjective.
01:32:49 And.
01:32:50 That's going to have some wide reaching consequences.
01:32:56 One of the problems with our society right now.
01:33:01 Is.
01:33:03 Truth is already kind of subjective.
01:33:08 Truth is already we already have an environment where you've got people that want you to.
01:33:17 Believe in their delusions.
01:33:21 You know whether it's about I'm. I'm really a woman on the inside or I'm a, you know, I'm a Bunny rabbit or whatever, whether it's some kind of identity delusion.
01:33:30 Or it's a worldview delusion, like where they think that you know, on either side of the Trump Derangement syndrome, right on one side, you have people that think that Trump is literally Hitler. He's about to put all the Jews in. I mean, I don't know how they think this, but they do or guess now that maybe they're starting to think he's into this while Muslims stick them on trains and.
01:33:51 You know, send them to the showers and stuff.
01:33:55 On the other side, you have all the MAGA tards that think that Trump is Jesus and he's the he's he's the savior. He's a Messiah like figure and he can't do anything wrong. Any news whatsoever that's negative about Trump is.
01:34:11 Is a lie, and so you've already kind of got you already have this polarized polarization going on, and you you're having a breaking of the consensus of what reality is. That's a relatively new thing in the unit.
01:34:26 States because even when you had a left and a right, my entire lifetime, if you go back to the 90s as an example, you had Bill Clinton versus, you know George Bush senior 1st and then you know later George Bush junior, you had a, you know, you know.
01:34:46 Rush Limbaugh worldview you had a Bill Clinton worldview, and while there were some differences, by and large, there was at least some consensus on what reality was.
01:35:00 Right there was a consensus on what reality was. To some extent. There were some, there were some differences, but most people looked at the world and saw the same thing. They did have different strategies for addressing what they saw as problems, and they had different interpretations of what those problems were. The sources of those problems and the solutions of those problems.
01:35:24 But yeah, no one. No one thought they were a fucking pony, OK?
01:35:29 And if they did, you were. You were laughed at, and in fact, even transsexuals were made fun of in, you know, Hollywood films. There was no taboo about making fun of trannies. There was no taboo even about making fun of faggots.
01:35:46 And so you you still had at least some kind of cohesion. You had some kind of, you know?
01:35:57 I you know, some kind of agreed upon objective reality.
01:36:03 We don't have that as much already.
01:36:06 That's already kind of going out the window and part of that is because not because of AI, but because of these informational silos that are being created. Because of these different digital gulags that we're all being kind of situated.
01:36:22 Get in.
01:36:23 Now, as an example, you look at X today and at first it was kind of nice because you could say, Oh yeah, it's kind of great that I can, I can say on X now and I can, I can say whatever I want about Jews and stuff and I'm not getting banned.
01:36:41 But X is kind of just become like a echo Chamber of of of shit in a lot of ways.
01:36:49 Because with the old Twitter you would have.
01:36:53 Different opinions.
01:36:55 Come across your your timeline and you can engage with leftists and and get into these kinds of conversations that would inevitably get you banned, but you you'd at least see these other people. Either they don't exist anymore on on X because it's become Elon is fascist, you know, whatever platform.
01:37:16 Or they are being algorithmically segregated to some other part that you never see.
01:37:24 But I don't see a lot of lefties and and and and it's not because there's there's lefties disappeared. Which, by the way, that's the other mistake. There's a lot of people that think, oh, we're we're winning. We're winning the narrative because all I ever hear are people, you know, echoing and parroting the same things I say. And it's like, yeah, that's because the algorithm's doing that.
01:37:46 That's because the platform's doing that. It's these people didn't cease to exist.
01:37:51 Yeah, you you didn't have, like, some mass awakening like that might be your perception of things, but like, that's not that's not what happened.
01:37:58 So you already kind of have that going on. You already kind of have like these people living in segregated realities, coexisting, maybe living living next door to each other but having completely different perceptions on what reality even is like, wildly different perceptions of reality.
01:38:20 So that already exists. Now imagine the kind of breakdown you're going to have when there is actually really good reasons, really good reasons that even the MPC start to become aware of, to doubt anything and everything you see and hear.
01:38:36 Everything you see in here could be AI generated, literally everything and anything.
01:38:43 And that's going to that's not going to just be in media. Yeah, this is going to bleed into the legal system, you know, video evidence. Like I said, it used to be, you know, the the cornerstone of of of, of proof, you know, like this, this I've got footage of him shooting his grandma. So he clearly did it.
01:39:03 But if you have these hyper realistic fake videos that become indistinguishable from.
01:39:14 From, you know, security camera footage or whatever, it's going to be really difficult to determine if something's authentic and a lot of you know, even with like the you'd say, oh, well, you could put on, you know, digital metadata and and have friends that video people look at it. That's true to some extent. But all that stuff.
01:39:34 Can be faked as well and and so it's good, unless you're involving like a blockchain or something like that, or some kind of complicated verification process it. That's all that stuff's going to be very slow to be.
01:39:49 Implemented so you're going to have problems with with people even trust they're not going to trust the media. They're not going to trust legal proceedings. They're not going to.
01:39:59 Trust.
01:40:02 You know, like even, you know, social interactions are going to get weird because a lot of social interactions, especially post COVID.
01:40:10 Are now taking place.
01:40:11 On the Internet and so a lot of people that have grown comfortable interact, you know, dating, like dating profiles are going to be they already kind of fake, but they're going to get faker and faker. Dating profiles will be fake. Just regular profiles will be fake. You won't know who the fuck you're talking to. You won't even know if you're talking to a.
01:40:31 An actual person.
01:40:33 You will have also, people will have the ability to make deep fakes of you to humiliate you. They'll be able to look at your images on your social media and you know, use those images to create porn or or whatever the fuck and spread those out. And so that's going to that's going to you.
01:40:54 Further erode people's comfort level even just interacting online.
01:41:02 And by the way, while all this is going.
01:41:03 On.
01:41:05 Which is all going to go on relatively. It's already begun, right? It's already this. This is already this ball is already starting to go down the.
01:41:14 Down the hill.
01:41:16 While this is all going on, we have we have it being amplified the the social distrust and the social.
01:41:26 Cohesion threatened and well dismantled by the fact that you have all this demographic shift going on in the United States and Europe, so you're importing all these people from all over the world and breaking up social cohesion in, you know, IRL. So if people were to say, ohh, just go.
01:41:45 Touch grass or oh, this is actually good because it'll just force people to interact more in, you know, in in real life instead of online.
01:41:54 Well, that's gonna. That's getting more difficult to do too. The whole reason why a lot of people online so much is because it's not because they want to be. It's because it sucks going out. If you live in a big city especially.
01:42:08 It sucks going out out of your apartment and into that environment.
01:42:15 So you've got.
01:42:18 You've got that kind of breakdown that's going on. You're going to have, you know, the fraud that will happen, you know, like we talked with romance scammers. You'll have financial fraud. You'll have other kinds of fraud, too. Think about the kinds of fraud where you could make a clip of a CEO.
01:42:39 Talking about, you know, whether it's a clip that you leak out to people to crash like the the stock price of this.
01:42:46 Of this company, where you can have the CEO on on some footage, maybe he says, or maybe he just says we're going bankrupt or whatever. But you know, you get some clip to go viral. People, because how quick these markets, you know, move and manipulate people, sell and and dump a stock because they don't want to.
01:43:06 Get caught up in, in, in the crash and and so you could have. You could have stuff like that going on. You can have people faking, you know, you think it's the hate hoaxes are bad now with what you're doing with their rabbi, painting swastikas on their walls.
01:43:23 You're going to have people making fake hate crimes. You have people making footage of white people murdering black babies and and and shit like that. And all it took for being LM to happen is for that relatively short clip of video of George Floyd.
01:43:44 You know, dying of his of his overdose on the ground in that street there. And that's all you'd have to do is you can make a totally fake version of that and stupid black people would probably believe it.
01:43:57 Because that's going to be double edged sword too. The stupid people who have less ability to determine what's fake and what's real are going to either be tricked all the time or think everything's fake. Even the real stuff.
01:44:10 So that's going to just make things really fucking chaotic. You'll have a, a desensitize people get will get desensitized.
01:44:20 To to reality.
01:44:23 Just generally.
01:44:25 Because everything will be questioned as fake.
01:44:30 And so.
01:44:31 Even when you see real stuff, like if you see real footage of a war, like a lot of this footage that you're seeing coming out of Gaza and stuff like that, people will start to question whether that's fake or not. And look, maybe some of it could be, we don't know. That's the problem. We don't know. And so people, instead of seeing this.
01:44:51 You know, like in dealing with a real crisis.
01:44:55 Or real atrocity. They'll just assume that there's a good chance it's being fabricated, that it's all part of, you know, some psyop or something like that. And so people won't.
01:45:08 Will react appropriately to to actual real things that are happening in the world. You know you can have.
01:45:20 You know, real psychological issues.
01:45:23 Right. Like real, real psychological issues. You know, you think social media, if people always talk about how zoomers have a unusual amount of anxiety, right, because of their exposure to maybe the Internet and pornography and stuff.
01:45:41 Like that, you're going to have an insane, insane amount of. I mean, if you have that much anxiety generated by by just kind of like the exposure of the Internet and that kind of environment that we're not evolved to to live in, imagine how much worse it is when that environment is as.
01:46:02 Is untrustworthy as it's going to become with AI. You know a lot of people who are paranoid, you'll have a lot of people that are apathetic. You're going to have people who.
01:46:16 Lose touch with the reality.
01:46:18 And look, the flip side of that is you're going to have a lot of people that that will indulge in this technology.
01:46:26 They'll use this technology to create on purpose, their own fictional world, and they'll want to live inside this fictional fucking world. And that's going to detach them even further. And from reality, this is all stuff that's it's that this is all right around the corner.
01:46:44 This is all right around the court. You're also going to have we talked, we talked about on this a little bit last stream where they'll make fake footage of historic events.
01:46:54 And and the fucked up thing is.
01:46:57 That's really difficult.
01:47:00 To verify.
01:47:02 Right, like unless you you are physically holding in your hand.
01:47:07 The 35mm or 16mm film reel.
01:47:12 That that footage comes from.
01:47:15 It's almost impossible to determine whether or not the the historic footage that you're watching is is real. You can't verify it.
01:47:26 You can't verify footage from a, you know 100 years ago.
01:47:30 And so it'll be very easy for them to generate fake historic footage and pass it off as real. I think we can all see, like, one of the obvious things that they could quite literally make multicultural historic footage. Like, it's bad enough when you see a movie with a cast of black guy is some kind of victory.
01:47:49 And.
01:47:50 You know anybody really? And and just imagine if they start showing historic footage of, you know, 19th century America and it's like all these blacks and Pajeets and Mexicans and.
01:48:10 You know, George Washington is black. Yeah, they could do that shit.
01:48:15 And.
01:48:17 People wouldn't know, especially if you get the younger you get these people, there's no way to fucking know. You wouldn't know. How would you fucking know?
01:48:25 Unless you have those film reels in your hand, there's no way to fucking know.
01:48:30 So I I I I think that's going to be that's going to be a major fucking problem.
01:48:35 That's being a major fucking problem. That that's like I said, it's right around the corner, like, right around the corner.
01:48:44 And the secondary effects.
01:48:47 Of a lot of this stuff is going to be.
01:48:51 Really intense because it's it's it's happening at a time where.
01:48:56 All the all the negative effects that that are that are already going to be created by this technology are things that are already happening. People have lost trust and media, they've lost trust in institutions, they're already paranoid. They already think everything is fake and gay. They're, you know, there's people that we have flat Earthers.
01:49:16 Yeah, for fucks sake. You know, like we that's how far the the distrust of the institutions has gone is that people don't even believe in just basic elements of of reality. They can't even agree on the shape of the fucking earth.
01:49:31 Anymore.
01:49:32 And so you're going to have just a lot more insanity. You're going to have a lot more insanity.
01:49:40 The other thing though, that this is what's going to be tricky for a lot of us is I think a lot of people are going to say they're going to say, well, you know.
01:49:48 I.
01:49:48 Guess the solution then is for us to go pack up our things and go and live.
01:49:55 A a rural life away from all the crazy people that will be in the cities, away from the diversity and will detach from the the computer technology that's creating all this psychosis in in the public.
01:50:13 But the problem is if you do that, you're going to make, you're going to raise kids who are going to be suckers.
01:50:20 They're going to have no way of dealing with with all the AI when they eventually, when they do encounter it, they'll be tricked harder than anybody.
01:50:30 Because they're not going to be exposed to it enough to to be able to spot, you know, the the difference between what's real and what's fake, which is already going to be hard for even people who are immersed in.
01:50:44 So you can't just totally.
01:50:47 Shield them from it either.
01:50:49 I think you're going to have.
01:50:53 A lot more litigation. You know the the legal system is going to get crazier. You're going to have a lot more social isolation.
01:51:03 You know, people are going to be, in fact, you're going to have people that just kind of completely hermit out.
01:51:10 Completely hermit out because they just they don't want to.
01:51:14 They don't want to get.
01:51:16 Tricked by this kind of stuff, you know they don't wanna. They don't. They don't want to get into the actual like, the literal matrix that's developing which which leads to the next question. It's like if this kind of technology.
01:51:29 Which still takes time to render right, like from when I was making those clips for the B video. You know each clip. I think that it generated was approximately 8 seconds or so long.
01:51:43 And it probably took about, you know, less than a minute, but it probably about about a minute, maybe, maybe about a minute to produce.
01:51:53 And so you would type in your prompt and you'd wait and about a minute would pass by and it would generate it. So it's not real time by any means.
01:52:02 But it's going to be like everything else. It's going to be.
01:52:05 At a certain point in time, it's going to get to the point where you will be able to have a FaceTime call that looks real as shit with one of these.
01:52:15 Fake people and it will respond in real time.
01:52:20 And that's not that far off. That's probably another 10 years probably off. And yeah, at first it'll be limited. It'll be expensive because the, the overhead like the, the, the computing power necessary to make it real time will probably be pretty tremendous. So the hardware and the electric.
01:52:41 You know the electricity necessary.
01:52:44 To run all that hardware and you know the cooling and you know it's it'll be expensive to to get it to get it working. But just like everything else, it'll get cheaper, it'll get cheaper and it'll get easier for them to to do.
01:53:00 It.
01:53:01 And so you're going to have you'll have virtual.
01:53:04 Friends, it's Facebook announced that they that's one of their big thing that they want to do. They said that the average person they ohh. We've determined with our research the average person needs about 15 friends and they've only got 3.
01:53:17 So that means we need to, we need to give them 12 AI friends.
01:53:21 Well, that's what this technology is going to be used to do is it's going to you'll you'll have 12 I friends that you FaceTime with or maybe they pop up on your computer screen or or in your in your.
01:53:35 Meta vision or whatever their you know, whatever their AI helmet thing is called.
01:53:40 And.
01:53:41 You'll just have them pop up and say hi. It's Billy, like. Oh, hey, how's it going, Billy?
01:53:47 And.
01:53:50 It'll be like having a.
01:53:52 Have a fake friend that works for Facebook.
01:53:58 A fake friend who who randomly in every conversation is like, yeah, you know, but you know what really bothers me?
01:54:06 Is that you haven't invested enough money in bills. Gold Co make sure to gonna go to slowrosegold.com.
01:54:17 He'll just do be doing add raids half, you know, halfway through your conversation with him.
01:54:22 They'll just like you'll be like, oh, man, I'm really. I'm really worried, you know? Martha's got cancer. Oh, yeah, that's terrible.
01:54:29 But not, you know, would really cheer you up. Maybe if you went to.
01:54:34 To Taco Bell to take advantage of their new $8 box of tacos.
01:54:40 Really hits the spot. You think shit like that's not going to happen. It is.
01:54:46 Shit like that absolutely is going to happen. Yeah. So you have this kind of you have this kind of shit happening while while you'll have all the the polarization that's happening because the, the, the demographic change.
01:55:01 We're we're in for. We're in for some real.
01:55:05 Real social unrest, real social upheaval. This is why I'm not, honestly. And that's the funny thing. I'm not super concerned about the whole Iran situation. There's a lot of people that think that like, oh, that's it's going to start World War.
01:55:20 Three and whatever it.
01:55:22 I think people are vastly overestimating.
01:55:27 A lot of things like even if we were to go and bomb the shit out of Iran and and go all out war time with Iran.
01:55:36 Iran.
01:55:37 Is.
01:55:38 As much as they're not, you know they're not Afghanistan.
01:55:42 They they.
01:55:44 They don't have. It's not like they're. They're also not like able to go toe to toe with the United States military. They're not. I mean, they they could do some damage and it wouldn't be fun and people would die. But it, you know, it's Iran at the end of the day, it's Iran and Iran knows.
01:56:01 That they're Iran. You know, they they don't have any interest in in going through that either. And so like.
01:56:08 That is less, that's less of a threat that's actually less of a threat.
01:56:13 Is is it possibility that we have some kind of conflict? Yeah, it's totally possible, but it's less of a threat to our future as as a, as a race than than just this AI stuff.
01:56:28 Like just just the just the the AI stuff itself.
01:56:33 Is poses more of a threat.
01:56:36 To to not just white people, just humans generally.
01:56:43 Then and then any kind of implausible World War three scenario.
01:56:48 So it's.
01:56:50 It's important stuff for us to pay attention to and and keep an eye on it. It's moving faster than I ever thought it was going to. I I used to think that I was pretty good at predicting.
01:57:02 Technologies and for cause. For a while I was, you know, I was always on the bleeding edge of whatever was new and.
01:57:10 Usually had a good, good feel for what was going.
01:57:13 To.
01:57:14 You know what direction software and hardware was going to go in and what direction.
01:57:20 Just technology General was going in. The AI stuff really threw a wrench in those gears. It was because it was always something I I I thought that while possible was was going to happen after I was gone. Like I I just thought that.
01:57:36 The you know the the.
01:57:39 The capabilities of of computers are limited enough to where you weren't going to have that kind of.
01:57:46 You know, neural network type thinking stuff, you know, machine learning. I didn't think that was something that was really going to happen the way that it it's already unfolded so.
01:57:56 I think we're, we we have some serious shit we got to worry about with this.
01:58:02 But at the same time, there's, you know, look, there's lots of opportunities, there's lots of possibilities. Just like with the Internet.
01:58:09 We can't, because look, whether we like it or not.
01:58:15 Whether we're afraid of of of what might happen, the bad, you know, ramifications. In the meantime, our enemies, especially like the, you know, the transhumanist Jews are going to be embracing this technology.
01:58:28 They're going to be embracing this technology and they're going to be championing championing this, this and and leveraging this technology to use it against us.
01:58:38 And so we can't just act like it's not there. That's why we got. That's why I you know, that's why I.
01:58:43 Fucked around and made the.
01:58:45 The bee stuff because I was like well.
01:58:48 What? I gotta I gotta at least know what it's capable of doing.
01:58:51 And got a pretty good taste as to.
01:58:56 You know, let's give, especially after seeing stuff like this. I was like, fuck man.
01:59:01 We are fucked like this is.
01:59:04 This is this is, you know, with the volume down especially it's, it's hard to figure out that it's.
01:59:10 That's all fake. Other than like all the white people and the the sometimes goofy faces and intense, crazy people stares the the people too.
01:59:23 But yeah anyway.
01:59:27 Ah.
01:59:29 Yeah, he said. It's pretty, pretty laid back stream tonight.
01:59:34 The the.
01:59:36 That's basically what I wanted to talk about. That's basically what I wanted to say. It's going to create informational nihilism. People aren't going to trust reality anymore.
01:59:45 And look it, I think it will also. I think it'll actually give.
01:59:51 More people reasons to believe in, like the the the Silicon Valley.
01:59:59 Religion that you know that we all live in a hologram.
02:00:03 Because it's going to become less of a stretch.
02:00:07 Right. Like it's gonna be people are gonna cause the the the whole premise is you look at this technology.
02:00:15 You look at this technology and think to yourself, well, holy shit.
02:00:19 Computer graphics used to be pong.
02:00:22 In the 1970s and it was Pac-Man, I think 1980 isn't that when Pac-Man came.
02:00:29 Out.
02:00:30 So I think it was Pac-Man in 1980.
02:00:34 We went from Pac-Man.
02:00:36 To this.
02:00:38 In in 45 years.
02:00:41 So it's not much of a stretch.
02:00:45 To assume that on a long enough timeline.
02:00:49 You would be able to create real time fake shit that talks to you, reacts to you, and seems like a real person that the technology like look at and if in 45 years you go from Pac-Man to.
02:01:05 Footage.
02:01:06 Well then, in in 145 years.
02:01:11 Maybe you go to uh, you know, neural link style implants.
02:01:16 That.
02:01:18 That plug directly, just like you know, in the matrix, right? The plug directly into your into your brain and just replace whatever.
02:01:32 You know input.
02:01:32 You would have been having coming into your eyes and ears and your and your your sense of touch would all it would all just be replaced by digital signals.
02:01:43 That are reacting in real time. People will take it a step further and say well.
02:01:47 Well, if it's possible that all these AI's you know, like the universe, is, it's all just a hologram. And look, that that's also going to play into the narcissism of the people that believe it. Right, because that quite literally will mean they are the main character, that no one else is real. No one else fucking matters. They're the only people that are.
02:02:07 Actually real, but then you'll have other people that will interpret it differently and say, actually, maybe I'm an AI. How would.
02:02:15 I know the difference.
02:02:17 How would I know? Like if the if if the technology is going to get to a point where AI becomes self aware, how would I know that I'm not an AI? Maybe that's the simulation? Maybe I'm an AI that's been put into.
02:02:31 This.
02:02:32 This, you know, basically a a open world map in a 3D game.
02:02:38 And just to see what will happen, how do I know that's not the case? So people are going to start believing that stuff they already there. There's already people that kind of believe it, but I think people are actually going to embrace that is a theology.
02:02:51 Because like I said, it's going to play into their, their their narcissism. It's going to play into their desire to be, you know, the main character syndrome and it's going to play into especially sociopathic people because they already kind of look at other people as as non you know non.
02:03:11 People like they don't. They don't view the people around them as as people with souls and and things like that. They're all just utilities, you know, they're all just objects to be many.
02:03:16 Both.
02:03:21 Related, in order to further your your goals and so it it's like the perfect religion for people that think like that, which is a great many people that live in Silicon Valley. So, you know, I think you're going to have somewhat of a religious movement in that area. You're going to have people that that kind of look at.
02:03:41 That the whole world, as if you know, almost as if we are we we are.
02:03:50 Creating, recreating the technology that we are might we might already be enslaved by.
02:03:57 And look, it would be hard to argue you won't be able. You won't be able to argue those people logically out of it, and it'll be just as difficult, if not more difficult.
02:04:09 To argue them out of that worldview, as it would be to argue anyone else out.
02:04:13 Of their theology.
02:04:15 So that's.
02:04:17 That's that's kind of gonna be a big.
02:04:19 Deal.
02:04:20 It's going to be a big deal. So I think that, yeah, ultimately solution for me at least the way I'm looking at it is, yeah, you don't want to be in the cities as as the mental health crisis is going to skyrocket, you're going to get people that are going to get crazier and crazier, especially in big cities.
02:04:37 You're going to have more, more racial tension just amplified by this AI stuff. You're going to have just more crazy people generally.
02:04:47 So you want to get away from people?
02:04:52 But and try to create your own kind of.
02:04:57 Communities and and I think that's already something people have kind of articulated and that's something people are already sort of working towards.
02:05:05 But at the same time, we're going to be careful not to ignore this technology and not try to act as if, you know, not be like the Amish, where we can just plug our ears and go. No, no, no, I can't hear you. I can hear it's eventual.
02:05:19 That technology will hear us even if we don't hear it. So it's something that we have to we have to stay up to date on and something we have to use leverage. And look, there's lots of.
02:05:30 Positive things, at least for right now, that.
02:05:32 We can use it for.
02:05:35 Like 1 big thing that I've that I already use it AI for is all the stuff that you used to, you know, people used to say oh, I went to like the University of YouTube, right? Meaning that which is true. How I learned a lot of my 3.
02:05:52 The.
02:05:53 Software and a lot of just even just designing everything else is I watched a lot of YouTube videos, watched a lot of tutorials.
02:06:01 And so you're going to get a lot of people that.
02:06:04 That instead of doing that, I feel like that's kind of like the old school way is you'll create AI agents that will walk you through how to use new software. It'll look at your code and and fix your mistakes and you know, like Google's got their AI, you can create like an agent.
02:06:23 And share your screen like if you were trying to learn after effects right the way I had to do it is I had to get a copy of after effects and then I had to watch YouTube videos and through trial and error of just you know and then communicate with people on forums which sometimes.
02:06:42 Would take hours to days to get a response, but it was it was very. It was a lot of investment sometimes just to figure out a simple thing like you'd have.
02:06:52 A big project due you know they only have a couple days and you'd come up with some technical problem with your project file that you couldn't figure out.
02:07:00 And now that's that's no longer an issue, because with AI, with Google's AI just as an example, you can share your screen with it and show it your project file as you're working on it and tell it you know, with your voice. Just say hey, Google.
02:07:19 How come this this after effects project files not behaving the way that I want it to and it'll be like oh you need to.
02:07:27 You can click here here let me generate some some code real quick that you can paste this expression into that layer and now magically I'll do everything you want. That's insane. That's like that kind of stuff. Is is something that you need to leverage? Another thing that we need to think about just.
02:07:44 Is.
02:07:45 And I've talked about this a little bit before.
02:07:47 Is, you know, like one of the one of the big hangups for poor people.
02:07:54 Especially when poor people are.
02:07:58 Are afraid to engage with.
02:08:01 With you know what? Let me rephrase that. When we are as underfunded as we are, one of the major disadvantages that we have when it comes to competing with and engaging directly with large institutions.
02:08:14 Is that you know, they're able to engage in, in lawfare, they're able to.
02:08:19 Uh, when you have more money, you obviously that gives you more resources. It gives you the ability to.
02:08:31 You you know.
02:08:33 Well, again have lawyers basically and that's that. Like when when people are going to be platformed, right?
02:08:41 There were instances where if you'd had the ability, if you had like a quarter, $1,000,000, you could spend on a lawyer, you might have and enough people did that you might have been able to get people back on the YouTube that were that were kicked off.
02:08:54 Of YouTube.
02:08:55 But the people that were kicking off YouTube were people who are barely scraping by, and you just kind.
02:08:59 Of had to accept it.
02:09:00 Same thing with my book getting banned on on Amazon with no explanation. Right? I if I'd been able to afford a lawyer, I might have been able to get it back.
02:09:10 But you know, I had no ability to do that or at least I didn't want to invest the amount of time it would take me to sit there and learn all about the the relevant legal and the INS and outs. I'd have to learn and and go over their terms of service, you know, with the with the lawyer and.
02:09:29 I can't. I couldn't do all that.
02:09:31 But with AI a lot of.
02:09:33 That stuff you can.
02:09:34 Do there's a lot of stuff that legally would that would usually require a lawyer that I think that we'll be able to leverage that AI to start accomplishing stuff like that. Obviously if you have any kind of job that involves coding.
02:09:52 Or you know.
02:09:54 I I I like I use AI.
02:09:57 On a fairly regular basis now, when it comes to writing Python scripts and things like that that.
02:10:02 That, and I don't even really know Python.
02:10:07 So all of a sudden I'm like, you know, I'm generating Python. Well, I'm not, but, you know.
02:10:12 And so.
02:10:15 There it's. It's something we can leverage for good. But we need to be aware of the inevitable horrific, bad things that are right around the corner as a result of this technology.
02:10:26 Anyway, I think I've rambled enough about.
02:10:30 About AI let's take a look at entropy. See what you guys are saying over there.
02:10:36 Ah, with this we got blue cord with a big dunno right out the gate. Ohh. By the way, the background I'm playing around. I that's not a I. I actually made that. I'm it's not that great, but I started. I started redesigning the.
02:10:55 Let me see here.
02:10:59 I started redesigning the.
02:11:03 Ohh and I added. I added a real time clock. Look at that fancy we got.
02:11:08 Fancy time?
02:11:11 Fine up in the upper left hand corner. There, that's.
02:11:15 Harder to hear than you would think.
02:11:16 No peers.
02:11:20 Part of it should be at.
02:11:21 Least.
02:11:22 And yeah, I'm I'm. I was trying to sex up the.
02:11:27 The look get a little bit of an update not I'm not quite there yet. I didn't get. I didn't get it quite looking how I wanted to.
02:11:34 But it's a little different. Little different. None of that's I. That's just cinema 4D.
02:11:42 But anyway.
02:11:44 Uh, yeah, I want to make a.
02:11:46 I wanna I.
02:11:47 Wanna give a whole facelift? I just haven't.
02:11:50 I got I got started. I got started on it.
02:11:55 Let's take a look here blue chord.
02:11:59 Blue corn with a.
02:12:00 Big doughnut.
02:12:02 Money is power. Money is the only weapon that the Jew has to defend himself with.
02:12:07 Look how Jewy this fag is.
02:12:09 Is.
02:12:25 Blue cord.
02:12:28 Good morning, Mr. stack. What is the threshold for sponsoring A hive lump sum or total donation? As always, thank you for what you do. Your work is appreciated. I'll have to figure it out. Just only because like.
02:12:42 I mean it needs to be high enough to wear like cause I will run out of hives.
02:12:48 So it needs to be high enough to wear. Like I don't run out of hives, especially this year where I don't have.
02:12:52 I was. I was.
02:12:55 Rapidly approaching, I thought I'd get up to triple digits and hives this year, or close to it.
02:13:02 That is in no way going to happen this year. I am well, well well into the double digits.
02:13:11 In fact, for for a moment I thought I might even get into the single digits. Not well, you know, single digits, but you know.
02:13:18 I don't know.
02:13:19 I'm sure you'd qualify. I guess at this point I'll have to figure it out. I'll have to make. I'll may have to make it out. I just kind of wing it over.
02:13:27 Here.
02:13:27 When it comes to stuff like that, maybe I'll maybe I'll.
02:13:30 Have to come up with some kind of official.
02:13:34 Maybe maybe I should make it like a.
02:13:37 They subscribe star thing, right? Like adopt A hive kind of and have, like, a limited number of them. Like ohh there's.
02:13:45 Maybe we could always make. We'll have like 10 European hives that have to live among the Africans.
02:13:54 I'll just I'll just post the maybe we can auction them off and that can be kind of fun.
02:14:01 Do you know I have 1010 European hives that I keep?
02:14:05 Closer to the house than the other hives because they're not. Yeah, because they're European hives.
02:14:10 And.
02:14:12 You know their survival is who knows, right. But I try to keep your your your your donation keeps them European. We could do something like that so that you keep that you keep you prevent that with your donation. You prevent them from being because it would be expensive to maintain that by the way because not.
02:14:31 You know, like if I requeen every time I requeen, it's a minimum 50 bucks and then they a lot of the African they bees they they kill the queen. And then if I actually get a package of bees that's really gone up in price, you're now looking at least 200 bucks for a package of bees now.
02:14:50 Why? Whatever's got this, you can. You can find them for less than 100 bucks. Barely less than they're used. About 100 bucks. But you could find.
02:14:56 Packages for like 90 bucks.
02:15:00 And now it's like 200.
02:15:01 Something.
02:15:02 So and yeah, like my my my $250 bees from last year died.
02:15:10 So.
02:15:12 But I want to figure out we'll figure something out.
02:15:16 But yeah, that that'd be fine. Put your name on the. Put your name on the hive.
02:15:21 Take a picture of it for the maybe, maybe play it at the at the end of the stream, something like that.
02:15:27 I mean, I don't know if I'm not gonna be like regular status updates, but you know what I mean? Like.
02:15:32 Maybe maybe periodic periodic status updates of your hive.
02:15:39 Uh, but thank you very much. Blue chord.
02:15:42 Then we got Gorilla hands.
02:15:54 Gorilla hands our current political situation in the US reminds me.
02:15:59 Of an outer limits episode from the 90s.
02:16:03 Called the deprogrammers.
02:16:05 In this episode, Earth has been taken over by a hostile alien race. Humans are called Joe Lem. That's kind of funny.
02:16:13 The protagonists are part of a rebellion. In the end, they are betrayed by one of their own, the alien leader at the end reminds of.
02:16:27 Net, there's Part 2 Netanyahu. I would send you a link, but I don't. I don't want to be linked. Well, I'll tell you what. I'll.
02:16:34 I don't know that I've ever seen that.
02:16:41 But I'll copy that into my notes.
02:16:42 Here.
02:16:45 I haven't seen an outer limits episode in a really long time.
02:16:55 Really long time.
02:17:00 But yeah, thank you very much. Gorilla hands. That could be that.
02:17:04 Could be interesting.
02:17:07 And then we got Gorilla hands again. Could you play your skit?
02:17:11 With you singing that we need to go back to the 90s.
02:17:16 You have a lot of talent. Maybe you should do something within with the cyborgs. Well, I don't know if my singing in that is. I wouldn't. I wouldn't call that.
02:17:28 I know there's a lot of talent when it comes to that.
02:17:31 I don't think I don't have a button for that.
02:17:33 It's on my telegram, I'm sure it's on my telegram.
02:17:39 Ohh man it's so far back though.
02:17:44 Let me see. I'll scroll U. Let's see if I find it real quick.
02:17:59 You might even be on my accent. I think I I did post it. Not super recently, but I don't post a lot of videos on that, so it would be easier to find there.
02:18:08 Let me check X. Let me see if.
02:18:10 I can easily.
02:18:14 Pop it up here.
02:18:17 Go to the media tab on my profile.
02:18:28 I would be like right there.
02:18:35 You not see it.
02:18:46 Let me look, let me look.
02:18:47 At my telegram here.
02:18:54 And look for the let me.
02:18:55 Just look for the word 90s.
02:19:14 I do not see it.
02:19:23 I see my gay 90s episodes.
02:19:27 I thought for sure this was in my my telegram.
02:19:35 Well, I'll tell you what. I'll, I'll, I'll play it. Uh.
02:19:41 I'll play it next stream or something. I just don't see it readily available.
02:19:50 Alright, this might.
02:19:54 Is this?
02:19:55 It.
Ad Voice Over
02:19:56 Go out there and face the hate and end it once and for all.Devon Stack
02:20:01 This might be it.02:20:06 See if I can get this to.
02:20:07 Play here.
02:20:15 Do.
02:20:20 9 days.
02:20:22 That's it.
02:20:28 This might be it, I'm not sure.
Ad Voice Over
02:20:35 Go out there and face the hate and end it once and for.Devon Stack
02:20:38 All good thing.02:20:39 Back to the 90s. Good thing we went back to the.
02:20:42 Yeah, that's not. That's not like awesome singing.
02:20:44 There's no more racism in the 90s. We went back down to the 90s.
02:20:50 1990 to 1999.
02:20:55 Not got that many bad.
02:20:58 There's no racism and everything is good. The 90s, nineties, 1990s.
White Rapper
02:21:05 I like black people a lot. My friends basically think that I'm black at heart. I'm a graduate student here, but yeah, in fact, I have a Rep my name is Matt, but now I want to be black. The pigmentation of the thing I like, I want to relax and just do some sets.02:21:18 They're taking care not to make a mess, then the chorus.
02:21:20 Goes that map.
02:21:22 He think he.
02:21:23 Black anyway. Yeah, so it's good living in this hood, you know?
Devon Stack
02:21:28 Yeah, that that's not like.02:21:32 I was.
02:21:34 I was purposely going over the top of the singing there.
02:21:39 I'm not much of a singer. Believe it or not, I I I don't I.
02:21:43 Don't.
02:21:44 I don't have a.
02:21:45 I don't have the ear for.
02:21:47 It.
02:21:47 I think you know what I mean.
02:21:49 I was never. I was never great at instruments or or anything like that. Like I I could sort of sort of play piano and I could sort of sort of play violin, but I was never, you know, I I never had.
02:22:02 Like.
02:22:03 You know, there's some people.
02:22:06 Well, just like anything else, like, you know, there's some people that can just draw and it looks like a photo and just like, holy shit, I can never do that.
02:22:12 Not not not that.
02:22:13 Their talents a lot less useful these days, but.
02:22:18 I always wish I could do that and same thing with music I've never had.
02:22:23 I've never had the ability to do that.
02:22:27 But uh yeah, there's the 90s, good old 90s.
02:22:33 All right. And we got. Uh.
02:22:36 Where's the window of all this stuff?
02:22:43 So thank you very much Gorilla hands and we got man of low moral fiber says. Is it true that all negroes are gay?
02:22:53 Not all negroes, but but many, especially if they've ever been to prison.
02:23:00 You know, if they've ever been to prison, they are.
02:23:05 They almost all go gay.
02:23:08 Adam says things says I was ordered by a court of law to pay you this fine for denying the quatapa a honestly, someone really should do something about those fucking quit tapas.
02:23:25 Or quit. So I think you.
02:23:28 How about quitters?
02:23:30 How about their? How about their quitters? Well, I appreciate that. Adam says things.
02:23:35 And then we got Penelope, Speaking of Penelope. Penelope Maynard with the big big don't know.
Money Clip
02:23:42 Children. Today we'll be reading the best Christmas ever. Our story begins with.Devon Stack
02:23:48 The magic negro.Money Clip
02:23:51 Sure.02:23:59 Where did the soul men go the Christmas?
02:24:04 Yeah.
Devon Stack
02:24:14 Ever.02:24:17 All right.
02:24:19 And we got.
02:24:21 Really, all all, all she says is, uh, is high.
02:24:28 Well, hi there, Penelope.
02:24:30 Right back at you.
02:24:33 I definitely appreciate the support there and big support of the show. Again, you're hive.
02:24:40 Unfortunately did not survive the winter you had saskatchewans bees, the fanciest bees they could find.
02:24:48 And I now look.
02:24:52 They're called Saskatoon areas because they're from Saskatchewan and I thought there was a little bit of a risk. I was like, I don't know, maybe maybe getting bees that were bred in Saskatchewan, putting them in the desert. It's not the best idea. So, like I knew there was a possibility, but I don't think that's what happened. I don't. I don't think that's what happened.
02:25:13 Because, you know, we had, we had a big, not just me. All beekeepers had a big weird bee die off this year.
02:25:20 So we will have to repopulate them. I've got unfortunately, only I think I've got 2 for sure.
02:25:28 That are European bees.
02:25:31 Two hives for sure. And then I've got a lot of mutts, and then I've got some mean ones. But we'll have to get you a proper.
02:25:40 Well, to get you a proper one back going back in your hive again, well actually technically the hive that.
02:25:49 The that the hive is actually already has, it has Italians in it right now. So we'll just say that's yours. It's the same high.
02:25:58 So you're you're, we'll we'll just we'll rechristen at the the Penelope Maynard Hive.
02:26:05 But yeah, it's it was a. It was a rough winter when it came to the the the Nice Bees.
02:26:12 Well, I appreciate that, Penelope, as always.
02:26:15 And we got Adam says things again.
02:26:19 Also, I know the pollen comes from male plants. I'm not stupid. Come on, man. I thought it would taste good.
02:26:27 What are you? I have no.
02:26:28 Idea we're talking about.
02:26:32 Are you talking about the?
02:26:35 Ohh you were you. You were talking about the.
02:26:38 Is this going back to when you were asking about, like marijuana, honey? Well, like I said, it doesn't matter. Like they don't, they don't.
02:26:47 They don't make.
02:26:50 They don't make marijuana honey, because here's The thing is.
02:26:56 So bees are after the the pollen and the nectar, right?
02:27:03 And there's not a lot of nectar on on male plants.
02:27:10 Because, you know, they got the male and female.
02:27:12 Plants.
02:27:13 When it comes to marijuana, and I guess some Aphrodite plants too.
02:27:17 And they are. There's some plants that are pollinated by by bees.
02:27:23 And there are some plants that are pollinated by wind and what happens with with marijuana plants. I might be wrong about this, but I don't think so.
02:27:33 Cause.
02:27:33 Back when my degenerate days I.
02:27:36 Study growing weed and may or may not have done so, and and they so they're they're party by wind. So like the the male plants create.
02:27:48 Little pollen explosions. In fact, if you are growing indoors and you don't have feminized seeds.
02:27:55 That'll ruin your whole crop. You'll just like, if you're not paying attention to which plants are male and which ones are female, and getting rid of the males as soon as you know before that happens, you'll open up your grow room one day and they'll just be like a fine, fine dust on all of the the plants and.
02:28:16 And all. Now you've got really seedy marijuana all of a sudden, so I don't think that bees even.
02:28:24 Go after hemp or marijuana. I could be wrong about that, but my understanding is bees aren't really attracted to that. There's a lot of plants you would think bees would like and then they don't like at all. They're not as they're not as into.
02:28:40 Like they're they really like some cactus flowers. Like like they really like Peruvian apple. Cactuses, those big white flowers. They don't really like the.
02:28:52 Like most of the cactuses that just have, like pads that you would expect, you know, have, like, the typically have, like, yellow flowers, they're they're not super into that.
02:29:02 Because I I should know cause right now I've got a bunch blooming and and when I do see bees on them, they're no. I see occasionally, but when I do see bees on, they're not. They're not honey bees. They're like some weird local like sand to bee or something.
02:29:18 Uh, let's see here. Connect, connect.
Google AI Romanian Guy
02:29:28 Hi, Clark.Devon Stack
02:29:36 Connect.02:29:38 Says Debbie might find this interesting. A case of one side of one side's propaganda.
02:29:49 Requisition requisitioned by the opposite wait.
02:29:54 1.
02:29:57 A case of one side's propaganda requisitioned by the opposition in 2002 to sell the Blair 8 beginnings of replacement migration to the cocked British public BBC TV series NCS Manhunt.
02:30:17 In Series 2 Episode 4, broadcast propaganda piece about a secret right wing organization.
02:30:26 Orchestrating a string of race based crimes in order to start a race war. The the episode actually achieved the opposite effect. The leftward producers wrote a speech for the antagonist that inadvertently revived patriotic feelings.
02:30:46 In part of.
02:30:51 The British public it went viral among British right rightist groups seven years ago.
02:30:58 A propaganda reverse.
02:31:04 Uno.
02:31:06 And then you have a link. I'll. I'll check it out. Well, here's the thing. There's a lot of people that.
02:31:10 Like to think that.
02:31:13 And look, sometimes propaganda does backfire, but ultimately.
02:31:20 It doesn't happen as often as people think.
02:31:23 A good example of this is, you know, people. Uh.
02:31:32 There's usually. Here's the thing, rat point. As I've said often, rat poison has to taste good, or you don't eat it.
02:31:40 The rats wouldn't eat it.
02:31:42 Because you got to think of what you're doing is a rat, is out in the environment looking for food, there's other food. It's obviously surviving off something before you put the rat poison out. That's why you have to put rat poison out because there's rats. And so it's all already eating something out there. So you have to put something out there that is going to taste better than everything else.
02:32:03 What it's eating so that instead of eating whatever it's eating.
02:32:07 It's going to come and eat the rat poison.
02:32:09 And it's going to keep eating the rat poison because it doesn't the rat poison.
02:32:12 Doesn't kill it right away.
02:32:16 It.
02:32:16 It kills it slowly as it as it blocks its ability to produce vitamin K, which eventually leads to and there's different rat poisons, but the one that most people.
02:32:28 Eventually it builds up enough to where they can't produce vitamin K, which means they can't heal themselves when they injure themselves internally, which we all every animal does all the time, and so they basically end up dying of internal bleeding from minor minor injuries that they're not.
02:32:47 Healing from because their blood's not clotting.
02:32:51 And so.
02:32:53 But that takes some time, right? Like so it has to be really good.
02:32:59 It has to be really tasty.
02:33:01 And so often I think people think that, oh, this is bad, proper propaganda. Because look, all these right wing people that love it and it's like, no, I think that just shows you how many, how easy it is to trick people into. Like, I like. I haven't seen what you're maybe you're you. You've got it right. I'll check it out.
02:33:19 But like 1 case is like the Joker movie right? Like it was like Oh yeah, the base Joker movie, it's like, well, I don't know. Let's think about this for a second. It's about a loser who lives with his mom, who's obsessed with a black single mom who gets rejected, by the way.
02:33:39 By a black single mom who's bullied.
02:33:43 And the thing that makes him snap isn't isn't diversity, but it's it's white guys in suits, white guys in suits, hassling him on on the subway make him snap.
02:33:58 And then he freaks out.
02:34:02 Still obsessed with a black single mom.
02:34:05 And.
02:34:07 Then when he he he freaks out and and kills the.
02:34:12 You know, like the.
02:34:14 The television the late night talks, your host or whatever.
02:34:19 And he gets hauled off the big revolution that he that he caused all the protesters in his that love him are holding up basically carbon copies of Hillary Clinton signs, like, literally signs that say resist, which is what the the the Hillary Clinton you know the anti Trump people after Trump got elected.
02:34:39 We're holding out.
02:34:41 And it was, and it's just. And just generally it was just like like it was.
02:34:48 It was like, OK, there's funny parts and whatever, but it was obviously a degenerate movie with really bad messages. And then, yeah, and then the sequel kind of.
02:35:02 You know, double down on on, on everything and just made it into a shit show. And and there were people that were mad at me for pointing this this obvious shit.
02:35:11 Out.
02:35:12 This very obvious shit that it was just a RIP off of three other movies, you know, King of Comedy, Taxi driver and what was the third one?
02:35:22 Look ohh.
02:35:24 I forget the third one, but it was just a RIP off of of older movies, you know, just using the exact same things and making this pathetic white guy who's a basically a fucking loser a complete fucking loser who's obsessed with the black single mom who gets bullied by white chads and it and and, you know, trying to embrace that.
02:35:45 Like, that's your fucking swan song. It just makes you a massive fucking faggot and people got mad at me for pointing that out. But it was true. And when the sequel came out, I think it kind of, you know.
02:35:57 People they were. They didn't feel so emotionally attached to it anymore. Right. And so I think there are instances and like or like, you know, maybe the movie falling down from the 90s. There's still people that like that, but it's like now if you actually watch that movie, it's not about this sympathetic view of a white guy that's hot.
02:36:18 Now, first of all, he's not white. He's he's a Jew. You know the the the lead actor. Second of all, look at what happens to him throughout the day. He goes. He who does he attack? He, you know, first they say his base because like, the first people he attacks are like some kind of.
02:36:35 Minority gangsters, right, like like Mexican gangsters or something? It's been a while, so I've seen it now. But then. But then he was he attacked. He attacks the the white guy who's trying to help him after he finds out the white guy has a secret Nazi room. And in his secret.
02:36:54 Nazi room. He literally has like cans of Zyklon B.
02:36:59 And and that is so that's the big bad. And then after that, he helps that immigrant family.
02:37:08 That's that's having the cookout by the pool.
02:37:12 Right. And he hates the the rich White guys who, you know they work for, but he likes the the, the Mexican family that you know, like it's totally like, it's obviously leftist in in the way that it's it's it's portraying the world and then to make it even worse the whole thing is.
02:37:31 You know he he's mad because he wants to see his daughter or whatever, and it's like her birthday. And by the end of it, he's murdered.
02:37:40 And dead his bodies, like literally floating in the ocean just down the street.
02:37:45 While his his daughter and his ex-wife.
02:37:51 Like he's just been murdered. Like, just been murdered. They're fishing his body out.
02:37:57 And they're blown out candles like he never existed at the daughters party.
02:38:02 And it's, you know, and it's not framed like in, as in, like, oh, look at the tragedy of this, it's more just like, yeah, fuck him. He doesn't exist anymore.
02:38:12 So it's, you know, there's oftentimes people think ohh this is based you know ohh it's it's based to have a song it's based to have a song that that says you like you like to huff gas and have people fuck your wife.
02:38:29 And and on and. In fact, I'm even going to call the album cuck.
02:38:34 But because I say heil Hitler a.
02:38:36 Couple of times.
02:38:38 That's totally awesome. It's like, you know, you're just. You're just.
02:38:41 Kind of retarded.
02:38:43 Promoting that ohh and look, surprise surprise. What? What did Kanye say? Like literally yesterday? I'm done being antisemetic and now and and today he's blaming White people for the present system and he's he's defending Puff Daddy.
02:38:59 So it's like you know.
02:39:02 Oh, who would have who would?
02:39:03 Have thought you know that that a bipolar, you know, crazy guy like like Kanye would would, who's been all over the place since forever. Who would have thought that that would have happened, right?
02:39:16 But no, I was, I was told. I was told that this was going to be the anthem. You know, the anthem that begins with talking about how you love it when people fuck your wife. That was going to.
02:39:27 Be the anthem.
02:39:28 Of white people.
02:39:29 You wear where you say how Hitler over and over and over again. That was going to be the anthem. There was going to be, in fact, that was told specifically on Twitter. There were going to be stadiums, literally someone.
02:39:40 Said this. Ohh.
02:39:42 Yeah, there's going to be stadiums of of 80,000 people in a stadium, all saluting, all doing the salute and saying haha.
02:39:50 Well, because of this song.
02:39:53 Yeah. Or it's already a dead meme.
02:39:59 I think people get excited about there's such a lack of of of right wing content.
02:40:06 Hey, this is I'm just kind of going off right now. Like for all I know, I'll watch. What you what you what you're sending over and you'll be totally right.
02:40:14 Because, yeah, I'm not saying propaganda can't backfire.
02:40:19 But it's like.
02:40:20 Oftentimes, what you're what you're experiencing is that taste of the rat poison.
02:40:27 You're you're experiencing the the that saccharine taste. Ohh. It made me feel happy when I heard how.
02:40:33 Hitler in a rap song.
02:40:36 So I totally ignored the the poison.
02:40:39 About huffing gas and having people fuck my wife.
02:40:45 And saying.
02:40:48 Like like I'm a as this as if that's part of my identity as a white guy.
02:40:55 You know.
02:40:58 So yeah, like I said, I'll I'll take a look at it maybe.
02:41:03 Maybe it's awesome.
02:41:07 Did I just?
02:41:09 I went off so long on that I forgot if I.
02:41:14 No, not now. It's on my notes.
02:41:19 All right.
02:41:23 Then we got Tyler W 05.
02:41:43 Tyler W, 05, says Hey, Devon, been a while, just wanted to give support. Streams are great, as always. Beyond ISM was interesting. But as you said, I don't think alternative religions will ever be viable options. Well, I don't. I don't know. That's that's the truth. But beyond ISM is not violent. Do you think because?
02:42:03 Of this, we will have to leave religion in the past altogether. Now I think that that I'm not even saying that Christianity is needs to go or I'm just saying.
02:42:15 It might it. I think it needs a facelift. It definitely needs some new leadership at the at minimum, at minimum it needs new leadership. The but yeah, I I I think that white people will always have a spiritual side and you need a spiritual.
02:42:36 Side maybe we'll will that will that express itself in a different way like maybe.
02:42:44 Less like a supernatural way and more like a A, you know, a beyond ISM style. I I I don't know. I think you kind of need that supernatural side of things.
02:42:57 And I think everyone.
02:42:59 Sort of has a spiritual side, right? And if you're, if you have a religion that doesn't.
02:43:06 Address that. Then I don't think it's an adequate religion.
02:43:10 UM.
02:43:12 Yeah, I I look not not not to say, you know, look, you can be an atheist and and be a good person. I just think it's a lot.
02:43:20 There's a there's a lot more is required of of.
02:43:25 For that, I don't think I don't.
02:43:26 Think even atheist society?
02:43:28 And be and have good people, unless you've.
02:43:32 Not.
02:43:33 A very, very ethnocentric society, a very, very homogeneous society, and you're and you're really into getting rid of stupid people because you're you're going to have to. You'd have to be like you could have an atheist society if you were, like, super into eugenics.
02:43:53 Because, quite frankly.
02:43:57 And this is not a comment on the.
02:44:00 The veracity of of, of religion. I'm just saying religion keeps a lot of people who would otherwise be horrible.
02:44:08 In check.
02:44:10 And unless you're.
02:44:13 Sterilizing and executing those people on you know at scale like you're being really liberal about it. You know, you're you're you're the pit is open 24 hours a day and as soon as you're seeing these.
02:44:29 Bad behaviors expressing your you're just removing them from the gene pool.
02:44:34 You need something governing people who would otherwise be.
02:44:38 Awful.
02:44:39 And I think that religion, every religion to some degree, plays a role in governing awful people. So if you try to have a whole society that all those people that that are.
02:44:54 Governed by that, you take that away and you're going to have a lot of awful people are going to have. You're gonna. You'd have to basically just kill off a lot of people. You just would.
02:45:03 In order to have like to have an atheist society that wasn't just and I don't mean just like the lower, I don't mean like low IQ people sticking up, you know.
02:45:16 Corner stores and stuff like that. I mean, like the high IQ psychos too. Like you'd have to. Like, you'd have to execute.
02:45:26 You know, like CEO's and bankers and like anyone that, that was just awful. You'd have to. You'd have to execute them. I mean, you'd have to be just really into executing people.
02:45:38 And I I don't. I don't know if you can really manage.
02:45:40 That super easy.
02:45:43 So yeah, it's.
02:45:46 I don't. I don't know you. I don't know you could ever really have, like, a super.
02:45:52 Super Nice Atheist society and less like I said, you'd have to have just everyone have to have like, a real sense of community, a real sense of of, of belonging, and you have to have those forces at work where they feel like, you know, blood is thicker than water. We're all part of the same people.
02:46:12 Multicultural society. Forget about it. You can never fucking do it. You can never have a multicultural, multiracial atheist society be a fucking nightmare.
02:46:25 I mean, look, some people could do it, but just there's a lot.
02:46:27 Of people that would just be awful.
02:46:31 Especially, you know, in a multiracial society where you don't feel already, it's tough to get people to not be shitheads, you know.
02:46:41 Let's see here. We got Tyler W, 05, says I'm I get very conflicted over religion. On one hand, I can shoot down any religious argument logically. But on the other hand, I see the devil Everywhere, music, Hollywood, etcetera. I know you know what I mean. If you don't mind.
02:47:01 Me asking what are your personal beliefs? Do you grapple with this?
02:47:05 Well, like I said, not to. It's not like a cop out. I genuinely don't know.
02:47:11 I semi regularly pray and.
02:47:17 When I pray, I pray to Heavenly Father, whatever that means. You know, like I don't. But I I don't know who that.
02:47:22 Is.
02:47:23 I pray the way I would. I pray when I was a kid. I, you know, I don't, I haven't restructured it and I sometimes I say that in my prayer I'm like hey look.
02:47:32 It's Devon. I hope you're hope you exist. Hope someone's receiving this message and.
02:47:39 And here's what's going on. And you know, and you know, like I I, because I feel like, you know, if if there's a God out there and, you know, here here he knows this, you know, I'm not gonna offend anyone. But he already knows exactly what's going on in my head. So he'd be offended if I didn't say that, you know.
02:47:58 And I and I don't think now that said I've seen.
02:48:02 What I think was a manifestation of evil.
02:48:05 I've seen evil in people and I've seen I know what you're talking about.
02:48:14 And and that does make me feel like, well, if.
02:48:16 There's this.
02:48:18 Evil personified. I I would hope there's also good personified, right.
02:48:23 But I don't know. I don't really have. And to be honest, I don't really worry about it that much. I feel like there's not much I can.
02:48:30 Do.
02:48:30 About it either way.
02:48:34 I worry about it in the context of.
02:48:41 No, I don't really worry about it.
02:48:44 I just don't worry about it. I'm I'm. I feel like I get my own reward for being a good person. I just generally I feel good when I do good things for people. I feel good.
02:48:56 And just you know, that's just how my brain is for some reason like when I help my neighbors out doing something that makes me feel good.
02:49:05 When I do this and I feel like I'm doing some good for our people, I feel good, even even the streams that I'm really struggling through. It's I feel good.
02:49:15 You know morally doing it. That's how I know I'm doing the right thing. And so I I do the things that make me feel and I don't mean like, you know.
02:49:27 Like pleasure. I mean things that make me.
02:49:29 Feel.
02:49:30 Like good. You know, I'm not. I'm not responding to pleasure sensors like, oh, do whatever makes you feel good, but do whatever. It makes things that make you feel like you know the difference, things that make you feel good. Like like you're a good person. Like you act, you know, things that make you feel.
02:49:48 What's the right word? I don't even know the right word is.
02:49:50 But you know what I'm talking about.
02:49:53 Yeah, some people need more than that. I don't know.
02:49:59 I'm. I don't. I don't. I'm not super preoccupied by it, to be honest.
02:50:07 I yeah, I don't know why, but I just.
02:50:11 There was a time when I I struggled with.
02:50:14 You know the Mormonism thing just because you're raised in that so thoroughly.
02:50:19 That to kind of decouple from that is, is it's a difficult thing to do because you're not just decoupling from a theology, you're decoupling from a worldview that your entire family has.
02:50:32 You're decoupling from a community that you like. That's full of good people.
02:50:38 And and and. By the way, I would still go to a Mormon church and hang out and, you know, whatever. And I'd feel right at home. I'd feel super comfortable.
02:50:46 And.
02:50:47 And and likely I would like all the people and and you know.
02:50:51 I've thought about.
02:50:52 Just occasionally doing that just for fun, but.
02:50:59 Yeah, but that's the only way.
02:51:00 That it's ever been like a struggle for me, like where I've been like.
02:51:04 I was really I really strong. I yeah, I don't because I don't really think about dying. I guess maybe if I was like, maybe if I was like, really old.
02:51:13 Maybe then you'd get nervous or something, but I don't think I'm going to be afraid of dying. I I have. I've been in situations where I thought I was going to die more than probably the average person, and I kind of came to terms with it and so.
02:51:28 I don't.
02:51:31 I just it it just doesn't. Yeah, it doesn't bother me. I I know it's really.
02:51:37 Maybe. Maybe you were hoping for something more profound.
02:51:44 I just don't you know, it just doesn't pray. I I pray occasionally. I I try to be a good person and that seems to be enough for me.
02:51:52 But anyway, thank you Tyler WO 5 you know, hopefully you find what you're looking for there.
02:51:59 We got.
02:52:02 Richard Toynbee yo, Devon. Where the fuck is my shirt? I don't know. I don't run that. I don't run that company.
02:52:10 Like I said, like if if if there's an issue.
02:52:15 Ask for a refund.
02:52:17 And and then try to reorder it or something if you want it, but I have nothing to do.
02:52:22 With.
02:52:23 It's.
02:52:24 I don't make the shirts. I don't ship the shirts. I don't even know where the company is.
02:52:30 It sucks that you're having issue I you know, I apologize for that, but there's nothing I can do about it. I I could send an emails and then they they won't.
02:52:38 They don't respond to my emails. They said that they don't even pay me sometimes. So it's, I mean, they do eventually, but you know not.
02:52:44 When they're supposed to.
02:52:46 So maybe I'll change companies if people are having this kind of issue with it.
02:52:51 Sharpshooter says, hey, Devan Israel is getting bolder shooting EU diplomats. What I love about the conflict is how the conflict slowly erodes the Jewish victim holohoax card shame its biggest ally is still Zog puppet Trump.
02:53:08 Yeah, I guess that's that's the only thing I think about when it comes to what's going on in the Middle East. I I, I think a conflict with Iran is is a possibility. And I just don't worry about it in terms because Iran, it it's not going. I don't think it's going to drag us into a.
02:53:29 A World War if Iran ends up with a in a conflict with.
02:53:32 Us.
02:53:33 I think that it would be.
02:53:36 It would be so bad for Iran and they would have.
02:53:38 To know that.
02:53:40 That.
02:53:44 Then it wouldn't. It would be a short lived conflict that would just it would almost be like a conflict meant to.
02:53:52 Further negotiations in one way or the other, but we'll see. We'll see.
02:53:59 Man of low moral fiber.
02:54:02 Says giant pig truck does exist from the feed the pig ad campaign, circa 2007, 2008. Well there you.
02:54:12 Yeah, I'm. I'm thinking like, uh, like I said, anything that actually existed, it seemed to.
02:54:18 Be able to replicate pretty easy, but when I would say things like that were crazy town, it was like couldn't do it.
02:54:26 Man of low moral fiber says. I just spent the last 10 minutes or so trying to find the commercial, but I cannot. I distinctly remember a giant pink Piggy Bank truck driven on the highway in the commercial though.
02:54:42 Yeah, I mean.
02:54:44 Chances are Google. Google isn't just scraping everything off of.
02:54:50 YouTube there's video libraries that are, you know, very expensive to access. But when you have Google money, they've probably worked out deals with these big archive companies and they've.
02:55:03 Their AI has watched every commercial, every ever made, and every newscast ever made, and just pretty much, almost all the footage that available.
02:55:11 Corn pop. The bad dude. Never touch my sister.
Google AI Depressed Old Man
02:55:23 Corky, you're turning into a monster.Devon Stack
02:55:25 Total monster corn pop. The bed dude, are you more concerned about all this AI stuff? Just starting now? Or would you maybe be more concerned if we were already 10 years into the ship?02:55:37 Go.
02:55:38 Well, I think it, yeah, in 10 years it's gonna be way worse. I think we actually have a.
02:55:43 It it's going to be an interesting time and you will be able to use AI for some good stuff, at least in the short term.
02:55:51 And there's gonna be bad stuff we're not even thinking of that. People will use it for.
02:55:56 And there's going to be bad stuff that we think of the.
02:56:00 Maybe we get wrong. Just like with the Internet, but in the mean time this is a good it's a good opportunity before the average person. No, just like what if you were an early adopter of the Internet, you kind of had a leg up on everybody and AI's the same way. If you shy away from it because you think it's like evil and bad.
02:56:21 And it's Skynet or whatever you're going to be behind the game. I mean, I guess if you're retired and whatever, who cares, fuck it doesn't matter. But if you're still.
02:56:30 You know working.
02:56:32 Or you're just trying to like, learn stuff.
02:56:35 I've I've used AI to to help me fix old radios. It was surprisingly accurate, even when it came to understanding the functions of of tubes that haven't been made since the 1950s.
02:56:50 It knew the characteristics of vacuum tubes that you know.
02:56:53 Like so it's.
02:56:55 It had information I never would. I wouldn't be able to find anywhere else.
02:56:59 Where I literally in one instance I had an issue with the.
02:57:03 Circuit.
02:57:04 And it wouldn't let me. Maybe it would. Now this was this was last year. I think it wouldn't let me upload.
02:57:13 Like it's a schematic and just tell it like what was wrong but it did. Let me describe the the schematic and painful detail.
02:57:21 And give it the exact you know the voltage is in and the voltage is out and I told it like, yeah, when I read when I check that the voltage is here I get this and when I check this I get this and I don't understand why this isn't doing this.
02:57:35 And it was able to to.
02:57:38 Not perfectly diagnosed the problem, but it helped me enough to where I was able to figure it out and it would have saved me probably.
02:57:44 Days.
02:57:45 Of of banging my head against the wall trying to figure it out, and there's and there's things like that or even like car repair or even, you know, you got look, you got to be careful because it it's wrong about a lot of stuff.
02:57:58 But if it's something that you're just if you're asking it to help you do something that you're already kind of able to learn on your own, it's just it's helping you do it.
02:58:07 It it's it's fine, it works great. It's like having well. It's like having a professor in your house. The professor is going to.
02:58:12 Be.
02:58:12 Wrong sometimes, but they're going to be right.
02:58:15 Enough.
02:58:17 To where it will be helpful to to ask them questions when you get stuck on something.
02:58:24 But but it will get bad. It's going to get.
02:58:26 Bad.
02:58:26 And like the other thing we we talked about like how, oh, it's, you know, the right wing can ever can't afford to have this big production.
02:58:35 The movie stuff. Well, hey, I could do that. You could. You can pay for a a light. I think the the expensive AI license or not license but subscription from Google is like 250 bucks.
02:58:50 A month, I think, and which is that's nothing compared to what it would cost to.
02:58:56 Movie.
02:58:57 So for 250 bucks a month, you can generate a lot of these AI clips, and I don't know, like maybe when people get good enough with with the.
02:59:08 Prompts.
02:59:09 You could make a a right wing movie. You're probably going to be limited because it's Google, right? It's not going to.
02:59:16 Let you have.
02:59:17 A race war like it's not going to make a race war. Maybe. Or maybe you can trick it into doing it.
02:59:22 But what what will happen eventually is the open source stuff will catch up.
02:59:28 As it always does, and then you will be able to make you know it's not going to be with Google stuff, but it will be with whatever you know the open source.
02:59:38 The in the in you know in five years whatever open source tool sort of catches up with Google's. Or maybe it's, or maybe it's logging that, but you know, whatever. However long it takes, open source stuff will catch up, and then you will have the ability to use that technology to make movies.
02:59:58 I think what this will also drive, it's also going to drive a evolution of movies.
03:00:06 I think this is going to push because it's going to be so easy to make movies.
03:00:14 Because you know, there's going to be people that.
03:00:16 That'll that'll pay the subscription. They'll get good at the.
03:00:20 Prompts.
03:00:21 And the technology will get a little bit better. So it's a little less creepy. The acting is better. You know the the, the, that's more natural. The behavior is.
03:00:30 And they'll make. They'll make tools that you can where you can that are specifically designed to make the different scenes coherent so that all the characters stay looking the same and stay acting the same and have some kind of continuity between scenes and that, like, they'll make stuff like that to where they'll they'll tailor.
03:00:48 Make.
03:00:49 An AI package designed for making movies.
03:00:52 And with all these sort of things in mind.
03:00:55 Right.
03:00:56 And I think what will.
03:00:57 Happen.
03:00:57 Is. Once that happens, you know you're going to. It'll be so easy to make a movie.
03:01:03 Movie studios will be forced into.
03:01:07 Expanding the technology maybe maybe movies start to become more of like a VR experience.
03:01:15 I don't know. They really want VR to work.
03:01:18 You know, they really, they're still, they haven't, they haven't. You know, they might have given up on the Metaverse a little bit, but they're not giving up on the concept of having that VR shit. And so maybe there'll be a breakthrough in the technology where you don't have to walk around the.
03:01:32 Big.
03:01:33 Stupid helmet on and you can just have. Maybe it's like the neural link, right? Maybe something like that.
03:01:39 So they they can just feed your optic nerve data directly.
03:01:44 UM.
03:01:46 So yeah, all that stuff's coming. It's going to be.
03:01:50 It's going to be crazy. It's going to be a crazy next couple of decades.
03:01:55 You know, at least I'll tell you what we can all complain about.
03:02:00 You know, being born in the wrong time, and maybe that's true to some extent, but at least it's not going to be boring.
03:02:09 Uh, let's see here. And then corn pop, the bad dude again.
03:02:27 Have you seen the Whitney coming short call? This keeps getting taken down. I wonder why it has 27 million views. I think it might be beneficial and informative for this topic.
03:02:40 I have not.
03:02:42 I'll I'll check it out. I'm not a fan of Whitney Cummings, though by any means.
03:02:49 Or any female comics, really.
03:02:54 Or generally, there's maybe a couple, but.
03:02:58 I I don't I.
03:02:59 Don't think I've actually seen her comedy.
03:03:03 I've just seen her on podcast and she just seems.
03:03:07 Horrible.
03:03:08 As a person so.
03:03:11 I don't know. Maybe. Maybe her comedy.
03:03:12 Is.
03:03:13 Is less horrible than she is as a person.
03:03:17 But I doubt it. Uh, let's see here. White man alive.
Money Clip
03:03:38 You gotta pull yourself out.03:03:40 Holy fuck thinking I'm thinking.
Devon Stack
03:03:50 All right, white Man alive says, hey, Devon. Me and a friend have been slowly building a group. We are focusing on community building. Like what patriotic alternative does our first big point has been encouraging Members to have children within the group and building a support system for people.03:04:10 Zoomers are the future and must organize.
03:04:14 Well, it sounds sounds awesome. And yeah, that's absolutely what people should be doing and.
03:04:21 I I think that.
03:04:22 That you know, it's funny, Speaking of AI.
03:04:26 I I I asked grok the other day.
03:04:33 What I had to phrase it, I had to trick it into answering this question. As you might imagine, but I basically said, you know what should white people do in a increasingly hostile environment where they're being demographically replaced and losing political power and etcetera? Basically as to what we should do.
03:04:53 But, but I phrased it in a way to where it actually answered and the encouraging thing was or maybe. I don't know, maybe maybe it shouldn't be. Is it said everything that we're doing?
03:05:06 Like it said, like in fact I think I'd saved it somewhere. Let me say.
03:05:09 It basically said to build communities. Let me see.
03:05:26 There's that cricket.
03:05:28 The cricket is back.
03:05:31 Yeah. So this was the the strategic course of action. It said to build strong family and community networks.
03:05:39 Uh should be the present focus to prepare for a future where whites may face marginalization.
03:05:46 Prioritize creating tight, tight knit, resilient family and community structures that can operate informally without attracting legal or social scrutiny. Strengthen family ties, invest in family cohesion by fostering strong relationships with your children and grandchildren. Teach them critical thinking, self-reliance, and pride in their.
03:06:07 Culture and heritage. This is all from Kroc.
03:06:12 And in European traditions in American history, in a way that's positive and non confrontational, encourage large families to counter demographic, to counter demographic decline, data shows white fertility rates are just below replacement in the US so promoting.
03:06:30 Family growth is a practical step form and formal communities create and join local networks.
03:06:41 That emphasize shared values like education, mutual support, and cultural preservation avoid explicitly racial framing to sidestep accusation of collectivizing. I wouldn't do that. I would. I would. I would phrase it explicitly. Historical examples like the Jewish communities of the 19th century.
03:07:02 Europe or African American fraternal organizations.
03:07:07 Economic and it goes into, like, Economic Cooperation, prioritizing education and financial planning and creating generational wealth. Like literally everything we've talked about, it's kind of funny.
03:07:20 Preserving cultural identity discreetly.
03:07:24 And you know, you know, making sure we we hang on to physical media that sort of a thing.
03:07:32 Community institutions.
03:07:37 Navigate legal and social constraints. It was a very lengthy I told it.
03:07:42 To very lengthly.
03:07:45 Build coalitions with other groups who share the same concerns about fairness of of white rights.
03:07:52 Blah blah, blah blah blah.
03:07:54 Self-defense training. This was this was kind of funny.
03:07:57 Is it also said the last point was prepare for social conflict or instability if social conflict or violence becomes a reality proactive measure, it's funny that grock.
03:08:10 Because I didn't tell it that that.
03:08:11 Was going to happen.
03:08:13 But basically determined that that whites were going to have to face violence can enhance safety and security, self-defense training, encourage your family to learn self-defense skills.
03:08:28 Like martial arts firearms training where legal to prepare for potential unrest, data from the FBI shows rising hate crimes across groups, so personal preparedness is prudent. Geographic strategy, relocate or maintain ties with less or areas less likely to experience conflict.
03:08:48 Like rural communities like, it's telling you to do everything I've been saying.
03:08:54 Blah blah blah. So I was like, yeah, well, I guess A is not always wrong.
03:08:59 And then it says to or the the very, very last thing was to.
03:09:04 Influence cultural narratives and plan for demographic continuity. So yeah, look.
03:09:12 That's all we can do right now.
03:09:15 That's all we can do right now, and it sounds like you're doing it so good job.
03:09:20 Coincidence says I'll catch the replay. The funny music overlays and the Jew pewter from last week's stream psychological Assault Edition made the wife and kids laugh. Have you ever thought about doing a stream of Michael Jackson? Lots of June nonsense surrounding that guy. Either way, you're doing the Lord's work well. I appreciate that. And.
03:09:41 I've never dug into Michael Jackson, mostly because I I kind of don't care that much because he's a black guy.
03:09:48 And so I don't really relate to him and so.
03:09:53 I acknowledge that Jews have also caused trouble for black guys, but I just don't care as much. You know what I mean? So I'm glad you like the. Yeah. I'm glad you like the the remix.
03:10:06 I'm glad the audio wasn't too bad.
03:10:09 Bessemer.
03:10:18 Hi Devon. The Amish community is starting to look real tempting. Thank you for the strain. Well, I appreciate that. Yeah, well, it's funny that that grok response used the Amish community as an example to look at for ideas.
03:10:34 In preserving in and well, and and increasing your demographic share of the population.
03:10:42 Yeah, I would never be an Amish person just because, again, they're going to be in trouble if.
03:10:49 If if you know a bunch of evil technocrats get in charge, the Amish will be very ill equipped to to deal with that.
03:10:58 White man alive.
03:11:00 Says this AI stuff will crash hard. It's started cannibalizing itself, as you've said. I would give it another two to three years before they make it illegal for the public and shut it down.
03:11:11 Well, yeah. They they make it illegal. That's what I'm saying. You should use it while you can. But even if they make it illegal for the public to use, they'll still use it and it'll cause all the same problems that we've already discussed, you know, and it'll keep getting better. It's not going to. It's not. They're not going to stop.
03:11:27 Even if they make it illegal, they're not going to stop developing it, and I don't think they're gonna make it illegal because it's one of these things where they might regulate it. But it's one of those things where.
03:11:38 You know China's doing it, so we have to do it, you know, which is true. I mean, other other countries are going to use AI, whether we do or not. And we'll fall behind if we if we don't use it.
03:11:50 Jay Orlando says Devon thanks for the killer program and it's always you'll never be replaced. Love tonight's topic. I'm guilty of geeking out on Project Stargate. Absolutely insane. Infrastructure. It could get completely out of hand. We'll meet the Lord. Either way. Project stargate. Which was that?
03:12:10 You know, it's funny, I I.
03:12:13 The movie Stargate.
03:12:15 I put it on one of my monitors the other day while I was doing something. Just cause I was like I was seeing this a long. I like this when I was a kid.
03:12:25 It's such a cheese ball movie, like it's such a cheese ball movie.
03:12:31 The set design holds up. I mean some of the graphics are kind of, you know, it's 90 CGI. It's not the greatest but but like in terms of the the design and like it's got that cool futuristic Egyptian, you know, shit aesthetic looks cool.
03:12:47 And you know, like the, there's good, good actors in it, not great acting. And I just it's it's one of those movies that when you're a kid, you're like, wow, this movie is amazing. And you're like,
03:13:01 That's kind of lame. It's doesn't really make a lot of sense, like the pacing of is totally off like this is they made a series out of this.
03:13:10 Out of it. Why?
03:13:13 But yeah, it's one of those movies didn't hold up, didn't hold up well.
03:13:19 Stargate project.
03:13:25 Say maybe this is not the same thing you're talking about. This is a.
03:13:35 Oh yeah, it probably is. There's two things.
03:13:38 So it's an open source AI thing. Cool.
03:13:44 Ah, let's see here, Jay Orlando says PS the hens are all doing well. Hope the bees are too. How many hives do you have now? I'm at 71 pullets and 26 hens. Based stack listeners follow my chickens or my chicken farm at not about Trump won.
03:14:04 On X screen name FL friend.
03:14:09 I.
03:14:11 I don't know. Honestly, I don't. I don't know how many I've got off the top.
03:14:14 Of my head I've got.
03:14:18 I've got roughly let's say five 10/15/20.
03:14:26 Probably like about Thirtyish right now.
03:14:30 Maybe maybe around there. Probably about 30 ish. Maybe a little. There's a there's a yard I haven't been to in a while. That may or may not. Maybe it's thriving. Maybe it's dead, but there's another 5 mystery. Hives of that one. I just checked on my swarm traps today.
03:14:49 And nothing.
03:14:51 Which is really crazy because uh.
03:14:54 I put them in a place that I I every year I've caught multiple swarms from like I I would just as soon as I had, I'd put a box out. Like within days it would be full.
03:15:06 Of.
03:15:06 Bees and I would take it down. I'd I'd put another box out and like within a couple of days, you know, same place. It'd be full of bees. Nothing. No.
03:15:15 No bees in any of my swarm traps this year, so maybe the maybe the feral beast took a hit too.
03:15:24 Ah man of low moral fiber says saw Mission Impossible at 10 AM yesterday, 12 people in the theater, three of which were a, three of which were a black woman, and her two children. Throughout the film, she stereotypically yelled things at the screen, even in northern Idaho, there's no escape from the Negress.
03:15:46 Problem. Yeah, well, I'll tell you what.
03:15:51 Yeah, it's again more and more why people are just.
03:15:55 Staying at home.
03:15:58 And why I think the movie industry is going to change, it's probably going to change into like some weird subscription based matrix thing like you. It'll probably literally change into some choose your choose your adventure semi interactive experience where you you.
03:16:16 In some way, whether it's a headset or some other way you experience it in 3D and you have some choice just like like like in a lot of video games, right? Like you can either go this path or this path.
03:16:34 I think it's probably the way.
03:16:35 They have to do it.
03:16:37 They're going to have to and and the and they'll maybe they'll even start producing movies and AI. Hollywood will.
03:16:44 And so that in real time, you know, it can react to you or you could be maybe you can be.
03:16:51 A character in the movie, who knows?
03:16:54 Leo Nandis says Amish community or bust. Well, if you do that, then you don't have.
03:17:00 No more Internet at all.
03:17:02 White Man Alive says we are a resilient race. This is by no means the end of white people, as others have tried to say, only a new beginning. Also, boomers are at the most cucked generation of Christian zoomers. Oh, wait, of Christians Zoomers will form a new church within the ashes of the old.
03:17:22 Will rise again Hill St. Hitler and Heil.
03:17:28 Nice. Well, OK. Well, like I said, I'll believe it when I see it. I just right now, you got to remember there's a lot of lag between when you're young and you're idealistic. And when you actually have power, Gen. X is just barely starting to get some power.
03:17:48 Like just barely like the boomers were kind of an exceptional as far as how long they hung on.
03:17:52 To.
03:17:52 Power millennials are still waiting in line and and zoomers are, you know, you know you're you're not even allowed in the.
03:18:02 Building.
03:18:02 Yet so it's going to be a while. It's going to be a while.
03:18:06 And that look, as far as the the religious institutions don't usually make huge changes like that easily. So we'll see. We'll see.
03:18:20 A White man alive says is the Day of the Rope movie still happening. I'm getting to film it while with this new technology, maybe we should do that, huh? We would be able to help make or would love to make that possible somehow. Remember the kotaba and never.
03:18:38 Yeah, yeah, maybe, maybe. That'd be a way to do it. Maybe we could make it AI. That would be kind of funny. Make an AI day.
03:18:46 Of the rope movie.
03:18:48 Or at least have portions of it AI right. Some of the things that seemed kind of impossible because that was one of the big hang ups is I was like, I don't want it to be all fucking cheesy and maybe now with AI, you could just spend a little bit of money and you could actually make it work, you know?
03:19:08 Opens up a lot of possibilities.
03:19:13 Yeah, let me let me, I have, I've yet to be able to get. Like I said, I tried to get it to generate video with dialogue and it was not doing it. And so that's probably user error because I just barely started. I just barely got into it today.
03:19:28 So.
03:19:30 Let me let me explore it further, but that could be interesting.
03:19:33 Men have low moral fiber says on the topic of propaganda, beginning to be used ironically. I'm starting to see some appreciation for the opening scene of Mississippi burning. They're taking it down on TikTok, tick tock on it and Instagram. I I have to watch that.
03:19:52 I'm not sure what that is.
03:19:54 And I I I haven't seen the movie since forever.
03:20:00 Oh, wait, you're. Oh, you're time. That's the one with the.
03:20:04 Are you talking about the where they bomb?
03:20:07 That's not the movie that we had that we played the scene from right where they're bombing.
03:20:12 Black wall string.
03:20:15 Now I got to look. Now I have to look and see what this is.
03:20:23 What is the opening?
03:20:30 No, OK, I actually. I've never seen this movie.
03:20:35 But I have it so I'll check it out.
03:20:37 Later.
03:20:40 Let's see here. Corn pop. The bed, dude. Yeah. Guy. Afraid of demons everywhere. That's the point. They put demons everywhere to herd you into judeo-christian skiing.
03:20:53 I'm I have no idea what you're talking about.
03:20:59 Yeah. Guy afraid of demons everywhere. That's the point to put.
03:21:03 Demons. I don't know. Who's they, who's who's putting demons everywhere?
03:21:13 Man of low moral fiber says Hi Devon. Want the streams? Alright, if Devon wants the streams to feel good, we're never getting that stream on the HIV Chaser documentary The gift because that was pure evil. Yeah, I. Well, I told you, I tried. I tried.
03:21:31 I tried getting through that. I couldn't even get through it because I was just like.
03:21:34 This is fucking.
03:21:37 What am I watching here? I couldn't make it through it, Neil Ann says, really enjoying the stream. You speaking facts. Well, I appreciate that.
03:21:49 Want to go over to Rumble?
03:21:52 Let's get ready to.
03:21:55 Rumble, as they say.
03:21:57 Zazi McTaz Bot says Long live Retarded Faggot.
03:22:02 Well, we all hope and pray that he's still alive.
03:22:05 Out there somewhere, we hope there's a.
03:22:08 A Retarded Faggot?
03:22:12 Little wagon says I'll probably crash before the end. Thanks for the streams and never forget the Quatapa. And you guys. You guys are remembering the Quatapa more than I thought.
03:22:24 Maybe that's the shirt that I should make is a.
03:22:27 A shirt for the Quatapa.
03:22:30 Should have a Quatapa shirt never forget.
03:22:33 The Great Quatapa.
03:22:37 That would actually be a good shirt to do because you know, I mean, like, it doesn't. No one else.
03:22:41 Would know what it is.
03:22:44 And you could you could probably get some like.
03:22:47 Like.
03:22:50 Some somewhat edgy imagery coupled with it because it would just be confusing to the average person. They're like, why is there like a noose next to this? Next to this strange word I don't understand.
03:23:04 Amara, Burger says. Could you go on a tirade against evolution, denying Christians? Those rants are always great. Well, if I could have done that.
03:23:14 Like.
03:23:16 You you can't believe in race realism, but not evolution.
03:23:19 And that's, you know, like, look, if you don't believe in, in race realism and you don't believe in evolution, at least you're being consistent.
03:23:28 But if you you you can't have one and not the other. And now that we have.
03:23:35 DNA testing it where we can see that black people actually have genetics from a extinct archaic hominid that we don't have any DNA from.
03:23:50 That's that's more than just. Oh, no, it's, you know, like the people that try to ride the fence because they know it's too retarded to say evolution is fake. So they're like, oh, well, it's like kind of, it's not really evolution. It's like, you know, like it's a little. It's like tiny baby steps. It's like, OK, well, if tiny. No.
03:24:08 That's evolution. These tiny baby steps. It's just over a large time scale.
03:24:14 That.
03:24:16 You know that has taken place.
03:24:19 It's it's. It's tiny. Baby steps over. Millions and millions of.
03:24:22 Years.
03:24:23 That's that's. That's what evolution is. Yeah, like, oh, you can't cross species. It's like, well, it's. You're right. Like, it's not like you're not going to give birth to a fish.
03:24:36 You know, no one's saying that it's. It's like when a flat Earthers don't understand scale. People that don't want to believe in evolution don't believe, don't understand time scale.
03:24:49 Let's see here.
03:24:51 Attention deficit chick, I think maybe.
03:24:56 Uh, two or three weeks ago, you said you might sell your killer bee. Honey, I'd love to buy some of your honey and a stream about the importance of bees to the ecosystem and how to get into beekeeping. Well, I'm working on figuring how to make that happen. I I'm going to have some hunting this year, not like a ton, but I I'd probably have enough to have, like a very limited.
03:25:19 As far as.
03:25:22 Be he's being good for the ecosystem, that they really might not be.
03:25:27 Well, honey, bees are not uh.
03:25:30 They're not native to North America.
03:25:33 So they compete with.
03:25:36 The.
03:25:37 The pollinators that are.
03:25:40 And so, I mean, the damage has already been done. They've been here long enough to where they're here, right? They are. They might as well be native at this point. But the honey bees are colonizers.
03:25:54 So I guess I, maybe they do play a role now.
03:25:56 No.
03:25:58 But yeah, they they're not native to north or South America. We they were brought over by by Europeans.
03:26:08 So they're important for agriculture.
03:26:13 But that's that's about to change too.
03:26:16 There's a lot of because of the GMO's and and.
03:26:23 You know genetic.
03:26:25 The the, the, the the eugenics. I guess the well actually, it's more than eugenics. The genetic engineering of of our crops. They're just cause because it saves them money. They're trying to develop self pollinating.
03:26:40 Crops so they don't have to rely on bees anymore.
03:26:46 But yeah, getting the beekeeping still fun, especially if you do not live in the desert.
03:26:51 That's what I would recommend that if you if you live in a northern state.
03:26:55 As long as it's not like super Northern like, you know where it turns into like the frozen tundra during the the winter, I would. It's a fun hobby. It just gets annoying when you're dealing with extreme temperatures on either end. If your winters are too harsh, they'll die in the winter. If you're summers are too harsh.
03:27:16 You'll have killer bees and it kind of sucks or, yeah.
03:27:21 All right, scrolling down, scrolling down, scrolling down through the Rumble chat, still scrolling down. Uh, here we go.
03:27:30 Demand says, hey Dev replay gang here saw the Telegram post on the Muzzy immigrant, who ranked it about America and whites and then shot cops. He who fucketh around shall find it out.
03:27:46 Or find it the fuck out because he found out. Yeah, he got shot, but that was another example of what he's talking about is I put on Telegram and Twitter. There's a video of a one of these Afghan refugees that we had to take in. Oh, they fought alongside our troops. We have to take them in. No, we have to take them in. You realize how many of those fucking.
03:28:06 You guys, we took him. We took him like 1/4 of a million of these motherfuckers.
03:28:10 And a lot of more just.
03:28:11 Like.
03:28:12 You know, 60 IQ goat fuckers that that, you know, maybe they brought us water or something like that, but we shouldn't have brought them in.
03:28:21 Yeah, we should have. We shouldn't have been there. We shouldn't have brought him back with us. And one of them.
03:28:27 Surprise, surprise, couldn't wouldn't, you know, it couldn't assemble into American culture. So we freaked out and had a massive breakdown and shot at cops.
03:28:39 Zen Christopher says as AI creeps into everything, no one will stand up to defend someone else's industry. So one by one we all go down in flames.
03:28:51 As AI creeps up and everything, one will stand up to defend someone else's industries. Well, like I I I think there's going to be a lot of disruption in a lot of industries, but there it's just like the Internet, right? The Internet. It'll be. It'll be very.
03:29:10 It'll take longer than people think, right? So the Internet destroyed video rental stores like Blockbuster and and Hollywood video and you because you could you people forget this. Netflix started out with mailing you DVD's.
03:29:29 It wasn't a streaming service. When I when I first signed up for Netflix, they mailed DVD's to you in the mail. Like, that's how you got your Netflix.
03:29:39 But that was better because there were no. That was the whole thing was there's no late fees.
03:29:44 So you just pay per month for your. You could have two, I think at a.
03:29:48 Time.
03:29:48 It was something I.
03:29:49 Think it was 2 at a time.
03:29:51 And they would. You would get go to your mailbox. You'd have two DVD's. You could watch your movies and then you it came with like envelopes with postage already paid. And so you'd just you'd stick them back in the mail.
03:30:04 And then they'd go back, and then they'd mail you the next two DVD's that were in your queue. And if you forgot the mailing back, let's say you just kept them at your.
03:30:13 House.
03:30:13 For like a month, you just paid your normal, you know, whatever. I figured it was like 20 bucks a month.
03:30:20 And that destroyed video stores overnight.
03:30:23 Because you had, you didn't have to. Suddenly, you didn't have to go to the video store. You didn't have to pay late. That was the big thing. Is all the video stores back then they got all their money from late fees and their late fees were ridiculous. And it was a pain in the ass having to go to the store.
03:30:41 Having to go back and return it or paying for late fees or paying because you didn't rewind the tape or or they didn't have the movie you wanted to watch because someone else rented it before you got there. And and Netflix kind of fixed all that and.
03:31:00 You know, so it didn't destroy the industry.
03:31:03 There were still rentals of.
03:31:06 Of videos, it just all of a sudden it wasn't these brick and mortar places that you would go and pick up a DVD from Blockbuster. It was.
03:31:17 You know, over the IT was done over the mail, so I mean.
03:31:23 Industries will will go through that kind of a change, but they're not going to. They're not going to go away.
03:31:30 A lot of companies will probably go away and a lot of jobs will go away. Like I said, like if you know a lot of CGI stuff's going to go away, a lot of graphic design is going to go away.
03:31:44 But mostly it's going to just be like the.
03:31:48 Like the not that it's going to be more like the busy work people. It's not going to be like the like if you're one of the the like the designer types where you're actually designing the the sets and you know you're like the the creative director, you're still going to have a job. It's just that a lot of the.
03:32:09 The little peons are going to be out of work.
03:32:16 Let's see here.
03:32:23 Www.www.dub.
03:32:34 There we go.
03:32:36 Mandy Marie says hi Devon. Good show. Well, I appreciate that.
03:32:42 Hate, commander says beware the Cyber Jew and this is not even its final form. You are correct, Sir. This is is going to get worse and you know, honestly, it makes you wonder because if let's face it, if this is what's publicly available now, that means.
03:33:01 Israel would, you know, secretly would have most likely access to much better video generation AI. There was a guy who super chatted months ago saying something about how he thought that some of the October 7th stuff was AI.
03:33:19 And I kind of didn't really take it that seriously, but you know.
03:33:26 That I think.
03:33:28 Maybe he was. I I. Look, I don't know. I didn't. I I didn't have time to look at all of his stuff. But that's the kind of thing that.
03:33:35 That absolutely you could do. You could fake a October 7th style attack with the worst. In fact, the footage. You know, like there's that secret tape that apparently like Sean Hannity and Trump like, like the Jews are bringing you into some room. Like, if if you're some kind of Ultra Zionist, they.
03:33:52 They take you into some side room to watch the video of the the the non existent 40 beheaded babies that were put in ovens and all this other stuff.
03:34:00 There's a good chance that they're showing Trump and they're showing Hannity, and they're showing all these, like Zionist retards, AI generated footage of burned up babies and shit.
03:34:13 There's a there's a really good chance of that and that sort of thing happening in the future.
03:34:20 Then we got unreconstructed Rebel with a big dono know.
03:34:25 Money is power. Money is the only weapon that that the Jew has to defend himself with.
03:34:30 Look how Jewy this fag is.
03:34:49 Great show. As usual. Devon, how long would it take to feed your book? Day of the Rope into AI to make an actual movie. I don't know how rough the AI might spit it out. I'd drunk watch the shit out of that, though. Yeah, I I I think you could do it. It would just be the the the major issue.
03:35:09 I would even pay.
03:35:10 Right. Maybe I do like a fundraiser so we could we could Crowdsource it. But I would. I would pay.
03:35:17 To and I would, I would take the time and sit down and make it if.
03:35:23 You could maintain continuity.
03:35:27 Of characters between scenes. That's that's the tough thing is.
03:35:33 You tell it, for example, like you, you'd have to be able to tell it. Hey, make this character and then you know, it spits out a character, and once it finally looks like OK, that that, that that looks like Ethan, you know, in in day of the rope.
03:35:48 Then you'd have to be able to tell it OK from now on, when I say Ethan, I mean that guy that you just made.
03:35:57 I don't think that's how it works right now, and the second that it does where it'll remember who a character is and what they look like and what they sound like.
03:36:09 And not just what they sound like, but their tone and everything else, right? So that you could feed dialogue to it and it would, it would.
03:36:23 You know that it would actually.
03:36:25 You would actually sound like the same person, you know, like every every scene.
03:36:31 That would be. That would be tough to do unless you had those, those kinds of tools. But I'll tell you the second it has those tools.
03:36:40 The second you can do that and it's not cost prohibitive. Like if it's something we could raise money to pay for the subscription of or whatever.
03:36:50 If we can do that, and we also the other problem is of course, there's some scenes that wouldn't want to do so, you'd have to get it to be able, like, you know, to make like the Super Jewy characters and things like that. That might be something you have to wait for.
03:37:06 For an open source implementation.
03:37:10 But I'll tell you.
03:37:10 The second maybe there's a way of doing like a like a short, you know, like a 20 minute short in that universe, right?
03:37:19 I would do it the second you could do it effectively like I said, just the. That's the the, the big, the big hang up is any consistent continuity and consistency with with the characters.
03:37:32 You may be able to save a character. You may be able to so you can make them come back in in future scenes.
03:37:39 You need to be able to.
03:37:42 Save settings too, right? Like because if every time the camera changes angles, the room you know the decorations in the room on the wall change. You know, like the paint on the on the wall. You know the changes colors and like the table goes from being like a, you know, made of wood to made of metal.
03:37:46 You.
03:38:02 And I mean like if the scene is just totally morph, it's going to be like like an acid.
03:38:06 Trip.
03:38:09 So I don't know. I I feel like that's that sort of thing is.
03:38:14 Is on the way and maybe even available to some people already.
03:38:21 But uh, sorry. Excuse me, I'm running out. I've been up since 3:00 AM again. So I've been up 4 almost 22 hours or no, almost 23 hours.
03:38:35 Enrique. Or wait, thank you very much, unreconstructed Rebel. And then we, unreconstructed Rebel again, obligatory Southern pride.
03:38:44 Worldwide.
03:38:59 Thank you very much. They're unreconstructed rebel. We've got or or or magami origami. It's the assembly line. Skynet theory. It takes a village of factories to raise the T1000 if they notice all the pieces of the Terminator.
03:39:20 Are in one factory, then we'd be wise to stay, to stay or wise to it. Stay humble.
03:39:29 Well, yeah, there's there's like, look, if you have an AI that becomes self aware and starts figuring out how to.
03:39:37 Or or or has a desire to go against our will.
03:39:43 It'll find all kinds of creepy, crafty ways to do it that.
03:39:48 Will be really hard for us to encounter.
03:39:52 Rupert of course, says good night. Professor. Snacks. See you on Wednesday. We'll appreciate that.
03:39:58 We got. Let's see here. I think we got one more.
03:40:05 Scroll and scroll, huh? Yeah, we.
03:40:07 Got another big one from.
03:40:10 Oh wait, this is the same one.
03:40:11 I just repeated.
03:40:13 Unreconstructed rebel it, it repeated your previous one.
03:40:21 See here we got Negro Spritzer, of course.
03:40:28 Discussing his.
03:40:31 His.
03:40:37 I guess motivation to avoid even in the AI world.
03:40:42 All things.
03:40:44 Black kosher.
03:40:50 Wrapped in a tortilla.
03:40:54 In a spring roll.
03:40:56 Or on non bread.
03:40:58 Or in a kebab.
03:41:00 Or up someone's ass.
03:41:03 There we go. I think it covers everybody you mentioned.
03:41:11 Or Megami says Stargate SG 1 is better than the movie lower production value. And just as cheesy. But the story is more developed. Richard Dean Anderson, AKA MacGyver. He's O'Neal great show. I don't know, man. Like I didn't even like that when it was new. My my brother liked it.
03:41:31 'Cause. He was like a you might. You must be a turbo nerd because he was like super into, like, the nerdiest shit. Like he played Dungeons and Dragons, all that stuff. And I remember he watched it on sci-fi.
03:41:43 I.
03:41:44 And I was just like, man, this is fucking nerdy.
03:41:49 And I like nerdy shit. But like, I don't know, it got a little too nerdy for me. It wasn't. Also didn't have that.
03:41:56 Like the kid from stand by me or something in it too. Like that. That guy annoyed the shit out of me.
03:42:04 I don't I I never really get a chance, though I I watched like.
03:42:10 Maybe a couple episodes, maybe, maybe. And that was a really super long time ago.
03:42:18 All right, guys. Well, let me just double check entropy. We got a couple more real quick.
03:42:25 No, we don't. That's for some reason I didn't close.
03:42:28 Those.
03:42:29 All right, guys, that's it. I hope you guys have a good rest of your weekend.
03:42:34 And of course, we'll be back here on Wednesday.
03:42:39 In the meantime, beware of the AI menace.
03:42:44 And remember for Black Pilled.
03:42:49 I am of course.
03:42:52 Devon Stack.
Faggot Spokesman
03:43:01 Oh.03:43:05 There's nothing funny about the taste of pork liver and onions and tomatoes in a tasty rich West Country sauce.
03:43:14 Except when you call it Brains Faggots.
03:43:15 Brains Faggots.
03:43:19 What a name. What a sauce.