You’re Probably Going to Lose Your Job. Get Valuable Now.
AI isn’t coming “eventually.” It’s coming this year... and the biggest scam is that you’re training the model that replaces you every time you use it. We break down Nvidia’s blowout quarter and why the market still shrugged, then zoom out to the darker part: the circular financing and “investment” theater where Big Tech writes checks to OpenAI… and OpenAI hands it right back as infrastructure spend. We talk the real endgame (jobs → AGI → UBI → asset owners win), and we close with the only sane answer: own assets, learn agentic AI, and build yourself into the part of the system that can’t be commoditized.
💥 Have you left your "honest ⭐️⭐️⭐️⭐️⭐️" review?
This episode is proudly brought to you by Fridays.
Because real wealth starts with your health. If you want to feel sharper, stronger, and more in control, visit joinfridays.com and use code HIGHER for an exclusive discount.
📩 NEWSLETTER: https://tr.ee/O6FWkv
👕 THS MERCH: http://www.thspod.com
🔗 Resources:
Nvidia just reported record Q4 revenue (Christopher M. Naghibi via X)
The Macroeconomic Consequences of AI (Moody’s Analytics)
Report: Jobs That Are Most And Least Impacted By AI (Forbes)
The next chapter of the Microsoft–OpenAI partnership (OpenAI)
The Investment Example (Christopher M. Naghibi via X)
What Sam Altman Doesn't Want You To Know (More Perfect Union)
⚠️ Disclaimer: Please note that the content shared on this show is solely for entertainment purposes and should not be considered legal or investment advice or attributed to any company. The views and opinions expressed are personal and not reflective of any entity. We do not guarantee the accuracy or completeness of the information provided, and listeners are urged to seek professional advice before making any legal or financial decisions. By listening to The Higher Standard podcast you agree to these terms, and the show, its hosts and employees are not liable for any consequences arising from your use of the content.
Transcript
All right, before we even start the show.
Speaker A:Rajeel, can you hear me?
Speaker B:Okay?
Speaker C:Yes.
Speaker C:Yes.
Speaker A:You wanna.
Speaker A:You wanna admit to us if you actually watched the last show or not.
Speaker D:I. I usually do, but I didn't watch it.
Speaker C:I didn't watch this one.
Speaker C:No, no.
Speaker D:You see me commenting?
Speaker D:No.
Speaker D:Monday, I watched it.
Speaker A:No, not the live shows.
Speaker A:The last podcast on Tuesday.
Speaker A:I know you watched the live sheets.
Speaker C:Yeah.
Speaker C:No, no, I gave you a special shout out.
Speaker D:You didn't.
Speaker A:We knew.
Speaker B:We knew.
Speaker C:It was like, oh, let's test him.
Speaker C:If we say this and he mentions
Speaker A:it, he's just trying to bait you in watching the show.
Speaker A:We didn't do that.
Speaker C:I'm rage baiting.
Speaker A:He's trying to get another.
Speaker A:Another view.
Speaker D:Oh, yeah, of course.
Speaker B:So many views.
Speaker C:Welcome back to the number one financial literacy podcast in the world.
Speaker C:This is the higher standard sitting in front of me in the fighter T shirt.
Speaker A:Yeah, that's right.
Speaker C:Like it you can find@thspod.com it's Christopher Nahivi.
Speaker C:Cue the camp pop C4.
Speaker C:Old school.
Speaker A:He's I and I am him.
Speaker E:You are him.
Speaker C:Himothy.
Speaker A:Himothy sitting across from me, my man.
Speaker B:The only man in the world who
Speaker A:actually wears a quarter zit with basketball shorts and a T shirt.
Speaker B:It's got to be the brand.
Speaker A:Say it, Omar, everybody.
Speaker C:Thank you, my man.
Speaker C:You can't prove that I'm wearing shorts and sit behind the desk in the production suite, if you will.
Speaker C:Slim Rejil.
Speaker C:Fine Fijian.
Speaker C:What's up, my guy?
Speaker D:What's up, everyone?
Speaker A:We lied to the audience.
Speaker A:Told that you wouldn't be here this week because you weren't supposed to be here this week.
Speaker A:And then we would change the recording date.
Speaker A:Now you're here this week and we just.
Speaker C:We honestly didn't like the show without you, so we pushed the recording back a day just to have you on.
Speaker A:Yeah, that's why we did it.
Speaker C:That's why why we did it.
Speaker D:Okay.
Speaker D:Surprise.
Speaker B:Hey, I'm out here.
Speaker C:It's actually my daughter's birthday today.
Speaker C:So special.
Speaker C:Happy birthday to you, Arya.
Speaker B:Happy birthday Ar.
Speaker C:We got a lot to get into today.
Speaker A:She's probably not listening to the rest of the show.
Speaker A:It's going to be.
Speaker C:Yeah, she stopped right there.
Speaker A:Yeah, right there.
Speaker A:Just cut her off.
Speaker C:We're going to get into a little thing called manipulation at some point in the show.
Speaker A:Yeah.
Speaker C:But first we're going to start off with why you need to hurry up and get valuable now because you probably will lose your job.
Speaker A:Yeah, I'M gonna apologize in advance.
Speaker A:This one's gonna be expedited filled rage.
Speaker A:I'm angry and it's gonna come out.
Speaker A:There's no, no avoiding at this point in time.
Speaker A:So it just is what it is, right?
Speaker A:And if you find that this topic is grim, I apologize.
Speaker A:I'm probably the only one saying what the quiet part is out loud to all of you.
Speaker A:And again, before we get started any of that.
Speaker A:Look, this show is lovingly brought to you, the team over at Fridays.
Speaker A:You want to support the show.
Speaker A:That's a great way to do it.
Speaker A:You can go to www.join Fridays.com Enter code higher.
Speaker A:You will get a hundred dollars off your first order.
Speaker A:They sell the GLP ones.
Speaker A:You meet with a doctor.
Speaker A:Very, very quick and painless process.
Speaker A:You can actually do it all electronically without actually physically talking to anybody if you'd like to.
Speaker A:You can also get NAD plus and a bunch of other longevity drugs.
Speaker A:I am a big fan and on both personally go to joinfridays.com use code higher.
Speaker D:And also they provide free nutrition services and coaching.
Speaker A:And the skinny guy would tell you,
Speaker C:yes, he knows because it works.
Speaker C:Look, he is the, the proof in the pudding.
Speaker D:I am him.
Speaker A:That is a wide angle lens
Speaker B:making
Speaker A:you feel good about you one lens at all, right?
Speaker A:So look, I've been staying up late at night every night.
Speaker A:I only sleep a couple hours compared to most people.
Speaker A:And I can validate this because I
Speaker E:get the text messages you do.
Speaker A:It's dark, it's grim, it's ugly.
Speaker A:And there are some things we need to talk about that are important as it relates to AI because there's a narrative that's going around that I think is deserving of what's really happening.
Speaker A:All of this is not as painless as, as I think people would like you to believe because it's sensational, it's new, it sounds cool.
Speaker A:But the fact of the matter is, AI is in fact coming for your job.
Speaker A:Not tomorrow, but certainly this year.
Speaker A:Not three years from now, not five years from now, not the what if.
Speaker A:Because why would you want to stigmatize a population by telling them their job's gonna be taken out near term?
Speaker A:You don't do that, right?
Speaker A:You try to get them used to the idea of this adoption and adopting the technology.
Speaker A:You're adopting technology that will take your job.
Speaker A:It is learning how to do your job.
Speaker A:Now, for every single one of you, using a learning language model, an LLM to help you do your job, you are in fact teaching your successor how to do Your job every single damn day.
Speaker A:Right.
Speaker A:And you might think this is making you more efficient.
Speaker A:It is.
Speaker A:Right up until it makes you so unnecessary in the process that you are no longer an efficient part of that process.
Speaker C:Yeah, it's the equivalent of like trying to train your replacement while you go on family medical leave.
Speaker C:Before you can, when you go on let's say maternity leave, right.
Speaker C:Then you say I'm not going to teach this person everything because if I don't want them to replace me by the time I come back, except this time they will.
Speaker A:It's not good.
Speaker A:So in this, this game, the shell game is, is a very, very bad one.
Speaker A:And we're going to actually spend some time and I didn't put this in the show notes on purpose.
Speaker A:I'm going to put it in there now.
Speaker A:Guys.
Speaker A:You guys have it.
Speaker A:I wanted to get your live reaction to some things that a video that I saw, it's about saying Sam open eyes.
Speaker A:Yeah, I always want to call him Sam Beckman Fried.
Speaker A:That's a Freudian slip.
Speaker A:Because some days I feel that way.
Speaker C:Yeah, right.
Speaker A:Some days I feel that way.
Speaker A:So I'm gonna put in the show notes here in a minute.
Speaker A:Don't worry about it now.
Speaker A:It's not gonna pop up anytime soon.
Speaker A:It's not something we're talking about First.
Speaker A:First I want to start the show off getting into a little bit of a backstory on Nvidia and the semis and what we're seeing here in the market because something very unusual happened yesterday.
Speaker A:We're recording this Thursday 26th February.
Speaker A:Okay, so let's go right off the top.
Speaker A:This is actually on my X if you looked at it.
Speaker A:This is Nvidia.
Speaker A:They just recorded their, their Q4 revenue.
Speaker A:We knew that was coming out yesterday.
Speaker A:The live show, $68.1 billion up 73% year over year led by data center AI chips and beating expectations.
Speaker A:We all know the capex story here.
Speaker A:So if you listen to the show, you follow us on social media, you know this.
Speaker A:I'm not going to bear repeating what you already know.
Speaker A:EPS earnings per share was a 162 and Q1 got is guided to about $78 billion.
Speaker A:They've already had 13 successive quarters of revenue beats.
Speaker A:They are making money hand over fist.
Speaker A:Okay, this raises a real question however.
Speaker A:How much of this growth is tied to sustained capex on infrastructure versus near term hyperscaler build out build outs.
Speaker A:Right.
Speaker A:Meaning if you build all this stuff right and it's field of dreams and
Speaker B:they come, it's Great.
Speaker A:But if you build this stuff and nobody shows up and you're talking to yourself on a live all by yourself in the middle of the day at 11am it could not be so great.
Speaker A:Right?
Speaker A:Yeah.
Speaker B:So, yeah, cloud and enterprise spending is
Speaker A:huge now, but is it durable?
Speaker A:And that's going to remain to be a question.
Speaker A:And what happens if the broader compute demand slows or dare I say gets more efficient?
Speaker A:So for example, right now, technology is what it is.
Speaker A:But if we get more efficient processing power, more efficient energy supplies, what happens?
Speaker A:All this.
Speaker A:Well, bad things happen, right.
Speaker A:Especially their profits.
Speaker A:Well, the market seemed to kind of believe that.
Speaker A:And today in reject, go to cnbc, type in Nvidia NVDA and look at their stock price.
Speaker A:They actually fell off today, the day after their 13th consecutive quarter earnings beat.
Speaker A:The stock market's going, meh.
Speaker C:They took a hit.
Speaker A:Yeah, they took a hit.
Speaker A:That is a strange, strange ass comment.
Speaker A:Right.
Speaker A:And I don't make any, you know, excuses for it.
Speaker A:I just know that it's strangely unusual and the behavior is very, very not comfortable for me.
Speaker A:I'm gonna add the stuff in the show notes now for you to look at later on, Rajeel, but it's in the show notes when we'll naturally talk about it.
Speaker A:So don't worry about it.
Speaker A:Do you have a chart?
Speaker C:Well, it's.
Speaker C:And I'll just add too.
Speaker C:I mean, some of this, this is.
Speaker C:Yeah, no, you just want to go to income shares, Nvidia, just, just nvda.
Speaker A:Yeah.
Speaker C:But some of this also has to do with.
Speaker C:And we've talked about this on the show before.
Speaker C:Right.
Speaker C:Is all the circular financing that's involved around.
Speaker C:There you go.
Speaker C:All the AI companies.
Speaker A:Right.
Speaker A:And all the money, that's 5.46.
Speaker A:This is the day after their, their Earnings revenue beat 184.89 per share.
Speaker A:So again, down 10.67.
Speaker A:This is the phenomenal earnings quarter.
Speaker C:Right.
Speaker A:The market doesn't believe it.
Speaker C:The market is, yeah.
Speaker C:Starting to declare shenanigans.
Speaker A:Right.
Speaker C:You got Nvidia that, I mean, not too long ago came out and said, we're going to invest $100 billion into open AI.
Speaker C:Right.
Speaker C: bly be profitable sometime in: Speaker C:That's the goal.
Speaker C:Where we're not profitable.
Speaker C:We'll break even by then.
Speaker C:Right.
Speaker C:And you're like, man, that's a long ways out.
Speaker C:And relying on a lot of companies to also do well in the Process and, and you have to think if one of these companies don't do well, what happens to the entire system?
Speaker A:So.
Speaker A:And we're going to get into that in detail on this show and I'm also going to talk about how this impacts our jobs.
Speaker A:Okay.
Speaker A:Because this is a very scary process and thought process for me.
Speaker A:Pull up this chart here of the quarterly.
Speaker A:Just click on that image.
Speaker A:There you go.
Speaker A:Yeah.
Speaker A:The Financial Times.
Speaker A:So this is a chart from Financial times showing you Nvidia's earnings.
Speaker A:You can see the chat GPT's release really kicked off these 13 consecutive quarters of just insane growth.
Speaker A:Okay.
Speaker A:Insane growth like this is never sustainable in perpetuity.
Speaker A:Right?
Speaker A:Right.
Speaker A:The only place you have right now is downside risk.
Speaker A:The stock and the market is pricing that in because they're going quarter after quarter.
Speaker A:They beat, beat, beat, beat, beat.
Speaker A:At some point in time it ain't going to be that way.
Speaker A:Yeah, right.
Speaker A:So go ahead and close that radial and I want to get a little more broad in this sense of why this is such a serious conversation.
Speaker A:Look, AI is a real, a real thing, okay.
Speaker A:I don't know that, that anybody can look at it and say this is not going to increase efficiency and be good for humanity on some level.
Speaker A:However.
Speaker A:Yeah, big asterisk here.
Speaker A:The Internet is a good thing.
Speaker A:And the Internet is still around today.
Speaker A:As a matter of fact, all this
Speaker B:AI runs off of the basic information
Speaker A:found on the Internet.
Speaker B:We had a dot com bubble burst, okay.
Speaker B:Because the, the money that was pumped into these markets and the valuation that was there were not real.
Speaker B:It was based on future earnings potential
Speaker A:that did not materialize.
Speaker B:Now yes, these guys are earning money.
Speaker A:AI is earning money and some of that has materialized.
Speaker A:But the expectations of earnings that we have from these people, those are never going to materialize.
Speaker A:I'm saying that now.
Speaker A:Never going to materialize.
Speaker B:And just be thematic through this episode.
Speaker A:You have a company like OpenAI, which originally started off as non profit, mind
Speaker B:you, and now somehow throwing that out
Speaker A:to the wayside and has this profit built model.
Speaker A:They make billions of dollars a year.
Speaker A:I think they're somewhere in around the $20 billion range.
Speaker A:Ish.
Speaker A:Mark.
Speaker A:Yeah.
Speaker A:I think some people are saying 13, I think it's about 20 billion.
Speaker A:Right.
Speaker B:But they have trillion over a trillion dollars in commitments and spending over the next couple of years.
Speaker A:Right.
Speaker B:That to me signals AI bubble problems in and of itself.
Speaker A:Right.
Speaker B:People say, well, AI is not going to crash.
Speaker A:I'm not saying AI is going to go away.
Speaker B:What I'm saying is Just like the dot com bubble burst, reality kicked in.
Speaker A:The Internet's still here.
Speaker C:Yeah.
Speaker B:Reality will kick in in this sector.
Speaker C:Good things will come from the infrastructure and everything that does get built out.
Speaker A:Right.
Speaker C:It was for, with the dot com bubble.
Speaker C:It was really, really the, the fiber optics or the fiber cables.
Speaker A:Right.
Speaker C:That were laid out that ultimately helped broadband Internet.
Speaker C:Right?
Speaker A:That's right.
Speaker C:But that wasn't their original purpose.
Speaker C:And that's not the reason why the leading up to the dot com crash why everything expanded and inflated.
Speaker A:Right.
Speaker A:Yeah.
Speaker A:And you have explaining and this is capitalism at its core.
Speaker A:But let's be clear.
Speaker A:Even if opening as revenue chart.
Speaker A:Yeah, actually I chart the top right there.
Speaker A:That one.
Speaker A:Perfect.
Speaker A:Yeah.
Speaker A:I saw this earlier today.
Speaker A:So this is really interesting when you look at this, if you look at the purple.
Speaker A:So I'll try to explain this as best I can.
Speaker A:Click on this the chart, see if you can make it bigger.
Speaker A:And I've seen this before earlier.
Speaker A:There is basically a bar chart at the very bottom of the bar chart.
Speaker A:It's okay, go back to the previous one.
Speaker A:I can explain it.
Speaker A:I saw it earlier.
Speaker A: ing will be their income from: Speaker A:Yeah.
Speaker A:Okay.
Speaker B:I don't want to be the asshole here, but if you're going to take our jobs using Agentic and AGI.
Speaker A:Yeah.
Speaker B:How the fuck are we going to pay for open AI?
Speaker B:And what am I paying for exactly?
Speaker B:If it's not going to, if it's not going to make me more efficient at work, I'm paying for it.
Speaker B:To what, the right memos to my mom?
Speaker B:Yeah, let me see.
Speaker B:Like what are we doing here?
Speaker B: f their income up until about: Speaker A:Okay.
Speaker B:If they're not using it to be more efficient in their jobs and they're not using it for their second jobs, what the fuck are you using ChatGPT for?
Speaker A:Right?
Speaker B:I mean, I mean you'll use it anecdotally, but you can certainly get away with a free model at that point in time.
Speaker B:Which by the way, you can run Olama and some of the local models locally on your computer.
Speaker B:You don't need chat GPT for some
Speaker A:of the Low end stuff.
Speaker A:Right.
Speaker B:So question number one.
Speaker B:And then number two, they say API usage, which is really your agents and
Speaker A:your bots and people using stuff like on our show, our show uses an API into Claude and Cloud will.
Speaker A:Then that's the number two source of income.
Speaker A:And then number three is agents, the actual agent, you know, cloud agents.
Speaker A:And that, that's actually a good point to stall here.
Speaker A:We're going to get to a point as well.
Speaker A:We're all going to have AI agents.
Speaker A:We talked about that in a previous show.
Speaker A:We're all going to have an AI agent, an AI that knows us better than these general learning language models.
Speaker A:Right.
Speaker A:And that we don't have to keep prompting that AI agent.
Speaker A:Okay, well, if they're saying 50% of
Speaker B:their income comes from a learning language
Speaker A:model, but we know that agents are going to be a growing portion of
Speaker B:their income, at what point in time
Speaker A:do we go, okay, wait a minute, that doesn't make sense because Perplexity just yesterday rolled out something called Perplexity Computer, which is basically an agentic AI model in the cloud that learns you.
Speaker B:Learns all about Sayonomar.
Speaker A:Yeah.
Speaker B:In the cloud you don't have to buy a computer, any hardware for it.
Speaker A:It remembers everything that you tell it
Speaker B:and builds a database based on you and will give you feedback.
Speaker A:So you can say, hey, that one thing we did that one time is do it again.
Speaker B:That's all you can say.
Speaker A:And it will do it.
Speaker C:Yeah, yeah, yeah, yeah.
Speaker B:And the best part about Perplexity's model is it takes every available major LLM,
Speaker A:including Cloud and OpenAI, and it'll use the one that's most pertinent for what job you want it to do.
Speaker A:Yeah.
Speaker A:On its own.
Speaker A:You don't got to prompt that.
Speaker C:Yeah.
Speaker C:If there's, if there was a world where you could, you know, build out your own agent, and companies were restricted to only use agents that their employees have built out and the agent itself has its own, let's just say maybe not a tax ID number, but its own, like identification number.
Speaker A:Right.
Speaker C:Then I could get behind this in, you know, to some degree.
Speaker C:Right.
Speaker C:But why would they do that?
Speaker C:Why, why wouldn't companies just take that and build it, Build it out themselves?
Speaker A:Well, exactly, and that's what we're doing is we talked about this in previous shows.
Speaker A:Number one, you get the data.
Speaker A:Number two, you build the model.
Speaker A:Except I'm a fucking idiot, okay?
Speaker A:I'm an idiot the whole time.
Speaker A:I'm like, okay, you get data in the form of like an Excel Template and you populate it and you train the model on how to use the data.
Speaker B:No, I'm an idiot.
Speaker A:We're building the model.
Speaker A:We're the data.
Speaker C:Yeah, yeah, Right, right, right.
Speaker B:We're the data for the companies.
Speaker B:Guess what?
Speaker B:Once we have done giving them the
Speaker A:data, which is how we do our jobs.
Speaker A:Yes.
Speaker B:Then the model is replacing us.
Speaker A:Right.
Speaker B:That's the model.
Speaker B:It's never been about how do we
Speaker A:get more efficient to get the result we want.
Speaker A:It's been about how do we replace the worker.
Speaker B:And anybody who's telling you otherwise just ain't looking far enough down the road.
Speaker B:That's where this goes.
Speaker A:Yeah.
Speaker C:Well, that's the goal.
Speaker A:Right.
Speaker B:It's, it's 100 the goal.
Speaker B:And even if it isn't the intentional goal, that's the only way the capital X spend makes sense.
Speaker C:Yeah, right.
Speaker B:It's the only way.
Speaker B:Rajeel, let's go to the next article.
Speaker C:But then at some point, right, if you're, if a lot of your value right now, or at least last year, remember we talked about this, I think, on the last episode.
Speaker A:Yeah.
Speaker C:Where if a company announces they're, they're going to contribute, let's say, $100 million into capital expenditures, and you know, the market investors see that and the analysts see that, they start valuing the stock higher.
Speaker C:Like, look, they're, they're really leaning into the space.
Speaker C:Right.
Speaker C:But at some point they got you.
Speaker C:You have to start questioning.
Speaker A:But that's also bullshit too, because.
Speaker A:And we're going to talk about this
Speaker B:later, if you say you're leaning into
Speaker A:the space and you're going to spend $100 million.
Speaker A:Okay.
Speaker A:And you're, I don't know, telling, hey, we're going to make $100 million investment or $100 billion investment into OpenAI.
Speaker A:Yeah.
Speaker A:But then OpenAI takes that and goes.
Speaker A:Buys product for me.
Speaker A:Yeah.
Speaker B:You're playing hide the sausage.
Speaker B:You just paid yourself.
Speaker C:Yes, exactly.
Speaker A:Right.
Speaker C:And then it.
Speaker C:There's so many middlemen and they all range in sizes.
Speaker C:Right.
Speaker C:But I have it here, too.
Speaker C:So Oracle's a big player in all of this.
Speaker A:Huge player.
Speaker A:Huge player.
Speaker C:Right.
Speaker C:So you got to start wondering, like, okay, if one of these don't do well, how big was that player?
Speaker C:And if that player is big enough, is it going to hurt the entire system?
Speaker A:Right.
Speaker C: ght now hit the highest since: Speaker C:That's bad.
Speaker C:Dude, we talked about this.
Speaker C:Their bonds are being traded like junk bonds.
Speaker A:Yeah, well, and then Meta got real Smart.
Speaker A:They started doing JVs and putting this outside of their company through these joint ventures.
Speaker C:So what are JVs?
Speaker A:So they have a joint venture where they're basically financing this outside the company, as opposed to the company just going get direct financing.
Speaker A:Right.
Speaker A:So you're in a joint venture vis a vis this company with Meta on the end result being profitable.
Speaker A:Right.
Speaker A:Instead of them just getting a loan and building it themselves.
Speaker C:Yeah.
Speaker C:So, like, oh, so OpenAI.
Speaker C:Right.
Speaker C:They'll.
Speaker C:They'll lean in and use Oracle to, like, rent their, like, computing space.
Speaker C:Right.
Speaker A:And.
Speaker C:And use some of their data centers and their infrastructure instead of using all of their owns to kind of like, spread the risk out a little bit.
Speaker C:But it's like, dude, what.
Speaker B:What if it.
Speaker B:What if it.
Speaker C:What if they decide to just close
Speaker B:that whole arm and.
Speaker B:And it could very well happen.
Speaker A:So Moody's just came out with an analysis on this whole thing, the macroeconomic consequences of AI.
Speaker A:And my boy Mark Zandi wrote it.
Speaker A:Led the.
Speaker A:The writing of it.
Speaker A:I should say paraphrasing a whole lot.
Speaker A:And this article is several pages.
Speaker A:It's in the show notes on every major platform that we're on.
Speaker A:New Moody's report says AI will be one of the most consequential economic forces of the century.
Speaker A:Yeah, thank you, Captain Obvious.
Speaker A:Appreciate that.
Speaker A:It's meaningful that you said it.
Speaker A:Boosting productivity and gdp, but uncertain effects on jobs.
Speaker A:Okay.
Speaker B:And we're already seeing a disconnect between
Speaker A:GDP and the employment numbers.
Speaker A:Right.
Speaker B:Less people are getting hired, but GDP
Speaker A:is still going up.
Speaker B:What's going on?
Speaker A:Well, we're not seeing the impacts of AI.
Speaker A:Bullshit.
Speaker A:Yes, you are.
Speaker A:That's the impacts of AI already.
Speaker A:Can you imagine when we actually get up to speed in cadence?
Speaker C:Right?
Speaker A:All right, says, but with uncertain effects on jobs.
Speaker A:And I said, and I'm quoting here, I believe those effects are already seen.
Speaker A:You know why?
Speaker A:Baseline scenario of Moody says that AI adding significantly to growth over the next decade with modest job disruption offset by new opportunities.
Speaker A:That's their baseline.
Speaker A:I don't believe that modest is the right vernacular, but okay, fine, whatever.
Speaker A:They got a political agenda here as well.
Speaker B:Other scenarios range from modest recessions if AI adoption stalls.
Speaker A:So option one, AI adoption stalls.
Speaker A:Or there's some type of, I don't know, cataclysmic financial situation of one of these companies and then has reverberating impacts.
Speaker A:Recession trigger right there.
Speaker A:To think of a dot com bubble burst.
Speaker A:That's real.
Speaker A:That happens.
Speaker C:And there's.
Speaker C:And there's a reason why I feel like the headlines are as optimistic as they are, and they're all still promoting growth.
Speaker C:And the GDP numbers are, well, the US can't afford a recession right now.
Speaker A:And the US all in on this
Speaker C:because when they, when we, when we do, in fact, whether it gets actually labeled a recession or not, Right.
Speaker C:The thing is, with the headline risk of it being a recession, things just tend to go down even further, which ultimately means less tax revenue, which ultimately means debt ceiling has to go up even higher.
Speaker C:Right?
Speaker A:So there is no solution to this, by the way.
Speaker A:There is no solution to the debt.
Speaker A:So the debt is so big of a snowball rolling down such a steep hill at this point in time, you're not stopping it.
Speaker A:Yeah, okay.
Speaker A:Find another solution.
Speaker B:And everybody's like, oh, is going to solve this problem.
Speaker B:No, it's not.
Speaker C:It's a game.
Speaker C:It feels like a game of chicken.
Speaker A:It is a game of chicken.
Speaker C:It's right.
Speaker C:It feels like we need to hurry up and get this to a point where it's all become profitable because almost all of these AI projects are all operating on a loss right now.
Speaker C:And we need to hurry up and get to a point where it's profitable.
Speaker B:Let me give you the best possible scenario.
Speaker B:And it's fucked up too.
Speaker B:I'll give you the best possible scenario.
Speaker A:AI is successful, okay?
Speaker B:AI is successful.
Speaker B:We get AGI in the not too distant future, Artificial General intelligence is as intelligent as you, as me as a human.
Speaker B:Now I got a bot that can do my job better than me because I spent the last year, year and a half programming and how to do my job because I was too lazy to do my own goddamn job.
Speaker B:So I went out using LLM, and the LLM learned how to do my job because I kept asking questions about how to do my own job, right?
Speaker B:So now I got AGI, which can do my job, which they don't pay a salary, no health care, no HR problems.
Speaker B:So companies eventually shift to that, as begrudgingly as they may be.
Speaker B:So all the manual work's gone.
Speaker B:They keep one supervisor, two supervisors to oversee the results and just fill through things at a high level because you need some human interaction for now.
Speaker A:Yeah.
Speaker B:And then you got a population of Americans who aren't working and you say, okay, well, guess what?
Speaker B:Elon Musk has been touting universal basic income.
Speaker B:Ubi.
Speaker B:Except here's a problem.
Speaker C:Andrew Yang.
Speaker B:Yeah, we've seen how capitalism works, Chief.
Speaker B:You know how this works?
Speaker B:The rich assholes stay rich by not giving you as much as they're getting.
Speaker C:I Feel like they're going to be fair this time, though.
Speaker B:No, they're not.
Speaker C:Come on.
Speaker B:That's not the way this plays out.
Speaker B:So you know what happens?
Speaker B:Everybody gets ubi.
Speaker B:And those who have assets, who have tactile wages and actual, like, earnings from companies they own still make money.
Speaker B:Everybody else gets the same socialist pot of pie to fuck around with.
Speaker C:They're going to give us UBI and I hope they get UTIs.
Speaker B:That's the kind of humor we're at on this.
Speaker A:Sorry.
Speaker C:Sorry.
Speaker E:That's what we're.
Speaker B:Sorry.
Speaker B:Drink some cranberry juice, kids.
Speaker A:Good for you.
Speaker C:I mean, it's grim, man.
Speaker B:It's grim.
Speaker B:And look, I don't want to be a pessimist and I'm not trying to be like, you know, over the top and pushing my thoughts on people, but that's the best scenario.
Speaker B:Yeah, that's dark.
Speaker A:Yeah.
Speaker C:And in the worst case scenario, it flops.
Speaker C:What does that mean?
Speaker A:Right.
Speaker C:Listen, even if you're in this space and let's just say you've been sitting on the sidelines, you're like, yeah, I get Max seven and all that, but, like, I'm not.
Speaker C:I'm still new to investing.
Speaker C:I don't want to.
Speaker C:I don't want to get into this.
Speaker C:Look, for everyday, average Americans, your 401ks are going to get hit.
Speaker C:Your 401ks are all invested into these large caps, right?
Speaker C:Into the S&P 500.
Speaker C:Right.
Speaker C:It's like these companies are propping up everything.
Speaker C:If it goes down and say you didn't invest in it directly, indirectly, you're in it.
Speaker D:Right?
Speaker C:So everyone's going to feel it.
Speaker A:Yeah, well, look, I can.
Speaker A:I can throw out a ton of different variants of how you can make money using agent AI.
Speaker B:And I might even.
Speaker A:I'll throw out.
Speaker A:You know what, if you listen to the end of the show, I'll throw out at least five solid ideas and ways to make money with agent AI right now.
Speaker A:That'll make money.
Speaker A:Look at that.
Speaker C:Hopefully you stuck around this long to get that teaser.
Speaker A:22 minutes.
Speaker B:If you haven't got the teaser.
Speaker A:All right, let's get back to the
Speaker B:Moody chart real quick.
Speaker C:That should be the intro for the show.
Speaker A:AI powered stock market is overvalued, bordering on frothy.
Speaker B:This is from the Moody's analysis.
Speaker A:There's many, many charts and graphs.
Speaker A:I pulled this one out.
Speaker A:Look at.
Speaker B: he only other spike since the: Speaker A:to now is the Y2K bubble.
Speaker A:And this looks very much like that bubble.
Speaker B:It's not at the height of where it was once at.
Speaker A:We're at about 20 versus 25, but
Speaker B:we're not, we're about 20% of the way off.
Speaker C:Right.
Speaker C:And if this gets corrected like that dot com bubble, look, in the dot com bubble, it, it, it erased about $5 trillion in values in U.
Speaker C:S equities.
Speaker B:And just be clear, AI is not going anywhere.
Speaker B:I'm not saying that it will.
Speaker B:The Internet didn't go anywhere.
Speaker A:Right, Right.
Speaker B:No one's saying that it did, but everybody said, oh, you know, Internet's gonna take your jobs.
Speaker B:The Internet changed a lot of jobs.
Speaker B:It created a lot of jobs.
Speaker B:But the Internet's purpose was never to take people's jobs.
Speaker B:AI's purpose, it was to promote, to take your jobs.
Speaker C:It was really to promote jobs.
Speaker A:Right?
Speaker C:Yeah, exactly.
Speaker B:So the only way this becomes AI takes jobs.
Speaker A:Yeah.
Speaker A:Okay.
Speaker B:There is no world, there is no outcome.
Speaker B:I don't care who's telling you this, okay?
Speaker B:Just trust me, bro.
Speaker B:No, yeah, okay.
Speaker B:The whole purpose, the only way this capex spend, makes money is if people lose jobs.
Speaker B:That's it.
Speaker B:End of statement.
Speaker C:And there ha.
Speaker C:There have been a few cities now that are on record where they've actually been able to stop it at before they started building out the data centers and they've, they put an end to it where data centers were supposed to get built out in certain cities and communities come out and they were able to stop it.
Speaker C:I don't have the names off the top of my head.
Speaker C:Maybe Rajeel can look them up.
Speaker A:That's all right, let's keep going with this.
Speaker C:But that could potentially be like.
Speaker C:That in and of itself is a signal to the markets too, where people are getting hit to this.
Speaker C:And like, no, no, no, no, no, no, no.
Speaker C:We don't want none of that in our community.
Speaker A:Right, well, yeah, and now you've got the White House supposed to announce any day now that XAI and all the major AI players are going to have some type of power, like access something going on with the White House.
Speaker A:We're going to give them nuclear power.
Speaker A:And I'm sitting here going like, so
Speaker B:we're just going now like no one's like, you know, maybe we shouldn't use a fast charger here.
Speaker B:You know, I mean, let's keep the battery preserved.
Speaker B:You know, it's just, it's just nuts that that announcement is looming.
Speaker A:It's been all over the, the, the Twitter sphere, or the X sphere I think is what you call it now.
Speaker A:Whatever.
Speaker B:So also overall, AI So this is the conclusion here.
Speaker B:Overall AI lifts productivity and output, but labor market effects and measurement challenges mean outcomes aren't guaranteed.
Speaker A:I believe it is guaranteed.
Speaker A:No offense, Mark.
Speaker A:You're still my zaddy, but that's the way it works.
Speaker C:Okay, Mr. Zandy, we.
Speaker A:We got history.
Speaker A:All right, let's go to the Forbes article.
Speaker A:Reill.
Speaker C:Yeah.
Speaker B:And this came out this morning, and
Speaker A:I read this literally at like 2am and in between my usual thinking about Saeed naked, this is what I was.
Speaker A:This is what I was on.
Speaker C:You're welcome.
Speaker A:Oh, that sucks.
Speaker A:Oh, it's okay.
Speaker A:Okay, well, you got to have membership to go to Forbes 1 membership.
Speaker A:Can you close this?
Speaker C:Dude, they want all the memberships.
Speaker E:That's whack.
Speaker A:Okay, well, screw you, Forbes.
Speaker A:We're not.
Speaker A:We're not subscribing.
Speaker A:All right, so I already have it here in the show notes because I'm that kind of guy.
Speaker A:I plagiarize everything.
Speaker C:Yeah.
Speaker A:All right.
Speaker A:Report jobs that are most and least impacted by AI Any guesses?
Speaker A:Boys, before we go on the list,
Speaker C:I mean, all the trade stuff, right?
Speaker C:So plumbers, electricians.
Speaker A:No, the most.
Speaker C:Least impacted.
Speaker A:Most and least impacted.
Speaker C:Okay, so least.
Speaker C:So least impacted.
Speaker A:I mean, trade stuff.
Speaker C:Yeah, Trade stuff.
Speaker A:Okay.
Speaker C:But like financial sector stuff.
Speaker C:Analysts work.
Speaker C:Yikes.
Speaker A:Yeah.
Speaker A:Call centers.
Speaker C:Call.
Speaker C:Yeah, that's a big one.
Speaker A:Yeah.
Speaker C:Call centers.
Speaker B:Well, that.
Speaker C:That one.
Speaker C:That one makes me a little more so because you got to really get your whole customer base to adopt it and not get too frustrated.
Speaker A:No, no, no, dude, I know.
Speaker A:Okay, so little backstory on AI that maybe people don't understand.
Speaker A:Okay.
Speaker A:AI has been used by the military for over 10 years.
Speaker A:Okay.
Speaker C:Yeah, yeah.
Speaker A:They've been using it in the field because they want to make quick decisions in the battlefield without this human emotional pause element.
Speaker A:And a quicker decision can mean life or death.
Speaker A:Yeah.
Speaker A:They've already started the transition to making those decisions in the battlefield operation centers, where now people who are normally humans looking at the total talent of circumstances make decisions.
Speaker A:Now the humans are reviewing part of the AI in their decision making process.
Speaker A:Right, Right.
Speaker A:I said to do this.
Speaker A:Here's what we think.
Speaker A:Okay.
Speaker A:Is a line.
Speaker A:Okay.
Speaker A:The logic's good, right?
Speaker B:So that's been in use for years.
Speaker A:You are fooling yourself if you think that AI isn't good enough right now to.
Speaker A:To sound almost human.
Speaker C:See, in theory.
Speaker C:In theory, that sounds great.
Speaker C:Like from a bird's eye view, high level.
Speaker C:Okay, we talked about this.
Speaker C:One of your former classmates or maybe professors at Yale.
Speaker A:Yeah.
Speaker C:Something brown.
Speaker A:Yeah.
Speaker A:Jim Brown.
Speaker C:Jim Brown, right.
Speaker C:That's who it was.
Speaker C:And he was online, but he was
Speaker B:warning about this like five years ago.
Speaker C:Five years, yeah.
Speaker C:Because he was hip to it probably.
Speaker A:Yeah.
Speaker A:He was talking about quantum computing and all of that back.
Speaker A:Right.
Speaker C:And I remember we.
Speaker C:I was watching one of his interviews and it said, you know, at some point, eventually you have so much data.
Speaker A:Right.
Speaker C:That all the cars out there are all like.
Speaker C:This is just one example.
Speaker C:Are all operating off of this AI.
Speaker A:Right.
Speaker C:And you probably decrease the number of deaths in car accidents by over 95%.
Speaker C:So everyone's willing to adopt this.
Speaker C:What do you mean 95% less people die from car accidents?
Speaker C:Yeah.
Speaker C:Take.
Speaker A:Did you see dara the Uber CEO on Diary of a CEO recently?
Speaker C:No.
Speaker A:He literally just said, like in 10 years, 95% of our workforce will be AI.
Speaker A:We won't have anybody driving cars.
Speaker C:Yeah, See, so then, and then you got to.
Speaker C:But then you got to think this from a high level.
Speaker C:Right?
Speaker C:That sounds great.
Speaker C:Okay.
Speaker C:But then you think, okay, it's going to come down, make a decision.
Speaker C:If I go this way, or God forbid, if ever were to happen, the.
Speaker C:The case that it does, if I go this way, two people die.
Speaker C:If I go this way, one person dies.
Speaker C:It obstacle, path of least resistance.
Speaker A:Right.
Speaker C:I'm gonna go this way.
Speaker C:But what if it's one on one?
Speaker C:And then, and then, and then, and then Jim Brown.
Speaker C:And then Jim Brown says, Chris's genes
Speaker B:are better than Sayed.
Speaker C:Jeans are better than Sayed's.
Speaker C:Or how about this?
Speaker C:He paid more in taxes than Said.
Speaker C:Or how about this?
Speaker C:He's at the end of his lifespan.
Speaker C:This person has still a lot of life to live.
Speaker C:Yeah, no, like, look like who factors all this in?
Speaker C:And then I think what Mr. Brown said.
Speaker C:And then he's like, I think at that point you just got to decide.
Speaker C:And that's the part where I was
Speaker B:like, nope, I'm out.
Speaker B:No, thank you.
Speaker B:Chris's hair looks mighty thin.
Speaker B:Weak genetics.
Speaker C:No, but it's like, it's like usually when you hear stuff like that, it's like, no, no, no, no.
Speaker C:That's not like you're going to tell me that I decided.
Speaker C:But really it was programmed by something on the back end that I'm not a hip to or aware of.
Speaker A:But let me, let me be just horribly cruel.
Speaker A:Okay?
Speaker A:These decisions are already happening on battlefields every single day.
Speaker C:Yeah, yeah, right.
Speaker A:In real wartime scenarios.
Speaker A:That's how this happens.
Speaker A:And that, that's, that's the cost of speed, of efficiency, of decision making.
Speaker A:You save millions of lives by making decision quickly, even if it was in the same one you would have made if you had more time.
Speaker A:And that sucks.
Speaker A:But that, that's the reality of it.
Speaker A:And that's unfortunately where AI goes.
Speaker A:And let's go through this, this report a little bit because I'll tell you how it affects you and me as opposed to the battlefield.
Speaker A:93% of jobs in the US can be done at least partially by AI.
Speaker A:93%.
Speaker C:That's not good.
Speaker B:That's not a small number.
Speaker C:That's not good.
Speaker A:According to a new study.
Speaker A:And companies should shift more than 4.5 trillion in labor costs to AI.
Speaker A:And where do you think that 4.5 is shifting from?
Speaker C:See, it feels like Congress is taking a little too long to take action
Speaker A:because they don't give a shit about you and me.
Speaker B:You want to know why our representatives, for the most part, I mean, AOC
Speaker A:and some younger representatives excluded here, are
Speaker B:so old, they don't give a about us.
Speaker B:They care about their main constituents of voting population, which also happen to be generally older demographics who are the same people.
Speaker B:The president doesn't want to lose value in their homes.
Speaker B:I get that.
Speaker B:Fine.
Speaker B:There's.
Speaker A:Who voted for you.
Speaker A:All good.
Speaker B:But then how are you going to solve the affordability crisis?
Speaker B:Oh, we'll have lower interest rates.
Speaker A:Yes.
Speaker B:It's the easiest lever to move.
Speaker B:Except guess what, this does not solve the problem.
Speaker B:It drives values up over time.
Speaker B:And okay, if you want values to go up over time, you know how you solve the affordability crisis under that scenario where you have lower interest rates and higher home values?
Speaker B:You got to pay more money.
Speaker C:Yeah.
Speaker B:You just told me that you're not going to pay me more money, Chris, because it's going to take my job.
Speaker C:One.
Speaker C:Yeah.
Speaker C:One way to solve the home affordability crisis is to allow values to crash.
Speaker C:And guess who's not going to allow
Speaker A:values to crash again in ubi if
Speaker B:we all have the same income?
Speaker B:How the hell are any of us going to buy houses unless we get the same manufactured prefab home for everybody in America?
Speaker B:Is that what we're talking about?
Speaker C:You heard about the new proposals that are out, right?
Speaker C:So, so they want, they want Trump homes.
Speaker A:Okay, I'm sorry, what now?
Speaker C:Trump homes.
Speaker A:Okay.
Speaker A:Is that, what does that mean?
Speaker C:So these are proposals that.
Speaker C:So he wants to build more homes and he's going out to the builders saying we need to build more homes.
Speaker A:Right.
Speaker C:And a million more homes to come online.
Speaker C:And the investors, whatever private equity Companies that come into this will allow you to, you know, rent the home.
Speaker C:And for the first three years, those payments that you rented can go towards a down payment of the home.
Speaker C:Right.
Speaker C:So it's essentially a rent to own type model.
Speaker A:So now we're going to extort people into three years of guaranteed free cash flow for land.
Speaker B:I'm a landlord and I don't want someone staying with me because they need to.
Speaker A:This is not serious, is it really?
Speaker A:Yeah.
Speaker C:Oh, no, this is, this is that.
Speaker A: ct, distinct concept of early: Speaker B:How is that not a conflict?
Speaker C:Yeah, and they'll be.
Speaker C:Exactly.
Speaker C:And they want.
Speaker C:He's going to force Fannie and Freddie to get behind this.
Speaker E:Right.
Speaker C:So this is, this is, this, this is one of the solutions proposals that's on the table.
Speaker C:And you're like, I don't know, man,
Speaker B:I feel like, I feel like dystopian movie.
Speaker A:I feel like we're like the bad guy ascends and won't ever like lose power.
Speaker B:And I'm not like this is going to sound political.
Speaker A:It's not political.
Speaker A:It's the entire class of politicians, Republican and Democrat.
Speaker A:Right.
Speaker A:I feel like this elite class of special kind of asshole is taking over
Speaker B:and it's just getting to the point where I'm just, I'm, I'm at my wits end.
Speaker A:Yeah.
Speaker B:And the sad part is then you see someone like Jerome Powell, who I respect saying we just haven't seen the job numbers.
Speaker B:He came out literally yesterday saying AI may have had a bigger impact on
Speaker A:jobs than we have.
Speaker A:We've seen in the data.
Speaker C:But no, Jerome, but unemployment goes down and they're saying the labor market is strong.
Speaker C:Meanwhile, just literally I'm pulling up CNBC as I'm like in the parking lot here.
Speaker C:EBay, eBay, 800 jobs, 800 jobs, 6% of its workforce.
Speaker C:I'm like, all I'm seeing, all I'm seeing is people laying off.
Speaker C:And you're telling me that the job market is strong.
Speaker C:I'm not seeing any positivity.
Speaker C:Yeah, zero positivity.
Speaker A:With.
Speaker A:In the world's defense, those headlines don't
Speaker B:sell because you know, you're not going to, you're not going to drown on all the bad stuff with like, oh, but don't worry, Target's hiring.
Speaker C:Yeah, yeah, you know, you know, I
Speaker B:mean it's not going to happen.
Speaker A:Right.
Speaker B:So it's a, it's a grim look and look, that's what sells and here's what you know.
Speaker B:It sells.
Speaker B:Every single one of us, those listening to the show, everybody watching this stuff on the platforms, every single one of you has an inkling of fear and concern because of the uncertainty.
Speaker B:What happens next?
Speaker B:Humans like certainty.
Speaker B:We're tribal.
Speaker A:We like to know what's going to happen in the future.
Speaker B:You like the confidence of stability.
Speaker B:And what rocks that stability is unchartered technology which could in fact challenge you for your job.
Speaker B:We know 93% according to Forbes, 4.5 trillion in quote money being shifted to allocate towards that.
Speaker B:We're all afraid.
Speaker B:We just don't want to say the quiet part out loud.
Speaker B:And some people are like, oh, Chris, there's always jobs that are created.
Speaker B:Okay, fuck you, guy.
Speaker A:Right?
Speaker B:You and I both know that you're afraid.
Speaker B:And you want to know why?
Speaker B:This consumer sentiment study's out right now.
Speaker B:It says the wealthier earning population is more afraid than the not wealthy population.
Speaker B:You want to know why people do an actual manual labor who aren't training those models every single day inquiry.
Speaker B:And the people are stocking shelves, right?
Speaker B:They're not as worried about their jobs, Chief, because they ain't going to be stocking shelves.
Speaker C:So I know what a people, a lot of people who are kind of hit to the space that have been listening to us for some time now because they've been in my DMs, but they're hopeful.
Speaker A:I'm probably upset with my rage bait this episode.
Speaker C:No, they don't know which way to root for.
Speaker C:Right?
Speaker B:Did you say yeah.
Speaker A:What?
Speaker C:Yeah, you're upset at me, bro.
Speaker B:Don't make me.
Speaker C:I'm not gonna be able to sleep tonight.
Speaker A:Yeah.
Speaker C:The part that they're secretly rooting for that they don't want to admit to is do.
Speaker C:Maybe I want all of it to fail to just secure my.
Speaker C:Our jobs for.
Speaker C:For as long as we can.
Speaker B:Yeah.
Speaker B:If you've been coding AI.
Speaker A:Yeah.
Speaker B:Yeah, imagine that, right?
Speaker B:You could work one day and AI tells you itself.
Speaker A:Hey, bro, you go home.
Speaker A:Yeah.
Speaker C:Your login doesn't work anymore.
Speaker C:You go, you go to scan in.
Speaker C:It just beeps.
Speaker C:Nope, you are not allowed.
Speaker C:Access denied.
Speaker A:Yeah, I'm gonna need the ID back.
Speaker A:Yeah.
Speaker C:You are the weakest link.
Speaker C:Goodbye.
Speaker D:So we all have to start thinking about a plan B. Yeah.
Speaker A:Oh, don't worry.
Speaker A:I get the plan B.
Speaker C:The plan B will be at the end of the show if you're still plugged in.
Speaker C:But it's like.
Speaker C:But here's the thing.
Speaker C:Just like the dot com Bubble.
Speaker A:Right.
Speaker C:All the infrastructure and all the technology behind all this is already being built out.
Speaker C:Too much money's into it.
Speaker C:So even if it, let's say, crashes
Speaker B:or it pops, it's not going to be here.
Speaker C:It'll be here and then it'll just pick up after.
Speaker B:They'll be winners and losers.
Speaker B:Yeah, yeah.
Speaker B:Unfortunately, we're the losers.
Speaker A:Humanity.
Speaker B:Not you and me.
Speaker B:Just.
Speaker A:Just to be clear.
Speaker B:So let me keep going with this
Speaker A:because I got a lot of data and we're only halfway to the show and there's a whole lot here.
Speaker B:And trust me, you want to stay
Speaker A:through the end on this one because.
Speaker A:Yeah.
Speaker A:I'm going to hit you right in the ding ding with some stuff that you're going to talk to your friends about.
Speaker C:You don't have ding dings.
Speaker A:Yeah.
Speaker A:Even the non ding dings of you.
Speaker A:All right.
Speaker B:Researchers study more than 18,000 tasks across
Speaker A:1,000 jobs to determine where AI could be applied.
Speaker B:The Upsh AI capability is growing fast.
Speaker A:That's the upshot.
Speaker A:It's cute.
Speaker B:And it could soon take over an even larger segment of the economy.
Speaker B:That's the positive news.
Speaker B:Okay.
Speaker B: The report is an update of a: Speaker A:and consulting firm with over 20 billion
Speaker B:in annual revenue, conducted across the same
Speaker A:number of jobs and tasks.
Speaker B:The UPS AI driven change is coming sooner than expected.
Speaker A:You think?
Speaker B:Software development is one of the most heavily impacted areas.
Speaker A:You saw that with the SaaS companies getting hit pretty hard.
Speaker A:The stock market recently.
Speaker A:Although there's been a rebound today, LLMs have improved in their ability to code.
Speaker B:Others include business and financial operation jobs, management roles, office and administrative support roles, and legal analysts.
Speaker B:According to the report, the top six most impacted jobs by percentage of tasks that can be done by AI as of today.
Speaker B:Number one, financial managers.
Speaker B:84 impacted.
Speaker A:Claude just rolled out Financial wealth management.
Speaker A:Oh, wow.
Speaker A:Yeah.
Speaker B:Press control plus.
Speaker A:I don't think you can make that bigger, Jill.
Speaker A:Yeah, you know, don't worry about it.
Speaker A:We can, we can.
Speaker A:We'll use the sexy sounds of Chris's voice, too.
Speaker A:Give you the list.
Speaker A:Yeah.
Speaker A:There you just put title.
Speaker A:There you go.
Speaker C:Boom.
Speaker A:There you go.
Speaker A:Everyone's up.
Speaker B:All right, so number two, computer and mathematical roles.
Speaker A:67% impacted.
Speaker A:Makes sense.
Speaker A:Six, seven.
Speaker A:Yeah.
Speaker B:All right, number three, business and financial operations.
Speaker B:60 to 68 impacted.
Speaker B:Okay, that's a pretty broad category, Guy.
Speaker A:Business, business and financial operations.
Speaker C:Entire operations.
Speaker A:Jesus.
Speaker A:Right?
Speaker B:Office and administrative support.
Speaker A:60 and 68% impacted.
Speaker C:Yeah, I can see that.
Speaker A:Yeah, I can, too.
Speaker D:It's crazy.
Speaker B:Legal occupations, oh, 63% impacted.
Speaker A:Here's the one that I love the most.
Speaker A:You ready?
Speaker B:You ready?
Speaker B:All you big shots out there earning big paychecks, thinking that you're hot shit, you can't be replaced.
Speaker B:Management jobs, including C suite, 60% impacted.
Speaker C:Can you imagine?
Speaker B:You don't need executives running into things anymore.
Speaker B:You know people are gonna do, right?
Speaker B:The whole point of being an executive, a senior leader in a corporation is you're gonna lead either one culturally, which I'm gonna be honest, most of these assholes are so disconnected to, they can't do.
Speaker B:And number two is that you provide some level of seniority and tenure of time.
Speaker B:Cause you have experience.
Speaker B:I guarantee you that.
Speaker B:LLM, you've been training Chief.
Speaker B:Cause you haven't been doing full time.
Speaker B:Do you watch the CNBC and golf all day long.
Speaker B:Can do your job better than you now, right?
Speaker C:Imagine you go to any corporate office in, in the C suite, corner office.
Speaker C:It's just a MacBook mini.
Speaker A:Yeah.
Speaker B:Little Mac Mini.
Speaker C:A little Mac Mini Jillian.
Speaker C:It's like, that's my boss.
Speaker B:I guess that's your boss.
Speaker B:That's gonna happen.
Speaker B:You laugh.
Speaker B:Yeah, that's real.
Speaker B:Can you imagine doing performance review?
Speaker B:You take lunches, minus one, you go home.
Speaker A:All right.
Speaker C:I'd be interested to see like what, what Claude thinks about in office versus work from home.
Speaker C:I would be interested to see what it has to say about that.
Speaker A:I've never gone on that path.
Speaker A:We'll do that on an episode.
Speaker A:Yeah.
Speaker A:All right.
Speaker B:So I am going to go into
Speaker A:the next chapter of my rant and for full disclosure, I did give everybody like full like heads up.
Speaker A:This was going to be this kind of episode.
Speaker A:Right.
Speaker A:I mean, I wasn't hiding the ball here, was I?
Speaker D:No, no.
Speaker A:Yeah.
Speaker C:You did say be careful.
Speaker A:I did.
Speaker A:I did call Saeed before the show and say, hey bro, I apologize in advance because I'm pissed off.
Speaker A:All right.
Speaker A:So I went down a bit of a rabbit hole last night, which started with Microsoft's announcement of their investment into OpenAI.
Speaker A:They put out this press release just brought on the screen here.
Speaker A:OpenAI, Microsoft.
Speaker A:Look, it looks all sexy and branded
Speaker C:and more circular financing.
Speaker A:Oh my God.
Speaker B: Since: Speaker A:a vision to advance artificial intelligence reasonably and make its benefits broadly accessible.
Speaker B:Really?
Speaker A:I thought your plan was to get rich.
Speaker A:Yeah.
Speaker B:Weird.
Speaker B:What began as an investment in research organization has grown into one of the most successful partnerships in our industry.
Speaker A:What started as a non profit is now very profitable.
Speaker A:And we like that profit.
Speaker A:So we want more, more profit as
Speaker B:we enter the next phase of this partnership.
Speaker B:We signed a new definitive agreement that builds on our foundation, strengthens our partnership and sets the stage for long term success for both organizations.
Speaker A:Yeah, we're robbing Peter to pay Paul.
Speaker A:And he's Peter and I'm Paul.
Speaker A:Right, right.
Speaker A:All right.
Speaker B:First, Microsoft supports the OpenAI board moving
Speaker A:forward with formation of a public benefit corporation and recapitalization, I. E. Giving them lots of money.
Speaker B:Following the recapitalization, Microsoft holds an Investment
Speaker A:in Open AI Group PBC, the public
Speaker B:benefit corporation valued at approximately $135 billion.
Speaker C:It's a lot of money.
Speaker A:Yeah.
Speaker B:Representing roughly 27% on an on.
Speaker B:On an AS converted diluted basis, inclusive
Speaker A:of all owners, employees, investors and the OpenAI Foundation.
Speaker B:Excluding the impact of OpenAI's recent funding rounds, Microsoft held a 32.5% stake on
Speaker A:an as converted basis of OpenAI for profit.
Speaker B:Okay, so I'm going to take all this, which sounded really complex and just really cerebral.
Speaker B:It made you go, oh, they want to make the world a better place for you.
Speaker A:Okay.
Speaker C:Yeah.
Speaker A:So I'm going to, I'm going to say this to you colloquially.
Speaker A:Okay.
Speaker A:Yeah.
Speaker A:Gangsters like 50 Cent, we're here, right?
Speaker C:Take me behind the curtain.
Speaker A:Yeah.
Speaker A:What if I told you a giant company announced a multi billion dollar investment in a startup?
Speaker C:Oh, okay.
Speaker C:They must really believe in this thing.
Speaker A:There you go, that's a good one.
Speaker B:And the startup immediately turned around and
Speaker A:used that money to buy the giant company's own product.
Speaker A:Wait a minute, wait a minute.
Speaker A:Yeah.
Speaker B:So the investor gets to say, look
Speaker A:at us funding the future, I. E.
Speaker B:What Microsoft has just said.
Speaker A:Right.
Speaker B:The startup goes, gets to say, look at us at our explosive revenue growth,
Speaker A:which is what OpenAI is saying.
Speaker A:Right, Right.
Speaker B:And the cash, well, it's basically leaves to the same door it came in.
Speaker A:Microsoft says, I'm giving you OpenAI a bunch of money.
Speaker B:And OpenAI says, hey, I'm giving you a bunch of money.
Speaker A:There you go.
Speaker A:That would be money laundering in other circumstances.
Speaker A:But fine, whatever.
Speaker A:This is accounting.
Speaker C:I mean we've all seen this happen with Enron.
Speaker B:So this sounds completely absurd whenever I break it down colloquially, but that's exactly
Speaker A:what's going on with the investment in OpenAI right now.
Speaker A:Yeah.
Speaker B:Big tech firms announce eye popping multibillion dollar investments.
Speaker B:OpenAI then uses a huge portion of that capital to buy cloud compute infrastructure and services generally from the exact same company writing the check.
Speaker A:Okay.
Speaker B:It's not venture capital in the Traditional sense, because venture capital puts money in
Speaker A:for they have a risk of loss.
Speaker B:With this upside potential, it's prepaid infrastructure spend with a valuation headline attached.
Speaker B:Basically, they can say, hey, look, we're making more money, so you should value
Speaker A:our company as more.
Speaker A:Right.
Speaker B:That's how OpenAI is currently achieving their valuations.
Speaker A:Right.
Speaker B:Microsoft donates over a hundred billion dollars.
Speaker B:They say they're making more than $100 billion, but they're just passing it back to Microsoft.
Speaker B:And then guess what?
Speaker B:They get to say, look, give us a multiple of our revenue, which in this case, $100 billion higher because what Microsoft did, even though we're giving it right back to Microsoft.
Speaker B:So you should value US OpenAI at a multiple of that $100 billion to make us worth trillions of dollars.
Speaker C:Right.
Speaker C:And it doesn't just, it doesn't just stop here with them.
Speaker C:Right.
Speaker C:Nvidia says they're going to invest $100 billion into OpenAI.
Speaker C:And OpenAI uses Nvidia's chips.
Speaker A:That's right.
Speaker B:Ding, ding, ding, ding.
Speaker C:Look.
Speaker C:So it's happening here, it's happening there, and it's all flowing.
Speaker A:Oh, we got a chart for that.
Speaker A:Don't worry, we'll get to that a minute here.
Speaker B:Now, let's add it all to scale
Speaker A:and make the numbers make sense.
Speaker A:Okay.
Speaker B:OpenAI's annualized revenue surpassed $20 billion in
Speaker A: billion in: Speaker B:That in and of itself was a good number.
Speaker B: In February: Speaker A:That was down from 1.3 trillion in the prior announcements.
Speaker A:Right.
Speaker A:That figure itself was whacked down.
Speaker A:CEO Sam Altman was like, well, maybe that was a bit of an overreach.
Speaker A:Yeah.
Speaker B:So he got it in more than half, from 1.3, 1.4 trillion down to 600.
Speaker A:600 million.
Speaker B:This is still a company that, that with tens of billions in revenue and hundreds of billions of dollars in future spend.
Speaker C:That's hundreds of billions.
Speaker A:Yeah.
Speaker B:That's like me committing to spending tens
Speaker A:of millions of dollars because I know I'm going to make that much in the future.
Speaker B:Yeah, I got work that much.
Speaker C:I got it.
Speaker B:I'm like that.
Speaker B:I'm the new king, right?
Speaker B:I'm like that.
Speaker E:Give it to me.
Speaker B:Yeah, look.
Speaker B:Does this face look like a face that would lie to you?
Speaker A:Right.
Speaker B:Trust me, bro.
Speaker C:I'm good.
Speaker C:Hey, I'm good for it, bro.
Speaker B:You know this.
Speaker C:Yeah.
Speaker B:I'm one song away from 50 Cent.
Speaker B:That's it.
Speaker B:Just one.
Speaker A:One.
Speaker B:I go viral one time.
Speaker C:Give me, just give me a Dr. Dre sample.
Speaker B:This is financial engineering with very, very,
Speaker A:very good pr but a gambling mast as venture capital type structure.
Speaker B:It's like the biggest sports bettor in
Speaker A:Vegas telling you what he is betting on and then betting on you betting on him because he told you what he was betting on.
Speaker C:Gangster.
Speaker C:That was a good reference, right?
Speaker A:That was all me too.
Speaker A:That was an AI.
Speaker A:Yeah, that was good.
Speaker A:That's all, that's all Chris's.
Speaker A:Because everything goes back to a Vegas sports reference now.
Speaker B:Okay, now in the show notes is this chart.
Speaker A:Make it as big as you can.
Speaker A:Rejeel.
Speaker B:This is prior to the Microsoft deal announcement.
Speaker B:This is the inner workings and I'm
Speaker A:going to give you a little bit of a backstory from a credit and
Speaker B:finance perspective that not all of you listen to the show are going to know.
Speaker C:Okay?
Speaker A:Okay.
Speaker B:Concentration bad.
Speaker A:Okay.
Speaker B:Single event risk bad.
Speaker B:Anytime you make any kind of credit decision, whether you're in venture capital, you're in hedge funds, wherever you are, if you're buying for your own stock, you don't put all of your money into one company for your 401k.
Speaker B:You spread it amongst the world so you have less single event risk.
Speaker B:If one company goes down, in theory, it shouldn't tank your entire investment.
Speaker A:Right.
Speaker C:If you ever.
Speaker C:If you're, if I'm underwriting a company and they sell a product and they say, Look, I made two, I have some, I made $2 million last year and selling all these products and, and these companies have to keep coming back and buying more products.
Speaker C:Right.
Speaker C:They, they need me.
Speaker C:Yeah.
Speaker C:And then I take a look at all the companies that are coming and buying your product and one company makes up 90% of all that.
Speaker C:It's like that's a huge concentration.
Speaker C:What if you lose that company?
Speaker A:Yeah.
Speaker B:So basically this is showing the Nvidia at the center and around Nvidia are two big ass circles.
Speaker B:OpenAI and Oracle.
Speaker B:Now OpenAI is clearly the biggest offender between Nvidia and OpenAI.
Speaker B:Everybody is pimping everybody some money and
Speaker A:getting money back from this architect.
Speaker B:And if you have, if you're driving, trust me, it's bad.
Speaker B:It's incestual as shit.
Speaker B:It's not good.
Speaker A:Okay.
Speaker B:If this were dating relationships, you wouldn't
Speaker A:sleep with any of these people.
Speaker A:Maybe Oracle.
Speaker B:Maybe Oracle.
Speaker A:Why?
Speaker B:Because Oracle's only got a couple ins and one out.
Speaker C:Like it's just, you know, and the owner's the owner.
Speaker A:Yeah.
Speaker A:Yeah.
Speaker B:So I mean it is what it is.
Speaker B:But you got amd, Microsoft nebulous Core Weave, intel Nvidia XAI figure, Mistral n scale.
Speaker A:I mean it goes on and on
Speaker C:and on and on.
Speaker B:Isn't.
Speaker A:I think.
Speaker C:And the US they're a large.
Speaker C:Considered a large investor into Intel.
Speaker A:Yeah.
Speaker C:They have like 9 or 10 ownership stake in the company.
Speaker A:And then you got Claude, which is Anthropic, which has government contracts which are now being moved to like a higher risk category if Anthropic doesn't open up their code to the government.
Speaker A:I mean, it's a whole dog and pony.
Speaker C:By the way, recently how the CEO of Anthropic and OpenAI were on stage together and they refused to like hold.
Speaker B:Yeah, yeah, that was, that was Sam Altman who by the way, I'm gonna
Speaker A:be the guy who says it.
Speaker A:Okay.
Speaker A:Okay.
Speaker B:He used to come off as so
Speaker C:likable and now for some reason I feel like it's taking a pivot.
Speaker A:Okay.
Speaker A:Regill, this video that I gave you the link to get that prepped up.
Speaker A:Remember?
Speaker C:Because he, he came out.
Speaker C:I remember.
Speaker C:I remember my first time like reading into him.
Speaker C:He's like, I.
Speaker C:We built out the.
Speaker C:The board of directors a certain way.
Speaker D:Right.
Speaker C:Where it split down the middle where half of the people want high regulation and other half don't.
Speaker A:Right.
Speaker C:And that was, that was done intentionally.
Speaker A:All right.
Speaker A:This video is so.
Speaker A:I'll be.
Speaker A:I'm gonna give my bias out here before we.
Speaker A:Jill hits play.
Speaker C:14 minutes long.
Speaker A:It's 14 minutes long and it's worth.
Speaker A:If you're listening for the show, you're gonna want to say for all 14 minutes.
Speaker A:This is impactful.
Speaker A:Okay.
Speaker A:This video we're gonna stop a couple times and talk about.
Speaker A:Because they're.
Speaker A:They're entry.
Speaker A:Exit points in this.
Speaker A:This video tells you a whole lot.
Speaker A:And I'm sure it's available.
Speaker A:I found it on.
Speaker A:On Facebook.
Speaker A:A friend sent it to me.
Speaker A:I'm sure it's available at other places, but.
Speaker A:But it's okay.
Speaker A:We're going to play it.
Speaker A:The audio is more important here than anything else.
Speaker A:This is a history in Sam Altman.
Speaker A:Okay.
Speaker A:And.
Speaker A:Oh, okay.
Speaker A:Most people don't know this.
Speaker B:I have known a lot of this
Speaker A:and it's always bothered me.
Speaker A:Okay.
Speaker A:So full disclosure.
Speaker A:Regil, when you're ready.
Speaker D:Let's go.
Speaker D:Sorry.
Speaker A:No, no.
Speaker E:Who's pretty friendly to big tech.
Speaker E:And I'll explain in a second.
Speaker F:How can the company with 13 billion
Speaker B:in revenues make 1.4 trillion of spend commitments and you've heard the criticism, Sam.
Speaker A:First of all, we're doing well.
Speaker F:More revenue than that.
Speaker F:Second of all, Brad, if you want to sell your shares, I'll find you a buyer.
Speaker A:I just.
Speaker E:Enough.
Speaker E:So why does Altman seem so upset here?
Speaker E:After all, this interviewer is just pointing out a basic fact that OpenAI has committed to spend over $1 trillion on AI infrastructure over the next eight years, despite only bringing in around $13 billion a year in record recurring revenue, less than 1% of what they're promising to spend.
Speaker E:I'm no money genius and I'm personally terrible at budgeting, but that doesn't seem great.
Speaker E: th in the American economy in: Speaker E:That's all part of a promise being made by the industry led by Sam Altman, that once a certain level of machine learning intelligence is reached, all of our problems will be solved.
Speaker F:The housing crisis, cancer, poverty, climate change, mental health, democracy, universal basic income, cure a bunch of diseases, this cancer and that one, and heart disease.
Speaker F:Helping you try to accomplish your goals and be your best, very high quality healthcare, important new scientific discoveries, the marginal cost of energy are going to trend rapidly towards zero.
Speaker F:The more equal world universal extreme is for everybody.
Speaker E:In exchange for all that, Altman is asking all of society to put all of our eggs, our data, our economy, our water and resources, everything into one basket.
Speaker E:His he's offering us one massive Just trust me, bro.
Speaker E:So shouldn't we trust Altman?
Speaker E:Should we accept his deal?
Speaker E:Is it even our choice?
Speaker E:Altman isn't a technologist or scientist.
Speaker E:He's an investor and dealmaker and really good at it, supposedly.
Speaker E:But his whole career is a series of just trust me, bro moments.
Speaker E:So let's examine the deal Altman is offering all of us.
Speaker E:Should we believe Sam Altman's promises?
Speaker E:And what's the cost to the rest of us if those promises turn out to be lies?
Speaker E:So let's go back and look closer at Altman's early days in the tech industry.
Speaker E:Altman's first big deal was selling his first company, Looped, a service for locating your friends.
Speaker E:That's something that inherently needs lots of users to work or else you're just locating yourself.
Speaker A:The operative idea seems to be ubiquity.
Speaker A:I mean, get it out there in more ways than you can possibly imagine and make it to everybody.
Speaker E:But the whole time, Looped refused to say how many users they had.
Speaker E:Altman just insisted there were, quote, way more users than any other similar service.
Speaker E:It turns out Though that towards the end, looped only had 500 users.
Speaker E:When Reuters reported this, Altman insisted it was a hundred times more than that and that he'd provide evidence he never did.
Speaker E:Just trust me, bro.
Speaker E:Loot sold to the Green Dot Corporation, who shut it down immediately and never used any of the tech.
Speaker E:Green Dot investors allege it was a dirty deal done to enrich Sequoia Capital, a VC firm with a stake in Loot, and two board members at GreenDot who helped approve the deal.
Speaker E:Altman left Green Dot as soon as he was legally able, walking away with millions for building an app that no longer existed in any form.
Speaker E:And luckily for Altman, someone saw something in him.
Speaker E:Peter Thiel.
Speaker E:Thiel, who once said that Altman should be treated as more of a messiah figure, gave Altman millions to start his own VC firm, Hydrazine Capital.
Speaker E:And that's not all the capital Altman controls.
Speaker E:He was also hired as president of Y Combinator, or yc, an influential venture capital firm and startup incubator where Looped got its original funding.
Speaker F:I think the president of YC is sort of the unofficial leader of the
Speaker E:startup movement, and Altman personally traded on that influence.
Speaker E:The New Yorker reports that up to 75% of hydrazine's capital was invested in YC companies.
Speaker E:Allman used his inside view to get a cut of YC's power, despite Altman promising he didn't cross invest in YC companies.
Speaker E:That's two big lies so far.
Speaker E:The user base of loops that needed users to exist and his Investments.
Speaker E: In: Speaker F:He's sort of a semi company, semi nonprofit, doing AI safety research.
Speaker E:OpenAI was launched as the supposed nonprofit OpenAI foundation with a charter with a lot of lofty goals, a primary fiduciary duty to humanity, and avoiding enabling uses of AI or AGI that harm humanity or unduly concentrate power while acting to minimize conflicts of interest among our employees and stakeholders.
Speaker E:The evidence that they do that, just Trust me, bro.
Speaker E:OpenAI's primary financial backers were tech billionaires and millionaires like Altman himself, Peter Thiel, Reid Hoffman and Elon Musk, and tech companies like Amazon Web Services and Infosys.
Speaker F:We wanted to build this with humanity's best interest at heart.
Speaker E:But in exchange, OpenAI is asking for a lot.
Speaker E:Putting all of society's eggs in one basket, if you will.
Speaker E:They want electricity, water, infrastructure, capital, your data, your writing, your art, and for humanity to adjust to job loss.
Speaker E:Deepfakes and everything else, all in exchange for some future promise of technology that fixes everything.
Speaker E:So can we trust him with all of this?
Speaker E:Let's look at some of his biggest statements and promises to show how they tie to all the eggs in the basket.
Speaker E:Altman insists he doesn't own any of OpenAI, and he barely takes his salary.
Speaker F:I paid enough for health insurance.
Speaker F:I have no equity in OpenAI.
Speaker F:I'm doing this because I love it.
Speaker E:But he doesn't hide that he's already rich.
Speaker E:Trying to do a rich guy using money for good Batman thing.
Speaker A:That Batman.
Speaker A:Such a wonderful person.
Speaker A:I don't deserve it, but we millionaires decided that you do.
Speaker E:But let's look at how this is part of his honesty problem and it ties in to the eggs in the basket because Altman is invested in all the stuff necessary necessary to build OpenAI.
Speaker E:One of the eggs OpenAI needs is a ton of data.
Speaker E:You can't build a large language model without examples of language and content, and one source of that data is Reddit.
Speaker E: te and was on its board until: Speaker E:Reddit got its start in the same inaugural Y Combinator class as Loot.
Speaker E: it co founder Aaron Swartz in: Speaker E: Swartz died by suicide in: Speaker E: In: Speaker E:Reddit co founder Alex Ohanian felt in his bones the deal was wrong.
Speaker E:It's a less noble version of what Reddit co founder Aaron Swartz was targeted by law enforcement for.
Speaker E:Swartz wanted to open the knowledge up to everyone.
Speaker E:Altman wanted to put it in his product.
Speaker E: In: Speaker E:That never happened due to regulatory issues.
Speaker E:But just like Reddit's data going to OpenAI, a look at the areas Altman's wealth is invested in show a deep connection to other needs of the organization.
Speaker E:He's invested in AI, networking equipment companies, thermal battery companies, and even companies mining the rare earth metals that server farms require.
Speaker E:And once it's all built, Altman will profit off the problems AI creates.
Speaker E:We're going to focus on three rising energy demands and costs, misuse like fraud and deepfakes, and Job loss and economic collapse.
Speaker A:He makes money in those three.
Speaker C:Yeah.
Speaker E:Altman says again and again that OpenAI needs more power.
Speaker E: gigawatts of capacity by: Speaker E:That much compute would require as much electricity as 1.5 billion people, the equivalent of the entire population of India.
Speaker E:But Altman has a solution.
Speaker E: e they first met in the early: Speaker E:Of course, nuclear can be an extremely efficient and clean form of energy, but Thiel and Altman want to own it.
Speaker E:Altman is invested in Helion and Oklo.
Speaker E:Helion is working to build the first ever nuclear fusion power plant, a type of energy creation that many scientists say won't work.
Speaker E:And Oklo is building microreactors, literally truck sized nuclear reactors, which is a bit concerning considering this investment strategy.
Speaker F:Part of our model is make the cost of mistakes really low and then make a lot of mistakes.
Speaker E:But for now, Oklo hasn't figured their reactors out yet and are just using gas to keep up with the promises they made.
Speaker A:Nuclear startup Oklo and natural gas firm Liberty Energy today announcing a partnership to provide energy to large scale customers.
Speaker E:Altman has also invested in multiple companies offering protection against AI bad actors, identity verification to prevent deepfakes, and even companies offering insurance for losses due to AI scams and hacking.
Speaker E:That's like Batman not making any money off of crime fighting, but then selling Batmobile drove into my house insurance while also running the Uber for henchmen startup that the Riddler uses and selling the Joker white makeup.
Speaker E:One other big promise Altman makes is that when the AI he sees as inevitable makes many jobs obsolete, it'll create so much wealth that it can be shared with everyone.
Speaker E:Just like his smaller scale Reddit promise.
Speaker E: ned out to be bullshit and in: Speaker E:WorldCoin WorldCoin WorldCoin WorldCoin Worldcoin is a technology company and cryptocurrency funded by all the usual suspects of techno fascism.
Speaker E:Worldcoin's backers say it can be a way to give out some form of universal basic income when AI starts replacing jobs.
Speaker F:I think this idea that we have a global currency that is outside of the control of any government is a super logical and important step on the
Speaker E:tech tree, but it also sells itself as a solution to identity verification problems created by AI.
Speaker E:They want to use these orbs as a method of trusted identity check.
Speaker E:And you don't get your universal basic income until you scan your eyes into the orbs.
Speaker E:And like many of Altman's other projects from looped to ChatGPT, it requires universal adoption to be of any business use.
Speaker E:A currency and identification system are pretty useless if other people don't use them.
Speaker E:So again, Altman is making an offer.
Speaker E:Give us your identity and we'll give you cryptocurrency.
Speaker E:It's a classic Altman deal.
Speaker E:I'll fix everything if you sign over everything.
Speaker E:Just trust me, bro.
Speaker E:It's almost like Altman wants to build a whole other economy just in case the one we have now falls apart.
Speaker E:Well, we'll get to that.
Speaker E: In: Speaker E:That for profit organization has none of the same legal responsibilities as the nonprofit did and brought in new investors like Microsoft, which invested $13 billion, which OpenAI largely spent on Microsoft products.
Speaker E:And it's not just Microsoft.
Speaker E:Nvidia has promised to invest $100 billion in OpenAI over the next few years.
Speaker E:Money that OpenAI will spend buying Nvidia chips.
Speaker E:OpenAI has similar circular deals with AMD, the Qatari government, and Larry Ellison's Oracle.
Speaker E:How about the 20 bucks you owe me?
Speaker B:Well, I only got 10, so here's
Speaker A:10 I owe you 10.
Speaker B:Hey Mo, you owe me 20.
Speaker E:Well, here's 10 I'll owe you 10.
Speaker A:You owe me 20.
Speaker A:Here's 10, I'll get 10.
Speaker B:Here's a 10 I owe you.
Speaker B:Here's a 10.
Speaker B:IOU.
Speaker E:Here's a 10 I owe YOU.
Speaker C:Good.
Speaker B:Now we're all even.
Speaker E:The entire economy is tied to the success of Altman's project.
Speaker F:We might screw it up.
Speaker F:Like this is the bet that we're making.
Speaker F:We're taking a risk along with that.
Speaker E:Who is the we taking the bet?
Speaker E:Here's OpenAI's CFO, banks, private equity, maybe even governmental.
Speaker B:The ways governments can come to bear.
Speaker B:Meaning like a federal subsidy or something.
Speaker B:Meaning like just first of all, the
Speaker E:backstop, the guarantee that allows the financing to happen.
Speaker E:Through all of that stammering, The CFO of OpenAI is making a clear point.
Speaker E:The government, your tax dollars, are responsible for saving the AI project.
Speaker E:That's more eggs in the basket.
Speaker E:And that basket is based on the promises of Sam Altman, who, as we've illustrated, lies and breaks promises a lot.
Speaker E:So if we really look at the basket.
Speaker E:Maybe we shouldn't have been putting all those eggs in there.
Speaker E:And it gets worse.
Speaker E:While we were editing this video, News broke that OpenAI is seeking a 750 billion dollar valuation and is in talks with Amazon for a $10 billion investment.
Speaker E:That's money that OpenAI would spend on Amazon infrastructure.
Speaker E:So going to say, I'm going to need more eggs.
Speaker A:So this all happened, of course, before Microsoft upped their investment from 10 billion to over 100 billion.
Speaker C:Yeah.
Speaker A:And that is how they got their valuation.
Speaker A:Yeah.
Speaker C:And it's, and that's the thing, right.
Speaker C:In order for anything like this to really gain Steam is you need adoption.
Speaker A:Right.
Speaker C:And in order to, because look, our, our currency is a fiat currency and the only reason why it works is because everybody believes in it.
Speaker A:Right.
Speaker C:So they're gonna go to any measure to make sure as many people adopt it to where there's leverage there now.
Speaker C:Yeah, yeah, man, I see it, I see it.
Speaker C:Yeah, it's like, it's when, when it, when it all first got released, I, I, I knew it because Tesla was great when it first got rolled out and everyone's driving the cars around, but really what it's doing is it's picking up all the data.
Speaker C:All your data is driving around.
Speaker C:That's, it's collecting data on where you're going, what you're doing, the cameras that it's around.
Speaker C:Right.
Speaker C:All, everything that it's picking up on.
Speaker C:It's, it's really a collection of data.
Speaker C:Everyone sees it as like a car company.
Speaker C:No, it's, it's, it's a data company.
Speaker A:If it was just a car company.
Speaker A:Yeah.
Speaker A:It wouldn't be stopping the production of Model X.
Speaker A:And.
Speaker C:Yes.
Speaker A:And the Model S stopping production because it was never about the cars.
Speaker A:Right.
Speaker C:It was never about the cars.
Speaker C:Right.
Speaker C:So,
Speaker A:so look, boys, we're in the overtime hours now.
Speaker A:Mark, I had a whole Jane street conversation we were going to have here and we'd probably say that for another show.
Speaker A:It's, it's insane.
Speaker A:The manipulation's gone on the bitcoin world.
Speaker A:We're going to hear more about that over time.
Speaker A:But I am legitimately afraid for the future in a way that I think is measurable.
Speaker A:You've got to own assets.
Speaker A:And I promised something in the middle of the show that I was going to tell people how to make money with AI and some things that I've seen.
Speaker A:And I'm going to fulfill that promise before we call it a wrap for today.
Speaker A:Look, you've got to Own assets, you got to own a business, you just got to W2.
Speaker A:Wages are going to change.
Speaker A:Even if you are still employed, your job is going to change.
Speaker A:It's going to be more review of work and less doing work.
Speaker A:And if that's the case, then they're going to pay you more or less.
Speaker A:There's going to be more people doing your job or less people doing your job.
Speaker A:The math here just only works one way.
Speaker A:There's too much downside risk because.
Speaker A:Right.
Speaker C:There will be more people out there that are willing to do your job for less.
Speaker A:That's right.
Speaker A:And more people who need jobs.
Speaker A:Because UBI is not going to make people feel good about themselves, their lives.
Speaker A:So to be explicitly clear, right, like
Speaker C:what would UBI be like?
Speaker C:I mean, is it going to keep up with Social Security?
Speaker C:Like, what are we talking about?
Speaker A:I mean, I doubt that highly.
Speaker A:Any Social Security in and of itself is not enough.
Speaker A:My mom's on Social Security and I, I probably give her somewhere between 3, $4,000 a month in spending.
Speaker A:Yeah, let's be clear.
Speaker A:You have to adopt AI and you have to understand it.
Speaker A:I'm not talking just using a learning language model.
Speaker A:I'm talking about understanding agentic AI and understanding what's going on and keeping up to date with it.
Speaker A:Because if not, you're basically going back to the dot com bubble burst and saying, I'm not going to adopt the Internet.
Speaker A:That sounds absolutely asinine right now.
Speaker A:So I'm not telling everybody to avoid this technology.
Speaker A:I'm telling you to get sharp and get good with it.
Speaker A:Right now, number one, there are people that are using AI to help them make money directly.
Speaker A:Some people, for example, have found out that if you have access to real time data feeds like Coinbase or in some of these accounts, you get access to data faster than Polymarket does.
Speaker A:So they're using agentic AI to make small bets on where Bitcoin is going in a delayed market like Polymarket, where more up to date real time feeds like Coinbase are telling them in seconds where it's going to be before the AI feeds get to them.
Speaker A:So if you have an agentic AI which can move fast, you can arbitrage that time because your speed of execution is faster.
Speaker A:And people are literally filling up crypto wallets with just arbitrage bets like that.
Speaker C:Yeah, yeah, yeah.
Speaker A:The predictions markets the wild wild west with AI and people are building agentic AI bots that do nothing besides place micro bets and make the money.
Speaker A:If they lose, they lose.
Speaker A:But if they Win, they win incremental bets.
Speaker A:Those incremental bets are hundreds, if not thousands throughout a day, in a week.
Speaker A:Right.
Speaker A:So that's number one.
Speaker A:And that's just people using the markets.
Speaker B:And this is not illegal.
Speaker B:Right.
Speaker A:This is just people using the data they have access to.
Speaker A:And now we all have access to real time data and APIs like we've never seen before.
Speaker A:Right.
Speaker A:And the world is not adjusted yet.
Speaker A:And I know there's been the rhetoric and the fodder, but everybody thinks everybody's using AI.
Speaker A:A lot of people are using learning language models like, you know, open cloud and stuff like that, but they don't realize.
Speaker A:They don't.
Speaker A:I mean, not open cloud, they're using cloud, they're using or chatgpt, but they're not using the real complex.
Speaker C:Right, right.
Speaker C:Using it for all of its benefits.
Speaker A:Yeah.
Speaker A:Number two, you can use it to buy back time.
Speaker A:Okay.
Speaker A:If you have, you use Perplexity's computer or you get a Mac Mini, you can use OpenAI to facilitate your day.
Speaker B:And even if you're not using it
Speaker A:in the work capacity, I'm not advocating that you do, but at some point we all will.
Speaker A:You're going to wind up having this basic personal assistant for your life, send an email to this person.
Speaker A:You can use an agentic AI, either WhatsApp or even iMessage, and literally have somebody you can mess at any point in time, like a personal assistant to schedule things in your calendar to get things done.
Speaker A:Right.
Speaker A:We all know, hey, I meant to put that in the calendar, babe, but I didn't do it because it requires you going into your calendar, putting up a title, pausing a time, taking, pausing your day.
Speaker A:Whereas we all send text messages.
Speaker A:Imagine sending a text message to your agent.
Speaker A:Hey, plan a date night with Joanna and see if you can get a reservation at this restaurant.
Speaker A:And it'll go and do that on its own.
Speaker A:Yeah.
Speaker A:That in and of itself will save you time and make you more efficient.
Speaker A:And the only thing I can give most people back is more time.
Speaker A:And that is one way to do it.
Speaker A:That's number two.
Speaker A:Okay.
Speaker A:Number three, you need to understand that your body of knowledge that you can soak in will never be as good as AI's body of knowledge.
Speaker A:But if you understand how to work with AI and use that to be more efficient in the things that you do on a daily basis, you will make more money.
Speaker A:Yeah, right.
Speaker A:And a great example of that is, let's say you're going to write an email to somebody and you're upset.
Speaker A:Run that email through AI you can keep all the proprietary information out and ask it for feedback.
Speaker A:And yes, some would argue it's sycophantic to have something that's going to confirm.
Speaker A:Ask it for a neutral opinion whether you should write that or not, that those pauses will help you keep emotional volatility and will make you more money over time.
Speaker A:Yeah.
Speaker A:Oh, yeah.
Speaker C:Okay, I see what you're saying.
Speaker A:Number.
Speaker A:Okay.
Speaker A:And that's part of number two.
Speaker A:Number three.
Speaker A:If you learn how to build an agentic model.
Speaker A:Right.
Speaker A:And this is not rocket science.
Speaker A:All you need to do is access your terminal if you have a Mac, for example, or you learn how to use perplexity in a way that you can do something.
Speaker A:There's a.
Speaker A:There's a company right now who sells an entire suite of skills for law firms.
Speaker A:Right.
Speaker A:That you can download to your opening.
Speaker C:I'm seeing a lot of this.
Speaker C:I'm seeing there's a lot of advertisement for, you know, invest in this.
Speaker C:We'll tell you.
Speaker C:We'll go through all these different AI.
Speaker C:I haven't tested any of them.
Speaker C:I'm a little skeptical.
Speaker B:That's right.
Speaker A:And you're going to be skeptical no matter what you read, because you don't know.
Speaker A:And you've heard about the data breaches and stuff like that.
Speaker A:Right.
Speaker A:So here's what you do.
Speaker A: ou start selling a course for: Speaker A:There's a lot of that out there.
Speaker A:Or you can take it up a notch for 2,500 bucks.
Speaker A:I will set up your installation for you based on what you want.
Speaker C:Think of it as a trade school.
Speaker A:Right, Exactly.
Speaker A:And you set it up for.
Speaker B:I had a guy hit me up today.
Speaker A:Hey, Chris, can you help me set it up or do you know anybody can set me up?
Speaker A:I'm willing to pay top dollar.
Speaker A:I know how valuable this is to me from a time perspective.
Speaker A:And he wants it for a marketing firm.
Speaker A:Wow.
Speaker A:And literally you can go to, you know, Anthropics, Claude, whatever, and you can just type in, hey, give me a manual on how to do this.
Speaker A:And then tell it what skills you want there and then use that to give it to the guy as a manual for his own product.
Speaker A:Right.
Speaker B:This is not rocket science stuff.
Speaker B:You have to understand that people just don't trust it.
Speaker A:And if you have enough intimate knowledge of how it works, you can in fact get better with this.
Speaker A:Yeah.
Speaker C:And from that point on, it's just a referral game.
Speaker A:That's right, yeah.
Speaker A:If your website is not getting traffic or you're having trouble with execution on conversions of leads on something, if you own your own business, right.
Speaker A:You can have this be your 24 hours a day, seven days a week.
Speaker A:The way the number one way most firms that are customer service oriented lose business is they don't answer the phone call quick enough.
Speaker A:Right.
Speaker A:They're not responsive enough.
Speaker A:OpenAI can be agentic AI, can be your responsiveness by using Perplexity's computer or using an agentic model.
Speaker A:It can literally answer your calls.
Speaker A:It can literally do all of that.
Speaker A:Right now there are voice modules, there's the Bubbles skill set which you can download to your agentic AI, which allows it to send imessage messages.
Speaker A:Imagine someone text message your business.
Speaker A:Now you can text message any time of day, day or night, and it'll give you a list of stuff in the morning to do.
Speaker A:Right?
Speaker A:Right.
Speaker A:I mean there are so many ways to utilize this to help benefit most Americans and make more money or free up more time.
Speaker A:Right.
Speaker A:That it's impossible.
Speaker A:But I'm telling everybody right now, you've got to 100% absolutely adopt this technology now.
Speaker A:Get on the cutting edge of it.
Speaker A:It'll make you less afraid.
Speaker C:Even if it's not so much you learning all the different platforms.
Speaker C:I mean, hopefully by now you've gotten used to and grown your prompting skills.
Speaker A:Right.
Speaker C:That in and of itself is a skill that needs to be honed in on.
Speaker A:You know what I saw last night?
Speaker A:Somebody said to Perplexity's computer, which is their agentic model that taps into all the 19 models or so they said, make me a Bloomberg terminal.
Speaker A:And it did.
Speaker C:That's it.
Speaker A:Bloomberg is about $30,000 a month now.
Speaker A:I don't know if it has the same real time data as Bloomberg.
Speaker A:I don't know how robust it is, but I saw a demo of it working and it was pretty damn close.
Speaker C:Oh, so you don't.
Speaker C:Okay.
Speaker C:Wow.
Speaker A:And it looked just like a Bloomberg terminal.
Speaker C:I mean, I know with time too.
Speaker C:We've talked about it, the vibe coding.
Speaker A:Right.
Speaker C:If you give it time and work with it, you can get it to where you need it to get to.
Speaker A:I think you're already there.
Speaker A:Yeah, I think, I think you're already there.
Speaker A:I think for.
Speaker A:Unless you're doing something truly complex, you can vibe code your.
Speaker A:And I get why vibe coding is so addictive.
Speaker A:You sit there and you talk about this stuff and I Know a lot of business owners.
Speaker C:It's like a puzzle, right?
Speaker A:It's like a puzzle, but it's.
Speaker A:You should get into it and you see this massive progress in a narrow window of time.
Speaker A:And then you spend a lot more time on the nuances, getting them dialed in to where you want them to be.
Speaker A:Right.
Speaker A:And it's just really, it's addictive because you're seeing these results and you've seen delivery to the markets and I've seen that.
Speaker A:I've shared this on my X feed and my threads feed.
Speaker A:Delivery to the markets of apps and websites is at an all time high.
Speaker A:You're just seeing the delivery cadence be unbelievable.
Speaker A:It's like inflection point straight up.
Speaker A:Wow.
Speaker A:So you can do that.
Speaker A:I mean, if you want to make a mobile app right now, if you've got a good idea for a mobile app, you can vibe code it right now.
Speaker A:Yeah.
Speaker A:With no help, end to end.
Speaker A:You by yourself, with no coding experience and deliver it to the app store in probably 48 hours, maybe maybe three days.
Speaker A:Maybe three days.
Speaker A:Right.
Speaker A:And start having people use your app.
Speaker A:What would have taken teams months before?
Speaker A:Right.
Speaker A:I mean, I'm.
Speaker A:And that is not overstating it.
Speaker A:I've seen it done.
Speaker A:I can do it.
Speaker A:I was using an app service for our real estate brand.
Speaker A:Right.
Speaker A:And I just canceled it.
Speaker A:I love those guys.
Speaker A:Yeah.
Speaker A:I can build that entire app myself.
Speaker C:Why would I pay?
Speaker C:Right?
Speaker A:And here's the problem.
Speaker A:Unless you're.
Speaker A:If you're selling a prepackaged white glove app, okay.
Speaker A:If you're not keeping it up to date quickly, then what's your value to me?
Speaker A:Right?
Speaker A:Right.
Speaker A:It's taking you months and years to roll out like it should be.
Speaker A:Boom, boom, boom, improvement, improvement, improvement, improvement.
Speaker A:Because that's the speed of which I can make that today if I see something on somebody's website right now.
Speaker C:Yeah.
Speaker C:And it should be making improvements on its own too.
Speaker C:I mean, I mean, that's another thing, another great thing.
Speaker A:So video editing on AI is getting much, much better.
Speaker A:And agentic AI will take that up a notch.
Speaker B:It's not quite there yet, but it's
Speaker A:very, very close to where you could say, hey, and use the same MPEG, MPEG format that YouTube uses on its backend to run its platform.
Speaker A:It uses that same thing to cut up and edit videos for you.
Speaker A:So this whole like editor hustle on social media, that'll be gone.
Speaker A:Agentic AI will do it all for you.
Speaker A:You just say, hey, look, even our show, it downloads to the drive when it goes to the drive, it tells agentic AI.
Speaker A:Agentic AI will look at it, it'll pull transcript right away.
Speaker A:I'm not doing this anything.
Speaker C:Why?
Speaker A:Right, so if you're in that space, right, and you want to deliver viral content, right.
Speaker A:You can learn how to prompt and build that with agent AI right now.
Speaker A:And if you're one of those guys who's got like 14 clients and you're spending hours a day manually cutting using an editing software, you just train your AI to do it.
Speaker A:And then your sole purpose, your only purpose is getting business in the door.
Speaker C:Yeah.
Speaker C:And I think also, I don't, I don't know how much in the banking industry, but other industries, I'm sure we're probably, I know for some we're already at that point where you're probably going to go to your annual review next year and they're going to ask you how, how have you used AI to make yourself more efficient?
Speaker A:You know what's crazy is they penalized that a year ago.
Speaker C:They penalized it a year ago, but now they expect you to use it to become more efficient.
Speaker C:If you're not doing it enough well, then you're not finding ways to improve.
Speaker A:Yeah, you're not, you're not growing, you're not growing.
Speaker A:Yeah, you're not gonna be part of our long term situation if you're not teaching the models how to do your job.
Speaker A:Saeed, how am I going to replace you?
Speaker A:Right?
Speaker C:Can't keep paying you forever.
Speaker A:Another great one.
Speaker A:If, let's say you got a 9 to 5 job, right?
Speaker A:And you're like, chris, I want to get involved in this, I want to understand, but I want to risk my job to do it.
Speaker A:How do I, how do I solve this problem?
Speaker A:Well, I got you, okay?
Speaker A:Faceless YouTube channels, right?
Speaker A:YouTube is the number one watch platform over all the streaming platforms.
Speaker A:YouTube is number one.
Speaker C:Okay, yeah, but I've heard that YouTube was going to crack down on this and demonetize accounts.
Speaker A:I'm sure they will at some point in time.
Speaker A:But for right now, if you're, if you're doing this, it's a great way to learn.
Speaker A:You can make some money on it.
Speaker A:If one video goes viral, you make a couple hundred bucks here and there.
Speaker A:But basically all you do is you have AI look at whatever topic you're interested in on Wikipedia to make a 30, 40 second, maybe one minute long narrative, right?
Speaker A:And what you do is you run that narrative through 11 Labs, 11 Labs gives you the audio.
Speaker A:You lay that audio on to Images that you have AI create for you.
Speaker A:You piece it all together and you put it on the Internet and It's a faceless YouTube channel that just as shorts and just hooks in for VI content.
Speaker D:Man, I've seen a lot of those on like Instagram.
Speaker A:Yeah.
Speaker A:And you're gonna see more.
Speaker A:You're gonna see more.
Speaker B:And that, that, that's, that's where human
Speaker A:humanity really comes in.
Speaker A:In as of February rarity.
Speaker C: Now, as of February: Speaker C:AI generated automated channels as part of sign because they know it's a real problem.
Speaker C:Because they know it's.
Speaker A:Yeah, but how do you crack down on it?
Speaker A:It's the future.
Speaker A:All movies gonna be made this way.
Speaker A:What are you gonna do?
Speaker A:You're gonna say you can't.
Speaker A:We're not gonna adopt.
Speaker A:We're not adopt.
Speaker A:You're delivering too much.
Speaker C:Like it says here, see targeted content channels focusing on mass produced Reddit stories, generic AI voiceovers with stock footage, repetitive ambient loops.
Speaker C:The fireplace.
Speaker A:Yeah.
Speaker C:And channels that simply reuse the same thumbnails, titles and templates across hundreds of videos are at high risk.
Speaker A:I'll tell you right now.
Speaker A:We've got a YouTube channel.
Speaker A:Never seen any of that help our algorithm at all.
Speaker A:We don't do any of those things.
Speaker A:And I don't see any boost.
Speaker C:Yeah, YouTube, where you least, at least promote the channels that aren't doing it in YouTube.
Speaker A:So YouTube is such a hypocrite too.
Speaker A:They say stuff like this, but they don't mean it.
Speaker A:So number one, our ch.
Speaker A:They want podcasts to join their platform.
Speaker A:We join their platform.
Speaker A:You put an hour long podcast on their platform, somebody watches 20 minutes.
Speaker A:That's good retention for a podcast.
Speaker A:Now, our retention is typically higher, but that's good retention.
Speaker C:Yeah, right.
Speaker C:Yeah.
Speaker A:But they consider that 20 minutes of an hour long podcast to be a 33% retention.
Speaker A:Right.
Speaker A:Versus if somebody watches five minutes of an eight minute clip and they go, oh, that retention's way higher.
Speaker A:We're gonna promote that channel way more than we promote the podcast channel.
Speaker B:So you wanted podcasts come on your platform, you created an avenue for podcasts
Speaker A:on your platform, but then you penalize them.
Speaker C:Yeah.
Speaker A:For having good podcast retention.
Speaker C:That's why.
Speaker A:Sense.
Speaker C:Yeah, that's why if you've ever wondered why all these different shows have different channels for their clips.
Speaker C:Yeah, yeah, that's what, that's why they, that's why they do that.
Speaker A:Yeah.
Speaker A:So they can game the retention system.
Speaker A:Yeah.
Speaker A:And it's, it's it's terrible, but that's.
Speaker C:Yeah.
Speaker C:And they hope there's crossover between the clips channel versus the other channel.
Speaker B:All right, well, look, I came in hot.
Speaker A:I gave it to you hot.
Speaker A:What'd you think?
Speaker A:It was good.
Speaker C:No, I mean.
Speaker C:I mean scary.
Speaker C:Scary good.
Speaker C:Scary good.
Speaker A:Yeah.
Speaker A:Did I scare you?
Speaker A:With Jill, you're quiet today.
Speaker C:He's soaking it all in.
Speaker D:I wasn't really prepared for this, man.
Speaker D:You just went in straight.
Speaker D:Straight to 100.
Speaker A:Yeah, I. I knew it was gonna be a hot episode.
Speaker A:Sorry.
Speaker C:Yeah, no, no, it's good.
Speaker C:And look, it's realistic and it's what the people need to hear.
Speaker B:Yeah, it gets the people going.
Speaker B:Nobody knows what it means.
Speaker C:Nobody understands.
Speaker D:The Sam Altman video was really good.
Speaker D:I started Googling, like, who owns Helion and all that stuff.
Speaker D:Yeah, dude's worth, like, almost $2 billion and owns, like, almost a billion off of Reddit.
Speaker A:That's right.
Speaker D:Like, it's.
Speaker C:It's.
Speaker A:Yeah.
Speaker A:And that guy's death was also questionable, too.
Speaker D:Oh, it's if you're suicided, something.
Speaker A:Suicided?
Speaker C:Yeah, if you were suicided online.
Speaker C:Yeah, man.
Speaker C:It's like all the whistleblowers can't.
Speaker C:I don't even want to touch this, bro.
Speaker C:I'm trying to stay alive.
Speaker A:Yeah.
Speaker C:Listen, if you like the show, please head over and make sure you subscribe.
Speaker C:Ring that notification bell.
Speaker C:Moist.
Speaker C:Good at good stuff.
Speaker C:Help.
Speaker C:Help the channel out.
Speaker C:Head to join Fridays.com.
Speaker C:use a code hired to help your boys out.
Speaker C:Or you can go to thspod.com, buy yourself some merch.
Speaker C:Get your merch.
Speaker C:Get your merch.
Speaker C:But the best thing you could do is refer the show to a family member or a friend.
Speaker C:That could really help out the show a lot.
Speaker C:Anything for you.
Speaker A:No, I'm good.
Speaker A:I think I've talked enough for this rail.
Speaker C:Anything?
Speaker D:No, just.
Speaker D:Thank you.
Speaker D:Have.
Speaker D:Have a great day.
Speaker B:Have a great day after that.
Speaker B:All right, good night, everybody.
Speaker B:Have a great day.
Speaker B:Bye.
