TOPLEY’S TOP 10 July 11 2024

1. Venture Funding of AI Start-Ups Doubles

From Dave Lutz at Jones Trading


2. America’s startup boom is still going strong. Here’s what it means for the economy

NPR Greg Rosalsky

What’s driving the boom
Haltiwanger says that, basically, two big buckets of new businesses are being created these days.

New businesses in the first bucket are capitalizing on a huge post-pandemic population shift. Many office workers are now either fully remote or hybrid. “People are not spending five days a week at the office in major downtown areas,” Haltiwanger says. Where people spend their time, they spend their money. Bad news for businesses in downtown areas. Good news for businesses where office workers live.

Sponsor Message
That’s why one of the big areas for new business growth is in food and accommodations, particularly in the outskirts of cities. Haltiwanger, together with Ryan Decker, calls this “the donut effect.” There’s now a hole lacking vibrant economic activity in many major business districts, and a delicious fried dough of new business opportunities in the suburbs surrounding them. Office workers need their doughnuts, coffee and sandwiches near their office, which is now more often at home.

However, if the story of the new business boom were limited to just delis, gyms and doughnut shops in suburban areas, the upside would be somewhat limited. Sure, remote and hybrid work is a revolutionary change for a large fraction of the workforce, but the business boom it has fed could be seen as mostly just a geographic reshuffling of economic activity. Fewer coffee shops in Manhattan. More in New Jersey or Brooklyn. That would likely have only a limited upside for the economy.

That’s why Haltiwanger is much more excited about the other big bucket of new businesses he has identified in the data: tech startups. This boom is proving, he says, to be the most persistent. These tech startups come in many stripes, but one subcategory has really caught his and other economists’ attention: startups working in artificial intelligence.

“I think we’re in a new tech wave,” Haltiwanger says. “I think AI is the poster child of this.”

https://www.npr.org/sections/planet-money/2024/07/02/g-s1-7139/economy-startup-boom-america-productivity  Found at Barry Ritholtz Big Picture Blog https://ritholtz.com/2024/07/weekend-reads-617/


3. Roaring Kitty and CHWY

CHWY rally did not get back to 2023 levels…All-time high was $120 in 2021


4. Tesla’s Share of U.S. Electric Car Market Falls Below 50%

NYT By Jack Ewing
Tesla accounted for 49.7 percent of electric vehicles sales from April through June, down from 59.3 percent a year earlier as the company led by Elon Musk lost ground to General Motors, Ford Motor, Hyundai and Kia, the research firm, Cox Automotive said. It was the first time the company’s market share fell below 50 percent in a quarter, according to Cox. The firm, a leading auto industry researcher, estimates market share based on registrations, company reports and other data.

https://www.nytimes.com/2024/07/09/business/tesla-electric-vehicles-market-share.html#:~:text=A%20new%20report%20estimates%20that,second%20quarter%20of%20the%20year.


5. Not Yet But Small Cap Tech PSCT Good Chart to Watch


6. More Mag 7 Market Cap Stats

Blackrock Insights

https://www.blackrock.com/us/individual/insights/blackrock-investment-institute/outlook


7. AI Power Demand May Grow by 10X

The Surging Problem of AI Energy Consumption-By Kelly Barner
 
On April 9th, Rene Haas, CEO at British semiconductor and software design company Arm Holdings, made a statement about data center energy consumption ahead of a partnership announcement with U.S. and Japan-based universities. 
As Haas said, “by the end of the decade, AI data centers could consume as much as 20% to 25% of U.S. power requirements. Today that’s probably 4% or less.”

25 percent of all power consumed in the United States might go to data processing in less than 6 years. No wonder all of the interest in AI and advanced computing has been driving up the stock prices of companies that own power plants: Vistra is up by 84 percent and Constellation Energy by 63 percent.

Many of the sources I consulted in preparation for this episode reference a report from The International Energy Agency. Titled simply Electricity 2024: Analysis and Forecast to 2026, this 170 page report is full of data points, analysis, and projections.

For instance, the report states that a request to ChatGPT (one of the most popular examples of generative AI widely available today) requires an average 2.9 watt-hours of electricity. That is equivalent to turning on a 60-watt light bulb for about three minutes. That is nearly 10 times as much energy as the average Google search. 

If that doesn’t have your mind spinning, AI power demand is expected to grow by at least 10X between 2023 and 2026.
Companies that are heavily invested in the AI and data processing space are well aware of this problem. Microsoft and Google have well defined plans to achieve net negative emissions (forget net zero). Apple aspires to be net neutral globally, including their supply chains, by 2030. 

How they are all going to hit those targets without changing something about AI energy consumption is a mystery to me.

AI’s Insatiable Need for Energy
AI runs on GPUs (short for (graphics processing units), a type of chip used to process large amounts of data. Processing requirements and energy consumption increase when the AI is responding to a query. The more complex the model or the larger the dataset, the more energy must be consumed to complete the job.

In addition, queries involving imagery are more energy intensive than those focused on text. Generating one image using AI can use the same amount of energy as charging a smartphone according to researchers at Hugging Face, a collaborative AI platform. 

Energy isn’t just consumed when we use AI; it is also consumed when the AI is being trained. 
Alex de Vries, a data scientist and a Ph.D. candidate at Vrije University Amsterdam talks about a training phase v an inference phase. Training is the process of setting up the model and teaching it how to learn on its own, while inference is when you feed it scenarios to test it and refine how it works.

ChatGPT it took relatively little energy to train, but a lot to do inference, which makes sense, because it had to learn to do a lot of complex things in a very human-friendly way. 

ChatGPT-3 was the one cited as consuming about 10 times as much energy per query as a Google search. GPT-4 probably uses more power because it has more parameters and is a larger model.

https://artofprocurement.com/supply/the-surging-problem-of-ai-energy-consumption/


8. Rents Falling in Florida-Redfin

https://www.redfin.com/news/rents-fall-in-florida-austin-june-2024


9. F-16 Transfers to the Ukraine

Axios Jacob Knutson

https://www.axios.com/2024/07/10/ukraine-russia-f16-jets-nato-summit


10. 80% of Adults Want Parental Consent Around Social Medial-Prof G Blog

Peer Pressure-Prof G-Scott Galloway

Age-gating social media is hugely popular. Over 80% of adults believe parental consent should be required for social media, and almost 70% want platforms to limit the time minors spend on them.

https://www.profgalloway.com/age-gating