Wall Street Brunch: Earnings Arrive Amid Hormuz Standoff

Seeking Alpha Blog

This piece offers no actionable intelligence for AI or semiconductor investors. The headline references earnings season and geopolitical tensions around the Strait of Hormuz, but the summary confirms there's no substantive technology sector content to analyze. For those tracking AI infrastructure buildout, hyperscaler capex cycles, or chip supply chains, this is a pass.

The absence of tech-specific content is itself noteworthy given the current market environment. We're in the midst of a critical earnings period where investors need clarity on several fronts: whether hyperscalers will maintain their projected $200-plus billion in combined AI capex for 2025, how quickly inference workloads are scaling relative to training, and whether NVIDIA's Blackwell ramp is meeting the aggressive deployment timelines customers have telegraphed. Microsoft, Meta, Amazon, and Google collectively spent roughly $190 billion on capex in 2024, with the majority directed toward AI infrastructure. Any deceleration or reallocation would have immediate implications for NVIDIA, Broadcom, and the broader semiconductor equipment chain.

The Hormuz reference is potentially relevant only in the narrowest sense. Roughly 20 percent of global oil flows through that chokepoint, and sustained disruption would pressure crude prices upward, feeding into inflation expectations and potentially constraining Federal Reserve policy flexibility. Higher rates would compress tech multiples, particularly for unprofitable AI application companies trading on revenue multiples. But the direct impact on semiconductor supply chains is minimal since chip fabrication and assembly occurs primarily in Taiwan, South Korea, Japan, and increasingly Arizona and Ohio for leading-edge nodes. Energy costs matter for fab operations, but they're not the binding constraint on production.

What investors actually need right now is granular data on AI monetization. We're seeing the first wave of inference revenue at scale, with companies like OpenAI reportedly approaching $4 billion in annualized revenue and Anthropic crossing $1 billion. The question is whether enterprise AI spending is cannibalizing traditional software budgets or representing net new investment. If it's substitution rather than expansion, that changes the growth trajectory for infrastructure providers who've been pricing in sustained 30-plus percent compound annual growth rates through 2027.

On the semiconductor side, the focus should be on TSMC's capacity allocation and whether Samsung's yield improvements on 3-nanometer processes are real or aspirational. NVIDIA's Blackwell chips are manufactured on TSMC's 4-nanometer process with CoWoS-L advanced packaging, and any bottleneck in that packaging capacity directly constrains how quickly hyperscalers can deploy next-generation clusters. TSMC has guided to roughly $40 billion in 2025 capex, with advanced packaging capacity expansion as a priority, but lead times remain extended.

For investors tracking this sector, the signal-to-noise ratio matters. Articles that promise earnings analysis or market-moving developments but deliver geopolitical commentary without connecting it to specific supply chains, customer demand patterns, or valuation frameworks waste time during a period when differentiation between AI winners and pretenders is accelerating. The next few weeks will clarify whether current AI infrastructure valuations reflect sustainable demand or speculative excess, and that requires focus on company-specific fundamentals rather than macro headlines without sector linkage.