How Micron Technology Is Poised to Benefit from AI Investments

Artificial Intelligence (AI) continues revolutionizing industries worldwide, including healthcare, retail, finance, automotive, manufacturing, and logistics, driving demand for advanced technology and infrastructure. Among the companies set to benefit significantly from this AI boom is Micron Technology, Inc. (MU), a prominent manufacturer of memory and storage solutions.

MU’s shares have surged more than 70% over the past six months and nearly 104% over the past year. Moreover, the stock is up approximately 12% over the past month.

This piece delves into the broader market dynamics of AI investments and how MU is strategically positioned to capitalize on these trends, offering insights into how investors might act now.

Broader Market Dynamics of AI Investments

According to Grand View Research, the AI market is expected to exceed $1.81 trillion by 2030, growing at a CAGR of 36.6% from 2024 to 2030. This robust market growth is propelled by the rapid adoption of advanced technologies in numerous industry verticals, increased generation of data, developments in machine learning and deep learning, the introduction of big data, and substantial investments from government and private enterprises.

AI has emerged as a pivotal force in the modern digital era. Tech giants such as Amazon.com, Inc. (AMZN), Alphabet Inc. (GOOGL), Apple Inc. (AAPL), Meta Platforms, Inc. (META), and Microsoft Corporation (MSFT) are heavily investing in research and development (R&D), thereby making AI more accessible for enterprise use cases.

Moreover, several companies have adopted AI technology to enhance customer experience and strengthen their presence in the AI industry 4.0.

Big Tech has spent billions of dollars in the AI revolution. So far, in 2024, Microsoft and Amazon have collectively allocated over $40 billion for investments in AI-related initiatives and data center projects worldwide.

DA Davidson analyst Gil Luria anticipates these companies will spend over $100 billion this year on AI infrastructure. According to Luria, spending will continue to rise in response to growing demand. Meanwhile, Wedbush analyst Daniel Ives projects continued investment in AI infrastructure by leading tech firms, “This is a $1 trillion spending jump ball over the next decade.”

Micron Technology’s Strategic Position

With a $156.54 billion market cap, MU is a crucial player in the AI ecosystem because it focuses on providing cutting-edge memory and storage products globally. The company operates through four segments: Compute and Networking Business Unit; Mobile Business Unit; Embedded Business Unit; and Storage Business Unit.

Micron’s dynamic random-access memory (DRAM) and NAND flash memory are critical components in AI applications, offering the speed and efficiency required for high-performance computing. The company has consistently introduced innovative products, such as the HBM2E with the industry’s fastest, highest capacity high-bandwidth memory (HBM), designed to advance generative AI innovation.

This month, MU announced sampling its next-generation GDDR7 graphics memory with the industry’s highest bit density. With more than 1.5 TB/s of system bandwidth and four independent channels to optimize workloads, Micron GDDR7 memory allows faster response times, smoother gameplay, and reduced processing times. The best-in-class capabilities of Micro GDDR7 will optimize AI, gaming, and high-performance computing workloads.

Notably, Micron recently reached an industry milestone as the first to validate and ship 128GB DDR5 32Gb server DRAM to address the increasing demands for rigorous speed and capacity of memory-intensive Gen AI applications.

Furthermore, MU has forged strategic partnerships with prominent tech companies like NVIDIA Corporation (NVDA) and Intel Corporation (INTC), positioning the company at the forefront of AI technology advancements. In February this year, Micron started mass production of its HBM2E solution for use in Nvidia’s latest AI chip. Micron’s 24GB 8H HBM3E will be part of NVIDIA H200 Tensor Core GPUs, expected to begin shipping in the second quarter.

Also, Micron's 128GB RDIMMs are ready for deployment on the 4th and 5th Gen Intel® Xeon® platforms. In addition to Intel, Micron’s 128GB DDR5 RDIMM memory will be supported by a robust ecosystem, including Advanced Micro Devices, Inc. (AMD), Hewlett Packard Enterprise Company (HPE), and Supermicro, among many others.

Further, in April, MU qualified a full suite of its automotive-grade memory and storage solutions for Qualcomm Technologies Inc.’s Snapdragon Digital Chassis, a comprehensive set of cloud-connected platforms designed to power data-rich, intelligent automotive services. This partnership is aimed at helping the ecosystem build next-generation intelligent vehicles powered by sophisticated AI.

Robust Second-Quarter Financials and Upbeat Outlook

Solid AI demand and constrained supply accelerated Micron’s return to profitability in the second quarter of fiscal 2024, which ended February 29, 2024. MU reported revenue of $5.82 billion, beating analysts’ estimate of $5.35 billion. This revenue is compared to $4.74 billion for the previous quarter and $3.69 billion for the same period in 2023.

The company’s non-GAAP gross margin was $1.16 billion, versus $37 million in the prior quarter and negative $1.16 billion for the previous year’s quarter. Micron’s non-GAAP operating income came in at $204 million, compared to an operating loss of $955 million and $2.08 billion for the prior quarter and the same period last year, respectively.

MU posted non-GAAP net income and earnings per share of $476 million and $0.42 for the second quarter, compared to non-GAAP net loss and loss per share of $2.08 billion and $1.91 a year ago, respectively. The company’s EPS also surpassed the consensus loss per share estimate of $0.24. During the quarter, its operating cash flow was $1.22 billion versus $343 million for the same quarter of 2023.

“Micron delivered fiscal Q2 results with revenue, gross margin and EPS well above the high-end of our guidance range — a testament to our team’s excellent execution on pricing, products and operations,” said Sanjay Mehrotra, MU’s President and CEO. “Our preeminent product portfolio positions us well to deliver a strong fiscal second half of 2024. We believe Micron is one of the biggest beneficiaries in the semiconductor industry of the multi-year opportunity enabled by AI.”

For the third quarter of 2024, the company expects revenue of $6.60 million ± $200 million, and its gross margin is projected to be 26.5% ± 1.5%. Also, Micron expects its non-GAAP earnings per share to be $0.45 ± 0.07.

Bottom Line

MU is strategically positioned to benefit from the burgeoning AI market, driven by its diversified portfolio of advanced memory and storage solutions, strategic partnerships and investments, robust financial health characterized by solid revenue growth and profitability, and expanding market presence.

The company’s recent innovations, including HBM3E and DDR5 RDIMM memory, underscore the commitment to advancing its capabilities across AI and high-performance computing applications.

Moreover, the company’s second-quarter 2024 earnings beat analysts' expectations, supported by the AI boom. Also, Micron offered a rosy guidance for the third quarter of fiscal 2024. Investors eagerly await insights into MU’s financial performance, strategic updates, and outlook during the third-quarter earnings conference call scheduled for June 26, 2024.

Braid Senior Research Analyst Tristan Gerra upgraded MU stock from “Neutral” to “Outperform” and increased the price target from $115 to $150, citing that the company has meaningful upside opportunities. Gerra stated that DRAM chip pricing has been rising while supply is anticipated to slow. Also, Morgan Stanley raised their outlook for Micron from “Underweight” to “Equal-Weight.”

As AI investments from numerous sectors continue to grow, Micron stands to capture significant market share, making it an attractive option for investors seeking long-term growth in the semiconductor sector.

Investing in AI: Should You Bet on AMD, Broadcom, or NVIDIA?

Is NVDA the Top Player in AI Stocks?

Initially famed for gaming GPUs, NVIDIA Corporation (NVDA) has evolved into a leader in data center hardware, spearheading AI advancement. The company’s Hopper GPUs are in high demand, accelerating AI applications from recommendation engines to natural language processing and generative AI large language models like ChatGPT on NVIDIA platforms. At this point, NVDA’s dominance in AI and data center markets is undeniable.

For the first quarter that ended April 28, 2024, Nvidia saw over 3x year-over-year increase to $26.04 billion, a new record level. NVIDIA’s Data Center Group (primarily connected to its AI operations) chalked up $22.60 billion in revenue, resulting in a 23% sequential gain and a massive 427% rise over the same period last year.

The chip giant’s operating income surged 690% from the year-ago value to $16.91 billion. NVIDIA’s non-GAAP net income amounted to $15.24 billion or $6.12 per share, compared to $2.71 billion or $1.09 per share in the previous year’s quarter, respectively.

Buoyed by a robust financial position, NVDA increased its quarterly dividend by 150% from $0.04 per share to $0.10 per share of common stock. The increased dividend is equivalent to $0.01 per share on a post-split basis and will be paid to its shareholders on June 28, 2024.

Moving forward, the company guided for a nice round of $28 billion in revenue for its second quarter of the fiscal year 2025, representing a projected 7.5% sequential gain. Its non-GAAP gross margin is expected to be 75.5%, plus or minus 50 basis points.

Analysts expect NVDA’s revenue for the fiscal 2025 second quarter (ending July 2024) to increase 109.7% year-over-year to $28.32 billion. The consensus EPS estimate of $6.35 for the current quarter indicates a 135.1% improvement year-over-year. Moreover, the company has an excellent earnings surprise history, surpassing the consensus EPS estimates in each of the trailing four quarters.

Nvidia’s comprehensive offerings, from chips to boards, systems, software, services, and supercomputing time, cater to expanding markets and diversify its revenue streams. Moreover, the chipmaker’s shares have surged more than 130% over the past six months and nearly 190% over the past year. NVIDIA's trajectory suggests an unstoppable momentum fueled by AI adoption mirroring a similar upward curve, promising a bright future.

Amid this, do AI stocks Broadcom Inc. (AVGO) and Advanced Micro Devices, Inc. (AMD) stand a chance to be as big as the industry leader, NVIDIA? Let’s fundamentally analyze them to find the answer.

Broadcom Inc. (AVGO)

Broadcom Inc. (AVGO) is emerging as one of Nvidia's toughest rivals in the race for networking revenue, especially as data centers undergo rapid transformation for the AI era. As a global tech leader, AVGO designs, develops, and supplies semiconductor and infrastructure software solutions. The company produces custom AI accelerators for major clients and recently projected $7 billion in sales from its two largest customers in 2024, who are widely believed to be Alphabet Inc. (GOOGL) and Meta Platforms, Inc. (META).

AVGO will announce its fiscal 2024 second-quarter earnings on June 12. Forecasts indicate a 37.4% year-over-year revenue surge to $12 billion, reflecting steady growth and financial resilience. Moreover, analysts expect a 5% uptick in the company’s EPS from the preceding year’s period to $10.84.

Broadcom has consistently exceeded consensus revenue and EPS estimates in each of the trailing four quarters, including the first quarter. Its net revenue increased 34% year-over-year to $11.96 billion, with a triple-digit revenue growth in the Infrastructure Software segment to $4.57 billion. AVGO’s gross margin grew 22.8% from the year-ago value to $7.37 billion.

On top of it, the company’s non-GAAP net income for the three months came in at $5.25 billion or $10.99 per share, up 17.2% and 6.4% year-over-year, respectively. Also, its adjusted EBITDA increased from the prior-year quarter to $7.16 billion.

Looking ahead, the company forecasts nearly $50 billion in revenues for fiscal year 2024, with adjusted EBITDA projected to be approximately 60% of its revenue. The company anticipates a 30% year-over-year surge in networking sales, driven by accelerated deployments of networking connectivity and the expansion of AI accelerators in hyperscalers. It also expects generative AI to account for 25% of semiconductor revenue.

The artificial intelligence megatrend is poised to significantly drive Broadcom's revenue and earnings growth in the upcoming decade. During a recent earnings call, Broadcom CEO Hock Tan emphasized, “Strong demand for our networking products in AI data centers, as well as custom AI accelerators from hyperscalers, are driving growth in our semiconductor segment.”

On May 20, 2024, AVGO announced its latest portfolio of highly scalable, high-performing, low-power 400G PCIe Gen 5.0 Ethernet adapters to revolutionize the data center ecosystem. These products offer an enhanced, open, standards-based Ethernet NIC and switching solution to resolve connectivity bottlenecks as XPU bandwidth and cluster sizes grow rapidly in AI data centers.

Patrick Moorhead, CEO & chief analyst at Moor Insights and Strategy, noted, “As the industry races to deliver generative AI at scale, the immense volumes of data that must be processed to train LLMs require even larger server clusters. Scalable high bandwidth, low latency connectivity is critical for maximizing the performance of these AI clusters.”

He added, “Ethernet presents a compelling case as the networking technology of choice for next-generation AI workloads. The 400G NICs offered by Broadcom, built on its success in delivering Ethernet at scale, offers open connectivity at an attractive TCO for power-hungry AI applications.”

With the company's expanding presence in the AI space, Broadcom stands out as a compelling alternative to major chip companies such as NVDA and AMD. Over the past six months, shares of AVGO have gained more than 42%, and nearly 63% over the past year, making it an attractive addition to your investment portfolio.

Advanced Micro Devices, Inc. (AMD)

Advanced Micro Devices, Inc. (AMD) has been at the forefront of innovation in high-performance computing, graphics, and visualization technologies for decades. While NVDA may be the first name that comes to mind in AI processor sales, AMD has established itself as a formidable competitor in the GPU space, particularly excelling in chips tailored for AI workloads.

However, AMD's influence doesn't stop in hardware; it has been actively expanding its AI software ecosystem. The company recently unveiled the groundbreaking AMD Ryzen™ AI 300 Series processors, featuring the world’s most powerful Neural Processing Unit (NPU). These processors are designed to bring AI capabilities directly to next-gen PCs, promising a future where AI-infused computing is seamlessly integrated into everyday tasks.

Additionally, the next-gen AMD Ryzen™ 9000 Series processors for desktops solidify AMD’s position as a leader in performance and efficiency for gamers, content creators, and prosumers alike.

Moreover, the company’s comprehensive roadmap for the Instinct accelerator series promises an annual cadence of cutting-edge AI performance and memory capabilities across each generation. Beginning with the imminent release of the AMD Instinct MI325X accelerator in Q4 2024, followed by the anticipated launch of the AMD Instinct MI350 series powered by the new AMD CDNA™ 4 architecture in 2025, AMD is poised to deliver up to a 35x increase in AI inference performance compared to its previous iterations.

In the first quarter that ended March 30, 2024, AMD’s non-GAAP revenue increased 2.2% year-over-year to $5.47 billion. Both its Data Center and Client segments experienced substantial growth, each exceeding 80% year-over-year, fueled by the uptake of MI300 AI accelerators and the popularity of Ryzen and EPYC processors.

Moreover, the company’s non-GAAP operating income grew 3.2% from the year-ago value to $1.13 billion. Its non-GAAP net income and earnings per share rose 4.4% and 3.3% from the prior-year quarter to $1.01 billion and $0.62, respectively.

AMD expects its revenue in the second quarter of 2024 to be around $5.7 billion, with a projected growth of 6% year-over-year and 4% sequentially. Meanwhile, its non-GAAP gross margin is expected to be around 53%.

Street expects AMD’s revenue for the second quarter (ending June 2024) to increase 6.7% year-over-year to $5.72 billion. Its EPS for the ongoing quarter is projected to reach $0.68, registering a 17% year-over-year growth. Moreover, the company surpassed the consensus revenue estimates in each of the trailing four quarters.

While Nvidia’s Data Center segment reported a sales run rate of $90 billion in the last quarter alone, experts predict that the company could surpass the $100 billion mark in Data Center sales with this momentum. In contrast, AMD's recent guidance forecasts sales of $3.5 billion for its MI300 AI chips in 2024. There’s still a sizable gap between NVIDIA and AMD in AI revenue. To put things into perspective, NVDA's networking revenue alone is approximately four times larger than AMD's total AI chip sales.

Nonetheless, AMD is poised to drive AI innovation across various domains with a diverse portfolio spanning cloud, edge, client, and beyond. The stock has gained more than 55% and 39% over the past nine months and a year, respectively.

Bottom Line

With the global artificial intelligence (AI) market projected to soar from $214.6 billion in 2024 to $1.34 trillion by 2030 (exhibiting a CAGR of 35.7%), leading chip companies, including NVIDIA, Broadcom, and Advanced Micro Devices, are rapidly expanding their market presence, vying for a piece of the pie.

Given their solid fundamentals and promising long-term outlooks, NVDA, AVGO, and AMD appear in good shape to thrive in the foreseeable future. Thus, investors can place their bets on these stocks to garner profitable returns and capitalize on the upward curve of AI.

Why Nvidia’s Stock Split Could Drive Further Market Gains

NVIDIA Corporation (NVDA) shares topped a record high of $1000 in a post-earnings rally. Last week, the company reported fiscal 2025 first-quarter results that beat analyst expectations for revenue and earnings, reinforcing investor confidence in the AI-driven boom in chip demand. Moreover, the stock has surged nearly 120% over the past six months and more than 245% over the past year.

Meanwhile, the chipmaker announced a 10-for-1 forward stock split of NVIDIA’s issued common stock, making stock ownership more accessible to employees and investors.

Let's delve deeper into how NVIDIA’s stock split decision could attract more investors and propel future gains.

The AI Chip Leader

NVDA’s prowess in AI and semiconductor technology has been nothing short of remarkable. Its GPUs (Graphics Processing Units) have become synonymous with cutting-edge AI applications, from powering self-driving cars and training and deploying LLMs to revolutionizing healthcare diagnostics and e-commerce recommendation systems.

Amid a rapidly evolving technological landscape, NVIDIA has consistently remained at the forefront, driving innovation and redefining industry standards. Led by Nvidia, the U.S. dominates the generative AI tech market. ChatGPT’s launch in November 2022 played a pivotal role in catalyzing the “AI boom.”

NVDA holds a market share of about 92% in the data center GPU market for generative AI applications. The company’s chips are sought after by several tech giants for their diverse applications and high performance, including Amazon (AMZN), Meta Platforms, Inc. (META), Microsoft Corporation (MSFT), Alphabet Inc. (GOOGL), and Tesla, Inc. (TSLA).

Nvidia surpassed analyst estimates for revenue and earnings in the first quarter of fiscal 2025, driven by robust demand for its AI chips. In the first quarter that ended April 28, 2024, NVIDIA’s revenue rose 262% year-over-year to $26.04 billion. That topped analysts’ revenue expectations of $24.59 billion. The company reported a record revenue from its Data Center segment of $22.60 billion, up 427% from the prior year’s quarter.

“Our data center growth was fueled by strong and accelerating demand for generative AI training and inference on the Hopper platform. Beyond cloud service providers, generative AI has expanded to consumer internet companies, and enterprise, sovereign AI, automotive and healthcare customers, creating multiple multibillion-dollar vertical markets,” said Jensen Huang, founder and CEO of NVDA.

“We are poised for our next wave of growth. The Blackwell platform is in full production and forms the foundation for trillion-parameter-scale generative AI,” Huang added. 

NVDA’s non-GAAP gross profit grew 328.2% from the year-ago value to $20.56 billion. The company’s non-GAAP operating income was $18.06 billion, an increase of 491.7% from the prior year’s quarter. Its non-GAAP net income rose 461.7% year-over-year to $15.24 billion.

Furthermore, the chipmaker reported non-GAAP EPS of $6.12, compared to the consensus estimate of $5.58, and up 461.5% year-over-year.

Nvidia’s Stock Split: A Strategic Move

Alongside an outstanding fiscal 2025 first-quarter earnings, NVDA announced a 10-for-1 stock split of its issued common stock. Nvidia’s decision to split its stock aligns with a broader trend among tech giants to make their shares more appealing to a wider range of investors, particularly retail investors. The chipmaker aims to democratize ownership and attract a vast investor base by breaking down the barrier of high share prices.

As more individual investors gain access to Nvidia’s shares post-stock split, we could see heightened trading activity and increased demand, potentially exerting upward pressure on its share prices. This strategic move reflects the confidence of NVIDIA’s management in its future growth trajectory and underscores its commitment to inclusivity in the investment landscape.

Bank of America analysts, led by Jared Woodward, head of the bank’s research investment committee, described the share split as “another large-cap tech pursuing shareholder-friendly policies” in a note to clients.

NVIDIA marks the fourth Magnificent Seven big tech companies to announce a stock split since 2022, following Google, Amazon, and Tesla’s efforts to make shares more accessible, according to Woodward and his team.

In recent years, as the share prices of several Big Tech companies surged past the $500 mark, it has become challenging for retail investors to buy shares. Consequently, these companies have been exploring ways to simplify the process for nonprofessional investors to buy in. BofA added, “Big Tech is going bite-sized” to lure retail investors, which might signal more market-beating returns.

Historical Data Suggests That Stock Splits Indicate a Bullish Outlook

Examining historical data on stock splits reveals a generally positive picture. While immediate post-split gains aren’t guaranteed, companies like Apple Inc. (AAPL) and Google have witnessed substantial appreciation in their share prices following splits. AAPL’s 4-for-1 stock split, which took effect in August 2020, primarily influenced investor sentiment and trading dynamics.

Following the split, Apple’s stock continued its upward trajectory, driven by solid performance in its core businesses, including iPhone sales, services revenue, and wearables. Throughout the latter half of 2020 and into 2021, its share price experienced significant appreciation, reaching new all-time highs.

Given NVIDIA’s robust fundamentals and leadership in AI and semiconductor technology, there’s reason to believe that its recent stock split could lead to similar outcomes.

BofA’s sell-side analysts have consistently been bullish on Nvidia shares, and following the first-quarter earnings release, they raised their lofty 12-month price target for the chip giant from $1,100 to $1,320. If the outlook proves accurate, Nvidia shares could surge by another 26%, and the stock split could support that bullish move, as per Bank of America’s reading of history.

“Splits have boosted returns in every decade, including the early 2000s when the S&P 500 struggled,” noted Woodard and his team. BofA’s research indicates that stocks have delivered 25% total returns within the 12 months following a stock split historically, compared to the S&P 500’s 12%.

Further, the bank highlighted that stock splits often ignite bullish runs, even in stocks that have been underperforming. For example, both Advanced Micro Devices, Inc. (AMD) and Valero Energy Corporation (VLO) experienced significant share price increases after announcing stock splits despite their prior poor performance. According to analysts, “Since gains are more common and larger than losses on average, splits appear to introduce upside potential into markets.”

However, it's essential to heed the standard caveat the Securities and Exchange Commission (SEC) provided: “Past performance is not indicative of future results.” In line, Bank of America emphasized that “outperformance is no guarantee” after a stock split. Companies still witness negative returns 30% of the time following a split, with an average decline of 22% over the subsequent 12 months.

The analysts noted, “While splits could be an indication of strong momentum, companies can struggle in a challenging macro environment.” They pointed to companies like Amazon, Google, and Tesla that faced difficulties in the 12 months following their stock splits in 2022 due to a high interest-rate environment.

Bottom Line

NVDA has a significant role as a global leader in AI and semiconductor technology, with its GPUs driving innovations across numerous industries, such as tech, automobile, healthcare, and e-commerce. Nvidia’s fiscal 2025 first-quarter results suggest that demand for its AI chips remains robust.

Statista projects the global generative AI market to reach $36.06 billion in 2024. This year, the U.S. is expected to maintain its position as the leader in AI market share, with a total of $11.66 billion. Further, the market is estimated to grow at a CAGR of 46.5%, resulting in a market volume of $356.10 billion by 2030. The AI market’s bright outlook should bode well for NVDA.

The company also recently made headlines with its announcement to undergo a 10-for-1 stock split. While stock splits generally do not change the fundamental value of a company, they make its shares more accessible and attractive to retail investors. So, the recent stock split could significantly increase retail participation, driving heightened trading activity and potentially exerting upward pressure on Nvidia’s share prices.

Historically, stock splits generally indicate a positive impact on stock performance. Companies like AAPL, GOOGL, and AMD experienced substantial price appreciation after stock splits, with enhanced accessibility to retail investors driving higher demand and liquidity.

However, it is crucial to acknowledge that past performance is not indicative of future results. While stock splits can signal strong price momentum, they do not guarantee outperformance.

In conclusion, Nvidia’s stock split will likely attract more retail investors, potentially boosting increased trading activity and stock price appreciation. Coupled with the company’s strong position in the AI and semiconductor markets, the stock split could facilitate further growth, aligning with historical trends of positive post-split performance.

Understanding the Bearish Signals in This Chipmaker's Stock Chart

Intel Corporation’s (INTC) shares plunged nearly 31% in April, marking their worst month in more than two decades, as the prominent chipmaker continues to grapple with executing a turnaround. Moreover, the stock has dropped approximately 40% year-to-date.

Most of INTC’s sell-off occurred after its recent financial results, which included a bleak forecast, indicating that the company’s turnaround efforts will require more time and investment. Further, Intel’s factory operations faced challenges in March, adding to investor concerns.

Mixed First-Quarter Earnings and Weak Forecast

During the first quarter that ended March 30, 2024, INTC’s net revenue increased 8.6% year-over-year to $12.72 billion. However, that missed analysts’ estimate of $12.78 billion. Also, the company’s Foundry business reported $4.40 billion in revenue, down 10% year-over-year.

The chipmaker’s gross margin rose 30.2% from the prior year’s quarter to $5.22 billion. Its operating loss was $1.07 billion, compared to $1.47 billion in the previous year’s period. However, Intel Foundry posted a $2.50 billion operating loss during the quarter. In 2023, this unit reported a hefty operating loss of $7 billion.

Furthermore, INTC’s net income came in at $437 million versus $2.77 billion in the same quarter of 2023. Also, the loss per share attributable to Intel was $0.09, compared to $0.66 in the prior year’s quarter. That surpassed the consensus loss per share estimate of $0.15.

Intel’s primary business remains manufacturing chips for PCs and laptops, categorized as Client Computing Group (CCG). This business unit revenue amounted to $7.50 billion, a 31% increase year-over-year.

In addition, Intel produces central processors for servers and other components and software, which are classified under its Data Center and AI business segment. Sales in this segment rose by 5% year-over-year to $3 billion. However, the chipmaker faces stiff competition in the server market, particularly against AI chips from companies like NVIDIA Corporation (NVDA).

In addition, for the second quarter of fiscal 2024, the company expects its revenue to come between $12.5 billion and $13.5 billion. It projects a loss per share of $0.05 for the current quarter, and its non-GAAP earnings per share are expected to be $0.10.

INTC recently revised its current-quarter revenue guidance after the U.S. Department of Commerce revoked certain export licenses intended to send its chips to the Chinese tech company Huawei.

On May 7, the chipmaker said in an 8-K filing with the SEC that it had received a notification from federal regulators that they were “revoking certain licenses for exports of consumer-related items to a customer in China, effective immediately.”

On Wednesday, Intel announced that due to the Commerce Department's directive, it expects revenue for the second quarter to fall below the midpoint of the original range of $12.5 billion to $13.5 billion. However, the company continues to expect full-year revenue and earnings to be higher than in 2023.

Intel Faces Fierce Competition

INTC, a longstanding leader in the semiconductor industry, has been facing rigid competition from rivals, including Advanced Micro Devices, Inc. (AMD) and Nvidia. Intel remains dominant in the PC chip market, but AMD is gaining ground in server, desktop, and mobile segments, as per the latest figures from Mercury Research.

Intel remains the leading player in the server CPU segment, with a market share of 79.2% during the first quarter; however, this is down from 82% in the year-ago quarter, indicating some erosion in its market share. On the other hand, AMD made gains in this segment, rising from just 18% a year ago to 23.6% in the first quarter of 2024.

Also, Intel's market share in the mobile CPU segment was 80.7% in the first quarter of 2024, compared to 83.8% in the prior year’s quarter. However, AMD’s 19.3% market share in the first quarter was 3.1% up from the same period in 2023. Further, AMD gained on Intel, with its 23.9% desktop share in the fiscal 2024 first quarter, up 4.7% a year ago.

Besides, INTC continues to fight for server market share against competitor NVDA, particularly in AI chips. Nvidia commands around 80% of the AI chip market with its graphics processors (GPUs), which AI builders have favored over the past year.

Earlier in April, Intel introduced its latest AI chip, Gaudi 3, as competition from NVDA intensified. The company claimed the new Gaudi 3 chip is over twice as power-efficient and can run AI models 1.5 times faster than Nvidia’s H100 GPU. Also, it is available in various configurations, such as a bundle of eight Gaudi 3 chips on a single motherboard or a card designed to fit into existing systems.

Intel tested the chip on models like Meta's open-source Llama and Falcon, backed by Abu Dhabi. It highlighted that Gaudi 3 could be instrumental in training or deploying models, including Stable Diffusion and OpenAI’s Whisper model for speech recognition.

Also, Intel is losing market share to rivals such as Arm Holdings PLC (ARM), Samsung Electronics, and Taiwan Semiconductor Manufacturing Ltd. (TSM).

Analysts Lowered Price Targets for Intel Shares

Goldman Sachs analysts slashed their price target for INTC stock from $39 to $34 and lowered their adjusted EPS estimates for the 2024-2026 period by an average of 18%. Also, they reaffirmed their “Sell” rating for the stock, which has been in effect since July 2020.

“We worry the company will continue to cede wallet share within the overall Data Center Compute market to the likes of Nvidia and Arm,” Goldman analysts said.

Meanwhile, Bank of America Corporation (BAC) cut its price objective to $40 from $44, citing higher costs, lower growth, and fierce competition. According to BofA analysts, the bleak second-quarter revenue guidance highlights that “topline growth remains lukewarm on limited AI exposure, while underutilized manufacturing and elevated costs.”

They added that Intel’s “enterprise incumbency, US-based manufacturing assets and weak investor sentiment provide turnaround potential.”

Bottom Line

INTC’s first-quarter 2024 earnings surpassed Wall Street’s expectations for EPS but fell short on sales. The chipmaker also provided a weak forecast for the current quarter.

After the U.S. Department of Commerce recently revoked certain licenses for exports of chips to Huawei in a bid to curb China’s tech power, Intel revised its second-quarter revenue guidance, anticipating below the initial range of $12.5 billion to $13.5 billion.

INTC’s stock fell more than 30% in April, making its biggest decline since June 2002. Moreover, the stock is trading below its 50-day and 200-day moving averages of $38.33 and $39.74, respectively, indicating a downtrend.

Despite INTC’s more than 50 years of dominance in the semiconductor industry, it now faces intense competition from competitors like AMD, NVDA, TSM, Samsung, ARM, and more. Also, the ongoing AI boom has caused a shift in enterprise spending away from Intel’s traditional data center chips.

With limited AI exposure, the intensifying competition raises doubts about Intel’s future dominance in the semiconductor industry.

INTC’s CEO Pat Gelsinger told investors on an earnings call to focus on the company’s long-term potential.

Analysts expect INTC’s revenue to increase marginally year-over-year to $13.06 billion for the second quarter ending June 2024. However, its EPS for the current quarter is expected to decline 18.2% year-over-year to $0.11. For the fiscal year 2024, the chipmaker’s revenue and EPS are expected to grow 3.3% and 4.8% year-over-year to $55.99 billion and $1.10, respectively.

“While 2024 should mark a bottom in many aspects of the business, the pace of the climb back up is unlikely to remain unclear,” Stifel stated in a note to clients.

Given INTC’s disappointing revenue guidance, regulatory issues, and fierce competition, it could be wise to avoid investing in this stock now.

Why Super Micro Computer (SMCI) Could Be a Hidden Gem for Growth Investors

In March 2024, Super Micro Computer, Inc. (SMCI) became the latest artificial intelligence (AI) company to join the S&P 500 index, just a little more than a year after joining the S&P MidCap 400 in December 2022. Shares of SMCI jumped by more than 2,000% in the past two years, driven by robust demand for its AI computing products, which led to rapid sales growth.

Moreover, SMCI’s stock has surged nearly 205% over the past six months and more than 520% over the past year. A historic rally in the stock has pushed the company’s market cap past $48 billion.

SMCI is a leading manufacturer of IT solutions and computing products, including storage and servers tailored for enterprise and cloud data centers, purpose-built for use cases such as AI, cloud computing, big data, and 5G applications. The company has significantly benefited from the ongoing AI boom in the technology sector.

According to ResearchAndMarkets.com’s report, the global AI server market is expected to reach $50.65 billion by 2029, growing at a CAGR of 26.5% during the forecast period (2024-2029).

Specializing in servers and computer infrastructure, SMCI maintains long-term alliances with major tech companies, including Nvidia Corporation (NVDA), Intel Corporation (INTC), and Advanced Micro Devices, Inc. (AMD), which have fueled the company’s profitability and growth.

Let’s discuss Super Micro Computer’s fundamentals and growth prospects in detail:

Recent Strategic Developments

On April 9, SMCI announced its X14 server portfolio with future support for the Intel® Xeon® 6 processor with early access programs. Supermicro’s Building Block Architecture, rack plug-and-play, and liquid cooling solutions, along with the breadth of the new Intel Xeon 6 processor family, enables the delivery of optimized solutions for any workload and at any scale, offering superior performance and efficiency.

The upcoming processor family will be available with Efficient-core (E-core) SKUs rising performance-per-watt for cloud, networking, analytics, and scale-out workloads, and Performance-core (P-core) SKUs increasing performance-per-core for AI, HPC, Storage and Edge workloads. 

Also, the upcoming processor portfolio will feature built-in Intel Accelerator Engines with new support for FP16 on Intel Advanced Matrix Extensions.

In the same month, SMCI expanded its edge compute portfolio to accelerate IoT and edge AI workloads with a new generation of embedded solutions.

“We continue to expand our system product line, which now includes servers that are optimized for the edge and can handle the demanding workloads where massive amounts of data are generated,” said Charles Liang, president and CEO of SMCI.

“Our building block architecture allows us to design and deliver a wide range of AI servers that give enterprises the solutions they need, from the edge to the cloud. Our new Intel Atom-based edge systems contain up to 16GB of memory, dual 2.5 GbE LAN ports, and a NANO SIM card slot, which enables AI inferencing at the edge where most of the world's data is generated,” Liang added.

Also, on March 19, Supermicro unveiled its newest lineup aimed at accelerating the deployment of generative AI. The Supermicro SuperCluster solutions offer foundational building blocks for the present and the future large language model (LLM) infrastructure.

The full-stack SuperClusters include air- and liquid-cooled training and cloud-scale inference rack configurations with the latest NVIDIA Tensor Core GPUs, Networking, and NVIDIA AI Enterprise software.

Further, SMCI announced new AI systems for large-scale generative AI featuring NVIDIA's next-generation of data center products, such as the latest NVIDIA GB200 Grace™ Blackwell Superchip, the NVIDIA B200 Tensor Core, and B100 Tensor Core GPUs.

Supermicro is upgrading its existing NVIDIA HGX™ H100/H200 8-GPU systems for seamless integration with the NVIDIA HGX™ B100 8-GPU, thus reducing time to delivery. Also, the company strengthens its broad NVIDIA MGX™ systems range with new offerings featuring the NVIDIA GB200, including the NVIDIA GB200 NVL72, a comprehensive rack-level solution equipped with 72 NVIDIA Blackwell GPUs.

Additionally, Supermicro is introducing new systems to its portfolio, including the 4U NVIDIA HGX B200 8-GPU liquid-cooled system.

Solid Third-Quarter 2024 Results

For the third quarter that ended March 31, 2024, SMCI’s revenue increased 200.8% year-over-year to $3.85 billion. Its non-GAAP gross profit grew 163.9% from the year-ago value to $600.59 million. Its non-GAAP income from operations was $434.42 million, up 290.7% year-over-year.

The server assembler’s non-GAAP net income rose 340% from the prior year’s quarter to $411.54 million. Its non-GAAP net income per common share came in at $6.65, an increase of 308% year-over-year.

As of March 31, 2024, Super Micro Computer’s cash and cash equivalents stood at $2.12 billion, compared to $440.46 million as of June 30, 2023. The company’s total current assets were $8.06 billion versus $3.18 billion as of June 30, 2023.

Charles Liang, President and CEO of Supermicro, said, “Strong demand for AI rack scale PnP solutions, along with our team’s ability to develop innovative DLC designs, enabled us to expand our market leadership in AI infrastructure. As new solutions ramp, including fully production ready DLC, we expect to continue gaining market share.”

Raised Full-Year Revenue Outlook

SMCI expects net sales of $5.10 billion to $5.50 billion for the fourth quarter of fiscal year 2024 ending June 30, 2024. The company’s non-GAAP net income per share is anticipated to be between $7.62 and $8.42.

For the fiscal year 2024, Supermicro raised its guidance for revenues from a range of $14.30 billion to $14.70 billion to a range of $14.70 billion to $15.10 billion. Its non-GAAP net income per share is expected to be from $23.29 to $24.09.

CEO Charles Liang said he expects AI growth to remain solid for several quarters, if not years, to come. To support this rapid growth, the company had to raise capital through a secondary offering this year, Liang added.

Meanwhile, finance chief David Weigand said that the company’s supply chain continues to improve.

Bottom Line

SMCI’s fiscal 2024 third-quarter results were exceptional, with a record revenue of $3.85 billion and a non-GAAP EPS of $6.65. This year-over-year revenue growth of 200% and year-over-year non-GAAP EPS growth of 308% significantly outpaced its industry peers.

After reporting outstanding financial performance, the company raised its full-year revenue forecast as it points to solid AI demand.

Super Micro Computer, which joined the S&P 500 in March, has a unique edge among server manufacturers aiming to capitalize on the generative AI boom. Notably, the server maker’s close ties with Nvidia allow it to launch products superior to competitors, including Dell Technologies Inc. (DELL) and Hewlett Packard Enterprise Company (HPE).

The company has a history of being among the first to receive AI chips from NVDA and AMD as it assists them in checking server prototypes, giving it a head start over rivals. This has positioned SMCI as a key supplier of servers crucial for generative AI applications, leading to a remarkable 192% surge in shares so far this year.

According to an analyst at Rosenblatt Securities, Hans Mosesmann, “Super Micro has developed a model that is very, very quick to market. They usually have the widest portfolio of products when a new product comes out from Nvidia or AMD or Intel.”

Moreover, analysts at Bank of America project that SMCI’s share of the AI server market will expand to around 17% in 2026 from 10% in 2023. Argus analyst Jim Kelleher also seems bullish about SMCI. Kelleher maintained a Buy rating on SMCI’s stock.

According to the analyst, Super Micro Computer is a leading server provider for the era of generative AI. Alongside a comprehensive range of rack and blade servers for cloud, enterprise, data center, and other applications, SMCI offers GPU-based systems for deep learning, high-performance computing, and various other applications.

Given solid financials, accelerating profitability, and robust near-term growth outlook, investors could consider buying this stock for substantial gains.