Outlook 2024 - AI & Robotics: Full steam ahead!
2023 has been a foundational year for the theme. We expect 2024 to deliver the continuation of such a process, in what we expect to be truly a revolution. Welcome to the AI age.
Semiconductors: too much, too fast?
Need for chips
If there is one sector that has truly benefited from the Artificial Intelligence (AI) frenzy, it is definitely the semiconductors one. The poster child here is obviously NVIDIA Corp, which has leveraged its position as a leading provider in Machine Learning (ML) solutions to capture most, if not all the rising demand for GPU-enabled computing power, shattering quarter after quarter even the most optimistic expectations with a diabolical regularity.
Other players have observed similar ascending trajectories (e.g., AMD), buoyed by a euphoria fueled by the rising popularity of large AI models, which require tremendous amounts of computing power to be trained, but also to be used (aka inference). And, for sure, every CEO and CFO tried to get a slice of the cake by relating their business to the technology. A recent example is ARM’s IPO, with management promoting the company as a central player in the AI supply chain despite the business currently still relying mostly on the smartphone market.
Where do we go from now?
This euphoria is based on the postulate that such AI models will retain their current growth trajectory, which supposes both steady growth in users and use cases, and continuous increase in model complexity. If both are likely over the long term (after all, if this was not our conviction, we would have closed the strategy!), it is more uncertain over the coming quarters. The novelty effect is indeed already starting to wear off, with many users expecting applications to mature a bit before committing more to the technology. But more importantly, thorough efforts have been initiated to optimize current models, as cost reduction appears paramount to ensure the sector’s profitability and thus longevity, especially in anticipation of the advent of a multitude of applications.
This leads to questioning the actual capacity required over the short term. It makes no doubt that there has been a chip rush to cover initial needs, as proven by the scarce availability of GPUs in the past few quarters. With normalizing user growth and ongoing optimization efforts, target nominal capacity may be reached much sooner than expected, leading to a temporary lull in demand. On top of that, China will increasingly be out of the equation due to U.S. sanctions and will not be able to stockpile hardware as it did in recent months.
Given current valuation levels, any sign of early normalization would trigger a significant sell-off in the share price of companies with the most exposure to hardware. It may be some time away, as 1H24 comps will remain favorable, but when this happens, it will be the signal for us to reduce our exposure and switch from the general-purpose hardware, to the more specialized ones.
First signs of change
AI hardware currently in use is indeed primarily derived from technologies not expressly designed for this: CPUs are general-purpose chips, while GPUs were initially designed for 3D applications. However, their flexibility perfectly fits the early stage of the AI landscape, with much experimenting in algorithm design. But with the sector maturing, specialized chips are bound to take the relay, especially when it comes to inference, as it will be way too costly to rely on GPUs for such tasks.
While the current landscape for such chips is limited, as generative AI (genAI) remains in its infancy, the first signs of such a change are materializing, with Alphabet being already at its fifth generation of so-called TPU; Microsoft Corp unveiled in November two specialized chips for its cloud infrastructure; and chipmaker Marvell is reporting growing traction for its specialized products. Competition is already brewing, something logical considering what is at stake and the fact that major players do not want to become too reliant on NVIDIA Corp. Volatility is to be expected, but excitement is guaranteed!
Data: let it flow
Navigating the data maze
Data is often referred to as the 21st-century oil. This affirmation has never been that accurate since the advent of genAI. Such models indeed require colossal amounts of data to be trained accurately: to put things in perspective, training the initial version of ChatGPT required over 300bn words, largely scrapped on the internet and in a large array of books. In addition, maintaining a model’s relevance means regularly feeding it with fresh data. This implies the existence of a complex and robust supply chain able to capture, store, format, and exploit data efficiently and at a large scale.
Only an infrastructure built on the cloud can address this problem. Cloud computing indeed allows to easily and dynamically scale the number of resources required for any given task. So easily, in fact, that many companies abused it and realized they were spending too much and too inefficiently, leading during the past few quarters to a period of optimization. Given the sentiment echoed across the latest earnings season, we believe the worst is behind us. We believe that the only significant downside risk would come from a deep recession and a general consumption cut. However, even in such a scenario, cloud technologies will remain in demand given their inherent efficiency advantages.
Some winners already emerge
Behind the “basic” cloud services (compute and storage), the rise of the cloud leaves as a major beneficiary the entire software ecosystem which is thriving to support and enhance the technology’s potential. We have notably in mind players active in the observability segment, i.e., how to optimize cloud workloads (e.g., Datadog Inc), and in the data integration segment, i.e., how to aggregate and prepare data to exploit it efficiently and accurately (e.g., Snowflake). Both these segments already have players of a respectable size showing explosive growth.
However, we believe the focal point lies within integrated ML platforms. The current complexity of deploying AI applications necessitates specialized skills and tools, a overtly recognized bottleneck. This challenge has prompted the emergence of platforms embracing low-code or no-code methodologies, enabling individuals with moderate training to engage in model training. As a well known example, one can mention Dataiku's platform, with its so-called "visual" ML managing the process from data ingestion to training. While this sector may be in its early stages compared to its slightly more mature counterparts, it unmistakably holds the key to differentiation. This is particularly evident as capabilities rapidly advance in tandem with the increasing sophistication of AI models, creating a positive feedback loop.
Regulatory concerns are easing, for now
Who owns the data decides where AI will go. Hence, some problems materialized when it appeared that some models, mainly in the image generation segment, had been trained on artwork in violation of authors' copyright. Lawsuit announcements quickly followed, with legal procedures still ongoing at the time of writing. This creates a potential significant headwind for the theme, as it would create constraints across the entire data pipeline.
However, as we recently wrote, some relief unexpectedly came from the U.S. executive order to regulate AI. This order, while clearly emphasizing security, did not say much, if at all, regarding the use of data, leaving the issue to be settled by future standards and jurisprudence. Although it does not fully clear the way of roadblocks, it shows that data is not the regulator’s priority, corresponding to a pro-business stance. Consequently, although we expect some regulation to emerge at some point, especially in jurisdictions that are more privacy-oriented, such as the E.U., we do not see it as a showstopper for the foreseeable future.
Physical automation: the most impacted by macro?
The double-edged sword of recession
Our fundamental conviction regarding physical automation has not changed: due to a variety of factors ranging from an aging population to a chronic shortage of workers, the penetration rate of robots is a question of when rather than if. However, despite these strong structural drivers, the segment’s expansion remains correlated to the capacity of its end market to absorb it. Unfortunately, manufacturing industries are typically low margins businesses with long investment cycles and some inertia. Adding concerns regarding an economic slowdown may not lead to the most favorable landscape for automation suppliers. Unfortunately, this has translated into disappointing returns this year, particularly for sectors where we had high expectations, such as warehouse automation.
Yet we remain convinced that players active in the space have reasons to believe, even (especially?) in the current challenging macroeconomic context. Automation is a significant cost saver, which should be inherently appealing for low-margin industries. The upfront cost problem can be alleviated thanks to ingenious pricing models (e.g., AutoStore with its pay-per-pick pricing, where it bears all the installation costs and bills depending on the actual use). We, therefore, see potential for innovative players active in cost-sensitive fields (e.g., logistics) or bringing transformational capabilities (e.g., cobots), and believe that investors, after currently pricing a recession, will start to factor in the following recovery. On the more traditional segments, we keep a firm conviction on Chinese automation, given the strong level of political support for the development of a local ecosystem, unavoidable to remain competitive in the global economy. Therefore, our currently limited exposure to the robotic segment will remain focused on automated logistics and China players.
Augmented reality: better off in the shadows?
Meta Platforms Inc’s Metaverse has lived. Good news, as the hype around what was essentially a half-baked social network was completely obfuscating the actual ongoing innovations in the field, such as virtual universes generated to train AI models. The most blatant example of how wrong Meta Platforms Inc’s approach was is materialized by Apple Inc, which, instead of trying to be a jack-of-all-trades, launched a headset initially targeting professional use cases – although the jury is still out regarding its potential commercial success. Still, we think this is indeed the way to follow in hardware, as there is much more value in professional rather than consumer environments in the foreseeable future.
However, from our perspective, the most exciting developments are from the software side. Computed Aided Design (CAD) and Computed Aided Engineering (CAE) are already well implemented but are gaining further traction with new generations of tools that allow far more complex tasks and integration. Their ongoing extension is, of course, the digital twins we wrote about, which allow the simulating of the lifecycle of a given asset while keeping in touch with it through data feedback. Combined, these tools allow for substantial productivity gains and cost savings, which are particularly welcomed in a challenging macroeconomic context.
Still, we see the holy grail as digitally simulated training universes, i.e., metaverses for machines. Such sandboxes allow more controlled and accelerated testing for algorithms before their deployment in real-world applications. This means lower development costs due to less reliance on development hardware and, ultimately, more security for the end-users, as a higher number of scenarios can be tested. When it comes to players active in the field, aside from usual suspects such as Unity Software Inc, we paradoxically see NVIDIA Corp as an emerging champion. It will take time for the company to lose its image of a hardware pure player, but this embodies our view that the theme will slowly but surely be driven by software instead of hardware.
Application: get ready, as this is only the beginning
The potential of AI is just starting to unfold
Think about it: 24 months ago, chatbots were considered at best as useful gadgets, while art was considered intrinsically reserved to humans. Fast forward to the present day, and these conceptions have been blown to pieces by technology (the transformer model) that was only invented in 2017 and which went mainstream with ChatGPT in late 2022. Now, factor in that several dozen companies, ranging from startups to conglomerates, with billions in funding, are working day and night to improve and optimize the technology. No sector will escape it, with applications already targeting segments as varied as personalized marketing, education, healthcare, engineering, cybersecurity, energy, or climate change. And, for all the impressive things that have emerged thanks to AI in the past few quarters, it will undoubtedly look like protohistory in no more than a decade.
The race is more than on
We believe that, for all their flaws, AI applications have already started to have a major impact on businesses. This impact is mostly felt in terms of strategic questioning for now, as genAI remains too unreliable at the present time to implement it in critical applications, limiting it to the experimental stage. However, there is not one month without a critical breakthrough in capability: just looking at the progress of OpenAI’s GPT models since the launch of ChatGPT is a blatant example. Multimodality, i.e., the capability to deal with several types of inputs such as text and images, is about to become the norm and will leverage the capabilities of the technology to such levels that are hard to imagine today.
A race to innovation is ongoing. Behind it, a commercial rush has also started, as AI applications already command a pricing premium, exemplified by the paying versions of ChatGPT or of Microsoft Corp’s copilot. With the removed overhang from regulation and capabilities improving by the day, we do not see the trend reverting soon, although the technology will need to improve in terms of reliability. But for well-defined applications, the tsunami will be impossible to stop.
In this regard, one of our strongest convictions remains Robotic Process Automation (RPA), i.e., the automation of basic and repetitive office tasks. The field is not new but will significantly improve in terms of capabilities thanks to genAI. Most importantly, it targets productivity gains, which is crucial for businesses in a standard setting but becomes paramount with economic headwinds. Similarly, we see specialized platforms targeting marketing and sales strongly benefiting, thanks to their capabilities to drive conversion rates. Also platforms targeting all kinds of support functions will thrive, as those will become low-hanging fruits.
Autonomous driving: in a ditch?
We are strong believers when it comes to autonomous driving, considering the significant economic and societal advantages of the technology. After positive signs in the past few years, recent developments materialized by Cruise having to suspend operations in California following an accident have somewhat curbed our enthusiasm. Not much in terms of technology, considering the setting of the accident, but more in terms of public trust. Such an event, which was bound to happen, can lead to significant slowdowns in progress, as both people and regulators will want some clarification and implement corrective actions. And indeed, for Cruise, the effect was a departure from its CEO, funding being cut, and potential fines over the horizon. Due to a different approach to regulation, we think that our China-based top pick in the segment, Baidu , is somewhat insulated – alas, it is also likely to get slowed by U.S.' embargos on AI chips. All in all, the segment will likely take some time to come back on the track.
Technological breakthroughs. AI models are getting better by the day, with the potential of major breakthroughs being behind every corner. Such breakthroughs could lead to such competitive advantages that businesses would be compelled to make the switch.
Technical democratization. More and more tools focus on facilitating the deployment of AI models. If the field becomes no longer a matter of specialist, thanks to a low code/no code approach, then a major growth ceiling would be removed.
Public perception. The perception of AI by the mainstream public has been ambiguous. The first major advanced applications have the potential to make a positive impression, creating a virtuous cycle.
Regulation. The first significant AI regulation, materialized by a U.S. executive order, was relatively permissive but left large parts to be addressed by future rulings. Those could be much more aggressive, potentially hampering the development of AI technologies.
Election scandals. 2024 will be a year of election in the U.S. The staggering progress of AI deep fakes could trigger a major scandal, which could turn into a national crisis and raise critical opposition.
Major economic slowdown. Despite its capacity to enable productivity gains, the theme is not insulated from a recession. Some early-stage companies could run out of funding, while some subsegments (e.g., automated logistics) are more exposed to a GDP slowdown than others.
Companies mentioned in this article
AMD (AMD); ARM (ARM); Alphabet (GOOGL); Apple Inc (AAPL); AutoStore (AUTO); Baidu (9888); Cruise (Not listed); Datadog Inc (DDOG); Dataiku (Not listed); Marvell (MRVL); Meta Platforms Inc (META); Microsoft Corp (MSFT); NVIDIA Corp (NVDA); OpenAI (Not listed); Snowflake (SNOW); Unity Software Inc (U)
This report has been produced by the organizational unit responsible for investment research (Research unit) of atonra Partners and sent to you by the company sales representatives.
As an internationally active company, atonra Partners SA may be subject to a number of provisions in drawing up and distributing its investment research documents. These regulations include the Directives on the Independence of Financial Research issued by the Swiss Bankers Association. Although atonra Partners SA believes that the information provided in this document is based on reliable sources, it cannot assume responsibility for the quality, correctness, timeliness or completeness of the information contained in this report.
The information contained in these publications is exclusively intended for a client base consisting of professionals or qualified investors. It is sent to you by way of information and cannot be divulged to a third party without the prior consent of atonra Partners. While all reasonable effort has been made to ensure that the information contained is not untrue or misleading at the time of publication, no representation is made as to its accuracy or completeness and it should not be relied upon as such.
Past performance is not indicative or a guarantee of future results. Investment losses may occur, and investors could lose some or all of their investment. Any indices cited herein are provided only as examples of general market performance and no index is directly comparable to the past or future performance of the Certificate.
It should not be assumed that the Certificate will invest in any specific securities that comprise any index, nor should it be understood to mean that there is a correlation between the Certificate’s returns and any index returns.
Any material provided to you is intended only for discussion purposes and is not intended as an offer or solicitation with respect to the purchase or sale of any security and should not be relied upon by you in evaluating the merits of investing inany securities.