SINGAPORE (Feb 26): One of the hottest tech companies in the world today neither makes trendy, state-of-the-art smartphones nor completely dominates the e-commerce space. It just designs chips that are used in artificial intelligence (AI)-powered devices. Founded by Taiwanese-born engineer Jen-Hsun Huang, Nvidia has risen to become a graphic chip powerhouse with a market capitalisation of US$152 billion ($200.7 billion), the world’s No 2 pure-play chipmaker behind Intel (market value of US$220 billion).
Huang, who has been described as a tech visionary in the mould of Apple’s Steve Jobs, Tesla’s Elon Musk and Amazon.com’s Jeff Bezos, bet early on that the ground would shift away from Intel, which has long dominated microprocessors in PCs. He wagered rightly that the future was voice and video, rather than just text and images which would require increasingly sophisticated graphics chips. “Nvidia is a platform company with a highly unique and leveragable architecture for some of the fastest growth markets in technology, including gaming, artificial intelligence and autonomous cars,” Vivek Arya, analyst for Bank of America Merrill Lynch, who has been bullish on the firm for years, gushed in a recent report about the company.
Nvidia has also been one of the world’s best-performing tech stocks and a darling of retail investors. Last year, CNBC’s stocks guru Jim Cramer named his pet dog Everest Nvidia. Its shares are up 26% just this year, 138% since January last year, 1,889% over the past five years and a whopping 3,804% from its post-financial crisis lows of late 2008. Last May, Japan’s SoftBank Group, a key investor in AI-related technology firms, took a 5% stake in Nvidia, which further fuelled the stock.
Nvidia developed its niche years ago as a maker of graphic chips for video game consoles such as Sony Corp’s PlayStation, Microsoft Corp’s Xbox and Nintendo Co’s Switch. Its recent ascent has been powered by stronger demand for its core graphics chips, which have found new growth drivers in AI, machine learning and deep learning. Nvidia has also been a key beneficiary of the huge boom in cloud services, as a producer of chips optimised for data centres that Amazon, Microsoft, Google, IBM and Oracle Corp use in remote servers.
Designed for sophisticated gamers, Nvidia’s graphics processing units (GPUs) were an accidental choice for training AI systems as well as deployment in machine and deep learning. Many independent AI researchers and institutions started buying graphics cards meant for gaming and using them to run AI algorithms, as they can perform lots of complex mathematical calculations simultaneously.
Catherine Wood, chief investment officer of New York-based, tech-focused asset management firm ARK Investments, says demand for AI-related chips, hardware and software is off the charts. Atif Malik, Citigroup’s semiconductor analyst in San Francisco, says: “Convergence of computational power, big data and machine learning algorithms is driving major innovations in image and speech recognition and language translation.” Self-driving cars, virtual assistants such as Siri and Alexa, robotics, game mechanics, healthcare and security are just some of the applications that are driving demand for AI chips. IT research consultancy IDC projects AI-related hardware software and services revenues will jump to US$58 billion by 2021 from just US$12 billion last year.
Cars of the future
Among Nvidia’s biggest initiatives is its forays in next-generation autos. Nvidia has long supplied chips to luxury car makers such as Daimler, the maker of Mercedes-Benz, for in-car infotainment systems that power audio, video, navigation systems, Bluetooth and USB connectivity. Now, it is expanding it to Level 3 and Level 4 autonomous car platforms — which still require a driver but have a high degree of automation — with its chips and software. With its deep knowledge of data centres and cloud infrastructure, Nvidia is working on a cloud-to-car solution, a system for training deep neural networks in the data centres to help enable safer autonomous driving. Among its partners are Audi and Volkswagen as well as Uber, which are using its chips and software for their driverless cars. China’s Baidu is partnering with Nvidia for its own autonomous vehicles.
To be sure, 2017 was the year when all the stars aligned for Nvidia. Just as it had begun firing on all cylinders at the start of last year, Nvidia stumbled on another winner: cryptocurrency mining. As bitcoin prices surged from US$900 to US$19,900 in 2017, there was a sudden demand for some its more powerful gaming chips from crypto “miners” who attempt to earn crypto assets such as bitcoin, Ethereum, Litecoin and others by using high-powered computers to solve the complex cryptographic puzzles that enable a transaction to be verified and added to the blockchain. The first miner to unravel each puzzle is rewarded with a stash of newly issued virtual coins.
For its part, Nvidia has been at pains to stress that direct sales of chips to crypto miners accounted for a tiny portion of its revenues last year, under 3%, despite the hefty prices that the miners continue to pay on the open market to get hold of its chips amid shortages.
Amit Daryanani, analyst for RBC Capital Marketing San Francisco, believes Nvidia’s traditional gaming chip business will continue to grow more than 15% annually, thanks to the rise of e-sports and more graphically intense video games. Yet, at the same time, its new business segments such as selling chips for servers and autonomous cars will grow much faster as they burgeon to be massive money spinners.
Competitors rev up
Can Nvidia maintain its huge lead in AI chips? For starters, chips used in data centres are in such high demand and margins have been consistently so high that a slew of other companies are ready to take a slice of the fast-growing pie. Among them are Intel, which will have its Knight’s Mill, a processor from its famed Xeon Phi line that has been tweaked for machine learning; and Advanced Micro Devices,
whose Naples server chip was introduced late last year.
Google’s parent Alphabet has been working on its own AI chip, called the tensor processing unit. The application-specific integrated circuit (ASIC) was designed to provide more energy-efficient performance for its search engine’s Deep Learning AI applications, which are capable of learning by processing massive quantities of data. Another player is programmable logic device maker Xilinx, whose field-programmable gate array (FPGA) chips optimised for machine learning are already being used by Amazon Web Service and cloud services operated by Chinese search engine giant Baidu. There is also a growing number of well-funded start-ups in California that are working on chips specifically designed for cloud infrastructure and data centres. Nvidia has been so successful that everyone wants to imitate the chip giant.
A bigger challenge for Nvidia may come from Apple and Amazon, which harbour ambitions of becoming chip giants almost as powerful as, if not more powerful than, Nvidia. Though mostly known for its iconic iPhone, iPads, Apple Watch and iMacs, Apple is already a formidable chip player. In 2008, it bought PA Semi, a chip design firm that has helped it become a global semiconductor powerhouse in its own right. Apple now designs its own chips that run its iPhones and iPads. Because hardware, using outsourced components, can be easily copied, its in-house chips help Apple differentiate its products from those of rivals and improve user experience.
Apple recently also began designing its own power-management chips, which will help charging, battery life management, and energy consumption for iPhones to stop relying on its current supplier Dialog Semiconductor. Last year, it began designing its own graphics chip and severed ties with its supplier Imagination Technologies. Apple developed a neural engine as part of its new A11 Bionic chip — a cutting-edge processor used in the iPhone X that can handle many AI functions locally, which vastly reduces the amount of user information that is transmitted to the cloud, thereby helping to secure the data. In addition, Apple is part of a private-equity-backed consortium that took control of Toshiba Corp’s memory chip unit recently.
Enter challenger Amazon
While Apple makes chips mostly for its own use, Nvidia’s real challenge is likely to come from one of its key customers, e-commerce behemoth Amazon, the world’s largest cloud service provider. San Francisco-based tech website The Information reported recently that Amazon was developing a chip designed for AI to work on the Echo and other hardware powered by Alexa, its virtual assistant. The custom chip will be able to do on-device processing instead of relying on a device’s connection to the cloud. When a user makes a request to Alexa, the information is transmitted to the cloud, where it is processed, and then returns as a response back to the device. So, when you ask Alexa to do a complicated task, currently there is a slight delay in the response, which also means hackers can intercept the communication even though it may be encrypted.
An AI chip that is able to recognise speech locally on the device will help drastically cut response time, make the communication less vulnerable to hackers and dramatically enhance the capabilities of Alexa and Amazon hardware. Amazon has quietly developed chip-making capabilities over the past two years, first through a US$350 million acquisition of Annapurna Labs, a secretive Israeli chipmaker, and, more recently, through the hiring of the semiconductor industry’s top talent from Intel, Nvidia, Apple and Qualcomm.
Nvidia may have met its match with crypto currencies. Crypto mining is so much in vogue that it has caused a global shortage of graphics chips so severe that there is little or no supply left for PC gamers who want to put a new graphics chip in their desktop to play some of the newer games. Nvidia’s GeForce 1080 Ti GPU, which was launched last year for US$700, currently sells for at least twice the list price online — that is, if you can get hold of one. Powered by soaring demand, Nvidia’s gaming chip revenue surged an annualised 29% in the October-to-December quarter to US$1.7 billion. Some of the more bullish analysts expect just Nvidia’s gaming chip segment revenues to surpass US$9 billion this year. Gaming chips still make up half of Nvidia’s total revenues, though other segments are growing much faster.
ARK’s Wood says that, while the recent crypto mining boom has been a boon for Nvidia, it may not remain a viable source of revenue growth over the long term. “Ethereum is changing its core algorithm from proof-of-work to proof-of-stake, potentially obviating the need for crypto mining and GPUs,” she says. “While GPUs will be used to mine other cryptos, like bitcoin, their wild popularity based on Ethereum — the second most valuable crypto asset today — may prove fleeting.”
RBC’s Daryanani disagrees: “Even if the algorithms change to proof-of-stake in the long term, the need to build out a decentralised world computer will command material compute power benefiting companies such as Nvidia.”
Nvidia founder Huang himself is probably not overly worried about Amazon, Apple or Google chasing their own AI chip dreams, or indeed Intel, where he once worked as an engineer. What keeps him awake at night is China, which has an audacious 30-year blueprint to dominate AI and has identified AI chips as a centrepiece of the strategy. Beijing is pouring billions of dollars into chip design start-ups such as Horizon Robotics as well as into R&D of AI chips. That China will continue to close the gap is not in doubt, but it could still take years before it does. “Nvidia is fairly well positioned to dominate the graphics
chips segment for the next few years,” says Daryanani. Its data centre chip business, which has grown at triple-digit rates over the last two years, is expected to chalk up another 100% growth this year and next. With net profits expected to grow 34% this year to US$4.3 billion and gross margins of more than 62%, Nvidia, for all its long-term challenges, remains in acceleration mode for now. It will probably take a sharp slump in tech spending to slow its momentum.
Assif Shameen is a technology writer based in North America