ARM And Nvidia: What Does This Partnership Mean For Today’s Hottest Tech Trends

News this week that two key players in AI hardware will come together following a $40 bn acquisition could have far-reaching consequences for today’s most important tech trends.

Technology developed by Nvidia and ARM is often hidden from sight but has provided the raw computing power that has led to recent advances in AI, the Internet of Things (IoT), autonomous cars, personalization, and wearables, as well as cloud and edge computing.

ARM And Nvidia: What Does This Partnership Mean For Today’s Hottest Tech Trends?

ARM And Nvidia: What Does This Partnership Mean For Today’s Hottest Tech Trends?

Adobe Stock

The proud new parent is Nvidia, the world’s leading manufacturer of graphics processing units (GPUs) – processors dedicated to crunching the complicated maths necessary to create state-of-the-art computer graphics in real-time. In recent years, the same hardware has also been found to be the most effective tool currently available for processing the equally complex algorithms used in machine learning and deep learning applications.

Nvidia’s GPUs were first applied to AI by Google’s Google Brain project in 2009 – widely credited as a breakthrough development in deep learning that has made much of the progress over the last decade possible. Today they are used as the AI “brain” in every car sold by Tesla.

Nvidia has become the new owner of ARM, a UK-based developer of central processing units (CPUs) that is dominant in the smartphone market. ARM does not manufacture its own chips but licenses its designs to phone manufacturers worldwide, including Apple and Samsung. More recently, it has also designed specialized AI processing chips and an AI computing platform for developing and running industrial AI applications.

Although both companies have fingers in many pies, they both have market dominance in one particular field – GPUs for Nvidia and mobile processors for ARM. And the most obvious crossover in what they do is in the field of AI. Combining these strengths creates a partnership that is well-placed to lead in a world where smart, connected devices are becoming increasingly important to our lives.

ARM’s CPUs are already present in billions of devices worldwide, thanks to the growth of mobile and IoT. Strategically this makes them a great acquisition for Nvidia, as a vehicle for rolling out its deep learning technology to “the edge.” This is the fast-growing share of compute power dedicated to analyzing and interpreting data onboard the actual devices that capture it, rather than sending it to the cloud for processing by remote computers. Over the next few years, moving compute workloads to the edge is expected to lead to increased speed, energy savings, and security improvements across all processes and operations that are driven by data collection and analytics.

In short, a merger of resources and capabilities between Nvidia and ARM makes them uniquely positioned to capitalize on what will be the most important and impactful tech trends of the next decade. Much of this may be transparent to us, as end-users, who, unless we want to build a high-specification gaming PC, might never buy a product or service marketed directly by either brand.

Aside from Nvidia and ARM shareholders, the winners here could be businesses that rely on leveraging these trends to generate new revenue streams built on providing data services. More effective integration of AI across the spectrum of devices that make up the IoT, from the edge to the cloud, means more opportunities to innovate through smarter, connected technology deployments. More intelligent and secure data capture and processing on our devices will lead to more useful (and safer) mobile applications, more capable of keeping up with the vast growth in the amount of data we have the ability to capture.

On the other hand, there are likely to be losers too – and in this case that could potentially mean the largest providers of CPUs to cloud data centers, which are Intel and AMD. Due to their monopoly, the bulk of the world’s machine learning workload so far has been carried out using Intel or AMD processors, often working alongside Nvidia GPUs in the cloud. This includes the core business functions carried out by web giants such as Amazon (shopping), Google (searching), and Facebook (socializing) as well as that carried out on their platforms by third-party service providers such as Netflix or Uber.

With the move to edge and mobile, where ARM is the undisputed leader, this is likely to change. More data collection and analytics can be carried out directly on your phone, such as the personal data-driven calculations carried out by Amazon to work out what to try and sell you, or those carried out by Netflix or Spotify to predict what you want to watch next. This could lead to greater levels of personalization and more useful predictions – with no need for the tech companies carrying out the work to even see the data that the decisions are based on.

This week’s announcement of the $40 bn merger might not come as a huge surprise to anyone who has been following the two companies closely – they have cooperated frequently in the past, and Nvidia has been clear about its intention to move into edge devices. Now they are set to fully combine forces, the impact that they could make on the future direction of today’s hottest trends could be dramatic.