The ARM processor architecture has become ubiquitous in mobile devices and embedded systems. Since its creation in the 1980s, ARM processors have evolved to meet the needs of increasingly complex and power-efficient devices. This article provides an overview of the major milestones in ARM’s history and the development of the Cortex series of processor cores.
The Origins of ARM
ARM stands for Advanced RISC Machine. The history of ARM begins with Acorn Computers, a British personal computer company. In the early 1980s, Acorn was working on a successor to its BBC Micro home computer. For this new 32-bit processor, Acorn wanted a RISC (Reduced Instruction Set Computer) architecture to allow for a simpler, more efficient design compared to complex CISC (Complex Instruction Set Computer) processors popular at the time like the Intel 80386.
In late 1983, a small team at Acorn including Sophie Wilson and Steve Furber began work on this new RISC processor. It was given the name ARM, both as an abbreviation and a reference to the ARM1 prototype’s arm-like shape. The first ARM processor, the ARM1, was built using a 2μm process and ran at 8MHz. It had about 25,000 transistors.
The ARM1 and follow-up ARM2 were used in Acorn’s Archimedes computer released in 1987. Although the Archimedes was not a major success, the efficiency of the ARM processors attracted interest from Apple. This eventually led to Apple licensing ARM cores for use in the Newton PDA released in 1993.
The Formation of ARM Holdings
As Acorn continued developing ARM processors in the late 1980s and early 1990s, they recognized the opportunity to license ARM cores to other companies. This would allow ARM architecture to be used in custom chip designs across the electronics industry.
In 1990, Acorn spun off its microprocessor division into a new company named Advanced RISC Machines Ltd, which was commonly referred to as ARM. This allowed the company to focus on developing and licensing the ARM instruction set architecture and cores rather than being tied to Acorn’s fate.
Throughout the 1990s, ARM pursued a licensing model that was unique in the industry. Rather than just license an off-the-shelf core, ARM offered a platform for partners to create customized chips using the common ARM instruction set. This business model proved incredibly successful.
By 1993, ARM had licensed its technology to over 50 different companies including big names like Intel, Texas Instruments, and Nokia. As devices like cellphones and PDAs began adopting ARM processors, the installed base of ARM chips started to accumulate rapidly. ARM shipped over 1 billion ARM-based processors by 2001.
The Evolution of ARM Cores
The first ARM-branded processors produced by ARM Holdings were based on the early Acorn RISC Machine ARM6 and ARM7 architectures, but quickly evolved into the ARM8, ARM9, and ARM10 families as ARM moved to higher performance 32-bit cores in the 1990s.
The ARM11 family launched in 2002 was a major milestone introducing a high-performance and energy efficient core in a small footprint designed for mobile applications. ARM11 cores powered the first generation of smartphones including early iOS and Android devices.
As smartphones and mobile devices became more complex, ARM continued innovating its core designs. In 2004, ARM introduced the ARM11 MPCore, its first multicore processor for mobile devices capable of running operating systems like Symbian, Windows Mobile, and Linux.
ARM first introduced the Cortex brand in 2005 to represent a new generation of high-performance ARM processor cores. The first Cortex core was the Cortex-A8 designed for home entertainment, smartphones, and other devices requiring complex operating systems and multimedia support.
Later Cortex-A cores included the Cortex-A9 in 2008 which improved performance for multicore implementations. The Cortex-A15 in 2011 pushed ARM performance further into low-power servers and networking applications.
Recent high-end Cortex-A cores include the Cortex-A72, A73, and A75 designs built on more advanced manufacturing processes. The latest Cortex-A77 provides laptop-class performance for mobile applications.
The Cortex-R series focuses on real-time applications that require predictable timing behavior and reliable execution. Cortex-R cores have found use in automotive systems, industrial equipment, and networking infrastructure.
The first Cortex-R core was the Cortex-R4 introduced in 2005. Later generations like the Cortex-R5 added reliability features like lock-step processing for safety critical systems. The latest Cortex-R82 core includes machine learning acceleration for edge inference workloads.
ARM launched the Cortex-M series in 2004 to address the growing market for deeply embedded applications like IoT devices, wearables, and microcontroller-based designs. The Cortex-M cores are designed to be ultra low power while still providing good performance for 32-bit embedded software.
Over the years, the Cortex-M series has expanded to cover use cases ranging from tiny battery-operated devices to more complex embedded products. Recent additions like the Cortex-M55 integrate machine learning capabilities for edge AI.
The Importance of Architectural Licenses
A key business strategy for ARM that enabled its widespread adoption was architectural licensing. Rather than just license an off-the-shelf core, ARM allowed partners to license the instruction set architecture and microarchitecture to create custom SoC designs.
Early architectural licenses enabled custom ARM chips for Apple, Qualcomm, Samsung, and many others powering PDAs, cell phones, networking gear, and embedded systems. This model has continued with ARMv8-A powering mobile SoCs from Apple, Qualcomm, Samsung, and more.
Architectural licenses also allowed ARM processors to be adapted for specialized applications like automotive, networking, servers, and AI acceleration. Although not as flexible as designing a chip from scratch, architectural licenses gave just enough customizability for partners while retaining ARM compatibility.
The Rise of ARM in Mobile and Beyond
By the early 2000s, ARM cores started to dominate the mobile space. Early smartphones relied on ARM processors, eventually leading to the ARM vs x86 clash with Intel as computers moved toward mobile form factors.
The lightweight architecture, customizability, and power efficiency of ARM proved far superior to x86 for mobile applications. Smartphone giants like Apple, Qualcomm, Samsung, and others invested heavily in custom ARM-based designs.
The emergence of iOS and Android running on ARM cemented its status as the computing architecture for the mobile era. As ARM pushed performance upward with advanced Cortex cores and new 64-bit ARMv8 architectures, ARM chips started to encroach on the low-power laptop space as well.
Outside of mobile, ARM’s R and M series cores continue to excel in embedded, IoT, industrial, automotive, and other applications needing power efficiency and real-time capabilities. ARM’s hold in mobile has also enabled it to aggressively pursue server and networking markets competing with Intel x86.
The Future of ARM
ARM remains the dominant architecture for mobile and embedded devices given its flexibility, wide software and tool support, and sheer scale. ARM also has a growing presence in markets like networking infrastructure and automotive thanks to its power efficiency.
Challenges for ARM’s continued growth include increased competition, especially in servers, from alternative architectures like RISC-V. There are also questions around ARM’s model as the mobile market matures and shifts more toward integrated offerings from Apple, Qualcomm, and other giants.
Regardless, ARM maintains a towering position and influence over the computing landscape. The architecture’s entrenched status in mobile ensures ARM processors will power the smartphones and mobile devices of the future, even as they continue making inroads into new markets.
After upending the traditional PC industry, ARM is poised to define the next era of computing across a diversity of intelligent and connected devices.