Intel at 50



A recap and preview of Intel’s best contributions to the embedded industry.

Founded in 1968 as “Intelco” by a couple of Fairchild refugees—Robert Noyce and Gordon Moore—Intel® will turn 50 in 2018. Closing 2015 sales at $55.4 billion, Intel is a powerhouse not just financially but as a continuing influence on the tech industry. While Intel’s failure to carve out a meaningful beachhead in the smartphone market is often cited as indicative of the company’s fall from grace, the company continues to disproportionately influence the embedded market.

It will do so for many years to come.

Looking back

Intel’s first MOS SRAM was in 1969, followed by the company’s 4004 microprocessor and EPROM in 1971. Interestingly, experts say that this EPROM idea was the last major innovation in computer memory technology (after DRAM and SRAM). Yet it is Intel (with Micron) bringing a new memory technology to market called 3D XScale, perhaps in 2016-2017.

According to Intel’s technology museum, and reported by www.techrepublic.com, Intel’s first 8-bit MPU was the 8008 in 1972, followed by the wildly commercially successful 8080 MPU in 1974 (Figure 1). In 1981, IBM chose Intel’s 8088 for the new IBM PC as long as a second source was available (it was: from Advanced Micro Devices, AMD). As the PC changed the industry, Intel evolved from 8088 to 80286 and all the way to 80486 CPUs. Along the way, the company radically changed technology marketing with the Intel Inside campaign that brought mainstream consumers’ attention to the actual CPU.

Intel’s 8080 was a commercial success in 1974. (Courtesy: www.wikipedia.com and Konstantin Lanzet.)

Figure 1: Intel’s 8080 was a commercial success in 1974. (Courtesy: www.wikipedia.com and Konstantin Lanzet.)

Intel participated with Xerox and Digital Equipment (DEC) in the development of Ethernet in 1980, then went on in 1998 to buy substantial assets from DEC including the StrongARM processor. A bona fide stumble by Intel, the company had a chance to overpower ARM at its own game, but Intel remained singularly focused on its own architecture: what might have been the “ARM 10” from Intel, alas, never was. Intel introduced various Pentium® processors throughout the late 1990s while ratcheting up performance, clock speed and heat.

Soon after “Y2K” (year 2000), consumers were complaining about poor battery life in their too-hot laptops and upstart Transmeta got Intel’s attention with a lower power x86. Tapping the company’s Israeli design team to build a lower power Pentium, the Pentium M (“Dothan”) was released in 2003, starting the end of Intel’s race for higher clocks and ushering in the era of multicore processors. Also in 2003, Intel introduced the Centrino® processor architecture that few people realize was the first instance of combining a processor with wireless networking (not monolithically, though). Today, no one can imagine computing solely with wired Internet. Intel saw the future and made it possible.

Figure 2: Transmeta’s Crusoe TM5600 got Intel thinking about lower power. (Courtesy: Wiki Commons; author Futase_tdkr).

Figure 2: Transmeta’s Crusoe TM5600 got Intel thinking about lower power. (Courtesy: Wiki Commons; author Futase_tdkr).

A string of Intel processor and architecture innovations followed, starting with the first Intel Core architecture, followed by Core™ Duo, Core 2™ Duo and a rapid march to today’s 4th, 5th and recent 6th generation Core processors code named “Skylake” (Figure 3). Along the way, Apple settled on Intel’s Core 2 Duo processors in 2005, ditching Motorola/Freescale’s PowerPC and completely changing Apple’s future. Intel processors have been inside myriad Apple products since then, although Intel has never supplied a processor for an iPhone (only modems and peripheral ICs).

Figure 3: Intel’s 6th Generation Core architecture, Skylake is the pinnacle of Intel’s processor efforts to-date. (Courtesy: Intel.)

Figure 3: Intel’s 6th Generation Core architecture, Skylake is the pinnacle of Intel’s processor efforts to-date. (Courtesy: Intel.)

Beyond processors and memories, Intel has worked quietly behind the scenes on Wi-Fi, PCI Express, SATA, Linux, Android, and all manner of Windows software. If it relates to just about any market where Intel sells ICs, it’s a fair bet that Intel has engineers and developers working on it. Yet the company rarely gets credit for this behind-the-scenes effort.

Looking ahead…

to Intel’s 50th, Intel’s pipeline is impressive. Already the company’s 14nm FinFET (tri-gate) process technology has inspired companies like TSMC to work harder at its own process technology, driving the technology mainstream in 2015 that benefits the whole market. As well, Intel’s continued collaboration with Micron in 3D XPoint memories promises either to merge DRAM with flash for better performance—or essentially assure designers that all memory will eventually be nonvolatile. It’s being called a disruptive technology.

And Intel’s complete dominance in the server and “cloud” space with Xeon processors, data plane software like DPDK, and algorithm extensions like AVX have made other companies take notice. A bigger ecosystem always expands a market, and ARM is steering aggressively towards Intel’s hegemony in servers. At the same time, ARM is driving some of what it learns down into the lower power/cost end of the company’s product line.

At IDF 2015, Intel made the audience collectively scratch heads at what the company is pondering in wearables and connected IoT devices. With the introduction of the low power, 32-bit SoC Quark™ IC two years ago, Intel is demonstrating myriad examples of Intel really, really inside of lots of systems from clothing and RealSense cameras, to medical patient monitoring systems.

There doesn’t seem to be a traditional, cohesive (public) roadmap for how Intel plans to be inside of every connected IoT doodad. However, the second-generation Quark will be even lower power (and clock frequency) moving into ARM Cortex Mx territory. If Intel repeats its 2003 Centrino success (CPU + WiFi network) with Quark, expect to see Quark + Cellular on an SoC. This powerful combination is absolutely what’s needed to achieve the 20B – 50B connected IoT devices universally forecast.

In short: in the next two years, Intel will continue to fuel the market in process technology, performance, and wearables.

Share and Enjoy:
  • Digg
  • Sphinn
  • del.icio.us
  • Facebook
  • Mixx
  • Google
  • TwitThis

Tags: