Monday, January 22, 2018

Silicon wars heat up in 2018 – Intel at CES

With the start of the new year comes the massive Consumer Electronics Show in Las Vegas and with it a flurry of technology announcements from the major silicon companies. The publicity focus is not just on shiny new electronics but on a range of new technologies driving cloud services, fog computing, edge data centres and the future of network connectivity. A big battle is clearly shaping up for brains of autonomous vehicles. The same can be said for the connected home, smart cities, healthcare, entertainment, gaming, etc.

Intel takes the stage

Before launching into his annual CES keynote on the near-term future of processing, Intel’s Brian Krzanich first needed to publicly respond to the “Spectre” and “Meltdown” security threats that have dominated tech news. He said Intel has not yet seen a real case where these vulnerabilities have led to a cyber exploit. Nevertheless, Intel will issue software patches for 90% of its products in the coming days with the remainder expecting fixes too. Intel is going to some lengths to assure customers that these software patches will have minimal performance impacts. Already, Amazon Web Services, Microsoft, and Google are stating that their public cloud computing resources are secure and experiencing only minimal performance variations. For assurance, Intel cites the following:

Apple“Our testing with public benchmarks has shown that the changes in the December 2017 updates resulted in no measurable reduction in the performance of macOS and iOS as measured by the GeekBench 4 benchmark, or in common Web browsing benchmarks such as Speedometer, JetStream, and ARES-6.”
Microsoft“The majority of Azure customers should not see a noticeable performance impact with this update. We’ve worked to optimize the CPU and disk I/O path and are not seeing noticeable performance impact after the fix has been applied.”
Amazon“We have not observed meaningful performance impact for the overwhelming majority of EC2 workloads.”
Google“On most of our workloads, including our cloud infrastructure, we see negligible impact on performance.”
This may not be the end of this trouble for Intel nor for its CEO, who is facing questions about his reported sale of stock options during the period before the vulnerabilities were published.

The data tsunami will benefit society

Krzanich used his CES keynote to highlight opportunities brought about by the explosion of data. It is the tsunami of data that is driving the next great wave of the technology revolution and with it, profound social change.  

“Data is going to introduce social and economic changes that we see perhaps once or twice in a century,” Krzanich said. “We not only find data everywhere today, but it will be the creative force behind the innovations of the future. Data is going to redefine how we experience life – in our work, in our homes, how we travel, and how we enjoy sports and entertainment.”

For technologists, the main questions are how much data, where does the processing occur, and what are the storage and networking requirements? For example, Intel says:

  • Average Internet users are consuming 1.5 GB per day
  • Autonomous vehicles will generate about 4 TB per day
  • A connected airplane will generate 40 TB per day
  • A smart factory could general 1 petabyte of data, the equivalent of data production equivalent of 700,000 people playing with smartphones.


Conventionally, data has been stored for later processing. Today’s applications increasing presume that it will be processed and analysed in real time.  Virtual reality and augmented reality consumption devices will need to perform real-time stitching between multiple video streams. Only some of the terabytes of data need to be carried over the network but the latency requirements will be tight.

Immersive Media

For CES, Intel is putting a heavy emphasis on “immersive media.” For example, one experiment being conducted with the National Football League places dozens of connected, 5K video cameras around an athletic field. The image streams are stitched together in real time. The system calculates volumetric pixels – called voxels – which can be viewed from any angle, creating an immersive experience for the viewer.  The first set-up in this experiment is producing 3 TB of data per minute.
Krzanich also announced the launch of Intel Studios, which has just completed the construction of a state-of-the-art sound/video stage in Los Angeles that is capable of stitching video streams from 100 cameras. The 10,000-square-foot dome is described as “the world’s largest volumetric video stage.” The post-production room is equipped with Intel-powered graphics workstations and servers with the ability to crunch over 1TB of data every 10 seconds. Paramount Pictures is the first major Hollywood studio to sign up as a partner.

Neuromorphic computing

Intel Labs has developed a prototype chip called “Loihi” that is based on the principles of neuromorphic computing, meaning that it aims to mimic the processing of the human brain.  Loihi combines training and inference on a single chip. The researchers say it mimics the natural learning process by forming new connections between neurons, or reprogramming the transistor connectivity on chip. The Loihi chip is currently capable of rudimentary image recognition in the lab. Intel plans to share it with research institutes later this year.

Intel is also announcing the design, fabrication and delivery of “Tangle Lake,” a 49-qubit superconducting quantum test chip. This is quite a gain over the 17-qubit design that Intel announced a few months back, but clearly still a prototype for ongoing research. Krzanich said Intel’s goal is to produce a complete quantum computing system – from architecture to algorithms to control electronics. The company reckons that we are still five to seven years away from addressing the manufacturing challenges that would have to be overcome for a commercial product. Getting to a commercial system will probably require chips with one million or more qubits on board. Delph University in the Netherlands is one of Intel’s research partners.

Moving fast with autonomous vehicles

The highlight of the 2018 CES show for Intel is its progress with autonomous vehicles. It was just about one year ago that Intel agreed to acquire Mobileye, a developer of machine vision systems for automated driving, for about $15 billion.

Mobileye, which is based in Israel and is now a wholly-owned division of Intel, holds the leading market position in computer vision for Advanced Driver Assistance Systems (ADAS). Its portfolio includes surround vision, sensor fusion, mapping, and driving policy products. Mobileye's EyeQ chips are already installed in about 20 million vehicles.

This installed base of Mobileye vehicles provides a strategic crowdsourcing mechanism for Intel and its auto manufacturing partners to develop the highly accurate maps needed to gain centimetre precision in the guidance of autonomous vehicles. Up to 2 million vehicles from BMW, Nissan and Volkswagen are now expected to use the Mobileye Road Experience Management (REM) technology to crowdsource this type of data.

Mobileye’s upcoming EyeQ4 and EyeQ5 chips for Level 3/4 autonomous driving programs go into production in 2018 and 2020 respectively. Mobileye currently has OEM relationships with GM, VW, Honda, BMW, PSA, Audi, Kia, Nissan, Volvo, Ford, Renault, Chrysler, SAIC and Hyundai.  Intel’s latest automated driving platform combines automotive-grade Intel Atom processors with Mobileye EyeQ5 chips to deliver a platform for L3 (Level 3) to L5 (Level 5) autonomous driving.

Flying taxis too

Intel is working with Volocopter, a start-up based in Germany that is developing autonomous, fully-electric, vertical take-off flying machines. The company plans to offer air taxi services in major cities.  The prototype uses the same technology that Intel is supplying to drone manufacturers.

Splitting ways with Micro on future 3D NAND

While most of Intel’s CES news is about building partnerships, there was one item moving in the opposite direction. Micron and Intel agreed to work independently on future generations of 3D NAND. The companies had previously been engaged in a partnership for NAND memory and are currently ramping products based on their second-generation of 3D NAND (64 layer) technology. The new business arrangement will go into effect after the companies complete development of their third-generation of 3D NAND technology, which will be delivered toward the end of this year and extending into early 2019.  Neither expects change in the cadence of their respective 3D NAND technology development of future nodes. Intel and Micron will also continue to jointly develop and manufacture 3D XPoint at their joint venture fab in Lehi, Utah, which is now entirely focused on 3D XPoint memory production.