Chip technology: is Intel losing touch? – digital


Lisa Su had an impossible job. Three CEOs in just a few years before her had failed to get the American chip manufacturer AMD back on track. Had failed to secure AMD at least a comfortable niche in the shadow of the giant Intel.

But now, after a few years at the top, the Taiwan-born, 51-year-old engineer has shown that there is another way: Not only do the two new competing game consoles, the X-Box from Microsoft and the Playstation from Sony, work like their predecessors with AMD processors. As far as the hearts for desktop computers are concerned, the so-called Central Processing Units (CPU), AMD is technologically far ahead of its big competitor. AMD’s share price is soaring, that of Intel has given way.

Lisa Su, AMD, press images

Lisa Su, PhD in engineering and head of the chip manufacturer AMD.

(Photo: AMD)

But the resurgent, more technologically advanced competitor is not Intel’s only problem. A few days ago, the technology group Apple announced more details about its new CPU M1 for the first time. Detailed independent tests are still pending, but it is already clear that Apple has actually managed to develop chips that have a big lead in the class of energy-efficient CPUs.

Apple has already announced that after a transition period of two years, all of the company’s computers will be converted to the new platform. Intel as a supplier of CPUs is out of the game. The chip company dominated the market for decades, chips are even referred to as Intel architecture, even when it comes to a CPU from AMD, for example. At first it is not too bad for the company from Santa Clara, California, if Apple is no longer a customer. Apple computers only make up a small percentage of all computers sold, and most of the money, around 60 percent, is earned by Intel not with consumer devices, but with professional applications such as chips for server computers.


But this business is also in danger due to a development for which Apple’s sensational success is only a symptom. Apple developed its CPU based on the design of ARM. ARM, a UK-based company, designs chips, but doesn’t produce them, it just sells the licenses for those designs to others. In addition to Apple, these include Samsung and Qualcomm, for example. ARM-based chips were initially used primarily in mobile devices, where, in addition to sufficient power, energy efficiency is particularly important – otherwise the batteries would be drained too quickly.

But over the years, ARM CPUs have become faster and faster without mutating into energy guzzlers. One of the reasons for this is that it was possible to accommodate more and more transistors in the same area. Apple’s M1 combines a gigantic 16 billion of the small switching elements on its silicon chip. There are still no ARM chips for high-performance PCs, but if you continue the development of the past few years, it seems to be a question of time. Then the lucrative server market, which has become enormously important with the cloud boom, would also be in danger.

Intel should get its internal problems under control

Intel, still the largest chip manufacturer in the world, is far from dead. But the company should get its internal problems under control. Far too often in the past few years the dates for promised development steps had to be collected again. Partly because it was not possible to get the new processes running in the company’s own fabs – i.e. the high-tech facilities specializing in chip production – that ensure that more transistors can be accommodated in the same area. In the meantime, people in Santa Clara are even thinking about outsourcing production to external specialists – as AMD has long since done under Lisa Su.


If that works and Intel manages to achieve its goals, the group could recharge its batteries for further development, which will still require a lot of research. It is well known that the miniaturization of chips cannot go on forever. Techniques such as quantum computers are still too far away to be really suitable for the masses; they are in principle not suitable for many computing tasks anyway. When it comes to memory chips, a three-dimensional design is already used, i.e. the semiconductors are not only arranged on the surface, but are stacked in several layers on top of each other. With CPUs you are still at the beginning. Especially now, when technologies such as machine learning require a lot of computing power, it would be so important that the decades of progress in CPU chips – known as Moore’s Law – do not come to a standstill.


Please enter your comment!
Please enter your name here