We can already date the time when the disaster forged for Intel: Apple began to be displeased with the giant in 2015, when it introduced the Skylake architecture, as explained by François Piednoël, ex-engineer of Intel. The new processor had too many problems at source for Cupertino’s high quality standards and it was at this point that Apple would have made the decision to do without Intel.
Tim Cook’s used the first Skylake chip with the 2015 iMac and repeated on the 2016 MacBook and MacBook Pro, and problems soon arose. According to this engineer, it was “basically the poor quality of Skylake that caused Apple to flee the platform,” he says. Without hair on his tongue and already far from Intel’s payroll, Piednoël maintains that the quality of the Skylake was “abnormally bad” and there were constant claims by those from the block; “Apple was the client that reported the most problems in this architecture.”
The former Intel employee believes that Apple at that time began to consider other options and it is possible that at that time they decided to manufacture their own processors themselves when they did not find solid alternatives. Now, is this really the only reason that has prompted Apple to give such a major change in strategy? It may be one of the causes, but possibly not the main one. The manufacturer has gained significant autonomy in the manufacturing process by not depending on anyone and making their own decisions.
On the other hand, Apple can design the chips from scratch and configure them specifically for the needs of the computer that is going to equip them, optimizing resources to the maximum. Being a designer and manufacturer, and given the size of the company, Apple will apply large economies of scale and ultimately achieve greater profitability by incorporating processors as one more piece of the house. The move can have a major impact on the industry because it could eventually ‘infect’ its rivals. What if Microsoft designed and manufactured its own processors for the Surface?