Bob Noyce's Penetration Pricing As A Market-Creating Innovation: App & Infrastructure Cycles
The origins of the semiconductor industry have many parallels with the AI platform shift unfolding before us
👋 Hey friends! To the regular readers, thank you for reading as always. Please share it with friends if you enjoy Missives!
To the new readers, I’m Akash and welcome to my newsletter, Missives. Every week I analyse trends and GTM strategies in Software and FinTech. I always love hearing from readers; you can find me on akash@earlybird.com.
Current subscribers:Â 2,644, +60 since last week
Join 2,600+ other founders, operators, and investors below for regular Missives.
I’ve been reading Chris Miller’s Chip War recently, an excellent study of the semiconductor industry’s evolution. Fairchild Semiconductor is an inextricable part of Silicon Valley lore and as I reflected on their role in the inception of the industry, it became clear that many decisions taken by management in the 1960s are instructive for the AI platform shift unfolding today.
Miller masterfully chronicles how integrated circuits emerged and revolutionised societies, as well as how the global supply chain and its various choke points developed (the Acquired episode on TSMC is also a canonical resource).
Miller provides homage to the myriad of creators behind integrated circuits and transistors, including Jack Kilby and Bob Noyce. Applying Carlota Perez’s framework for technology revolutions, Miller charts back to the Space Race between the US and the Soviet Union as the catalyst for the installation phase of semiconductors. Sputnik triggered significant investment from the US military in advanced computing; outnumbering the Soviet military and apparatus was not feasible, but winning by dint of precision technology was.
The first large order for Fairchild Semiconductor’s chips came from NASA, which had a singular goal of surpassing the Soviets’ rocket programs by sending a man to the moon. It was evident to NASA that transistor-based computers were far superior to vacuum-tube equivalents that had been responsible for the US military’s success in World War II, but it was still remarkable that a company as young as Fairchild Semiconductor managed to deliver the integrated circuits for the computer that took Apollo 11 to the moon. The Apollo program was an inflection point for Fairchild Semiconductor; revenue climbed from $500k in 1958 to $21 million in 1960.
In many ways, using startup GTM parlance, NASA was a lighthouse logo that provided the cachet needed for wider enterprise adoption, in this case across the US military. By 1965, the inevitability of integrated circuits as the new computing paradigm was becoming abundantly clear, as Gordon Moore made his now famous prediction that Fairchild would double the number of transistors that could fit on an integrated circuit every year for the next ten years. The realisation of Moore’s Law would pave the way for not only more computing power but also cheaper transistors with economies of scale. As we now know, that prediction was too conservative - even in 2023 we’re perhaps still a few years away from Moore’s Law diminishing.
Bob Noyce passed away before he could share a Nobel prize with Jack Kilby for inventing the integrated circuit, but in Moore’s eyes Noyce’s second biggest achievement was how Fairchild cut prices to increase market share and create entirely new categories of consumption. In 1965, defense dollars were still responsible for 72% of all integrated circuit orders. Moore and Noyce knew that chips would eventually power personal computers and devices, and US defense secretary Robert McNamara’s decision to cut costs in military procurement accelerated this transition. Noyce’s stroke of genius was to create new markets for their integrated circuits by aggressively slashing prices; chips that sold for $20 previously were cut to $2. At times, Fairchild was even selling products on negative gross margins, all in the pursuit of TAM expansion and category creation.
Moore later argued that Noyce’s price cuts were as big an innovation as the technology inside Fairchild’s integrated circuits.
Retrofitting frameworks from the subsequent corpus on pricing would frame Noyce’s savviness as penetration pricing; Fairchild received an order for 20 million chips in 1966 from computer firm Burroughs, and by 1968 the computer industry was buying more than the military. Fairchild chips served 80% of the computer market. The price cuts were a market-creation innovation that diffused the computing power of integrated circuits across society and ignited the deployment phase of the technology (whilst financial capital has become appropriate for some categories, production capital remains more suitable for the majority of software).
Market-creating innovations do exactly what the name implies - they create new markets. But not just any new markets, new markets that serve people for whom either no products existed or existing products were neither affordable nor accessible for a variety of reasons. These innovations transform complicated and expensive products into ones that are so much more affordable and accessible that many more people are able to buy and use them. In some cases they even create entirely new product categories. Clayton Christensen
LLMs remain nascent in terms of enterprise adoption but sufficient capital is being invested to realise the technology’s potential to deliver unprecedented productivity gains and act as a beacon of optimism through turbulent macro conditions. As applications blossom and developer tools emerge to improve the value apps can extract from the models, the dualism between apps and infrastructure comes into sharper focus for the present moment in AI.
Why is it that apps come first in the cycle, and not infrastructure first? One reason is that it doesn’t make sense to create infrastructure until there are apps asking you to solve their infrastructure problems. How do you know that the infrastructure you are building solves a real problem until you have app teams that you are solving for?
Bob Noyce’s prescience was in making the infrastructure cheap enough for applications to thrive, which eventually trickled down to economies of scale in infrastructure production. The wisdom in Bob Noyce’s pricing decision can also been in how several foundational model companies are rationalising the trade-off between monetisation and model improvement through feedback; OpenAI’s daily compute costs for ChatGPT are negligible when the stakes are as high as the pursuit of AGI.
In the LLM era, applications have achieved sufficient traction to drive an influx of capital into infra tooling for developers to build the right form factors for even wider adoption. Ordinarily, the synchronicity between apps and infrastructure is difficult to measure, but the causality observed between app developers’ pain points and the creation of LLMOps categories like vector databases and orchestrators is suggestive of a much tighter feedback loop.
What I’m Reading
The State of Autonomous AI Agents
A dear friend of mine, Dean Meyer of Vine Ventures, unpacks the current limitations of autonomous agents as well as the incredible potential in the future.
5 Contrarian AI Theses For Early Stage Investors
One of Rob May’s most intriguing theses is that agents will upend customer acquisition channels: ‘If you like a company because of their PLG or community driven GTM, it’s possible those are irrelevant in a few years.’
Intercom’s challenge of balancing its now two competing subscription and usage business models will make for an interesting case study at the app layer.
92% of IT ROI has a clear P&L impact, with the biggest value creation being in labour productivity.
Emerging Architectures for LLM Applications
a16z decompose the current stack of LLM applications built upon foundational models with access to internal and external data.