OpenAI enthroned as the Levis of Tech
OpenAI has launched its ChatGPT API service, powered by the GPT-3.5-turbo AI model. The model is available for $0.002 per 1,000 tokens and can be used for non-chat applications. Early adopters of the ChatGPT API include Shopify, Instacart, Quizlet, and Snap. This is another cornerstone moment for software. The AI-native-era gold rush has just begun.
Levi's was always the actual winner of the gold rush, making a fortune by supplying miners with the tools they needed to extract gold while never involving in the actual competition. They were the arms dealer in this silent war. AWS was the latest Levi's of the tech industry, after the comms companies like AT&T. Just as the gold rush created a massive demand for supplies like jeans and hammers, the digital transformation has made an enormous demand for cloud computing services like AWS. OpenAI just officialized that they will be the next Levi's.
The rise of AI-natives
It is also worth observing how the previous Levi's, AWS, reformed the tech environment; the advent of cloud computing has transformed how we approach software engineering. Some tectonic shifts:
- Fewer expensive On-Premise Engineers — Gone are the days we needed on-premise system engineers configuring servers and managing infrastructure. We deploy applications without worrying about managing the underlying hardware.
- Shift from Pets to Cattle — In the early days of computing, companies treated their servers like pets – they gave each one a unique name and invested time and resources individually. But as the number of servers and resources grew, companies began treating their servers and resources like cattle – they invested in automation tools that allowed them to manage large numbers of resources. This shift enabled companies to scale up their infrastructure rapidly without sacrificing reliability or performance.
- Rise of MSA and Serverless — AWS made it easy to spin up and manage small, independent services, leading to the emergence of microservices architecture. This architecture enables developers to build complex applications using small, modular services, improving reliability and performance. Even further, AWS Lambda, introduced in 2014, enabled developers to build and run applications without managing servers.
So what will happen now with AI-natives?
- Fewer expensive Data Engineers & ML Engineers — The rise of AI-native platforms makes it easier for non-experts to build and deploy AI models and reduces the need for dedicated AI engineers in many cases.
- Shift from building models from scratch to using pre-built models — Just as AWS provides pre-built infrastructure components, AI-native platforms provide pre-built machine learning models that can be customized and trained using your data.
- Rise of automation tools for building and deploying AI models — Just as automation tools have enabled companies to manage large numbers of servers and resources in the cloud, AI-native platforms are allowing companies to automate the building process and deploy AI models.
- Shift from monolithic AI applications to microservices-based AI architectures — As microservices architecture has enabled companies to build complex applications using small, modular services, microservices-based AI architectures are allowing the companies to build complex AI applications using small, modular AI models.
- Democratization of AI development and deployment — As AWS democratized cloud computing, AI-native platforms democratize AI development and deployment.
One outstanding question remains: will OpenAI become another dumb pipe?