Shreshta Shyamsundar is a Distinguished Technologist at Infosys. In this article, he discusses how API strategy is being influenced or will be influenced by Generative AI.
If I were to roll back, say, 15 years or so, AWS started in 2006, the iPhone was launched in 2007, and GCP followed suit in 2008. And as you’re in 2010, and right about 2007 was the iPhone launch. The iPhone launch led to an explosion in terms of people necessarily consuming content digitally and on their mobile service. Some of these trends initially took a bit of time because consumers were hesitant to share some of their information online and utilize some of these services on their phones. But people got over it, and then a big boom resulted in what we’re seeing even today.
After 2011, on the back of the global financial crisis, organizations were cash-crunched. Therefore, they were looking for cost-effective IT solutions. AWS and GCP caught up, and we rode the cloud wave in 2012, and it’s moving exponentially higher, even today. Even for the cloud, there was an element of hesitation that customers were facing in terms of putting up some of their prize-data elements onto an Internet-facing public cloud, regardless of the security elements that cloud providers and other hyper-scalers offered. It did cause a bit of hesitation in the eyes of the public mind.
Similarly, what we’re seeing now is an AI era. Between 2012 and 2017, large cloud companies were investing in AI. AI is being consumerized, and organizations that have done digital and cloud transformation well will leapfrog into the AI era. Google came out with its Transformer architecture around 2017, which yielded a lot of interest in natural language processing.
So, we are now in that AI world. Language models are becoming hugely sophisticated. They can understand much more information than they used to even seven or eight months ago. We started seeing models that could support 1.5 billion parameters, and now we’re already up to half a trillion. Therefore, they can process much more information than before in parallel.
More than 35 products based on LLMs have been launched in the last seven months.
AI adoption is evolving from generalized to specialized capabilities through these evolution patterns. The five trends are –
Consumer personal AI assistant – Open AI Chat GPT, etc
Specialized AI apps and assistants – GitHub, Adobe Firefly AI, etc
Custom AI apps using Closed Models and APIs – OpenAI GPT4, Azure OpenAI
Custom AI apps using Fine-Tuned Open Models – CodeGen, BLOOM
Industry-specific AI apps using specialized pre-trained models – Bloomberg GPT
This is the direction that we see AI being adopted in a generic sense across multiple organizations. In each of these phases, AI is beginning to disrupt engineering, getting companies to look at AI in a new light. They are putting pressure on both the time and the cost to deliver new products.
This will affect or likely come in the way of our traditional API strategy that API-first companies have today.
Let us look at an example of a Telco. A customer has ordered a new mobile phone with mobile service, broadband, and so on. From a strategy point of view, you need to have the business outcome delivered through APIs. So, the consumer should be able to track their orders on their phones or devices. This is one part of the strategy.
Now, you bring in the commercial team to identify any opportunity for monetization. You can charge customers for getting an SMS when their order is ready. In parallel, you also are looking at engaging your API products, so that you are bringing out the right information to be shared with the consumer. You’re also going to structure it appropriately so that the consumers do not find it confusing to understand when they will get what pieces of their product suite they’ve ordered and in what sequence they will get it.
On the other hand, once you have decided what you’re looking for and what kind of experience you’d like to deliver through these APIs, you then engage the design team, who will then work with governance, security, and developer experience to identify what is the appropriate architectural style for this product. This will also consider the API’s lifecycle and details regarding the design, non-functional items, and performance aspects.
For all these strategy elements to work, three key tenets are important –
- Get solid communication across these teams to ensure they are communicating; they know exactly that they’re chasing one common goal. That communication needs to be clear.
- Each of these teams, sub-teams, or roles needs clarity regarding what is expected of them and their responsibilities and accountabilities.
- The third most important tenet is that each of these teams has necessary operational level agreements that they will have to individually respect so that everyone has adequate time to do their job the best they can.
Regulatory scrutiny is going to be important with AI. It’s important to ensure a clear chain of command as the API asset is being built, tested and delivered to make sure that it’s absolutely clear in terms of what is the audit trail of this particular asset. From a regulatory standpoint, we must have an appropriate electronically verifiable audit trail for these assets.
Some of your rules and policies would need to vary if a machine consumes APIs. Customers have heightened expectations of being able to be delighted from an experience standpoint, and all are looking to ensure that customer delight is right on top of the expectation that they’d like to deliver. Therefore, being able to come out with specific add-ons now, for example, being able to make a guesstimate on when some of your products are likely to be shipped to you or arrive at your doorstep, is an important customer delight standpoint, even if the system isn’t having some of those dates handy. You need to make predictions and pass this information to the end customer.
Our API strategy needs to be malleable to a point wherein it can survive regulatory scrutiny. It needs to change and be reviewed and reworked based on the consumption of APIs by machines.
Let us envision a world that will have corporates driving API strategy top-down.
Trends driven by API strategy –
- AI pair programmers are now mainstream – Teams use productivity-enhancing AI tools for code completion and generation, including automatically generated software tests as part of their codebase.
- AI will democratize API build to non-tech staff – AI-powered platforms help build APIs, quality assess APIs to specifications, and generate and amend standard API documentation, thereby lowering the entry barrier for API development.
- The pressure to deliver faster and cheaper will increase with AI. With an increased automation quotient throughout the SDLC, teams will be pressured to deliver API products faster and at a lower cost.
- AI can now enable autonomous integrations – Platforms can consume standard documentation and auto-generate software code to integrate services and deliver simple business outcomes.
- Emphasis on API responses being context-aware – AI-driven APIs can analyze contextual information to adapt their behavior and responses accordingly and deliver personalized content, recommendations, or services.
- Inadequately secure code, security policies governing APIs, and other NFRs ignored in the AI race – AI helps deliver the MVP of a product swiftly. However, it can often overlook secure coding practices and apt security policies based on several factors.
Implications for API engineering teams –
- API debugging and testing will become even more important to get right – Generative AI will augment these processes and decrease the time developers take to get productive.
- AI-first talent strategy will inform the staffing process to uplift all staff to be AI-aware of impacts and influences, some of those educated to build and integrate with AI services and APIs and a few to build AI services to streamline quality of delivery with speed.
- The API engineering teams will insist on documentation in markdown format with a minimum introduction, authentication, endpoint/methods, request and response formats, param and query strings, error handling, versioning, security policies, pagination, and change log.
- Context-aware capabilities such as Google awareness API should be considered part of an API platform offering. They combine signals and cues to inform relevant API responses to the customer. In addition, teams must carry out a process to identify strategic reusable APIs to ensure such offerings are aptly and widely used without double dipping.
- Secure code audits must be carried out routinely while creating new or modifying existing APIs, especially with AI assistants.
To summarize, regardless of how much commoditization happens from the software engineering coding standpoint, API design and architecture will continue to be in the human domain. We may have AI-aided bots that will help us with the collaboration of design reviews and generate API documentation. We must ensure that our overlays reflect that and allow for AI and embrace the possibility of what AI can do for us. We must ensure that we have governance mechanisms in place.