Inside the API machine room

APIs are the ‘gluing bond’ and connectivity channel between different software application parts and components… so, what tools do we need to build one, what design principles govern their creation and by what measure can we judge an APIs effectiveness and suitability for purpose?

api application development program apps by ostapenkoolena getty
OstapenkoOlena / Getty Images

Modern software applications exist in an essentially connected space. That connectivity manifests itself as a bridge not just from app to users, not just to teams or groups, not just to databases and the wider expanses of the web itself, but to other applications and smaller application services.

If this were some form of 20-questions quiz, then you’d win immediate points for realizing that we’re talking about the use and prevalence of Application Programming Interfaces (APIs).

Now considered to be the gluing bond between different software application parts and components, APIs work by defining the route for a programmer to code a program (or program component) so that it can request services from an operating system (OS) or other application.

In order to work, APIs have required syntax and are implemented by ‘function calls’, an instruction issued by a program or a script that is written in high-level language instructions (i.e. abstracted so that it is closer to human language) composed of verbs and nouns.

If that’s API 101 dealt with, then what technologies are driving API design, creation and management?

 

A hand in the API pickle jar

Intel has long had a hand in the API pickle jar. The company owned API management specialist Mashery for just over two years before it was sold onwards during a refocus on its own in-house API expertise. This winter 2020 sees Intel deliver on what it calls its ‘multi-year journey’ to deliver a mix of unified software architectures with the release of Intel oneAPI Toolkits.

Modern software applications exist in an essentially connected space. That connectivity manifests itself as a bridge not just from app to users, not just to teams or groups, not just to databases and the wider expanses of the web itself, but to other applications and smaller application services.

If this were some form of 20-questions quiz, then you’d win immediate points for realizing that we’re talking about the use and prevalence of Application Programming Interfaces (APIs).

Now considered to be the gluing bond between different software application parts and components, APIs work by defining the route for a programmer to code a program (or program component) so that it can request services from an operating system (OS) or other application.

In order to work, APIs have required syntax and are implemented by ‘function calls’, an instruction issued by a program or a script that is written in high-level language instructions (i.e. abstracted so that it is closer to human language) composed of verbs and nouns.

If that’s API 101 dealt with, then what technologies are driving API design, creation and management?

 

A hand in the API pickle jar

Intel has long had a hand in the API pickle jar. The company owned API management specialist Mashery for just over two years before it was sold onwards during a refocus on its own in-house API expertise. This winter 2020 sees Intel deliver on what it calls its ‘multi-year journey’ to deliver a mix of unified software architectures with the release of Intel oneAPI Toolkits.

This is a product set designed to allow developers to combine the power of more than just the Central Processing Unit (CPU) in their applications. Intel says that as the world moves into an era of billions of intelligent devices and an exponential growth of data, a shift in focus is required from CPUs alone to a mix of architectures across CPUs, discrete graphics processing unit (GPUs), Field Programmable Gate Arrays (FPGAs) and other accelerators.

Intel collectively describes this as its ‘XPU’ vision, with the X presumably denoting either user eXperience, X-for any processor/accelerator, or both.

“Now is a key moment in our ambitious oneAPI and XPU journey. With the gold release of our oneAPI Toolkits, we have extended the developer experience from familiar CPU programming libraries and tools to include our vector-matrix-spatial architectures,” said Raja Koduri, Intel senior vice president, chief architect and general manager of architecture, graphics and software.

This era of computing also requires a comprehensive software stack. Developers will be able to access a common, open and standards-based programming model across Intel XPUs with Intel oneAPI Toolkits.

So much for connectivity, processor ubiquity and agnostic architectural inclusivity then. If these are the foundational cornerstones of API tooling, what kind of overalls do you need to enter the API machine room, where are the spanners and… come to think of it, what time is lunch?

Into the API machine room

The truth is, whatever tools, platform and conduits we use to deal with APIs, they don’t just happen automatically. Instead, it takes a purist approach to architectural planning and software code design, development and testing to create APIs that will be of tangible use in the real world.

API product marketing director at software development and quality tools provider SmartBear, Calvin Fudge, says that the whole API design starts with gathering input from all relevant stakeholders – technical, business, internal, external... and especially API consumers.

Why? Because a bad user experience while consuming an API will lead to an endless queue of support calls followed by a bad reputation, which is all that is needed to make any API services appear unreliable. This being 2020, APIs are a little like a bad website page i.e. users move on in a matter of seconds if they don’t get what they need.

“While an API strategy might include choosing the right technology stack and building an appropriate data-distribution model, there are a lot of finer details that can get overlooked. It’s important to plan before you start to implement your API. This is where a design-first approach is useful and utilizing the power of RESTful API description format like the OpenAPI Specification (formerly Swagger) is important. The OpenAPI Specification is for machine-readable interface files for describing, producing, consuming, and visualizing RESTful web services,” said Fudge.

In the 2020 State of API Survey conducted by SmartBear, 82% of the survey participants cited OpenAPI as their API standard of choice for defining APIs.

An API’s design is the blueprint for what the API wants to achieve. It gives a comprehensive overview of all the endpoints and CRUD (create, read, update, delete) operations associated with each of them. This means that an effective API design can help prevent complicated configurations, adherence to naming schema within classes and a host of other issues that can keep developers and other systems operations staff up for days.

“The design process will also help you think through exactly how your data will be distributed and how your core product will work. Using something like the OpenAPI Specification provides a machine-readable definition of the API that can be used in proxy for the API to run other tasks in parallel, i.e. testing, virtualization and later to automate building integrations in CI/CD pipeline,” explained SmartBear’s Fudge.

These days, there are two approaches to building APIs:

  • Code-first: Based on an application’s business plan, API is directly coded and from this point, a human- or machine-readable document, such as a Swagger document, can be generated.
  • Design-first: The plan is converted to a human and machine-readable contract, such as an OpenAPI Specification from which the code is built.

Documentation combats fragmentation

Fudge insists that documentation is crucial for building the interface that lets an API be consumed. In many cases, comprehensive documentation is done only after the API’s resources and response-request cycles are mapped out. A solid initial structure makes documenting the API faster and less error prone for the people responsible for handling documentation.

“One of the most challenging aspects that organizations face with any API initiative involves standardization, or the act of designing consistent methods across all APIs. Designing APIs against a set of internal or industry standards makes the APIs consistent and minimizes the learning curve for developers. Consistent, good design makes using APIs easy to use and reinforces quality across the API development lifecycle – so I would point to Swagger and SwaggerHub,” said Fudge.

Swagger was created in 2010 by SmartBear as a technology for creating RESTful APIs. In 2015, SmartBear donated Swagger to Linux Foundation and established the OpenAPI Specification. Swagger has evolved into one of the most widely used open source tool sets for developing APIs with the OpenAPI Specification.

SwaggerHub is an integrated API development platform built for teams to drive standardization and secure collaboration throughout the API lifecycle. Where Swagger is for the individual developer, SwaggerHub was designed to foster API collaboration and standardization across multiple teams.

APIs for the next decade

Taking stock of the API pot then, in order for APIs to perform a useful function in the web-cloud-machine world of ultraconnectible connectivity, they do need to exhibit a few core characteristics in their DNA.

In addition to design-first openness, detailed documentation diligence and industry standard adherence, APIs need to be created (and here’s the tough part) by technical gurus, but with an essential appreciation for the business plan use case and user requirement they must deliver upon in the real world. Okay, it’s the virtual world, but you get the point. APIs are here to stay, life will be easier if we learn to connect.