Microsoft recently announced at //BUILD 2016 their foray into the cognitive domain, currently dominated by IBM Watson, called Microsoft Cognitive Services. Having some knowledge of the cognitive computing space and seeing how it will revolutionise the way applications are created, I was toying the idea of a completely new way of consuming APIs when they announced the “Conversation as a Framework”, providing a bot framework and platform to create chat bots.
What if a developer can consume REST APIs (or any APIs for that matter) in a natural language like talking to a chat bot?
Consuming APIs like talking to a bot
Consuming APIs today is very structured. There is an endpoint with the standard HTTP protocols – GET, POST, PUT, DELETE – and using JSON in the body of your call as your data structure to pass data in and out of the endpoint.
With a chat bot style interface, I postulate that it is possible to call the endpoint using standard NLP concepts like actions, intents and entities with a one-to-one mapping. I also further postulate with more complex sentence structures, the developer is able to interact with the set of APIs in a more complex way than possible with natural language.
Having a chat bot to call the API solves a few issues developers have while consuming APIs:
- Discoverability of the API endpoints.
- How to use the API endpoints.
- Construct the data, and call the API endpoints.
So is it possible? Of course
So I delved a little into Language Understanding Intelligent Service (LUIS) and the Bot Framework to find out more about how easy it’ll be able to create something like this. LUIS is similar to Watson Natural Language Classifier but with a UI that helps build and train the system, and a more complete set of APIs to configure the system, which makes this idea possible.
The basic fundamentals of LUIS are:
- Action maps to the endpoint of the API call.
- Entity maps to the data name to send to the endpoint.
- Model maps to the data type
- Parameters of the Entity maps to the data itself
And we pretty much have mapped everything required to call the API.
Creating a new API platform using NLP
So here’s where the developer value comes in. There’s a lot of plumping involved to create this platform. This new platform will provide the following:
- Read the REST API endpoints and automatically define the Actions, Entities and Models.
- Create a standard dialog flow that promotes discoverability of the APIs, and the usage from the documentation.
- Ability to generate a “proposed” sentence to call the API and the standard REST API call in the various language.
- More complex API calls through chaining and flow control.
Training for more models will happen through LUIS if required.
Take for example a set of APIs for a Todo list.
- You can first discover the set of APIs by asking a simple question – “What can I do?” or “Give me a list of APIs“.
- Next, you can ask about a particular API by asking – “How do I get the list of todos?” or “I want to create a todo item.” – and it should be able to get you started with using the API.
- After you’re done discovering your usage, you can say – “I’m done. What’s the final command?” or something similar – it should provide back with both the NLP way of using the API like “Enter a todo item called ‘Buy shampoo’ and get it done by 3pm today” and/or the exact code in whatever programming language to call the API.
These are just my initial thoughts about this. What do you think?