IBM’s Watson question answering supercomptuer is an impressive bit of kit. The machine has competed very well against humans in many areas, including on the game show Jeopardy. IBM has announced a new API that makes the Watson question answering machine available as a service.
That means by using the API, app developers can incorporate Watson’s ability to understand native language and answer spoken questions into their apps. IBM says that developers can have their apps make a REST API call to the new IBM Watson Developers Cloud.
IBM CTO for Watson, Rob High said:
It [the API] doesn’t require that you understand anything about machine learning other than the need to provide training data.
With the new API anyone capable of making an app, could have the app tap into Watson’s massive brain without having to know anything about the tech behind the scenes. The main goal of IBM with the new API is said to be to build a large community of devs in the world of cognitive computing.
[Cognitive computing is] what we believe is the dominant form of computing in the future. We’ve come to the conclusion that this is too big and important to hold to our [ourselves].
Several Watson cloud platform apps are set for an early 2014 release. The apps include a personal health assistant from Welltok and a tool for recommending patient treatments from MD Buyline. A tool that acts as a personal shopper for online shopping sites is also coming from Fluid.
That personal shopper app will be used on the website of The North Face to give shoppers dynamic text feedback and images based on their questions. If a user types in a question about what they need for a hike in a particular park at a specific time of year, the website will be able to give them that answer with specific products. AT&T previously released an API that tapped into Watson’s speech API.