Recently, I’ve been working on a Skype Bot and in order to add more intelligence to it, I have started to investigate LUIS (Language Understanding Intelligent Service). In this article I would like to share my experience working with LUIS. I’m going to guide you to create a natural language understanding Bot using Microsoft Framework and LUIS, then test it locally using a powerful tool for building bots called “Bot Framework Emulator”.
Let’s suppose that as a part of “Smart House” project we want to implement the part of user-computer interaction using LUIS. First of all, we need to define our intents and entities. Let’s say that we want to have 3 intents for the beginning (we can add more later on):
- Turn the lights on in a certain room;
- Turn the lights off in a certain room;
- Set the temperature to a certain level.
And the entities for these intents would be “room” and “temperature”. For the beginning, let’s go through some definitions then we’ll proceed to create the application.
What is a Bot?
A bot is a web service that interacts with users in a conversational format. The conversation can be started from different channels like Skype, Facebook, Slack etc. In order to give bot more human-senses, Microsoft Cognitive Services can be used like LUIS for natural language understanding, Cortana for voice, and the Bing APIs for search.
What is LUIS?
LUIS is a Microsoft service which allows you to create intelligent applications that let your end-users use natural language to perform actions. LUIS is designed to allow you to very quickly deploy an HTTP endpoint that will take the utterances (phrases given by user) you send it, and process them in terms of the intention they have and the entities they affect.
Intent – it’s something that a user wants, sometimes it could be an action that they want to perform.
Entities – these are the parameters you set in the training process and then you expect the user message to contain them.
LUIS application
Go to luis.ai, sign in with your Microsoft account and create a new application. After you have created the application you should be able to see the LUIS dashboard:
It’s time to define our intents and entities. Go ahead and choose “Intents” from the left panel and then hit “Add Intent” button. A modal window opens in which you can define your first intent, i.e. “turnLightOn”:
Now that we have the intent defined, let’s define an entity for it as well, by choosing “Entities” from the left panel and then hitting “Add custom entity”:
Having the intent and the entity, we can add utterances and label the entities in them. This can be done by choosing “Intents” from the left panel, intent name (“turnLightOn” in this case) and then type a new utterance and press Enter. After that you can label the entity by choosing the words from utterance and select a entity from the list (Fig.4).
The same way we add the other 2 intents: “turnLightOff” and “setTemperature”. For the “setTemperature” intent we are going to use a pre-built entity called “temperature”. We can add it the same way we added custom entities, except this time we choose “Add prebuilt entity” and then select “temperature” from the list. Labeling of pre-built entities is done automatically like in Fig.5 (initial utterance is “set the temperature to 20 deg C”):
By adding more utterances in our application, we increase the accuracy of predictions for intents and entities, so let’s go ahead and add more utterances. Here are some examples of utterances that can be added:
- turnLightOn:
- turn the bedroom light on;
- make sure the light is on in the bathroom;
- turnLightOff:
- can you turn off the kitchen light for me;
- turn light off please in the basement;
- setTemperature:
- adjust the temperature to 24 degrees;
- change the temperature to 19 deg.
Once we added more utterances, we can go ahead and publish our application by clicking “Publish App” on the left panel and then “Publish” button. After a couple of moments, you will get an endpoint URL that makes your model available as a web service. Let’s copy this URL, since we are going to use it later on in our Bot Application.
Building the Bot
We need a couple of dependencies in our Node.js application. Get the BotBuilder and Restify modules using npm:
[code language=”javascript”]
npm install botbuilder –save
npm install restify –save
[/code]
Here is the basic outline of the Node.js application:
[code language=”javascript”]
var restify = require(‘restify’);
var builder = require(‘botbuilder’);
// Setup Restify Server
var server = restify.createServer();
server.listen(process.env.port || process.env.PORT || 3978, function () {
console.log(‘%s listening to %s’, server.name, server.url);
});
// Create chat bot
var connector = new builder.ChatConnector({
appId: process.env.MICROSOFT_APP_ID,
appPassword: process.env.MICROSOFT_APP_PASSWORD
});
var bot = new builder.UniversalBot(connector);
server.post(‘/api/messages’, connector.listen());
[/code]
It’s pretty straight-forward. We are creating a ChatConnector, providing to it our bot credentials which can be obtained by registering a new bot. Just a few lines of code are needed to set up the Restify server and hook the bot up to it.
The BotBuilder SDK is tightly integrated with LUIS. For Node.js, we simply have to create a LuisRecognizer that’s pointed at your model and then pass that recognizer into your IntentDialog:
[code language=”javascript”]
// Create LUIS recognizer that points at our model and add it as the root ‘/’ dialog.
var recognizer = new builder.LuisRecognizer(process.env.LUIS_URL);/*here we use the URL that we copied earlier*/
var dialog = new builder.IntentDialog({ recognizers: [recognizer] });
bot.dialog(‘/’, dialog);
[/code]
In order to handle LUIS intents, we can add handlers to the dialog. A handler can be a function or a sequence of functions called a “waterfall”:
[code language=”javascript”]
dialog.matches(‘turnLightOn’, [
function (session, args, next) {
// Resolve and store any entities passed from LUIS.
var room = builder.EntityRecognizer.findEntity(args.entities, ‘room’);
session.dialogData.intentScore = args.score;
// Prompt for room
if (!room) {
builder.Prompts.text(session, ‘What is the room name?’);
} else {
session.dialogData.room = room;
next();
}
},
function (session, results) {
var room = session.dialogData.room;
if (results.response) {
session.send(`The light is turning on in the **${results.response}** { Intent score: ${session.dialogData.intentScore} }`);
}else{
session.send(`The light is turning on in the **${room.entity}** { Intent score: ${session.dialogData.intentScore}; Entity score: ${room.score} }`);
}
}
]);
[/code]
Whenever the bot receives a message, our LUIS model is automatically called by the SDK and the top detected intent’s handler is triggered. The first function in the waterfall looks for the “room” in entities. If it matches, we simply call the next function in the waterfall. If it’s missing, we use the build-in text prompt dialog to ask for a value. The value is passed to the next function in the waterfall.
Beside handlers for intents that we defined, we need to define one more handler for None intent. This handler is triggered whenever LUIS doesn’t recognize any of our defined intents:
[code language=”javascript”]
dialog.onDefault(
function (session, args, next) {
session.send("I’m sorry, I didn’t understand");
}
);
[/code]
Testing with Bot Framework Emulator
As you can see from the image above, LUIS has correctly determined the intents and entities with a certain amount of confidence, which can always be increased by adding more utterances in the system.
By using LUIS with Bot Framework you can very quickly create interactive apps which understand what you want and reply back to you.