Empowering Your Assistant with Entities

Here at SitePoint, we’ve looked at the basics of setting up your own personal assistant using Api.aiand delved further into intents and context. In this article, we’ll be going one step further in the process, teaching our assistants completely custom concepts using entities.

What is an Entity?

An entity is a concept we want our personal assistant to understand when it is mentioned by the user in conversation. Each entity has a range of values and properties that contain the terms the assistant will need to understand to respond to this concept.

There are three types of entities in

  • System – entity types defined by such as date, color, email, number and so on which already understands. You can find a full list of these entities in’s documentation on System Entities.
  • Developer – Entities which we create for our individual needs — these are what we will be focused on in this article.
  • User – These are created for individual users while they use the assistant and can be generated by the API to be used within a single session. We won’t be covering these in this article but if there’s enough reader interest, we might explore this in future!’s pre-defined domains (see our very first article on this topic) would be examples of a whole range of pre-built entities that also come with pre-built intents for how to access them. When we add in entities into our assistant, we are adding them to expand into areas that domains do not currently cover and to train our assistant to do something unique to our personal needs.

For example, an entity of “superhero” is not something knows about. We could train our assistant to understand a range of superheroes and their various names — “Superman”, “Batman”, “The Flash”, “Green Lantern”, “Wonder Woman”, “Santa” and so on. It could then understand that these are specific concepts which we want to trigger actions with, such as contacting these heroes when villains strike via an API when we say things like “We need The Flash!”.

We also teach our assistant synonyms for each of these so that alongside names like “Superman”, it would also understand that Superman is also known as “Kal-El”, “The Man of Steel”, “Supes” and “Smallville”. If we say a different name in the spur of the moment (or someone else tries to request help from our assistant and they call them something else), help from our hero will still come!

While I’d have loved to keep that entity example going for the whole article, I thought it might be best to focus on a more realistic example in the demo itself! In our demo, we will teach our assistant to understand one important metric I get from my Jawbone Up — sleep. The end goal is for our assistant to understand statements like “How many hours of sleep did I get last night?” and “How much deep sleep did I get last night?”.

In this article, we will look at the first step of this process — setting up the entities required for our assistant to understand these statements. In a follow up article, we will look at connecting up our assistant web app to third party APIs to give it the information it needs to respond.

Creating a New Entity

To create a new entity, we open the console and go to the “Entities” page using the menu on the left. We then create an entity by clicking either “Create Entity”, the plus symbol on the “Entities” menu item or the “Create the first one” link which appears for those who have yet to create an entity:

Creating a new entity

In the page which appears, we enter in our entity name. A common convention is to write this in lowercase with words separated by dashes. We call our new entity “sleep”. We leave “Define synonyms” checked and enter one term in the section below it — “sleep”. We also can add in synonyms, so we cover a few more options by entering in “rest”, “doze” and “shut-eye” next to “sleep”. You add each synonym by pressing either the Enter key, Tab key or semicolon (;) key. When done, click Save:

The New Entity options

If we return to the “Entities” page, our new entity is shown with the name we will use to access it —@sleep.

Our saved sleep entity

Using Our Entity in an Intent

We now need to create a new intent that will train our personal assistant to recognize the sentences which trigger our sleep related requests. We start by heading to the “Intents” page and creating a new intent.

On our new intent page, we include our entity within “User Says” statements like so — @entity-name:alias. In the case of our sleep entity, we call it @sleep:sleep (the second parameter is the alias, which can be used later on as $sleep, however that is a bit beyond the scope of this article). When we include our entity within a user statement like “How many hours of@sleep:sleep did I get last night?”, it automatically is added into our parameter section below it:

Referring to our entity in our intent

Above those parameters, we have a field for the action name, this is the name which will be passed to our web app to show what thinks the user wants to do. We name our action “sleepHours”:

Naming our action sleepHours

We then can add a variety of different ways to say the same sort of statement, just as we did in the previous article on creating intents:

Variations on our user says statements

To finish up our intent, we set up some responses to our intent about sleep hours. The assistant itself in cannot look up the stats, we will need to use our own web app for that, however it is nice for the assistant to at least keep up the illusion that it is doing all the work. To do this, our responses say things like “I’ll retrieve your sleep stats for you now, one moment!” and “Looking up your sleep hours now.” It also gives us a bit of time for our web app to retrieve that data.

Our potential speech responses

Once we have our responses defined, we click Save!

One thing you might have noticed when scrolling up is that has also automatically recognized that “night” is a built in $time-period system entity in This is because we have the “Machine learning” feature turned on. It can be quite a neat feature, however in our particular situation it isn’t quite as helpful.

Machine learning time period recognition

If we test out our new intent by saying “How many hours of rest did I get last night?”, our assistant now returns a correct speech response and the action of “sleepHours” ready for our web app to respond. This is exactly what we want!

Our assistant returning sleepHours successfully

Expanding Our Entity

We have a working entity that lets our assistant understand when we want to look up how many hours of rest we’ve had, but the entity is still quite simple. Sleep is just sleep. Rest. Shut-eye. In reality, there are specific types of sleep that a user might ask about. What if the user asks “How many hours of REM sleep did I get last night?”. “REM sleep”, “Deep sleep” and “Light sleep” are all different types of “sleep” that should be understood by our sleep entity. We will add those in.

We return to the Entities page and open the @sleep entity. Underneath “sleep” and its synonyms of “sleep, rest, doze, shut-eye”, we add new types of sleep such as “REM sleep” (also just called “REM”), “deep sleep” and “light sleep”. We include these as new rows, as they have distinct meanings and aren’t exactly the same as the generic term of “sleep”. To add a new row, click “Add a row”. Once we have added our new forms of sleep, we click “Save” to save the changes:

Adding different types of sleep

If we now try a more specific sentence, like “How many hours of REM did I get last night?”, our assistant still recognizes that the request is one of “sleepHours” but also contains the sleep parameter of “REM sleep” to tell our web app what particular sleep the user is asking about:

Testing our assistant by asking them about REM sleep

If you click the “Show JSON” button underneath, you’ll see where the power of this truly comes into play. All of this information is returned to our web app in an easy to interpret JSON file that looks like so:

"id": "7438b5d8-981e-43f4-8a5f-e10be158bab4",
"timestamp": "2016-02-06T01:19:45.271Z",
"result": {
"source": "agent",
"resolvedQuery": "How many hours of REM did I get last night?",
"action": "sleepHours",
"actionIncomplete": false,
"parameters": {
"sleep": "REM sleep"
"contexts": [],
"metadata": {
"intentId": "25d04dfc-c90c-4f55-a7bd-6681e83b45ec",
"intentName": "How many hours of @sleep:sleep did I get last night?"
"fulfillment": {
"speech": "Looking up your sleep hours now."
"status": {
"code": 200,
"errorType": "success"

The most important bits of which are the action name that has been requested and the parameters for that action:

"action": "sleepHours",
"parameters": {
"sleep": "REM sleep"

This is what we will use in our next article to build up our web app’s response to these queries.


More from this author

Incredibly, after three articles looking into, we have still just scratched the surface of what is possible with the platform! Entities can contain other entities (both system entities and your own developer entities), we can set up intents which require certain information and prompt the user if they do not provide it, we can use our previously mentioned entities in a conversation using their $alias… and so on!

In an upcoming article at SitePoint, we will look at adding functionality to the web app that we created in our earlier article on How to Build Your Own AI Assistant Using that pulls in Jawbone data to give us real answers to these queries!

What do you think?

0 points
Upvote Downvote

Total votes: 0

Upvotes: 0

Upvotes percentage: 0.000000%

Downvotes: 0

Downvotes percentage: 0.000000%

12 Ways to Speed up Your WordPress Website

Customizing Your Assistant with Intent and Context