background preloader


Facebook Twitter

How to Build Your Own AI Assistant Using The world of artificially intelligent assistants is growing — Siri, Cortana, Alexa, Ok Google, Facebook M — all the big players in technology have their own.

How to Build Your Own AI Assistant Using

However, many developers do not realise that it is quite easy to build your own AI assistant too! You can customise it to your own needs, your own IoT connected devices, your own custom APIs — the sky is the limit. Important Update (20th Nov 2016): It appears that now charges for access to their pre-built domains — so access to a lot of the earlier available info will now require signing up to one of their subscriptions. The API has also changed, so previously built bots using this guide may no longer work. I’ve updated the guide below where I could to get it up to date though for those starting the process today. Late last year, I put together a guide on five simple ways to build artificial intelligence in 2016 where I covered a few of the simple options out there for building an AI assistant. — Exhaustive list of possible grain values – Chatbot’s Life. I was working on my personal assistant Fragments and needed some date parsing, so I went to — Exhaustive list of possible grain values – Chatbot’s Life

Wit Entity datetime has a very helpful property called grain, which tells us the precision with which the time was described by the user. For eg- “I met Rachel yesterday” has a grain of day whereas “I met Rachel yesterday at 8” has grain of hour. Similarly, “I met Rachel yesterday at 4:32” has a grain of minute and so on. Here’s why you should care about grain: Suppose the user sends a message “Our School’s founders’ Day is on Tuesday next week”. Absolute datetime is not relevant in all conversations (just like you saw above).

If (grain.equals (“month”)) { // Print only Date } else if (grain.equals (“hour”)) { // Print only till hour, like 27th Dec, 2016, 12am }else if (grain.equals (“minute”)) { // Print till minute, like like 27th Dec, 2016, 12:00am I couldn’t find an exhaustive list of all possible grain values anywhere in the Wit or Duckling Documentation. secondminutehourdayweekmonthyear. Wit — Quick start. In this section, we will show you how to build your first conversational app Sign up with GitHub or Facebook Go to home page and sign in with your GitHub or Facebook account.

Wit — Quick start

Voila! You now have access to the Wit console. This is the go-to site to configure and train your app. You need to think about what kind of conversations people will have with your bot. Let’s build a simple bot that gives weather forecasts. Click Create Story. In this story, we have taught several things to Wit: Step 2 to 5: the natural language sentence “What’s the weather in Rome?” Testing the story in the Chat window Let’s test this first story. Click on the robot at the bottom-right of the console to open the Chat window Type “how is the weather in Paris?” Adding a branch to the story What happens if the user says “What’s the weather” and forgets to provide a location? Click on the branch icon next to forecast in the story.

By creating a branch in our story: Wit — Recipes. Natural Language Processing (NLP) allows you to understand the meaning of a user input.

Wit — Recipes

You may need it as a stand-alone layer to parse text or speech into structure data You will also need NLP to build a conversational app to understand the user query and extract meaningful information out of it. Categorize the user intent Problem You want to understand what your end-user wants to perform. For example: Ask about the weather Book a restaurant Open the garage door The problem is that there a millions of different ways to express a given intent. “What is the weather in Paris?” Recipe Go to the Understanding tab in the Console In the Try out an expression field, type a sentence followed by Enter, for example > What is the weather?

Of course it’s not really useful to extract an intent if there is only one possible value. Extract date and time. Explained — Part 1 — Bot Engine, Stories, and Actions. I’ve been experimenting with bots for a couple of years now and I’ve gone through IBM Watson’s Dialog API, and for NLU As A Service. Explained — Part 1 — Bot Engine, Stories, and Actions

Somehow, I keep coming back to What Wit Used to Do For the longest time, what did for you was provide an API that takes your text or voice input and returns intents and entities: Intents: These are pretty self explanatory. An intent is simply what the user intends to do. Depending on the intent and entities you get from user input, your application can take actions or ask more questions to fulfill the user’s request. What Wit Does Now However, sometime in April, released something they call Bot Engine and it changed everything. Now, there’s this new paradigm called “Stories”. Now, I’m going to show you all the new things Stories brings to the table, and explain how they are used. stories/conversational app demo. Wit — landing.