background preloader


Facebook Twitter

Api-ai/apiai-weather-webhook-sample - Buttons - Heroku Elements. - sample webhook implementation in Python This is a really simple webhook implementation that gets classification JSON (i.e. a JSON output of /query endpoint) and returns a fulfillment response. More info about webhooks could be found here: Webhook Deploy to: What does the service do? It's a weather information fulfillment service that uses Yahoo! Weather API. The services takes the geo-city parameter from the action, performs geolocation for the city and requests weather information from Yahoo!

The service packs the result in the webhook-compatible response JSON and returns it to actions - Empower your home automation experience with AI, NLP and machine learning. What is is a platform that empowers mobile and desktop applications with multilingual voice recognition, natural language understanding, and text-to-speech technologies. It allows developers to create intelligent applications and include natural language interfaces in their products. Users of such applications can create, customise and teach chatbots or “smart agents” using natural language understanding and voice recognition to interact with them.

As an interface to, Beecon+ provides you multiple input and output methods to listen, train, talk and get information from your agent based on what you do, when you do and where you are. expands Beecon's actions, voice commands, proximity and location events with the aid of personal agents that learn, assist and communicate with you in your own language. When combined with home automation, smart agents can perform a myriad of autonomous tasks based on proximity, location, manual actions or voice commands: Get started. vs A comparison. This post concerns Greenhouse Group Labs, an innovation program for students, established by Greenhouse Group. Labs is an ideal opportunity to test the latest technologies available, while allowing for talented young individuals to deeply explore them and come up with groundbreaking solutions. Yesterday’s big news in the world of chatbots was Google acquiring, a company which allows developers to integrate natural language processing and understanding into their applications. Think of it like talking to someone, but instead, that someone is now your phone, and it actually understands what you’re saying. Services like and are capable of processing human speech patterns and filtering useful data like intent and context from it.

Now that is owned by Google, the battle between and will probably intensify, as has been Facebook’s property since January 5th of last year. But now, no more delays! Cheers! API.AI: Let’s create a Movie ChatBot in minutes – Chatbot’s Life. AutoVoice - Natural Language from Build a working SMS chat-bot in 10 minutes – Chatbot’s Life. Let’s build an SMS chat-bot using, Twilio and PythonAnywhere in about 10 minutes. No server setup, ~50 lines of code, $1.00. Ingredients we’ll need: an accounta Twilio accounta PythonAnywhere account Once you’ve registered for all 3 of the above, we can begin. Please follow the steps in order as the Twilio setup requires our webApp to respond.

Create an ‘agent’ on, call it ‘MyAgent’ Click ‘Domains’ to import a few conversational domains: “Small Talk” and “Wisdom”. Click on “Intents” and let’s create a simple intent: and its response: 42 Click “Save” then test it by using “Try it now…” in the upper right. We now have a simple chat-bot. To call it from our code we’ll need its “Client access token”.

Building a Google Action Using API.AI – Bot Tutorials. After buying my Google Home at launch I knew instantly that I wanted to be able to build conversation actions with it. Unfortunately, in the beginning there was very limited functionality and the best I was able to do was integrate my smart plug with IFTTT to make this: Luckily we now have a lot more support and along with the Actions API, Google offers a service called API.AI. This allows us to build our conversation actions much easier and today we’re going to build a simple conversation to learn when the bus is coming. So first lets create our API.AI account and start our first “agent”. I named my agent “BusTracker”. After that we are brought to a screen with the default “intents”. Intents are how we build the interaction between what a user says and how the system responds.

Let’s click on the “Default Welcome Intent” to edit it and give our agent some default start statements: Now let’s go enable our Google Assistant Integration. Very basic but we’re off to a good start! Great! Building Chatbots with API.AI and GRAKN.AI – GRAKN.AI. Building Conversational Chatbot Interfaces with API.AI - Google Slides. Building Conversational Chatbot Interfaces with API.AI - Google Slides. Building Google Action with JavaScript – Daniel Gwerzman. It’s Time to Take Some Action on Google Action (…pun intended 😀) At Google I/O 2017, the company showed off Actions on Google (or Google Action), and announced a Developer Challenge to create new Actions for Google Assistant on Google Home.

Since this is still relatively new tech, I’d like to share a short tutorial with you about how you can create your very own actions. I will use Dialogflow (a.k.a, a Conversational User Experience Platform, to help us create an agent/bot that we can communicate with in human language (voice or text), and Node.js to build out the webhook/server side. Our New Action: Math Trainer! The idea for this action came to me when Google Home joined our family. The kids immediately adopted “her” and started communicating with “her” on a daily bases. Since we homeschool our children, many of the requests to Google Home were educational.

With that in mind, we can get started: Defining our Action Building the Math Trainer Creating Intents Congratulations! Building Rich Cross-Platform Conversational UX with API.AI (Google I/O '17) Building your Conversational Bot with Go and With all the hype of conversational bots making the rounds, its good to know what is involved and how much effort it takes in creating a simple conversational bot. Here lets build a bot and integrate it with messenger. To keep it simple, we will build FruitBot which orders food and lets you check on its delivery status. Creating the Bot in Login to and create a New Agent. Before we begin lets draft out some conversation flow and understand what we are going to build me: hibot: hi, welcome to fruit bot! Me: what is the status of my orderbot: your order will be delivered by tomorrow 9:00 pm Not the greatest of bots, but this will do for this tutorial. Intent Bots are mainly composed of 3 components.

Greeting: Hi, Hello, HeyAddFruit: I would like apples todayCheckout(Yes/No): yesStatus: what is the status of my order Greeting intent seems straight forward To build AddFruit though we need to identify the fruits so we need to create an entity which can recognize the Fruits Entity Context. Chatbot Tutorial - API.AI webhook NodeJS - echo sample app - Code and Demo. Creating a NodeJS based Webhook for Intelligent Bots. Creating a Simple Facebook Messenger AI Bot with in Node.js – GirlieMac Blog. Hey, happy new year!!! Previously, I created a HTTP Status Cats bot for Slack (and its tutorial on Medium), and this time I tried with Facebook Messenger with some interesting 3rd party APIs, and I decide to give a try. As you may have heard of,, which recently acquired by Google, provides a conversational platform for natural language processing and it allows us to create bots easily.

Writing apps with the services aren’t hard, however it requires some time reading the docs to figure out how to set them up, so I would like to share my experiences as this tutorial so hopefully you can write your bot in less time. There are two major parts: Setting up a Facebook Messenger App and writing the webhook Using Small Talk domain and creating a custom Intents My step-by-step instruction uses Node.js, so if you’d like to follow the how-to, make sure Node.js is installed on your machine. The source code (on tutorial-01 branch) is on GitHub. 1. Writing a Webhook with Express.js 2. DialogFlow (API.AI) tutorial: Getting started with webhooks on Heroku - Mining Business Data. Ebook. Exploring Dialogflow: Understanding Agent Interaction. Dialogflow is a powerful tool that allows us to create conversational tools without the complications of needing to handle natural language processing. But before we dive into the platform, it’s important to understand all of the different concepts that tie together to create the conversational agents that we can create.

When I started exploring the platform I jumped in without knowing what was what — so in this article I want to quickly run through each of the concepts to help provide some foundational understanding for the platform. Just as you would say Hello to your friend before conversing with them, invoking an agent on the actions platform is carried out in the same way — this kicks off the experience with our Agent in a conversational manner. At this point, this is the user requesting to speak to our agent — this invocation is detected using the recognisable terms that we define in the Dialogflow console.

Handling Permissions with DialogFlow and Actions On Google. The “user_info” intent This intent will be triggered “automagically” when the user responds to the question: “To locate you, I’ll just need to get your street address from Google. Is that ok?”. In fact, when Actions On Google asks the question, and the user responds “yes” or “no” (grants or declines); Actions On Google will then send an event called “actions_intent_PERMISSION” to DialogFlow. We’ll use that event to trigger this particular intent. In the application, we’ll register the “user_info” action and make sure to check whether the user has granted or declined the permissions.

That’s it You can test your action in the Actions Simulation: Bonus Did you know? Hands-on with API.AI & Google Assistant: Writing your first Conversation Agent. This is an update to my previous API.AI Workshop material that was published here. This update takes into consideration changes in the development flow and a much closer integration between API.AI and Google Cloud Platform. Feel free to use this material as part of your workshop. Let a thousand actions bloom! You : “Ok Google, how do we write our first action for Google Assistant?” Me : Let’s start Prerequisite : Sign up for Google Cloud Platform (GCP) In case you have an existing GCP Account, you are good to go. Overview and Purpose This hands-on lesson provides a step-by-step guide to creating a Feedback Agent using natural conversation. Sample Conversation Before we begin with API.AI, we can take a look at a sample conversation. The conversational flow that we shall come up with is given below, with a sample user, User1 and the Feedback Agent: User initiates the conversation … Feedback Agent: Hi!

User1: I would like to leave some feedback. User1: I went to your Mumbai resort. “Hi! Optional : Hands-on with API.AI & Google Assistant: Writing your first Conversation Agent. How to Build Your Own AI Assistant Using The world of artificially intelligent assistants is growing — Siri, Cortana, Alexa, Ok Google, Facebook M — all the big players in technology have their own. However, many developers do not realise that it is quite easy to build your own AI assistant too! You can customise it to your own needs, your own IoT connected devices, your own custom APIs — the sky is the limit. Important Update (20th Nov 2016): It appears that now charges for access to their pre-built domains — so access to a lot of the earlier available info will now require signing up to one of their subscriptions. The API has also changed, so previously built bots using this guide may no longer work. I’ve updated the guide below where I could to get it up to date though for those starting the process today.

Late last year, I put together a guide on five simple ways to build artificial intelligence in 2016 where I covered a few of the simple options out there for building an AI assistant. What is The Code. Singlish Marathon Chatbot with NodeJS, AWS and With the popularity of chatbots and AI, many tech giants have released APIs to allow developers to build their own chatbots / AI / recommendation engine easily without any knowledge of machine learning.

Amazon released Lex, Facebook and Google acquiring and respectively. To brush up my knowledge of NodeJS, AWS and AI toolkits, I decided to build my own chatbot. The chatbot should be able to return users a list of running races with the link to the event website and date of the race. And being built in Singapore, it will be great to have some Singapore slang in built.

I first dabbled with, however I find that the learning curve is steeper than as the training of the bot, defining entities and intentions are not very intuitive.There is a very useful comparison of, and Microsoft Luis on Stackoverflow. It uses terms such as “Bot engine” and “Stories” to train the bot., on the other hand, was very intuitive. Slot Filling – the easiest way to build a dialog for information collection · API.AI. Tutorial : Getting Started with Google Actions with API.AI. Chatbots are the rage these days. At a high level, they can be rule-based or AI-based. The general consensus is that AI-based Chatbots are hard to write but a combination of NLP and Machine Learning in your Chatbot can take it to the next level, beyond the current rule-based Chatbots that are very rigid, though they do perform their intended action. Google opened up the Google Assistant platform for developers in December and currently the platform supports building out Conversation Actions for the Google Home device.

It is widely expected that the same Actions will eventually be available across Google’s other devices and applications. In my latest tutorial, published on ProgrammableWeb, I provide an introduction to how you can get started with Google Actions with an excellent platform API.AI, that Google acquired last year. The tutorial covers the following: Tutorial Link : I wish you a happy time writing your first Google Action and getting introduced to API.AI. Tutorial: Build an AI Assistant with and Amazon Lambda - RaizException - Raizlabs Developer BlogRaizException – Raizlabs Developer Blog. Tech giants are betting big on conversational interfaces; Facebook acquired, Google acquired, and Amazon announced Lex.

These services all make it easy to parse user intention from natural language. In this tutorial, we’ll demonstrate how to connect a conversational interface with a third-party API. These steps will enable you to build rich experiences for the Google Home, Amazon Echo, Microsoft Cortana, Facebook chatbots, Slack Bots, or another AI assistant. For this example, we’ll build a Google Home action that tells us the height of Star Wars characters. 1. Sign up for Name: StarWarsDescription: A conversational assistant that can tell you the height of various star wars characters.Click “Save.” 2. An intent describes something a user is asking for. Add a new intent by clicking the “+” next to “Intents.”Populate the intent. Notice how the Height Intent is correctly identified, even though we didn’t train that exact wording. 3. Sign up for Amazon AWS. 4. 5. 6. Using with Microsoft Bot Framework. Weather-step2.