Emojis as Content within Chatbots and NLPs

Emojis are more than just a fun visual graphic πŸ™Œ, they convey meaning. And within a chatbot, an emoji can be used by both the bot πŸ–₯πŸ’ and the bot user πŸ‘© to communicate concrete ideas πŸ’‘. Let’s look at how emojis can be used by bots and more interestingly IMHO by users.

What is an Emoji?

I am sure you know what an emoji is from the visual sense 😊. But did you know emojis are encoded in the Unicode Standard. Starting in 2010, Unicode has included emojis, which enables them to be widely used.

Emoji is a standardised set of characters that is available on iOS, Android, Windows and OS X. While the artwork for each emoji character varies by platform, the meaning of each symbol remains the same.

-- Emojipedia

What does that mean?

Emojis are equivalent to letters in Unicode, so they are standardized. While they look different on each platform (much the way letters look different in different fonts), they are sent and received by chatbots as text and not as images. This is an important difference (text β‰  πŸ–Ό).

Unicode Character Unicode Apple Google Facebook
 A (Latin Capital Letter A)  U+0041 n/a n/a n/a
 Ξ” (Greek Capital Letter Delta)  U+0394 n/a n/a n/a
 πŸ˜ƒ (Grinning Face Emoji)  U+1F603
 πŸ‘ (Thumbs Up Sign)  U+1F44D

Emojis Sent from Chatbots

Let's face it, chatbot messages can be boring. If all we see from a chatbot is a series of text messages, it is hard to stay engaged. Along with other visual elements (e.g., Messenger's Button Template and Slack's Interactive Buttons), emojis can help create a more compelling visual experience for the user.

The I will Vote chatbot makes good use of emojis to add some personality to the bot.

From a technical standpoint, adding emojis is simple, since they are just text. Just copy and paste the emoji into your NLP or hard coded responses, and you are done!

Emojis Sent to Chatbots by Users

The tricker and more interesting (to me πŸ˜‰) part about emoji usage is how to handle an emoji sent from the bot user. Since most of us use emojis (92% says Emogi), we need to start thinking about how a chatbot will understand an emoji.

Let us start with a simple example of trying to teach a NLP (Natural Language Processor) to understand a yes/no response. I will use Facebook Messenger for the chatbot and Api.ai for the NLP.

Setup of Api.ai

First create 2 entities to handle the yes/no answers.

Entity: @Answer-Yes

Entity: @Answer-No

Then we modify the Default Welcome Intent and add two new Intents: Answer-Positive and Answer-Negative.

Intent: Default Welcome Intent

Intent: Answer-Positive

Intent: Answer-Negative

Test in Api.ai

Now that Api.ai is setup, it is time to test within the api.ai console.

Test 1: Positive Response

The first test will confirm that the agent understands a positive/yes answer.

  • In Try it now..., type: hi
  • Response: Is your day going well?
  • In Try it now..., type: yes
  • Expected:
    • Response: I'm glad to hear that.
    • Intent: Answer-Positive
  • note: the Parameter value is Yes
 

Test 2: Negative Response

The next test is to confirm that the negative/no answer is working correctly.

  • In Try it now..., type: hi
  • Response: Is your day going well?
  • In Try it now..., type: no
  • Expected:
    • Response: I'm sorry to hear that.
    • Intent: Answer-Negative
 

Test 3: Emoji - Fail

Finally, it is time to confirm that api.ai is unable to understand an emoji. For those into TDD, this is the Red/Fail test

  • In Try it now..., type: hi
  • Response: Is your day going well?
  • In Try it now..., paste: πŸ‘
  • Expected:
    • Response: I'm not sure I follow.
      • or a similar message
    • Intent: Default Fallback Intent
 

Update Api.ai Entity

The next step is to train the NLP to understand the emoji. This only requires adding a synonym to the @Answer-Yes Entity.

 

Final Testing

Now that the emoji πŸ‘ is add to the entity, it should work in api.ai and in the chatbot. Onto the final tests.

Api.ai Test 4: Emoji - Success

Repeating Test 3 again in the api.ai console shows that the agent now understands the emoji.

  • In Try it now..., type: hi
  • Response: Is your day going well?
  • In Try it now..., paste: πŸ‘
  • Expected:
    • Response: I'm glad to hear that.
    • Intent: Answer-Positive
  • note: the parameter value is Yes; this is because πŸ‘ is a synonom for Yes in @Answer-Yes Entity
 

Test 5: Facebook Messenger

Since it all works inside of api.ai, it is time to do the last test within Facebook Messenger.

  • Open the chatbot in Facebook Messenger
  • type: hi
    • response: Is your day going well?
  • type: yes
    • repsponse: I'm glad to hear that.
  • type: hi
    • response: Is your day going well?
  • type: no
    • repsponse: I'm sorry to hear that.
  • type: hi
    • response: Is your day going well?
  • paste or select from emojis: πŸ‘
    • repsponse: I'm glad to hear that.
    • success!

Additional Thoughts

To wrap up, Emojis can be included in the content your chatbot sends to users, and chatbots can be trained to understand emojis.

Easy Part - Use Emojis in Content

If the personality of the chatbot allows it, I suggest using them. Looking at line after line of content can be confusing and boring. Emojis can add some nice visual queues and flare.

Hard Part - Understanding Emojis from User

By there nature, emojis are imprecise. So when asking a question, some emoji answers will be very clear and others will not:

Would you like me to submit your order?

  • Easier to understand:
    • πŸ‘ πŸ‘Œ
    • πŸ‘Ž
  • Harder to understand:
    • ⏰🐷✈️ (when pigs fly)

So depending on your audience, it may or may not be a priority. However, from a technology standpoint, chatbots and NLPs are ready.