Introducing TJBot – An open source maker kit connected to Watson Services.

tj

So, the past few days (months actually) have been spent working to prepare for the Watson Developer Conference where a really special project was unveiled – TJBot. I have had the incredible good fortune to have worked as technical lead (software) in creating TJBot and we are all super excited to share this project with the entire open source community. The project is the brainchild of my colleague Maryam Ashoori, and we worked with an amazing industrial designer colleague of ours (Aaron) who helped design TJ!

 TJ Bot is an open source project designed to help you access Watson Services in a fun way.You can 3D print it or laser cut it, then use one of its recipes to bring mite to life!#TJBot

TJBot can be laser cut from cardboard/chipboard (designs are open source and can be downloaded here). You can also 3D print it (download 3D files here). On the inside, TJ has

  • > A raspberry Pi 3
  • > A USB microphone,
  • > A raspberry pi camera in its left eye,
  • > An RGB LED on its head and
  • > A bluetooth speaker.

See the video render below to get an idea of how TJ is assembled from a laser cut cardboard.

TJBot and Embodied Cognition

The field of embodied cognition suggests that the human decision-making/reasoning process is a function of our thought process (our brains), body and the physical world. When we think and act, our physical body and our world often act as integral components or extensions of this process. A project like TJBot which integrates IBM Watson cognitive services  provide an opportunity to study and implement embodied cognition within objects. Democratizing this process, and bringing it within reach of the entire community will allow us explore a diverse spectrum of community-generated usecases as well as study the dynamics of interaction with such devices. At the Watson Developer Conference, IBM also released an experimental middleware application – project INTU to integrate various cognitive services, in essence simplifying the implementation of embodied cognition.

What  can you do with TJBot ?

Starting today, sample code on how to program your TJBot is available on github in form of recipes.

Recipes are step by step instructions to help you connect your TJ to Watson services. The recipes are designed based on a Raspberry Pi. You can either run one of the available recipes or create your own.

Today, we have created recipes to:

  • Make TJ Bot respond to emotions. The RGB LED on TJ Bot’s head will change color based on the public sentiment of a given topic on Twitter. It connects to the Twitter API to fetch the tweets and runs Watson Tone Analyzer to identify the overall sentiment. For example, you could program TJ Bot to track the real-time social sentiment of a major awards show, like the #Emmys.
  • Use your voice to control TJ Bot. Using your voice, you could give TJ Bot basic commands. For example, you could ask TJ Bot to “turn the light yellow”, and it will change the color of its light. TJ Bot uses the Watson Speech to Text API to transcribe, analyze and understand what you are saying.
  • Chat with TJ Bot. Using three Watson APIs, this recipe creates a “talking” bot. In a three step process, Watson Speech to Text API will convert your voice to text, Watson Conversation will process the text and calculate a response, and Watson Text to Speech will then revert the text to audio, allowing TJ Bot to respond. Based on how you program your Rasberry Pi, you can talk to TJ Bot about anything from the weather to your favorite TV show.

The above recipes are written in Nodejs (javascript), and designed to help you get started. In reality, you can pretty much do anything. I encourage you to put together your bot (for those) and start experimenting with use cases that take advantages of all the sensors within TJ! Want to contribute a recipe ? Upload your code to github, add its link to the  featured tjbot page and submit a pull request.

Reaction so far

In two words – overwhelmingly positive. Most people I discussed with at the conference were particularly engaged by TJ’s humanoid design features and words like “cute”, “funny”, “adorable” were used copiously. Some twitter reactions too.

and a ofcourse few people who just wanted to have one

 

Open Questions

What issues arise when as individuals attempt to interact with cognitive objects using natural language and lengthy conversations? What factors drive or hinder sustained engagement? What design patterns improve such human-machine interaction? These are open questions we can learn from the community as more developers create apps for TJBot.

Leaving you with a picture of a laser-cut TJBot!

img_2261

Now .. isn’t he cute ?

 

About Vykthur

Mobile and Web App Developer and Researcher. Passionate about learning, teaching, and recently - writing.
This entry was posted in Congnitive Services, Research and tagged , , . Bookmark the permalink.
  • kesfun

    He’s cute. I hope I can get one.