Summit Recap

As always, Atlassian exceeded expectations for this year's Summit. Back at the San Jose Convention center, this year, space was not an issue. The venue was huge and included a DJ and custom screen printing areas. Even the food was to die for, full catered breakfast and lunch! The Bash was just as amazing. Atlassian rented out San Pedro Square Market, which is a block of restaurants, bars and entertainment. The Bash featured a DJ plus a band for karaoke once the night got started. Addteq was Gold sponsors this year and even had the honor of sponsoring the Welcome Reception! Even our booth looked incredible, sticking to the Continuous Everything theme that we created from our infographic!



Addteq team members started the week off right with a team building sailing exercise around the San Francisco Bay! Definitely an awesome experience for all of us! Between the amazing conference floor to the after parties by Addteq and Atlassian, Summit was filled with helpful information and awesome entertainment! There are even planned changes to the partner program. Stay tuned for more information on how Addteq will be certifying our employees!

I can go on for days, but this year's Summit was definitely one for the books. Luckily, Atlassian does not spare a detail and made sure to record every session! Be sure to check out the link below to get a deeper, inside look at the 2016 Atlassian Summit.

Watch all of the sessions here.

Rock, Paper, Scissors with Amazon Echo

With the 2016 Atlassian Summit quickly approaching and the huge success of the Dartboard with HipChat Integration (Atlassian Summit 2015), Addteq was looking for another show-stopper to wow the crowds at our conference booth. We had also been invited to talk about our Amazon Alexa and Confluence Integration at the Atlassian ShipIt Live, so naturally we had to include Alexa in our booth presentation. But what could we possible do? What would allow us to integrate Alexa, an Atlasssian product and also interact with each individual at the conference? Well, Rock Paper Scissors, of course!!

Now that we know what we want to play, let's talk about the setup. It starts with a custom HipChat plugin that handles all of the player's interactions. Allowing the user to add their name, email and Twitter handle to a player queue. The HipChat room will also provide a scoreboard showing the highest scoring players. Next in line is the Amazon Echo. Alexa allows players to actually play the game, with AWS Lambda functions and a Firebase DB handling the game states behind the scenes. Finally, we have a web interface which receives all of the events from Firebase. This shows the users a fun, animated interface by using Phaser JS, which provides feedback on the current game state.

Atlassian HipChat

We created the HipChat plugin using the new Atlassian HipChat Connect framework. The Connect framework lets you build add-ons that extend HipChat apps, by providing lots of design and functional options. We used the HipChat modal Dialog to create a form where users can sign up for the game.  On submitting the form, the HipChat plugin stores the information in firebase userqueue collection.  The plugin also has options to see the list of users that are added to the user queue and  top scores in the game using the glance and slidebars features. The glance for user queue shows the next user in the queue and when clicking on the glance, it opens a side bar which shows the entire user queue. Similarly the Topscore Glance show the highest scorer and clicking it opens a sidebar with top 8 users based on the scores. There are also other click actions to show the User's past game history and an option to remove users from the user queue. There's plenty room for adding more features as the game evolves further.

Amazon Echo/Alexa, AWS Lambda & Firebase

Like most of Alexa's skills, there is an Invocation name that becomes the trigger or "wake up word" for Alexa to start the skill. Ours is called Brobot. Any information provided after the "wake up word" becomes the input for the backend Lambda function to execute and return the appropriate response. Most of Alexa's Skills are single line conversations, there is a request from the User and Alexa then responds the output. The Alexa Skill Kit provides an option which can extend inputs and responses into multi line conversations using the Session parameter. This then stores the context of the previous search.

The conversation shown below is the game of Rock, Paper, Scissors. The users start the game by saying  "Alexa ask Brobot to play rock paper scissors". This invokes our Lambda function which then pulls in the next user in the queue, which is pulled from firebase, and then returns the response to Alexa as  "Hello <user name>, are you ready? Okay, rock, paper, scissors- shoot." After saying shoot, Alexa waits for a response from the user, either as rock, paper or scissor.  If the user responds with anything else other than the former, Alexa responds with "Please choose a valid option between rock, paper and scissors." If the user's choice is valid, the Lambda function uses an algorithm that makes a choice based on history of choices made by other user's in the past. This makes the game more interesting and harder to win, rather then just picking a random choice among the 3. The Lambda function compares the choice of the user and choice returned from the algorithm, then declares the winner by responding to Alexa to announce it. 

Firebase was used as the centralized database for the entire game. This included the following objects:

  • userQueue object- to handle the list of users waiting to play the game
  • currentUser object- which keeps track of the user that is currently playing, as well as keeping track of each round's results
  • gameHistory object- tracks the results of every game played. This was used with the machine learning algorithm, to make the game even harder, the more people play it.

Enter Brobot & Phaser JS

The final piece to the puzzle, was giving the audience a nice visual to accompany the gameplay. The interface decided upon was an animated robot called Brobot. The softwares, Spriter and TexturePacker, were used to create the animations as well as mange the sprite sheets to export as a texture atlas, which was then used in Phaser JS. Phaser JS is a complete HTML5 game framework and was used in tandem with Firebase to show the different animations depending on the game state. 


Using the currentUser Firebase object as a reference, as well as the built in Firebase events feature (to watch for changes within the currentUser object), the web interface was able to access the current game state at all times. The interface was able to determine when to be in an idle state (currentUser object is empty) and when to be in the play state. The interface was also able to determine the result of each round as this was continuously passed to Firebase from the Lambda function. Once the Firebase event was able to get hold of the state of the currentUser object, this was then passed into the Phaser game framework as game variables. It was these variables that controlled Brobot's movements as well as updating all of the values on the page, such as the current game's results, the name and Twitter info of the current user, and the user's final score when the game was over.

Want to Use Our Integration for Your Own Project?

ShipIt Live 

This integration was able to get the attention of Atlassian! So much so that we were invited to speak at the coveted cheers to Summit, the ShipIt Live presentation. This is where teams work together to hack a project. Similarly, to the processes above, we connected Atlassian Confluence and Amazon Echo in order to be able to access Confluence's Knowledge Base with just voice recognition. You can read more about our integration and the background information on the project here

Watch the full ShipIt Live presentation below!