remote1.jpg

Elementum

Situation Room Remote Control

Turning a tablet into a remote control for walls of data

Role / Interaction Designer

Company / Elementum

Situation Room is a product that lets customers create large scale video walls to display data from Elementum apps. I designed a tablet remote control app- both a native iOS version and a responsive version - that lets our customers setup their own video wall and then reconfigure the wall as needed.  Users can also interact with screens on the wall via the remote control app, allowing them to pause screens when they see something compelling, or to fast forward and rewind.

 

THE CHALLENGE

  • The remote had to allow users to easily setup, configure or edit a video wall.  We needed to support all CRUD operations for entire walls or for a single screen in a wall.  

  • The remote also needed to function as a traditional remote control, allowing users to operate video walls once they were setup.

  • The remote design needed to be independent of the screen content designs.  Content creation was an ongoing initiative so the remote needed a flexible design for all customers that could accommodate any kind of content as it was dreamed up.

THE CONSTRAINTS

  • The remote started as a responsive web app and was later developed as native Android and iOS apps so the features and interaction needed to be supported and feel natural for all three frameworks.

  • The app needed to start with a very lightweight feature set but be extensible so we could add features such as fast-forward, pause and zoom as we had bandwidth.

THE SOLUTION

  • I worked with two other interaction designers during a company hackathon to come up with some of the basic app concepts.  We chose to use thumbnail images of content screens that the user could drag-and-drop into a preview pane for assigning content to screens.  This was very intuitive for less technical users and allowed us to add more content screens as needed.

  • Post-hackathon user research indicated that users wanted to use the app as a remote control for screens, so I designed remote control features using similar drag-and-drop interactions.

  • Initial designs allowed users to assign content to a screen and delete content but not interact with the content once it was displayed.  We used a "content carousel" to cycle through datasets in one content screen until we could implement fast-forward, pause and rewind features.  This was not ideal, but did allow us to display large amounts of data at the level of granularity users wanted.


 

Low Fidelity WIRES & Prototyping

We started the hackathon with a sketch session where we converged on a basic idea.  Then our team wireframed low fidelity screens (see below) using sample images from our existing apps that were in production or being developed.  We then prototyped the basic app flow in InVision. 

Live Wall copy.png
 

Concept Mapping

Our hackathon concept was well-received enough that we got buy-in to move forward with the remote app project. I used concept maps to define all the additional flows needed to make the app into a usable MVP.  

 

Sketching Layout Concepts

After deciding on required functionality and a basic user flow, I went through another round of sketching, focusing on different layouts and navigations.

 

High Fidelity Mockups & Final Designs

Our final MVP designs are below.  The first four (shown in the iPad) are the screens from our first customer release.  After getting feedback from our first customer, we continued to tweak the visual design, microcopy, font size, etc.