First Person Driving with a Wheel

I’ve hacked together a first person driving experience that uses a natural steering wheel to control an RC car. The experience is a lot like playing a Racing Kart game IRL. It’s probably easier to understand if you see it in motion. Check it out:

AlexDrive.gif

I got the idea for this project while watching my three-year-old son play with his radio controlled toy cars. These all use the conventional two joysticks. The left is isolated to vertical movement to control the motor and the right stick is isolated to horizontal movement to control the turning direction of the front wheels.

As a toddler, my son tends to only meaningfully control one stick at a time. It’s probably too sophisticated for him to infer that directing the car to a diagonal means he needs to press the left stick forward and combine that with the right stick’s sensitivity. What’s interesting, though, is that he has no such difficulty driving his big power wheel. Same operations but different interface. It has a natural steering wheel, a gas pedal, and a shifter to control direction. It got me thinking: what if RC car’s could be controlled by natural steering wheels too?

I quickly realized that having a more natural controller isn’t enough. Without locking my son’s orientation to the car’s, I would just be recreating the struggle I had with the arcade game Super Sprint. I could never tell if turning the wheel left was moving the avatar screen left/right/up/down. Luckily, with the rise of drone racing, I could simply mount an FPV camera and place the monitor as if it were a windshield or wear it directly on my face.


The combination of a natural steering wheel controller and visual feedback directly from the car itself makes for a really compelling experience. The perspective is really novel and the scale magnifies the intensity of speed. It can feel like travelling at 300 mph, but at the same time, everyone who has picked it up can deftly control the car within minutes. It’s a lot more approachable than a drone. Who knows, with autonomous cars on the horizon, maybe this is how the joy of driving is preserved?

I had a lot of fun making, breaking, and remaking this rig. I learned a lot about electronics in the process and it’s been really fun sharing the experience with friends, and now with you!

Hacking an RC Car

As a test, I decided to hack an existing piece of consumer electronic and control it with an Arduino. I took apart my son’s $10 toy RC car and used a wire to contact the battery’s positive terminal to random points on the circuit board until I isolated which circuits control the steering and motor. Once I located these circuits, I soldered wires to these points and connected them to pins on an Arduino.



My Arduino happens to be connected to a PS2 shield, so I made some slight modifications to my PS2 controller sketch to digitalWrite the appropriate pins to HIGH. Here’s the sketch:

#include "Shield_PS2.h"

PS2 ps2=PS2();
unsigned long time;
int fwPin = 8;
int bwPin = 9;

void setup() {
  ps2.init(9600, 2, 3);
  pinMode(fwPin, OUTPUT);
  pinMode(bwPin, OUTPUT);
  
  Serial.begin(9600);
  Serial.println("PS2 Shield initialized.");
}

void loop() { 
  // Button Status: 1 = not pressed, 0 = pressed
  // SQUARE
  if (ps2.getval(p_square) == 0) {
    printTime();
    Serial.println("SQUARE");
    digitalWrite(bwPin, HIGH);
  }
  else {
    digitalWrite(bwPin, LOW);
  }

  // CROSS
  if (ps2.getval(p_cross) == 0) {
    printTime();
    Serial.println("CROSS");
    digitalWrite(fwPin, HIGH);
  }
  else {
    digitalWrite(fwPin, LOW);
  }
}

void printTime () {
  time = millis();
  Serial.print("(");
  Serial.print(time);
  Serial.print(") ");
}

Arduino + Dualshock2

I’m working on a new project that could use some kind of game controller. I’ve got a few old PS2 controllers lying around so I got really excited when I stumbled on Bill Porter’s PS2X library. It looked pretty straight forward, and I was itching for excuses to practice soldering so I took apart an old PS2 adapter for its female connector and wired it up to an Arduino Uno.

IMG_0822.JPG

For reasons still unknown to me, I just couldn’t get the library to recognize a connected controller. I was powering the device just fine but data wasn’t making it to the Arduino or it wasn’t being interpreted correctly. (If any of you got it to work, I’d love to know how!)

Anyhow, long story short: I ordered a $19 PS2 Shield and you know what? It was worth every penny. It’s really well documented and even supports hot-swapping controllers. I highly recommend it. Here it is in action with my debug sketch:

IMG_0826.JPG

A VR Cycling Experience for $40

I’ve made an Arduino thing that can wirelessly talk to a mobile device over BLE and can now meter the revolutions of a wheel with an optical tachometer. I’m using these two hardware features to make a virtual reality (VR) cycling experience and I’ve got a working demo to share! Here’s how this works:

First, the Arduino thing is positioned to point at the back tires of a stationary bike. (I’m using a mountain bike on an indoor trainer but the beauty of this non-invasive approach is that you could use it with treadmills, ellipticals, rowing machines, or anything that has a looping/revolving surface.)

IMG_5155.JPG

A strip of paper is taped to the tire. Each time the wheel makes a complete rotation, the Arduino will detect when the piece of paper passes by and then sends a wireless message to the mobile phone.

The mobile phone is placed in a viewer that is strapped to my face. (I used a $10 headset that more comfortably fits the iPhone 6S Plus, but any ol’ viewer will do.)

Screen Shot 2016-01-23 at 2.30.54 AM.png

In software, a virtual bike is created to travel in a virtual environment. The virtual bike will only nudge forward whenever the app receives a message from the Arduino reporting that the physical bike’s wheel has made a complete rotation. We are effectively mapping the physical action of pedaling to movement in the virtual space to make an oversized gaming controller.

Screen Shot 2016-01-23 at 1.23.26 AM.png

The virtual environment is constructed in Unity. To render for virtual reality, I’m using Google Cardboard  – a free software SDK for mobile VR. The SDK for Unity is drop dead simple – just drag and drop a prefab into your scene and you’ll instantly have a stereo camera rig to manipulate.

Screen Shot 2016-01-23 at 1.34.51 AM.png

This camera rig is set to move along a spline path whenever the app receives a BLE ping from the Arduino. That’s pretty much all it takes! So there you have it, a VR cycling experience for a whopping total of $40:

  • Arduino thing? $30.
  • Mobile VR headset? $10.
  • Software? Priceless.

Yes it’s DIY, but to put things in perspective, take a look at the current offerings on the market today:

  • Peloton: $2,000 bike with built in touchscreen monitor. No VR. Pedaling does not affect the video screensaver😦
  • Virzoom: $300 bike ($600 Oculus + ~$2,000 PC not included)
  • Ciclotte: $10,700 bike?!

Screen Shot 2016-01-24 at 1.25.26 AM.png

This was a fun little personal project and I’ve learned what I set out to learn, so this is where I move on – but if you’re going to try something similar, here’s some things I’d consider in retrospect: It wouldn’t be too much of a stretch to have the physical bike steer the direction of the virtual bike. Why not give the player total freedom to explore? Also, does the virtual bike have to be a bike? I’ll just leave this right here: http://goo.gl/fRWY6Z.

For the interested Makers out there, below is a schematic of the IR sensor along with the Arduino code. Tinker away!

IRsensor_bb.png

…and here’s my Arduino sketch:

#include SPI.h
#include "Adafruit_BLE_UART.h"

// nRF8001 pins: SCK:13, MISO:12, MOSI:11, REQ:10, ACI:X, RST:9, 3Vo:X
#define ADAFRUITBLE_REQ 10
#define ADAFRUITBLE_RST 9
#define ADAFRUITBLE_RDY 2
Adafruit_BLE_UART uart = Adafruit_BLE_UART(ADAFRUITBLE_REQ, ADAFRUITBLE_RDY, ADAFRUITBLE_RST);
unsigned long time = 0l;
boolean connection = false;
uint8_t btm = 65;
uint8_t out = btm;
uint8_t cap = 90;
#define persec 30
#define sendat (1000/persec)

int irPin = 7;
int irSensorPin = 5;
int testLEDPin = 4;
int tripTime = 0;
int lastTrip = 0;
int tripBetween;
boolean detectState = false;
boolean lastDetectState = false;

void setup(void)
{
  Serial.begin(9600);

  pinMode(irPin, OUTPUT);
  pinMode(irSensorPin, INPUT);
  pinMode(testLEDPin, OUTPUT);

  uart.setDeviceName("YanBLE"); /* define BLE name: 7 characters max! */

  uart.setRXcallback(rxCallback);
  uart.setACIcallback(aciCallback);
  uart.begin();
}

void loop()
{
  pollIR(); // IR sensor
  uart.pollACI(); // BLE
}

void pollIR() {
  digitalWrite(irPin, HIGH);

  if (digitalRead(irSensorPin) == LOW) {
    detectState = true;
    if (detectState != lastDetectState) {
      // run the first time reflection is detected

      Serial.println("message sent via BLE");
      if (connection == true) {
        sendBlueMessage("1"); // dummy data passed here, this can be any value. We just need to ping the app
      }
      
      lastDetectState = true;
    }
    else {
      // here we are seeing the same reflection over several frames
      // turn test LED on to give visual indication of a positive reflection
      digitalWrite(testLEDPin, HIGH);
    }
  }
  else {
    detectState = false;
    lastDetectState = false;
    digitalWrite(testLEDPin, LOW);
  }
}

/**************************************************************************/
/*!
  BLE-related functions below this point
*/
/**************************************************************************/
void aciCallback(aci_evt_opcode_t event)
{
  // this function is called whenever select ACI events happen
  switch (event)
  {
    case ACI_EVT_DEVICE_STARTED:
      Serial.println(F("Advertising started"));
      break;
    case ACI_EVT_CONNECTED:
      Serial.println(F("Connected!"));
      connection = true;
      break;
    case ACI_EVT_DISCONNECTED:
      Serial.println(F("Disconnected"));
      connection = false;
      break;
    default:
      break;
  }
}

void rxCallback(uint8_t *buffer, uint8_t len)
{
  // this function is called whenever data arrives on the RX channel
}

void sendBlueMessage(String message) {

  uint8_t sendbuffer[20];
  message.getBytes(sendbuffer, 20);
  char sendbuffersize = min(20, message.length());

  Serial.print(F("\n* Sending -> \"")); Serial.print((char *)sendbuffer); Serial.println("\"");

  // write the data
  uart.write(sendbuffer, sendbuffersize);

}

Measuring Speed with Infrared Light

Since the last progress check, I managed to get an Arduino to talk wirelessly with a custom iOS app via Bluetooth. Outside of the technical feat, it wasn’t a particularly interesting experience so I’ve been spending a lot of time rummaging through electronic shops looking to find a fuller, funner destination with this project. Today’s update is a pit-stop towards one such destination that will require the ability to measure the speed of a bike – or more specifically, the time it takes a wheel to make a complete revolution. I added the requirement that whatever measurement thingy I build could easily be added to any kind of bike (ie mountain bikes, touring bikes, cruiser bikes, cycle trainers, horizontal trainers, tricycles, big wheels, etc) in a in a non-invasive way.

With that in mind, I started looking at optical ways of measuring speed. The general idea is this: shine an invisible infrared light at a rotating object and count its break-beams through a special IR sensor. A common use is to measure a CPU fan’s RPM by placing an IR light on one side of the fan, an IR sensor on the other end, and counting how often the beam is broken (divided by the number of fan blades). With something as large as a bicycle wheel, it’s impractical to synchronize an IR light with a sensor on both ends, so my plan was to measure the reflection instead of the break-beam. In theory, I could keep both light and sensor on the same side so long as I could get a consistent reflection from a registration point on the wheel once per revolution.

I started building an optical tachometer from scratch, but midway through I came across this convenient little $2 reflective IR sensor on Adafruit. It’s an LED and a phototransistor built into a convenient little plastic case that ensures that their targets concentrate on a shared point.

reflectiveIRSensor.jpgI found that the sensor was not terribly accurate on its own and had different readings in different lighting environments. Sometimes it had the desired effect of only picking up reflections from white/shiny surfaces, but other times I would get false positives from my hand or even matte black surfaces (this would be a problem for a bike tire). To fix this, I added a potentiometer to calibrate the sensor’s sensitivity based on the environment. I also added an LED to give visual feedback whenever the sensor detected IR reflection. These two things worked out really well!

IMG_4974

Next, I needed a way to position the tachometer (and get it out of my messy breadboard situation). I arranged all the components on a small perf board. For my first time doing anything off the breadboard and with no prior electronics experience, I thought the breakout turned out pretty good! I was just pleasantly surprised it still worked!

 

opticalTachometer

For the connection to the Arduino Uno, instead of using conventional 22 gauge wire, I naively bought and used 18 gauge – thicker and harder to plug into the standard pin holes. Turns out, the thickness of the wire was a good thing because it provided enough strength to mold its shape and get the sensor in precise orientations like a goose neck lamp so I ran with it and wrapped it with some shrink wrap. Here is it is with the Arduino board and BLE breakout as a consolidated thingy:

opticalTachometer2

If I were going to iterate on the hardware, it’s very possible this could be 1/4 of its size or less. There seem to be quite a few mini Arduino boards with built in BLE now and if I had more practice I could definitely get the components to be much more tightly packed together. It’s really tempting to try because I think it could be a really fun and focused challenge, but I’ve got to move along.

Next update: Software

I Finished a Drawing For Once!

firemanjd_1000

I’m constantly scribbling loose, ugly things on notepads and whiteboards to communicate ideas, but it’s been a long, long time since I can remember actually finishing one proper. I recently picked up a shiny new Surface Pro 4, so I thought I’d end that sad streak with something fun.

For the drawing above, I tried an approach I picked up from Marc Burnet’s Drawing & Coloring Techniques tutorial. I learned a lot about 2D rendering from a compositing approach and I thought I’d share the process I used here.

Sketching

I decided pretty early that I wanted to do a caricature of my son. I took this photo awhile back. It’s not the clearest photo in the world. I caught him in a blurry mid-expression but every time I see this picture I can hear him noisily rolling around the house so I wanted to try and capture some of that. On a technical level, I like that there’s a good balance of organic and hard surfaces to get my practice on.

firemanjd_reference
Normally I would go straight into construction with lines but this time I started by blocking out the rough volume. On a separate layer I did a light sketch to try and find the lines. My natural inclination is to get more and more refined at this early stage which inevitably trips me up later when I’m trying to trace over my fat messy lines. This time, I tried to restrict myself to construction. When I was ready to move on, I set this layer to a very low opacity and starting doing the more final, clean lines. I went through several different facial poses and stylistic caricaturizations and finally settled on a stern expression he often sports while zipping around on his firetruck.
firemanjd_sketching

Form

With clean lines established, I then started building out the render layers below it starting with the ambient occlusion pass. Normally I’d go in and try build out dark and light values from somewhere in the middle, but I tried to think about the piece in distinct render passes. The ambient occlusion pass defines the form with an inspecific light source. In general, the goal is to paint in darkness where it’s difficult for light photons to enter (ie where two surfaces meet to form creases). Instead of trying to manage all the lights in your scene, this step focuses specifically on adding dimension and form to your 2D line drawing. (For more information, here’s a great tutorial on painting AO.)

On separate layers, I also filled in flat colors for each of the major surfaces (hat, face, jacket, gloves, pants, boots, firetruck red, firetruck chrome, tires). This is the diffuse/albedo map. These layers end up also functioning as a quick selection set for later polish. When these two passes are multiplied against each other, you get a rendering that has depth and ready to be lit.

firemanjd_form

Lighting

I’m not great at simulating physical light or being able to map out a mental model of the 3D scene so I decided to keep things as simple as possible by limiting the scene to just 2 main sources of light. The first is a blue atmospheric light that generally is coming from above to simulate the sky strongest on his hat, weakest as we progress to the truck. With a layer set to blending mode “screen”, I used a very large, very soft brush at a low opacity and gently wiped in gradients for any surface that appeared to be pointing up.

Next, I painted in a soft yellow key light coming from the upper left corner down to the bottom right corner of the canvas. This layer was also set to “screen”. I erased away any areas that were blocked by other elements (ie the screen right arm blocking light received by the screen right leg). Normally I would actively add shadow to the scene but this process has you subtract it from separate light passes instead.

After that, I made another layer to paint in bounce light/global illumination for color bleed like the red hat bouncing red photons to the areas on the forehead nearest to it or adding yellow to the nearby clothing or truck. The differences are subtle but with each pass they start to really add up to feel more and more rendered.

Finally, I created a separate layer set to “normal” to manually paint in any specular hot spots that I couldn’t achieve with the previous blend modes.

firemanjd_lighting

Polish

Lastly, I made some detail adjustments to things like the hair and eyes, pumped up the saturation on the diffuse/albedo layer, and roughed in a vague bokeh backdrop. All in all, it took me about ~5 hours over a few days. I really enjoyed this way of building up a painting and I think I’ll stick with it for future drawings. I hope you found the explanation helpful!

FiremanJD_process

Arduino and Unity: Talking!

In the last update, I used a free app called Bluefruit LE Connect to facilitate the wireless communication between an iPhone and an Arduino over BLE. It’s a great little app for testing and you could conceivably use it in your project if you don’t require a custom interface, but if you need more control over the software side of things, you’re going to want to write your own app. For me, I’m ultimately curious about having a game talk to custom hardware so I made my next goal to develop an app in Unity that talks to Arduino through BLE.

Unity doesn’t have any native, built-in methods to handle BLE connections – but fortunately, that’s where the Unity Asset Store comes in. I found a great package called Bluetooth LE for iOS and Android. It’s lightweight, has clear examples and well written documentation, but it’s the developer support alone that makes the $10 a complete steal. Here’s why: testing BLE is cumbersome. It’s not something you can simulate in the Unity IDE so you end up having to create a build each time you want to try something out on the mobile device. Each time Unity compiles to Xcode, it recompiles everything, so you end up having to sit through ~5 minute builds each time you’re fumbling around trying to find that one line of code that might be wrong. I was banging my head against the wall several times but Tony patiently looked at my C# scripts + Xcode logs and helped clue me in on more than one occassion. Do check it out!

UnityBLE_ConnectedScreen

Fumbling Around with Arduino and BLE

I’ve been messing around with the little microcontroller for a few weeks now, on and off, and as cool as buttons, switches, and knobs are, it’s easy to imagine how convenient it would be to have a wireless interface for the Arduino. With all the wearables and peripherals using BLE these days (Bluetooth Low Energy), I ordered a breakout without much of a gameplan other than to fumble my way towards connecting my iPhone to the Arduino.

I picked up the Adafruit nRF8001 Bluefruit LE and followed their super clear, n00b friendly, walkthrough of how to hook everything up to an Arduino UNO. You can see my ugly first-time solder job below. Luckily it still works, but I definitely charred the board. (I really should have bought some junk components to practice on first!)

Screen Shot 2015-11-27 at 11.53.02 PM

Once I had everything connected, I downloaded Adafruit’s free Bluefruit LE iOS app (with an Apple Watch counterpart – w00t!) to connect my iPhone to the Adafruit BLE breakout/Arduino UNO. The app has three main modes of interaction:

  • UART is like a chatbox that allows you to send strings between the iOS device and the Arduino’s Serial Monitor.
  • Pin I/O provides an interface for you to control and monitor live changes to the Arduino pins.
  • Controller exposes sensor data (ie accelerometer) and provides two touch-based interfaces.

The Controller mode was particularly interesting to me because one of the two touch-based interfaces is a color picker. With that in mind, I decided that for this first test, I would try and keep things simple and control an RGB LED with the already-provided color picker.

Using the example sketch: echoDemo, I was able to successfully advertise and connect with the iPhone, but then I hit a little speed bump.

RGB_byteReadOut.png

 

The app was written so that data submitted from the color picker is formatted with the prefix “!C”. The printout spits out each byte cast as characters so I thought: “Nice! All I need to do is parse the obscure sequence: ?k^ and interpret these characters into individual RGB values to send to the LED!” Well, it turns out that was a really flawed and needlessly convoluted approach and all I had to do was cast the raw bytes directly to integers. That took me a lot longer to realize than I’d like to admit but hey, lesson learned. Also, it turns out someone had already specifically written a sketch to parse messages sent from Bluefruit LE. (Strangely, you won’t find this linked amongst all of Adafruit’s great documentation. It’s super useful if you plan on using Bluefruit LE as your primary interface.)

So, with access to the individual RGB values, I wrote a little function to gently transitions the LED to its new colors over a variable time. You can find that code snippet at the end of this page. Thanks for making it this far!

IMG_4380.jpg

Success! = {69,255,22}

 

int currentColor[3] = {0,0,0};

void setColor(int red, int green, int blue, int tweenTime)
{
  int r,g,b;

  //get the absolute value of the difference between the desired red and the current red.
  int redDelta = abs(currentColor[1] - red);
  r = currentColor[1];
  while (r != red) {
    if (r < red) {
      //if the desired red is larger than the current, increment the current until it reaches the desired value.
      r++;
    } else if (r > red) {
      //if the desired red is less than the current, decrease the current until it reaches the desired value.
      r--; 
    }
    analogWrite(redPin, r);
    delay(tweenTime);
  } //end while

  //do the same for green
  int greenDelta = abs(currentColor[2] - green);
  g = currentColor[2];
  while (g != green) {
    if (g < green) {
      g++;
    } else if (g > green) {
      g--;
    } // end while
    analogWrite(greenPin, g);
    delay(tweenTime);
  } //end while

  //do the same for blue
  int blueDelta = abs(currentColor[3] - blue);
  b = currentColor[3];
  while (b != blue) {
    if (b < blue) {
      b++;
    } else if (b > blue) {
      b--;
    }
    analogWrite(bluePin, b);
    delay(tweenTime);
  } //end while

  //print to verify that the current colors are now the desired colors
  Serial.print("* Color Set To: ");
  Serial.print(r); Serial.print(", ");
  Serial.print(g); Serial.print(", ");
  Serial.println(b);

  //set currentColor to the new color
  currentColor[1] = r;
  currentColor[2] = g;
  currentColor[3] = b;
}
Tagged ,

Tinkering with Arduino

I bought an Arduino Uno and I’ve been slowly making my way through the tutorials learning about circuits. I started to stray off the beaten path and made a strange timer thing that prints out button presses on an LCD screen with a few custom created characters (arrows). It’s not the most exciting thing in the world but I’m ecstatic it actually works!

My n00b sketch:

#include 
LiquidCrystal lcd(12, 11, 5, 4, 3, 2);
const int buttonPin = 8;
int buttonState = 0;
int lastButtonState = 0;
int pressCount = 0;
float timer = 0;
int padding = 0;

//this is a custom arrow up symbol
byte newCharUp[8] = {
 B00000,
 B00100,
 B01110,
 B10101,
 B00100,
 B00100,
 B00100,
 B00000,
};

//this is a custom arrow down symbol
byte newCharDown[8] = {
 B00000,
 B00100,
 B00100,
 B00100,
 B10101,
 B01110,
 B00100,
 B00000,
};

void setup() {
  //assign custom characters to (char) 0 and 1
  lcd.createChar(0, newCharUp);
  lcd.createChar(1, newCharDown);
  
  pinMode(buttonPin, INPUT);
  lcd.begin(16, 2);
}

void loop() {
  buttonState = digitalRead(buttonPin);
  
  if (buttonState != lastButtonState) {
    lcd.clear();
    lcd.print("Button Press: ");
    lcd.print(pressCount);
    if (buttonState == HIGH) {  
      lcd.setCursor(0, 1);
      lcd.print((char)0);
      lcd.print("UP");
    } else {
      lcd.setCursor(0, 1);
      lcd.print((char)1);
      lcd.print("DOWN");
    }
    lastButtonState = buttonState;
    if (buttonState == HIGH) {
      pressCount++;
    }
  }

  if (buttonState == LOW) {
    int digit = Serial.println(timer);
    //determine timer's total digits to right-justify it on the screen.
    padding = digit - 2;
    lcd.setCursor(15-padding,1);
    lcd.print(timer);
    lcd.setCursor(15,1);
    lcd.print("s");
    delay (100);
    timer+=0.1;
  } else {
    timer = 0;
  }
}

A Next-Gen TV Interface Concept

We all know that the interface for TV is about due for a major upgrade. Clicking your way through long lists and typing on a remote control is a chore that’s been made marginally better with app-based remote controls. But even with these software remotes, the interaction is still inherently indirect and functions like an awkward trackpad. What you really want to do is remove the abstraction and touch the TV screen directly… but without having to leave your couch.

To that end, I’ve been toying with an idea that would allow for just that. This concept demonstrates what the interaction could be like if TV apps could synchronize states and mirror its content to a mobile app all while accepting touch input. I’m using the Apple TV as a starting point because it’s already got everything lined up for this to happen and already demonstrates a fraction of its potential with the Remote app and Airplay.

Some takeaways I’ve discovered through this conceptual exercise:

  • We can (and should!) still support dedicated physical remotes. You will notice that all AppleTV screens can still be navigated with the standard up/down/left/right controls. This method acts as a complement to the tried and true clicker.
  • TV apps need to be designed with the mobile abbreviation in mind. The Hulu Plus TV app’s tab-based format presented a challenge to the mobile abbreviation.
  • The two representations of content should be as similar as possible to maintain the 1:1 perception. If you veer too far away, the user has to learn two different interfaces.
  • The user should not have to have all of the TV apps already installed on his/her device. By firing up a central app, all content on the TV should mirror to your handheld device.
  • Either the breakout box (Apple TV in this case) needs to act as the server to push content down directly to each mobile device or it could simply synchronize each client’s individual requests to the cloud.
  • Airplay/Casting is one way, pushing content from a mobile device to the TV. It was helpful for me to think about this concept as the a two-way cast.
  • Particularly for 2nd screen content (like viewing screen relevant IMDB info), there’s no reason why the mobile app couldn’t be used by multiple users simultaneously.
  • Apple’s probably already thinking along these lines as demonstrated in this patent filing here: http://goo.gl/RzLhIO
Tagged , ,