The NYTimes is running an article about life hackers. This quote sums it up perfectly, “Information is no longer a scarce resource – attention is”. A large portion of my HCI classes were spent discussing how to notify people without actually interrupting them (aka attentive user interfaces / more). It’ll end up being one of the larger problems facing the computer industry for at least the next 10 years.
We’ve reached a threshold in terms of information availabily and devices to interact with that information; be it iPods, cellphones, laptops, whatever. What we need are devices that work for us. A microwave is pretty convenient, but wouldn’t life be a lot easier if your frozen dinner had an embedded RFID chip that told the microwave exactly how to cook it? What about that punk kid on a skateboard listening to music? Should you hit him with your car, or have it send a message to his iPod? The ability to invade music devices probably isn’t that smart, but you get the idea.
The Human Media Lab at Queen’s does a lot of work with gaze and speech recognition. For a fourth year class, we had to use a webcam to track a red-ball and map it to mouse co-ordinates (i.e. simulating infrared iris tracking). It was one of the cooler projects that I worked on at school. If you’re taking a CS degree at Queen’s, I highly recommend taking the two HCI courses. The lab also has a weblog but it only sees sporadic updates.
For years, we’ve talked about swiss-army style devices that can do almost anything (cellphone/camera/pda/whistle/discoball) but we tend to limit our focus to handhelds. Eventually, we’ll start seeing more things along the lines of the new iMacs from Apple, with an integrated remote and camera. The cameras will allow programmers to do things eye-tracking and gaze-recognition, although it does open up the possibility of a malicious virus or trojan that steals retina data or takes rogue pictures.
Regardless, I just want my hovercar.