About…6 years ago, we used to make fun of our bosses because he always talked about “ubiquitous computing.” At the time, it was the catchphrase, and was thrown around like it meant something more than just this yet-to-be-realized utopian idea.
The idea that computing technology (define that as you wish, but I think most people will get within striking range of the same definition) would become so common that everyone had it everywhere was kind of the holy grail concept a while back. It was logical – a generation of users that grew up with technology everywhere, devices getting smaller and more powerful by the day, and more and more connectivity as wireless networks popped up here and there. But we weren’t there yet. Phones were not yet smart, computers were still too slow, and the best we had was the few local wireless networks around. Broadband cards weren’t even an option then (and the speed on those networks would have been unusable anyway).
We are almost there. First, the iPhone got us 99% of the way. Most people that know me would find it surprising that I’d give so much credit to an Apple product (I’m not against them, but I dislike the fanboy atmosphere that surrounds their products and the company). But even if you look at Android phones, Symbian phones, WebOS devices…they all started from the iPhone. An incredibly powerful device that also had an everyday purpose – cell phone – that made it simple, logical, and easy to always have around. With faster network speeds, wifi options and increased competition, the features of smart phones in general have made it to the point where I do a lot of just plain web surfing on my phone. Yes, I use the Google Reader to keep up with RSS feeds, and another app to find restaurants, etc, but sometimes I just do a google (voice) search and see what happens. I use the phone like a computer.
Many would say that with these smart phones we’re already at the point of everywhere computing. However, the last 1% that I leave out is a critical one. How do we connect all of these devices and the content that they provide and, more important, that which we create, together into one big mesh that is our “work?”
Right now, I use dropbox.com and box.net to bring a lot of my stuff together. I have the apps on my phone, the files on my computers and online. I work on a file on my computer at my desk at home, email it to colleagues while on the train going to work from my phone, then open it up on my other computer at work to get more done. My address book is all Google now. I am thwarted in my efforts to keep my book up to date with work contacts (to keep it truly my “one” address book), but overall all of my contacts are there, in Google. With the right software on the server side, I can now look at my work calendar and e-mail on my phone and any computer. I can tie the calendar in with Google Calendars (kind of) and then view multiple calendars at once to know all of the things going on in my oh-so-busy-life.
As I look at students today, here at the law school, working with tablets and doing some really innovative stuff with organizing large amounts of data, part of me says “it’s really here! now we have these ultra-portable, highly usable mobile devices! ubiquity!”
But then I see someone e-mail a file to themselves so they can work on it later. Or they have to refer to notes on the iPad while doing a full brief on the computer. That is a massive disconnect, and one which we, as Technology staff in higher ed, must remedy.