Ba Da Bing!
It's cold in hell today. Well, in my private corner of it anyway.
That's because my default home page - across all my browsers - was just changed to Microsoft's Bing.com.
In my world - that's really big news. I have friends who have responded with utter disbelief.
Ba Da Bing!
It's cold in hell today. Well, in my private corner of it anyway.
That's because my default home page - across all my browsers - was just changed to Microsoft's Bing.com.
In my world - that's really big news. I have friends who have responded with utter disbelief.
For the last 24 years I have been, you might say, generally anti-Microsoft. Or rather - I wasn't impressed with this company that had defaulted, and then bullied it's way, into ubiquity slightly ahead of availability of vastly better designed systems (cough - Mac OS - cough).
Yes, of course I was, and to a large degree, still am, an Apple fanboy. And yet when I think about the companies that I would prefer to have rule the universe, I have always thought Google makes a slightly more benevolent ruler than either of the former.
Over the last 24 years I repeatedly asserted that the day Microsoft developed a product that is better than Apple, and later Google, that I would have no problem adopting it. And of course that was so easy to say because such a thing had never happened. Like ever.B
ut for the last month I have been trying Bing, and guess what, it doesn't suck.In fact, it doesn't suck so much that it's actually really great. Dare I say - the greatest Internet search engine available today.
For over 10 years Google has held the status as the top subject in my private Internet kingdom. The first logo I saw every morning, and the most used internet tool every day. But all that changed today.
Using Bing, it's pretty obvious that search results are more relevant, videos more immediate, dynamic and easy to navigate, and images are more relevant, numerous and easy to view.And, gird yourself Google, I'm about to utter an alien phrase... it's cooler.
Using Bing, I realized that Google, the search engine, just slipped, unceremoniously, into the bottom half of the hour glass as an artifact of a previous time. A time when aesthetics necessarily fell by the wayside in favor of functionality and conservative technical etiquette. Business models had to actually work after the bubble burst- imagine that. And the growing tidal wave of newbie mom and pop internet users were still a little confused by all them thar buttons and interwebs and emails and such. Google's child-like branding and minimalist (read: mundane) approach to interface design and aesthetics made the company and it's site friendly and accessible. ...Back then.
However, today, Google's obvious repulsion against anything remotely related to aesthetic beauty or adventurous U.I. has left it with all the design gravitas of a pocket calculator. Yeah, it works, but there is no joy in using, it's not delightful, it's not cool.
As a Google corporate outsider it's hard to tell how much of Google's home page (and logo) - which has changed glacially in the last decade - was initially accidental or the result of advanced calculation, but in either case it worked at that time, and it's unlikely that anyone inside Google has been willing to take responsibility for messing with that success by fundamentally refreshing the product's appearance and behavior.
"DON'T TOUCH IT!" is the more likely conscience on the primary-colored campus.
But technology runs to commodity. And one day you wake up and the only difference between two competing products is aesthetics and an implied lifestyle.In hindsight, "change it" is something Google needed to do some time ago. Embracing the risk, reward and uncertainty of great design would have given the company a chance. Might have pre-empted Microsoft's bid altogether. But you don't write algorithms to do that, you employ artists, and unconventional U.I experts, you trust their intuition and taste, and you relegate to them some directorial control. You don't drown them in statistics, limitations and testing. That procedurally kills good design.Look at Apple - the poster-child of industrial design and aesthetics working hand-in-hand with great technology.
Apple gets it. Pretty much always has. Except for maybe when Gil Amelio was there. And it's not like they don't do consumer testing. They do - but they value great design. And Google could learn a few things about consumers and marketing from the design powerhouse, if they would just pull away from the ones and zeros long enough to appreciate organic, intuitive creativity. But alas, outside the occasional visiting artist who is paid to perform during the lunch-break (the videos we have all envied), Google does not seem to have any idea how to incorporate the intuitive creative sensibility into it's products in a meaningful way.I'm not saying Bing is some design nirvana - it's far from it. It even shares many similarities with Google. And I'm not saying that it is so advanced that Bing can't be unseated, but for now, it's just better than Google. And in the small, small world of search engine powerhouses, that's all that matters.I will add that it appeared to be a rather unbecoming defensive move when Google announced an operating system initiative - just as the obvious superiority of Bing's search over Google's was settling into the Internet stream of consciousness. Perhaps a bid to steal some of Microsoft's thunder - or keep them feeling the pressure of an inferiority complex that should be pretty well entrenched at Microsoft by now.For now, Google's well documented subservience to testing and data, and it's aversion to artistic intuition has done it this one infinitesimally small disservice: it has turned at least one staunch Apple Fanboy and Google advocate into a Microsoft convert.I'm writing this on a Mac. One that has never revealed a positive thing about Microsoft. And I even still want Google, with it's slightly more trustworthy corporate mission to "do no evil", to ultimately rule the technical universe. And yet...
Bing is now my home page.
Sorry Google, you have some work to do, Buddy.
The Digital Dark Ages
I have been developing Interactive work for over 15 years, and sadly, my son may never see any of it. That's because we are living in what future generations will undoubtedly call: The Digital Dark Ages.
The Digital Dark Ages
I have been developing Interactive work for over 15 years, and sadly, my son may never see any of it. That's because we are living in what future generations will undoubtedly call: The Digital Dark Ages.
This all came to a head with renewed force for me a few weeks ago, when an interactive agency contacted me as part of a vendor pitch. They were very proud of themselves for having "innovated a brand new kind of banner ad". One that allowed the user to interact with the brand/store/product within the banner itself, all without leaving page the banner was on. They went on to imply that it was the first time this had ever been done, and wasn't it a brilliant solution.
I generally agreed with it being the right direction - well, righter than the static alternative - except that it had been done before, and frankly, many times. I know because, my old company, Red Sky Interactive, did it, to name one. A lot. And as far back as 10 years ago. And it worked then.
This isn't the first time I've come across such a disconnect from past efforts. Especially in advertising. It seems to me that advertisers "discover" the same basic, big ideas, a couple times each decade. And each time it's hailed as a "truly innovative solution" all over again, as if it hadn't happened the first time. This doesn't just happen with banner ads either, but all sorts of basic interactive principles, interface techniques, and solutions based on newly observed user-behavior. I honestly don't think this is a case of selective memory, to their credit I think they truly believe they invented the idea. In part because they probably had to. Redundant though it may have been.
---
Many years ago I was working on a project and needed to reference what I recalled was some aesthetically innovative interactive work in its time. I had the CD-ROM on my bookshelf - "The Dark Eye". It was an awesome piece of work, created by animator Doug Beswick and featured really ground-breaking components including beautifully designed stop-motion puppets. The packaging still looked awesome. Looks innovative even still by today's standards. It was created in 1995, and when I attempted to run it, ..."the application that created it could not be found". I realized with some degree of concern that I had created a fair number of projects around that time, and before. I saved those old interactive projects- all manner of files, dutifully copied and transferred and burned, from machine to machine over the years- because they represented the bulk of my own body of work, and contained ideas and experiences that I wanted to keep for posterity. Many were first of kind innovations that won coveted awards and in some cases set industry bars.
I held my breath and double-clicked one of the pieces I was most proud of, and discovered that neither could it's application be found. I tried every way at my access to open it, and only then fuzzily remembered that I'd created it with a program called "Video Works II" - long before its name was changed to Macromind Director - which was incidentally before the company changed it's name to Macromedia, before the popular Internet, most certainly before Shockwave plugins, not to mention the arrival of Flash, and it's subsequent acquisition by Adobe. Needless to say, I no longer had the tools that I'd created the piece with.
The implications slowly setting in, I rapidly double-clicked, and watched in breathless horror as project after coveted project sadly faded into digital abstraction- unreadable data- like film trapped on a reel. That the only way they might see the light of day again is if I went to tremendous effort to, technologically, go back in time and bring them forward with me, version-by-version, adjusting code along the way. The most recent of the "lost" pieces were roughly 5 years old.
That's the day I decided I lived in the Digital Dark Ages.
I believe that future generations will look back at these days, and except for those few who are trying to "archive" portions and thin, top layers of the Internet, will have little idea of what was actually happening in Interactive media today. There will simply be a hole in our history, and no physical artifacts to remember it by. Lessons will be lost, only to be relearned. When you consider the mass of interactive work being created daily, it's virtually unreasonable to think that all of that innovation will be effectively captured en-masse and stored in a form that can be meaningfully revisited across a changing medium.
Our language, messages and artwork, are only made possible through tools and platforms that will relentlessly evolve out from under our work. Confronted with this scenario, a surprising number of people have suggested "video taping my work for posterity". But to me - an Interactivist, that entirely defeats the purpose. This is interactive work. You haven't experienced it unless you interact with it. Frankly, at the moment, it's interactive work that requires a mouse and keyboard. But even this hardware- the mouse- is on its way out. If we don't purposefully pursue a solution, we will need to admit that it's okay to let our place in History diminish with our work.
When I created it, I had imagined, years from now, finding myself contemplating my waning life, but being able to look back at the great work I'd created. To show my son. I'd hoped naively, that like the painters, sculptors, writers, film-makers of the past, that perhaps my work would persist for future generations, and maybe even serve as a touch point in instances. I see now that that isn't likely for any of us.
There are a few possible solutions to this issue:
Update. Commit to regularly upgrading work, advancing it into new platforms. This would require a scheduled effort, and will require re-coding as a frequent measure. As platforms change, creators will have to rethink interface elements. Admittedly, this solution becomes exponentially more difficult over time.
Emulate. It may yet be possible - and hopefully will be in the future- to load any OS and software configuration from the past into what will undoubtedly be very capable computing environments. Hardware will have to be emulated as well... which poses some interesting design challenges, but hey - I can run Windows on my Mac, so maybe this isn't too far fetched. I expect this is still a way off however.
Museum. A museum of old systems/platforms could potentially display key work to future audiences. And I'll admit, that's how I view some of my work today. Unfortunately this does not extend well, and is restricted by physical limitations.
Let go. It now appears to me that, as Interactivists, we may be working much closer to live performance than we had ever imagined. Technology is merely our stage. Perhaps we need to cozy up to that idea, and walk in with our eyes wide open. The illusion of "persistent content" comes with the ability to "Save", "Duplicate" and "Burn". But in fact, Interactive work rests on a flowing stream of technology - a stream that ultimately carries it away, even while traditional media persists.
There is a 5th option. Development of the Human Computer Interface Preservation Society. This effort is underway, and we will announce details as they become available.
In the mean time, interactive media, and more specifically, the language of interactivity, is still hovering in this awkward adolescent stage, a position it's been in for over a decade. The most expedient way that we'll move beyond this state is if the innovative efforts of our current crop of talent, industry creatives and engineers, more decisively builds off of what was done before - not replicate it.
My advice to younger Interactive developers: find and interact with a seasoned mentor(s). They're out there, and I'm sure you'll find them willing to recall hidden efforts. Unlike any other "recorded medium", the Charlie Chaplins, the Leonardo DaVincis, the relative "masters" of Interactive media are still alive today, and for better or worse, the best, most complete source of information on the subject rests with them, not on the net in circulation. At least in the short-term it's the only way we can effectively build off the innovation and invention that came before us.
Tooth Hackers & The Ultimate Technology
Some time ago, I found myself thinking about all our amazing technical advances - especially those that beg moral questions- and I began a journey that changed the way I approach technology, and changed how I think of humanity... and headphones.
"Should we be doing that?" I thought.
Should we be cloning humans? Developing implantable chips, artificial intelligence or nano-technology that may some day advance beyond our control? Will our technology unquestionably remain at our service? Will it's advance really improve our odds of survival, or will it just change it?
Is technology good?
Virtually every really bad doomsday movie launched from this string of questions. But even so, there are few certainties in life. Death being one. And, I need to add one other absolute certainty to that short-list:
- Man-made technology fails.
I have never used a technology that was perfect. It always breaks - it always reveals vulnerabilities - it always, always fails at some point.
Tooth Hackers & The Ultimate Technology
Some time ago, I found myself thinking about all our amazing technical advances - especially those that beg moral questions- and I began a journey that changed the way I approach technology, and changed how I think of humanity... and headphones.
"Should we be doing that?" I thought.
Should we be cloning humans? Developing implantable chips, artificial intelligence or nano-technology that may some day advance beyond our control? Will our technology unquestionably remain at our service? Will it's advance really improve our odds of survival, or will it just change it?
Is technology good?
Virtually every really bad doomsday movie launched from this string of questions. But even so, there are few certainties in life. Death being one. And, I need to add one other absolute certainty to that short-list:
- Man-made technology fails.
I have never used a technology that was perfect. It always breaks - it always reveals vulnerabilities - it always, always fails at some point. The safety features have safety features, and yet they still experience absolute breakage and miscalculation, and breeches, and failures. We humans have never- ever - created a technology that does not ultimately fail in totality.
Oh, and headphones suck.
When I was 12 I got my first Walkman. That's back when it was the Walkman. If your family owned a B&W TV, then I bet you remember this moment too - trying it in the store and putting those small headphones to your ears and being stunned at the audio quality. It really was rich and vibrant. A huge improvement over the big ostrich egg headphones of the previous decade. A few weeks ago it occurred to me that the headphones I have attached to my computer today are roughly identical to the pair that came bundled with my Walkman in the early 80s. Actually, my new ones are a little clunkier. That was almost 30 years ago. 30 years.
I mean, I see people walking down the street today with headphones on, wires dangling, twisted, draped into some inner pocket, and the whole thing looks so ..a-really-long-time-ago-ish. Definitely not futuristic. Definitely not the audio equivalent of, say, the iPhone. Oh, so now you can shove them in your ear. Hi-tech.
And then there's the blocky blinky blue wireless light that the really important high-powered executives opt for. Cyborg Lawyering their way through lunch. As an aside - is there anything more passively annoying than those guys that leave their little blinky Bluetooth headsets hanging over their ears when they're not even talking to anyone? Eesh. It's always guys in suits with the WSJ. The look-at-my-cell-phone-attached-to-my-belt-guy, ten years later. "No no, you look really cool."
Anyway, then I read about a chip that could be implanted into my tooth, like a filling, and this chip would receive a WiFi signal, vibrate my jaw bone, which is very, very close to my inner ear, and I would hear crystal clear music, and make invisible phone calls. My first thought was that the brand "Bluetooth" was wasted on the current state, and that blinky, blue teeth might be kind of cool at a concert. But my second thought was that this type of implant must be the inevitable advancement of headsets - the shedding of a "thing" that I need to carry altogether. And maybe that's still right. Seems like a logical progression. I mean, I would never do it, but I'm a technical immigrant. My son, who was born the same month as the first iPhone? He will, despite my protests.
And that's where these two strings reconnect for me - that chip in your tooth is going to go bad. Or worse - maybe some complete ass with a good sense of humor decides to hack it. You know, hacking isn't something you can stop. If it is decided that a thing should be hacked, it will be. And someone will most certainly wish to hack all the literal blueteeth that all the futurey people use to listen to their iThings. I imagine a large percentage of the population suddenly doubling over in pain as that scene from Superman The Movie involuntarily blasts through their jaws. "Only one thing alive with less than four legs can hear this frequency, Superman..."
My son will still get one. ...and yeah, that was Lex Luthor.
But this thread caused me to realize that all of this - is inevitable. Technical development does not stop. It can't, because it's flawed. Or rather, we are. And we have to fix it, or us rather. Because technological development is an inexorable part of being human - a primal, fundamental outgrowth of tool-use, our instinctual drive to decrease pain and seek pleasure as a means to survival - linked to our very biology. Our minds are tools that we can't turn off or put away, and with reason, and with creativity, comes the ability to envision improvements in our condition. It's not limited to culture or time.
We all contribute to the advance of technology - with every thermostatic adjustment, every new pair of shoes, and how much Air makes them soft enough? - we continuously try to improve our condition through the use of our tools, no matter where on Maslow's Hierarchy of Needs we sit. It is the very basis of human existence, and our life's activity until death, and if only we could put that off a little longer, and then maybe a little longer still, and you quickly find yourself wondering where all the advancement ends.
At what point have we achieved perfection, such that no further technical development is necessary? Incidentally, the answer to that can be found embedded deeply in virtually every religion.
When we live forever, in eternally-increasing ecstasy. The ideal state. Then we'll be done.
Until we reach that state of being - you know, we will always see room for improvement in our current technology.
Can we stop the advancement of technology? To consider such an idea is to contemplate the end of humanity. There is no line separating human from technology. And there is no line separating technical advancement from survival of the individual, or the species.
As we survive, we use technology. As we imagine, we advance technology.
Should we be doing that? I don't think we have a choice. The advance of technology is a law of humanity.
Technology is not good or bad. It is us.