Why do we have to reinvent the wheel every few years?
One of the problems we as technologists have is that we sometimes get too excited over the new “new thing”. We sometimes forget that the old “new thing” that we just got over is finally doing what it should and that the new “new thing” really doesn’t hold much value to our solution. We fantasize about what it would mean to streamline performance or enhance resolution or even be able to scale to zillions of users, but we forget to ask “Does the solution require this feature? Does the business context – budget, schedule, ROI etc. enable us to do this? Is the risk involved worth the reward?
This is the issue facing many engineers who have migrated from engineering positions into managerial positions. Its also a challenge when managing talented technologists. Obviously their excitement and passion is for learning and adopting new technologies. This is their form of “continuous improvement”. However, from a business perspective and reverting back to the metaphor in the title – getting from point A to point B does not require evolving from a car into a teleporting machine. For one cars have been around for a while and almost everyone has got a license. Teleporters, well they have only been around in movies (oh and video games). So really do we need to risk our particles being “lost in space” just to get from point A to point B in a split second? Or could we settle for doing it in a few minutes?
This analogy rings true for many evolutionary and revolutionary cycles we witness almost daily in the field of technology. Don’t get me wrong, I am all for innovation and new technology, however I prefer to adopt and integrate technology with inherent value to my business (or my customer’s business) rather than just for its “coolness”.
I’d like to bring this into perspective for anyone who’s reading this and thinking “Ok what’s his point?”. We who endured and are still living through the rapid evolution of the web found ourselves constantly re-inventing the wheel. We adopted, integrated and threw away new technologies at breakneck speeds. We saw 10s of programming languages, application development frameworks, development patterns and complimentary technologies come and go. Yet, throughout all this the only major change we see to how our user’s requirements from their business apps is that they love the accessibility (most anywhere with an internet connection) and performance (no nonsense, no installation, seemingly endless processing power) of browser based applications. Business applications still have data-entry forms. They still have multi-record data driven views, they still have multi-window interface, menus and tabs just like they’ve always had (well at least since we’ve all more or less settled down with Windows).
But in order to enable browser based applications to provide this level of usability and functionality we have invented, adopted, surrendered to or thrown out a number of promising technologies: Java applets, server based computing solutions, Flash, AJAX, HTC components, Web controls etc.
My ultimate question is Why? Why do we have to change our ways, throw out our best assets, skills and experience just for the sake of new technology? Why can’t we teach our old dogs new tricks? Why do we have to shift from “applications” to “web sites” to “web applications” until the next “new thing” arrives? Can’t we just keep developing applications with tried and tested methods, tools and technologies and simply have them “magically” leverage these new technologies?
Why can’t new technology manifest itself in the “plumbing” of the web rather than in the forefront? At least from the perspective of business application development. Well these are the questions I am committed to try and answer in my new venture – Gizmox developers of the Visual WebGui platform. I’d love to hear your experiences so feel free to comment.