Get new posts delivered straight to your inbox.
Enter your email address
powered by TinyLetter
Jul 24, 2007 •
I recently listened to a podcast in which Matt Mullenweg, one of the WordPress founders, explained that he and other developers are trying to push out new releases of WordPress as quickly as possible -- every 90 or 120 days. He said users like an accelerated development cycle. Software that stagnates, which isn't updated but every year and a half, loses its appeal to users.
I like change, even when it's not always for the better. Change is evidence of a creative, active mind experimenting and engaging in life. Because the IT industry thrives on rapidly changing technologies, those who work in IT should embrace change rather than fight it.
For some, change holds the promise of a more intriguing life. In a report titled "2007 Ten-Year Forecast" from the Institute for the Future, the authors begin:
We suspect that, like migrants, we humans are all beginning a long trek through strange territories we have only rumors of. The rumors come in many forms from many sources, some more reliable than others.
We hold the hand of technology as it leads us through new, strange lands. Intriguing rumors of what we'll find (cyborg cultures, apocalopyses) compel us forward.
In an article on how technology is changing IT, CNN writer Nora Isaacs says,
According to the experts, the next century looks like an airplane runway that never ends, with the technology careening faster and faster toward an unknown terrain.
Unlike other fields that are more predictable, technology moves so so quickly, its developmental pace is exponential. Whereas some fields may have changed little in hundreds of years, you can barely predict how technology will transform society in 20 years. The future is an unknown.
Raymond Kurzweil, a popular futurist author who believes an event of profound human change, known as the Singularity, will transform the fabric of humanity, begins his essay on the law of accelerating returns with the following:
An analysis of the history of technology shows that technological change is exponential, contrary to the common-sense 'intuitive linear' view. So we won't experience 100 years of progress in the 21st century—it will be more like 20,000 years of progress (at today's rate). The 'returns,' such as chip speed and cost-effectiveness, also increase exponentially. There's even exponential growth in the rate of exponential growth. Within a few decades, machine intelligence will surpass human intelligence, leading to the Singularity—technological change so rapid and profound it represents a rupture in the fabric of human history. The implications include the merger of biological and nonbiological intelligence, immortal software-based humans, and ultra-high levels of intelligence that expand outward in the universe at the speed of light.
Change is not linear. Those who experience stress at today's changes will hardly survive when the pace is tripled or magnified twenty-fold.
Paul Saffo, a well-known figure on future studies, explains that every 30 years or so, a new technology transforms society. In the earlier part of the 20th century, the transforming technology was chemistry (medicines, plastics). Then developments in physics followed (atomic energy, circuits). Now information technology is transforming society. In the 21st century, the change will be biological.
For some reason, many technical writers have an aversion to new technologies. So often I hear people ask questions like,
What problem did you have that caused you to seek out and implement X technical solution? (As if you can only explore a new technology as a solution to an existing problem.)
How will this change things for the better, making it easier and more beneficial for the users? (As if emerging technologies must only have proven, tangible user-benefits before they can be considered.)
These are good questions, but with emerging technologies it's not always possible to know the consequences or benefits. Particularly with wikis and blogs, it's hard to see the larger effect they will have. When the microchip was first developed, an IBM engineer asked, "But what . . . is it good for?" (1968).
The history of the computer is full of pessimistic reactions to its importance. In 1977, the founder of Digital Equipment Corporation said, "There is no reason anyone would want a computer in their home."
Exactly what we're moving towards isn't always clear, but it's intriguing. Christopher Meyer and Stanley Davis explain that "everything is becoming electronically connected to everything else: products, people, companies, countries, everything."
As information workers in IT, we should keep in mind the importance of change within our industry. I've seen incredible resistance to new tools (such as with the RoboHelp/Flare debate) simply because some tech writers are resistant to change. (In Kiersey speak, they are "guardians" rather than "innovators.") However, we should be the earliest adopters of technology, exploring and analyzing its potential uses.
Skeptics of new technology are quick to point out the disadvantages and potential problems associated with emerging technologies. Hollywood dystopias are familiar with the paradox of progress: that each step forward is often a step into a nightmare as well. One writer explains:
"Rarely does a new technology solve a problem because it has some type of external byproduct that we can't envision initially. (Joe Vanden Plas, Wisconsin Technology Network)
Even with failures and drawbacks, we keep pushing forward, experimenting with new ideas and techniques. Technology creates change, and the change forces new technology (Isaacs). The cycle continues and feeds on itself.