Introduction
There's been lots of speculation about and what it means to the modern technologist. I've found some of it pretty insightful and some of it misinformed. I use . A bunch. Not as much as , but . I like it.
The Best Defense...
I don't intend to defend Twitter because I do not believe it the needs defending.
This post is not intended to change your mind about Twitter. I'm not trying to convince you that you should or shouldn't use it - I'm not telling you "you're missing out" if you're not out there building your Follower Count each and every day. Honestly, I don't know if you're missing out or not. So I'm also not telling you that you aren't missing out.
You may, in fact, be missing out.
All that follows is my opinion; an opinion formed in the context of my experiences with the industry as a hobbyist and then professional since 1975.
What Twitter Isn't...
Twitter isn't micro-blogging.
Twitter isn't slow IM.
Then What Is It?
That, I think, is an excellent question! I'm not going to try and answer it all at once. If you simply cannot bear to wade through my attempts at logic, you can skip to the "Andy's Answer" section to read my answer.
First, A Bit Of History:
In the beginning, there was ENIAC. By today's standards, ENIAC was large and clunky. Other systems followed ENIAC, most of them very large machines that filled rooms and entire buildings with less computing power than almost any modern computing device.
One problem with these machines was time. Scientists, engineers, and normal people had to schedule time on these machines. They had to physically travel to the computer to load and then execute programs. There was usually a single terminal, and it was in the room (or one of the rooms) with the computer.
So terminals were added.
Now this may seem like a small change, but it was huge at the time. One of the effects of the terminal was it created an external architecture. This architecture evolved as terminals became smarter; and then the workstation came to be. Eventually, workstations became connected to larger computers and the architecture evolved again.
In my wee mind, I imagine this as a tectonic process. New (emerging) architectures grow similar to divergent plate boundaries shown here. Magma is the medium and engine of tectonics; ideas are the medium and engine of architectures. As the pressure of new ideas spread existing architectures, new layers are formed.
Returning to my (spotty) history of architecture: demands for more time drove parallel architectures (terminals), and then more intelligent terminals while growing another layer of smaller personal computers - which were then merged into architectures where individual personal computers replaced terminals in client-server architectures.
Architectures continued to evolve into n-tier, which eventually evolved into the modern cloud. It hasn't been clean or easy, but it's here - if only in its infancy.
Whenever architectures evolve, they do so organically and naturally. That's one reason I like to look for natural and organic processes for analogies. In the real world, nothing grows like smooth linear or gently increasing curves. In real life, things crack and expand and bifurcate, and there's entropy and cycles and fractals all over the place.
Andy's Answer
Twitter is something completely new.
It is a new layer in the architecture. Developers are finding all sorts of small, niche-filling, cool things to do with it. It's simple. It's elegant. It's one of the new things to pop out of the idea magma to build the latest part of the zeitgeist continent.
Classify it otherwise if you will, but (in my opinion) you do so at your own risk.
Conclusion
The preceding words are merely my opinion. I welcome your thoughts.
:{> Andy