And web is the native infrastructure for information dissemination, exchange, and sharing.
Old habits die hardMost software developers in circulation today have cut their teeth by learning how to implement computing algorithms. Something akin to: "here is a jumbled sequence of numbers, write a computer program that will sort them in ascending order".
The emphasis in such an exercise is always on exhibiting reasonable amount of direct control over the execution logic. The would-be programmers have been groomed for generations to master the skills of gaining full control over the stream of instructions that the CPU executes. In other words, to be a computer programmer/software developer implies to them that a full control over all the processing aspects must be in place.
It's a single point of human control, same as it's a single point of machine control (i.e. the Central Processing Unit, or CPU).
But in the world of the web, such notions of centralized control are not only ridiculous, they are down right harmful. Still, armies of CPU-trained programmers have now been employed and deployed to develop software that will run on the web.
Is it a small wonder then that such 'central point of control' minded individuals are creating fully dysfunctional web sites? Indeed, old habits do die hard.
New habits are slow to emergeWhat the above described situation then means is that the existing workforce that is dedicated to developing software programs is pretty much a write-off. Because old habits die hard, it is quite likely that it will be much harder to unlearn the old habits and relearn the new ones, than to start from a clean slate.
However, given the vast demand for web-based software that is out there, we must work on creating some sort of a compromise, in the attempt to bring the crusty old 'central point of control' software development workforce into the web fold. For that reason, we need to continue explaining the difference between the Central Processing Unit software architecture, and the Resource Oriented Architecture.
Transitioning from doing to managingOne of the hardest things to do in life is to make a successful transition from doing something to managing that same thing. The temptation to roll up one's sleeves while managing the process is simply too strong to resist. And that's why not too many people end up being good management material.
In the world of software development, right now there is a huge demand for people who are capable of abstaining from rolling up their sleeves and instead delegating the responsibilities to the 'workforce' out there.
What does that mean? In a more technical parlance, it's been proven that once software developers get trained in making procedure calls when attempting to process some information, they tend to get addicted to that gateway drug. Similar to how to a person with a hammer everything looks like a nail, to a programmer who had learned how to make procedure calls, every information processing challenge looks like a string of procedure calls. This algorithmic frame of mind is very hard to get rid of, once a person gets it contracted. And in order to become an efficient web developer, each and every procedure-besotted person must learn to let go of the procedure call driven world view.
Transitioning from local to distributed procedure callsMost people learn how to program computers by implementing procedure calls that are local. What that means is that their world view is CPU-centric. In other words, the executable code in its entirety is sitting on a local box, usually right next to the developer's desk. The developer then intuitively grasps the world as consisting of a single Central Processing Unit which acts as a traffic cop, deciding on who gets to do what and when. The 'who' in the previous sentence refers to the procedure. Typically, a programmer will modularize his code into a fairly large number of procedures. These procedures will then be called at opportune moments, will perform the intended processing, and will then recede in the background.
This situation is the prescribed way for teaching newcomers how to develop software. It willfully ignores the real world issues, such as that there is very little use and need in having a 'desert island', 'single sink' scenario, where a computer user will be sitting alone engaged in the 'glass bead game' on the computer. In the world of business at least (and also in the world of entertainment), multiple users are a must. Meaning, CPUs and people will be networked (999 times out of 1,000).
There inevitably comes a time when a software developer wannabe realizes that a transition from a single CPU to multiple CPUs connected via the network, is unavoidable. At that point, the newbie usually gets struck by the immense complexities that such transition entails. And it is at that point that the newbies all over the world decide to drag in the procedure call world view into the networked world.
Now the problem of control in a multi-CPU environment arises. Who's in charge here? It used to be that a single human mind was in full charge and control over the execution of computing instructions, via the single CPU. Now, all of a sudden, multiple CPUs which are distributed across vast networks enter into the equation. The remote nodes start posing the problem when it comes to controlling the proceedings.
The only way for such developers to try and tame this complexity is to wiggle in their chairs and to fabricate some hairy-brained scheme for enabling remote procedure calls (or, RPC). Various mind-boggling protocols get implemented (CORBA, RMI, DCOM, SOAP, SOA, etc.) All of these protocols are incredibly arbitrary, non-compelling, frighteningly complex and brittle.
As we can see, this transition from making local, simple-minded procedure calls to making distributed, remote procedure calls never goes well. In the end, it inevitably results in fragile, brittle software that requires constant duct-taping and chewing-gumming in order to stay afloat.
Who's going down with the ship?It is obvious to even a casual observer that the majority of software projects, today as in the past, fail miserably. As the complexities of the business scenarios continue to increase at a frightening rate, this rate of failure will only skyrocket.
At this pace, it won't be long before the software development industry turns into another Titanic, if the majority of developers continue to charge down the remote procedure call path. Developers who refuse to go down with the ship have only one option -- abandon the RPC ship and hop into the ROA boat.
But as we've already seen, the transition from navigating a huge, unwieldy ship to rowing a tiny nimble ROA canoe is proving more challenging than most developers are prepared to bear. That's why there must lie a major shakeout ahead of us, a shakeout that will separate the monolithic totalitarian software practices from the nimble, distributed ones. And ROA practices are certainly on the opposite side of the monolithic remote procedure calls practices.
How to make the transition?It seems to me that, in the process of making the transition from RPC to ROA, the biggest stumbling block for the entrenched software developers lies in the fact that ROA proposes erecting an impenetrable barrier between the client and the resourses on the server. In the bad old RPC world, such barriers are unheard of. In other words, if the client is privy to the protocol that the service-oriented server had arbitrarily and unilaterally made up, the client can intimately manipulate the state of the server.
No such thing is ever possible in the ROA world. And that's the biggest put off for the honest-to-god procedural crowd. They simply don't seem physically capable of conceiving the world where it would not be possible to 'crack the code', learn the protocol, and then rule the server on the other end.
So in order to make a successful transition to ROA, these software dictators must learn how to abdicate from their dictatorial throne. What are the chances of that happening en masse any time soon?
Slim to none? What do you think?