August 19, 2005

Modern disruptive technologies in enterprise software

Jason Hunter seems to believe that Ruby on Rails is a disruptive technology that will displace Java on the web tier.

RoR is certainly a very productive approach to building web sites, but it's confusing to me why people so often confuse "productive web framework" with "platform to run and operate an enterprise application". I suppose RoR may be disruptive to other web frameworks and/or technologies, but let's first recognize that Java is *not* the only one, and probably isn't the primary one. PHP and ASP.NET are pervasive.

It is completely unclear if the RoR disruption (assuming it is a disruption, which has nothing to do with someone's blog entry, and has everything to do with how the market reacts) will affect the web frameworks and maybe JSP/servlet container market or the entire J2EE application server market. I would believe the former, but have a hard time believing the latter. Jason seems to think RoR is targetted at replacing application servers and distributed transaction processors: "Like all disruptive technologies, it'll only get better. It will scale better. It will add two-phase commits and fancy message queues."

It is unbelievably frustrating to me to suggest that these features are in any way related to a web framework, in terms of engineering effort. Or that they are somehow sideshow features. Perhaps to an average web site, but this again assumes that web sites will be the primary application for the forseeable future. It takes hundreds of man years of effort to build these kinds of things.

The disruptive technology argument that Jason is using is similar to the one Microsoft makes about Windows over Unix or Linux -- Windows has everything Unix/Linux has, only more performance, productivity, and manageability. Yet Solaris, HP-UX, AIX, etc. are all still around in spades, and Linux seems like it may trump all of them with its own disruption.

Besides web frameworks, there are many disruptions on the horizon. The intense interest I see in integration technologies and web services, for example, are re-emphasizing the importance of high-speed, reliable messaging and data transformation and routing -- without having to write Java code. Another disruption is what I would call the "process & operations revolution", or "grid computing". Grids indicate a re-focus (which we lost in the PC era) on how to reliably handle the process of software development, provisioning hardware in a utility-based fashion, promotion /rollback of all changes, troubleshooting, monitoring, and diagnostics. This is arguably a major reason why Oracle rules the database world, and I think it may serve to hold off startup frameworks, languages, and platforms from capturing application server market share from the incumbents. It also at intersects and is a necessary condition to support SOA as another potential disruption, which is much less to do with web services than it is the drive to evolve from projects to product-lines and applications to more managable & re-usable services.

There are also many opportunities for incumbent vendors to start their own disruptions, or to adopt scripting languages and incorporate them into their platforms. There's already a trend to use Jython as an administrative scripting langauge in the BEA WebLogic community, for example.

Perhaps another way to look at the current environment is this: the past 15 years have seen developers as the driving force in what has pushed IT forward: first, the Windows developer base, second the Java developer base. I would claim that the open source movement has fragmented developer opinions so much between .NET, Java, and "scripting language du jour" that the next major disruption in IT will not necessarily be developer-led. There's too much cacaphony. I think it might be (for lack of a better term) "architect-led" or "infrastructure-led".

The focus on declarative configuration in modern frameworks (whether AOP or IoC or attribute metadata) is an indicator of this drive -- the next step is to disentangle the amount of knowledge requried to understand the chorus of frameworks and allow specialist roles to emerge, while an "architect" (in the "broad+deep developer" sense of the term, not the UML-junkie sense) ensures all the appropriate pieces are chosen, and the appropriate roles are filled by the people that can best do the work.

Anyway, the computing industry has a hard time accepting, en masse, a new platform or language technology. Java was the fastest adopted development platform in the history of computing for one reason: the Internet took off at that exact time. Before that, Windows was the fastest adopted platform because it was the first mass-market accepted GUI for PCs. It would take a major user-centred shift to bring about another language & platform revolution. Until that time, the cacaphony will reign.

Posted by stu at August 19, 2005 07:28 PM