Some days ago I was reading The Cathedral and the Bazaar: Musings on Linux and Open Source by an Accidental Revolutionary (abbreviated CatB) is an essay by Eric S. Raymond on software engineering methods, based on his observations of the Linux kernel development process and his experiences managing an open source project, fetchmail.
At the end I found really interesting conclusion, I am posting here.
Some programmers worry that the transition to open source will abolish or devalue their jobs. The standard nightmare is what I call the ``Open Source Doomsday'' scenario. This starts with the market value of software going to zero because of all the free source code out there. Use value alone doesn't attract enough consumers to support software development. The commercial software industry collapses. Programmers starve or leave the field. Doomsday arrives when the open-source culture itself (dependent on the spare time of all these pros) collapses, leaving nobody around who can program competently. All die. Oh, the embarrassment!
We have already observed a number of sufficient reasons this won't happen, starting with the fact that most developers' salaries don't depend on software sale value in the first place. But the very best one, worth emphasizing here, is this: when did you last see a software development group that didn't have way more than enough work waiting for it? In a swiftly changing world, in a rapidly complexifying and information-centered economy, there will always be plenty of work and a healthy demand for people who can make computers do things—no matter how much time and how many secrets they give away.
For purposes of examining the software market itself, it will be helpful to sort kinds of software by how completely the service they offer is describable by open technical standards, which is well correlated with how commoditized the underlying service has become.
This axis corresponds reasonably well to what people are normally thinking when they speak of `applications' (not at all commoditized, weak or nonexistent open technical standards), `infrastructure' (commoditized services, strong standards), and `middleware' (partially commoditized, effective but incomplete technical standards). The paradigm cases today in 2000 would be a word processor (application), a TCP/IP stack (infrastructure), and a database engine (middleware).
The payoff analysis we did earlier suggests that infrastructure, applications, and middleware will be transformed in different ways and exhibit different equilibrium mixes of open and closed source. It also suggested the prevalence of open source in a particular software area would be a function of whether substantial network effects operate there, what the costs of failure are, and to what extent the software is a business-critical capital good.
Infrastructure (the Internet, the Web, operating systems, and the lower levels of communications software that has to cross boundaries between competing parties) will be almost all open source, cooperatively maintained by user consortia and by for-profit distribution/service outfits with a role like that of Red Hat today.
Applications, on the other hand, will have the most tendency to remain closed. There will be circumstances under which the use value of an undisclosed algorithm or technology will be high enough (and the costs associated with unreliability will be low enough, and the risks associated with a supplier monopoly sufficiently tolerable) that consumers will continue to pay for closed software. This is likeliest to remain true in standalone vertical-market applications where network effects are weak. Our lumber-mill example earlier is one such; biometric identification software seems likeliest, of 1999's hot prospects, to be another.
Middleware (like databases, development tools, or the customized top ends of application protocol stacks) will be more mixed. Whether middleware categories tend to go closed or open seems likely to depend on the cost of failures, with higher cost creating market pressure for more openness.
To complete the picture, however, we need to notice that neither `applications' nor `middleware' are really stable categories. Earlier we saw that individual software technologies seem to go through a natural life cycle from rationally closed to rationally open. The same logic applies in the large.
Applications tend to fall into middleware as standardized techniques develop and portions of the service are commoditized. (Databases, for example, became middleware after SQL decoupled front ends from engines.) As middleware services are commoditized, they will in turn tend to fall into the open-source infrastructure—a transition we're seeing in operating systems right now.
In a future that includes competition from open source, we can expect that the eventual destiny of any software technology will be to either die or become part of the open infrastructure itself. While this is hardly happy news for entrepreneurs who would like to collect rent on closed software forever, it does suggest that the software industry as a whole will remain entrepreneurial, with new niches constantly opening up at the upper (application) end and a limited lifespan for closed-IP monopolies as their product categories fall into infrastructure.
Finally, of course, this equilibrium will be great for the software consumers who are driving the process. More and more high-quality software will become permanently available to use and build on instead of being discontinued or locked in somebody's vault. Ceridwen's magic cauldron is, finally, too weak a metaphor—because food is consumed or decays, whereas software sources potentially last forever. The free market, in its widest libertarian sense including all un-coerced activity whether trade or gift, can produce perpetually increasing software wealth for everyone.
This paragraph is taken from: http://www.catb.org/~esr/writings/cathedral-bazaar/magic-cauldron/ar01s1...