Importance Of RAID Technology

images (2)RAID is the most advanced storage technology for lost data recovery. In the event of data failure, this technology has very high significance as it enables users to recover RAID. The technology needs a lot of hard disk space to work properly. To explain in simple words how does this technology work; it can be stated that the data is divided into mirrored pairs at the lower level. In the nest stage, the controller selects a member from each mirrored pair and stripes the data into a new logical volume. There is a good reason why this technology is gaining more and more popularity with each passing day. Plenty of devices like NAS, SAN, external hard drives, controllers and servers use this array level. With the help of RAID recovery, you can recover the data, stored on the desktops, laptops, SQL servers as well as exchange servers. How much data can be recovered? Well, it depends on two aspects; the capacity of the disk and the ability of Raid Data Recovery professional. The professional you have hired must capable of carrying out a detailed analysis of all the drives. He should also be capable of finding out whether any physical damage has been caused or not.

RAID Technology Safeguards Your System Against Drive Failure

RAID was first used to define the array of disks that a user can employ to enjoy increased levels of data storage from the use of less reliable disk components. Because of the complexity of the technology behind it, it is usually safe to let qualified professionals handle any recovery, such as RAID 10 recovery. Other than the complexity, users that employ this technology have to adhere to certain other standards, most notably that of clean air. Dust is particularly bad for any disk, and a small speck can cause irreversible damage. As such, the data rooms are kept free of dust, and proper ventilation is maintained. RAID 10 is a combination of RAID’s 1 and 0. It is implemented at the hardware level by the use of a special controller. This device functions well enough even when power is absent, data can still be recovered. The device works by mirroring disk pairs to create the RAID 0 array. When data is written on the array, it is mirrored in RAID 1. It sounds simple but there are a lot of details that have to be met in order to make this happen. As such, RAID 10 recovery is not for any data recovery company; you must get one that is qualified in this area. Using RAID technology safeguards your system against drive failure. However, should the inevitable happen, contact a professional to provide RAID 10 recovery services.

The Safety Of RAID 10 Recovery Software

Most people usually see as a dilemma the prospects of going for data recovery services. This is because they have usually kept a number of very important information in the hard disks that do malfunction. With a world where more and more people are getting concerned with the ability of somebody stealing their information and later on using it for malicious reasons, there is a need to check on the safety of your data even as you wish to have it back. RAID 10 recovery is definitely a safe bet for all your data. There are proper safety provisions that have been made to ensure that nobody else has access to your files that may be of importance to you. Also, the connection between the data in the computer and the RAID 10 recovery software is made as secure as possible to escape the eyes of those who may malicious intentions. Usually, the link is well secured with encryption and SSL provisions so that they cannot be siphoned off easily. RAID 10 makes use of at least four drivers and is hence a better performer when compared to the other lower RAID versions. This therefore explains why a lot of the times, people have their sensitive information kept here. For better security of your data, the recovery process merely involves a remote solution to the problems and no fragment of your file is taken by the recovery team.

What Is Raid 10 Recovery All About

recoveryRaid 10 is basically known as RAID 1+0 which consists of four drives. RAID 10 systems are less expensive and reliable, but there might be certain conditions under which these systems fail to work. There are many causes of RAID 10 failure, such as corrupted, damaged or lost, flashing multiple drive failure, hardware and a few others. The major fault arises when mirroring occurs and it leads to failure. Raid 10 Recovery techniques recover the data that gets deleted due to disk failures. Since irregularities in one disk might affect the functioning of the other disks, it is recommended that no extra data are written to the RAID 10 system when a failure occurs. There are several techniques for data recovery in RAID 10 systems. The recovery is carried out in parts and it becomes successful when you know the volume of data and file system recovery algorithm. After this, the disk is divided onto one disk only. Raid 10 Recovery techniques excel in recovering critical data from RAID 10 arrays. There are professional companies which have state-of-the-art facilities by which you can get rest assured to get your data safely and securely. Through data recovery experts, you can get the data you need with the use of specialized equipments. Special Raid 10 Recovery Labs have been developed that possess customized tools and latest techniques to generate the best output and recover your vital data from failure.


Explaining RAID Technology


Using RAID technology, you can ensure your data is kept safe from drive failures. When you have lost your data, you can recover RAID 0, 5 or 10. The basic objective of RAID is replicating and dividing your data among several physical hard drives and the operating system reads all these drives as a single disk. It is called a RAID array. Meanwhile, you need to take proper steps of RAID to make your data safe. The problem of data loss is caused due to many reasons such as system overheating, burning of one or more physical drives, virus infection etc. When the cause is known, it becomes easier for professionals to recover RAID. In the event of data loss, you should immediately contact a RAID data recovery specialist who knows better how to handle the drive or drives correctly to prevent further damage. RAID data recovery specialists what is to be done to get back the critical data quickly. It takes more than just regular maintenance technician to carry out RAID data recovery. Raid Data Recovery Pros are highly qualified individuals to recover RAID failure. When hired, they always treat your job as the highest priority and attend to your needs quickly. Every business depends upon its data a lot as it is an integral part of businesses.

The Magic Solution Within RAID

Power outage and other ways can cause a logical damage to your hard drive thus crashing it. This can lead to some hardships and the only solution is a software based solution and recover RAID can be used to recover from a physical damage. This can happen by replacing the damaged part of the disk so that it can be readable again. The retrieval services of RAID are one of the best in the market. Originally, the intention of creating RAID was to be like a single huge vessel capacity that was reliable and efficient. Very many companies rely on the recover RAID for protecting huge amounts of their data. When RAID fails, you stand to lose lots of data. There are some instances that can lead you to lose all your data if you data to reformat the recover RAID drive. The consequences can be dire for you can end up losing all your data and never recover it at all costs. Make sure that all the steps that you take will not put your data at risk for any wrong move can cost you data which is may be priceless. If all is not well you can consult an expert and you will rest assured to have your data back safe and sound.

Factors To Consider Before Doing A Recover RAID

raid-imagesRecover RAID demands for attention since it is a process that is delicate. Before carrying out the recovery process, you should consider knowing the extent by which your disk is spoilt. For example, a disk that has been affected by fire cannot be handled the same way as one that has just refused to function. The one that has been burnt requires more attention and expertise. When you realize that your disks are not functioning as required, the first thing you should do is to package them safely. If you have not had basic training on such, switch off the machine from the power source and call up your expert. This will help to reduce further losing of data. Mishandling your disks especially after they show signs of losing data would damage them further. As you prepare for the recover RAID procedure, you should also consider determining whether you will use software or the manual procedure. Due to improved technology, you can easily benefit from the latest software from the internet. You can easily download it from the internet or follow the online wizard through the whole process. Additionally, you must be careful to use software that is compatible with your OS and machine. Use one that has a simple user interface, that you can easily follow the procedures.

How Important Is RAID Recovery?

For any business owner, having customers enjoy fast services and easy access to data is their number one goal. For this to be possible however, one must invest in a RAID server. The concept of RAID servers takes multiple disks and combines them to enhance the overall system performance. Such an arrangement is low costing and reliable over a long time. But like any other computer system, it is bound to fail at some point or another. Due to the arrangement of the drives, getting to recover RAID data often involves the help of professionals. To recover RAID data after damage of any sort, the professionals have to first evaluate the system to know what the issue is. After they have done this, they will usually release an evaluation report that gives the findings of their diagnosis, and the recommendations. This part of the recovery process is free of charge, and will allow you the user to choose from the options given. It is with your approval to continue that the recover RAID data specialists will tackle the issue, and ensure you get your data back. For those that feel they want to try out things on their own, there is RAID recovery software that gives easy to follow steps on how to recover data. This is only useful where the damage is minimal, and data not critical.


Factors To Consider Before Choosing Recover RAID Software


To recover RAID demands a high level of expertise since it is a very sensitive process. This will determine whether you will get your lost data back or not. With this in mind, you need to settle for software that is licensed. Many companies are coming up with software, which could sometimes be fake. Software that is not genuine is harmful to your disks. Furthermore, it could even cause more harm and loss of data other than the reverse of it. While choosing recovers RAID software, ensure it has the mark of quality with it. Check the company that has released it to ensure that is one that is well known for software. In case you have never heard about the company that has released it, go to a search engine and look out for it. Ensure you know more about the company and look out for their reviews. What the customers write concerning a company is very important. It will give you both the positive and negative indicators of the company. You need also to consider the price offered. It should be affordable for you. You can confirm this by going through the price listings of other companies. Another very important thing you need to consider is its compatibility with your machine and operating system. Remember software that is not compatible with your OS cannot work for you.

Tips On Choosing An Expert To Recover RAID

image-2Carrying out a recover RAID process is very vital. There are factors to consider before choosing a recover RAID expert. First, you should check out on their certification. This is very important for the sake of your machine and data. Before you hire an expert, ensure that they have at least some good amount of experience in the job. An expert determines whether you will get the data back since they hold the key to it. Second, ensure that they are licensed. This will keep you away from fraudsters who may want to steal your money in the name of trying to recover your data. If the experts have a company website, go through their website thoroughly. Look out for their profile and go through it systematically. Ensure also that the recovery RAID expert gives you a priority. As much as they could be busy, they should prioritize every customer and be available whenever they are needed. The expert should also have a good reputation from a good number of people. They should have built some level of confidence on the market. This will give you the confidence to let your job land on their hands. Choose an expert who can listen keenly to your needs.

Easiest Data Recovery Method

You might have had a problem with your computer but that does not mean that all is lost for you still can regain access to your data and recover RAID. You will be in a position to rectify both the physical and logical damage. This has become a day to day thing for there has to occur some faults whenever you are working with your computer thus hard ships in accessing your data. You can recover RAID which is an acronym of Redundant Array of Inexpensive Disk which means a cheap way of writing on disk drives. It is one of the most common ways being used for data recovery. Using recover RAID is one of the cheapest ways of solving your problems for you will be saving on the cost and incase one of the disks gets damaged, the retrieval is indispensable. These are some of the best options that we have in the market today regarding having your data back in track. As we all know, data is very important in any person’s life and some of the data can be very important in running of a business and also not for business but for private matters too. Having a good service provider to recover raid comes in handy for you like the one here.

RAID Server Specialists

If you ever had to recover RAID data from a server, you will appreciate the fact that this is work that can only be carried out by professionals. If that is not the case, then you will surely not enjoy the experience. When RAID servers fail, it usually means that one of the drives has failed or is damaged, and needs attention. Such drives are used to store large capacity of data, mostly important files and documents belonging to businesses and enterprises. You cannot really cut corners when it comes to recovering RAID data. That is because only the experienced recover RAID experts know how to diagnose issues and get back the critical data. If you have hired the right guys to recover RAID data for you, the professional company will treat the issue with the seriousness it deserves, and make sure everything is sorted. The professionals will realize that much of your company’s operations depend on the data, and will everything in their power to get it back for you. In many cases, this service will not come cheap. Bit since the data is not for home use or personal reasons; this may not be an issue. Your problem is not serious if the issue is physical. Data recovery as a result of physical trouble is not time consuming. However, if there is no physical damage, the data has to be transferred to different servers so the diagnosis and recovery process can begin.


Services That Help With Clicking Hard Drives

Hard disks have a File Allocation Table (FAT) which maintains details of all the files at any point of time. Data recovery software scans through this table to locate and retrieve the deleted files. In the event of data loss, you are likely to retrieve all the data provided you start the standard procedure of emergency data recovery immediately.

Hard drive clicking noises can be deadly!

Hard drive clicking noises can be deadly!

On the internet, there is a lot of freeware software available that is capable enough of performing file and data recovery. These software products are good for normal data. However, if the data you have just lost is of critical importance, you need to shut down the system without wasting any time. After shutting down the system, it is highly recommended that you seek professional help from a specialist firm to ensure the safety of critical data.

While attempting to retrieve the lost data with the help of recovery software, there are chances to lose the data permanently because a simple mistake might damage the files. Meanwhile there are some precautionary steps that can save you huge data recovery costs from professional recoveries. For more safety and protection, it is always a good thing to keep several backups of your important files by saving them on more than one computer, CDs and DVDs.

Data recovery is very important, as you do not want to lose your valuable data for any reason. There are lots of services that can help retrieve data when you hear your hard drive clicking. Some of their services offered include hard drive recovery, Apple Mac and laptop recovery, USB and digital media recovery. With all these services you can rest assured to go back to your business in no time with all your data recovered by a good service provider. Previously data loss was a nightmare but with such competent companies, you need not fear data loss anymore.

The services have become very swift because the competition in the market. The competition is so tight that most service providers have pulled up their socks to meet the demands of the large client base that is there.

Technology is rarely faultless. Breakages and mechanical failure as well as data loss is common. With services such as emergency data recovery, you will be able to continue with your tasks without any setback that can come with loss of information.

Factors To Consider When Choosing Emergency Data Recovery Software

So you realize that you accidentally formatted your disk and have lost all of your data. You stand there not knowing where to start from since this was some very important data for your business. When you choose software to recover your data, ensure that your operating system is compatible with the software that you are about to use. Otherwise, there will be no results for you.

Another thing which you must consider carefully is that you must ensure that the data that has been lost has a format that will be compatible with your software. You should be careful enough to read the manufacturer’s recommendations data retrieval. So, high quality software should mainly focus on your data on the platters. It should not in any way repair your hard disks. The latter will cause even more harm than good to the data and your hard disk structure.

Also, when you choose a software, you should ensure that it is able to boot the computer. Sometimes your computer may be attacked by a virus which will damage the operating system. This is now the point at which the software comes in handy since you can use it to boot the computer without much difficulty.

It can be a frightening situation when you turn your computer on and discover that all the important data may be at risk because your hard disk is clicking. If this happens, it can make you suffer from panic attacks, anxiety and frustration. But why let it happen when you can prevent it from happening? To prevent such mishaps, data protection methods have an important role. With the help of a solid hard drive recovery company, you can recover data in the event of a natural disaster or virus threat. No matter what the reason for data loss, these shops are very effective in all situations.

Having the contact details for a data recovery professional is very important for individuals and companies. It becomes even more important if your business collects sensitive data like personal information, Social Security numbers, or bank routing numbers; if this is the case, make this a priority. Professional recovery is the process which comes into play in the event of damage or an infection that renders the hard disk source unreadable. It is the best way to retrieve files and other material from a digital storage facility.

Among many options, one important option is to store backup data in a separate place away from the computer. This way, we can get the data back if the original computer, in which data is stored, is damaged.


MKLinux Just Never Took Off

There are many reasons to recommend a Mac. Plug and Play made Mac hardware setup trivial long before it became a hyped technology on other platforms. The computer’s built-in networking support ensures that connecting Macs with both AppleTalk and TCP/IP is quick and simple.

However, the ease with which you can construct Mac peer networks creates management problems. Tracking different versions of hundreds of files distributed over many machines is tedious. Maintaining security and performing backups are a real nightmare.

Centralizing the important data on a single server was the logical solution. However, as much as I like the Mac OS, it is not robust enough to provide file-system security or manage quotas and resources such as the Web, e-mail, and name servers.

Organizations that have made an investment in Mac hardware may legitimately wonder what is the best option when they need a server that provides these functions, yet integrates seamlessly into the existing network. Can these services still be provided by a Mac? The answer, as I discovered, is yes.

Linux on the PowerPC

While I was researching these issues for a small Mac network, I came across the MkLinux OS–Linux for the Power Mac. MkLinux began its life in 1995, when Apple began supporting a project by the Open Group’s Research Institute to port this freely distributable Unix-like OS to the Power Mac.

Both MkLinux and the BeOS lead the trend to open up the Mac platform to alternative OSes.

In a departure from the monolithic kernel design of other Linux distributions, MkLinux runs natively on top of the Open Group Mach (PMK 1.1) microkernel, which itself is derived from Carnegie Mellon University’s Mach 3.0 microkernel. The Mach microkernel performs only a small number of functions. Among these functions are low-level hardware I/O, interprocess communications (IPC), memory management, and scheduling.

These services provide an abstract layer onto which you can port other OSes. A server is a Mach process that gives the OS its “personality” and provides higher-level functions such as file-system and network support, as shown in the figure “MkLinux Architecture.”

MkLinux thus runs as a Linux server. That is, Linux runs as a Mach process that contains an orthodox Linux kernel, which is modified to use low-level Mach services. To improve performance, the Linux server can reside in the same address space as the Mach kernel.

Installing MkLinux

MkLinux runs on most Power Macs, including early NuBus-based machines (6100, 8100, and 9100), first- and second-generation PCI models (7100, 7200, 7300, 7500, 7600, 8500, 8600, 9500, and 9600), some PowerBooks (2400, 3400, 5300, and G3), and the latest G3 Power Macs. A multiprocessor kernel is available that supports Apple dual-processor machines and clones, as well as DayStar Digital’s two-way 604e CPU upgrade card.

Installing Linux on any platform is not for a novice. Much of the MkLinux installation is automated, but some knowledge of networking, SCSI, and disk partitioning is required for things to go smoothly. I installed MkLinux on an external 1-GB SCSI hard drive attached to a Power Mac 7600. It takes only two partitions to install MkLinux (one to hold the Linux file system, and the other for swap space), but four or more are commonly used because they provide better flexibility.

Although Apple provides a functional disk-partitioning program, offerings from FWB and LaCie are more sophisticated and let you resize partitions without reformatting. If you are willing to forgo a GUI, MkLinux has a serviceable, if somewhat unfriendly, character-mode disk utility called pdisk. I created a 70-MB “\ ” (root) partition, a 32-MB partition for \swap, and a 100-MB partition for \home, which leaves the remaining 798 MB to \user. Note that disk-partitioning software offers new and exciting opportunities to junk your data; backups are essential.

Setup begins by installing a MkLinux Control Panel that selects MkLinux or the Mac OS as the default OS at boot-up. The Mach kernel is put in the Extensions folder, and a folder containing the Mach server is placed in the root directory of your bootable Mac partition (you can remove it after installation). These steps were sufficient to bootstrap MkLinux.

Rebooting automatically starts the installation program. After specifying which disk partitions should hold different parts of the file system, I was presented with a list of “packages” to install. Packages are compressed binary archives that contain all the files necessary to implement a particular OS service or user application. You can install packages from a distribution CD, over the Internet from an FTP server, from an NFS mount, or from a local hard drive. Because the MkLinux distribution is nearly 300 MB in size, the CD distribution makes sense.

Packages necessary to run a basic MkLinux system (including the X11.6 windowing system) are preselected for you, so accepting the default is a wise move. Installing other packages later is easy. I installed some additional packages, including developer tools (Gnu C, C++, and FORTRAN compilers) and both HTTP and FTP servers.

The excellent RedHat Package Manager (RPM) performs the installation by expanding packages and copying the contents to their appropriate places. The RPM system maintains a database of installed packages, thereby providing a useful version and dependency control system. Supplying network information, a name for my “new” machine, and a root password completed the installation. After rebooting, I had a fully functioning MkLinux server.

Speaking AppleTalk

Getting my MkLinux server running was one thing; making it useful on an AppleTalk network was another. Mac users like to access servers via the Chooser, and this convenience can be provided easily if you install Netatalk.

Netatalk is a kernel-level implementation of the AppleTalk Protocol Suite for Unix systems running over Ethernet. It is available as either source code or the RPM package and is part of the MkLinux distribution. It includes support for routing AppleTalk, serving Unix and AFS file systems over the AFP (AppleShare), serving Unix printers, and accessing AppleTalk printers.

Once installed, Netatalk made the MkLinux server appear like any other Mac on the network. Mac users with an account on a MkLinux server running Netatalk are able to log in and mount their home directories as network drives. Clever file translation ensures that folder attributes, file icons, and their program associations are preserved on the MkLinux file system.

If you need to transfer files between HFS and MkLinux partitions, there are a series of “h” utilities to help you. These mimic their Linux counterparts but operate on a local HFS partition. If you need complementary functions, an excellent shareware utility by Michael Pollet called LinuxDisks allows file transfer to and from MkLinux partitions from within the Mac OS.


Fixed Income And Derivatives Integrated – The Beginning Of The End

Not so long ago, the fixed income and derivatives markets were worlds apart. Banks had separate groups to deal with each. Under separate managers, these groups developed their own practices, cultures and computer systems.

But over the past few years, these worlds have collided. Bond traders now hedge their deals with swaps and swaptions, or even caps and floors. A number of securities now straddle the two worlds, such as callable, puttable and convertible bonds, and are widely traded. As fixed income and derivatives increasingly overlap, banks are merging the separate groups under a single manager. But they are finding it less easy to integrate the different computer systems.

The US investment banks, as usual, led the way in making the change. Major players such as JP Morgan, Merrill Lynch and Chemical (now part of Chase) pioneered the integration of fixed income and derivatives in the early 1990s. Deutsche Bank led the way in Europe, followed more recently by Union Bank of Switzerland, WestLB and others.

What are the reasons for these worlds merging? Paramount is the banks’ desire to manage their risks more effectively. Banks do not want to have a bond and its hedge in two different places. Besides, banks have realised that behind the differences in structures and terminology, bonds and swaps are essentially the same instruments. The separation of the two worlds is more historical than logical.

Another thing that kept the worlds apart was the hunt for arbitrage opportunities. Traders looking for profits from price discrepancies tend to keep their eyes narrowly focused. But more recently the increasing efficiency in the markets has been steadily reducing these opportunities. This has forced traders to lift their eyes and look for profits beyond the narrow horizons of their traditional markets.

At the same time as arbitrage opportunities have been evaporating, margins on fixed income deals have narrowed as interest rates have fallen across much of the developed world. This, combined with a decline in volumes, has meant that traders have had to devise more sophisticated deals to generate profits. Many have switched their attention to trading credit risk where the margins are greater. For instance, traders are buying corporate bonds in the emerging markets, hedging away their interest rate risks with derivatives and trading the credit risk.

Meanwhile, customers have become more sophisticated too. They now know that they can hedge their risks by repackaging their assets in other forms, such as swapping fixed income coupons for floating rates or other strategies. Clients want a single point of access to get solutions, even when they combine cash and derivatives.

These factors – the desire to manage risk more effectively, declining margins and arbitrage opportunities, and the growing sophistication of customers – are driving the merger of fixed income and derivatives. Under this pressure, the traditional world of the bond markets is fast giving way to the more innovative, flexible and risk aware world of derivatives.

But while the logic for merging fixed income and derivatives activities at the organisational and risk management levels may be overwhelming, it is not as easy to accomplish the integration at the systems level.

In the past the different requirements, in terms of instrument structures and trading volumes, meant that the fixed income and derivatives groups used quite distinct computer systems to support their activities.

Fixed income products are generally standard instruments that are traded in volume. A trader can make a huge number of bond trades per day. A fixed income system, therefore, must be able to manage volume trading. At the same time, because these trades are generally in a small set of instruments, the system has relatively few cashflows to manage. Each new bond traded is simply another instance of cashflows already calculated and stored in the system.

A swaps system, on the other hand, does not have to deal with the same volume of trades, but it must manage many more cashflows. Swaps and swaptions, now commonly used as a means of hedging interest rate risks, are unique instruments and are traded in much lower volumes. Because each swap is unique, the system must calculate and store its cashflows separately. The database must have the capacity to manage the explosion of cashflow data.

Most organisations in the past have designed their systems for the one world or the other – the high volume trading of securities or the cashflow complexities of derivatives. Because these differences often dictated the underlying architecture of the systems, banks cannot simply reconfigure them for the new task. The data model for a bond system is usually fundamentally different from the data model of a derivatives system. Few thought to design their data models to handle both.

Building an integrated fixed income and derivatives system is not a simple matter. Many banks have already learnt the hard way the problems of specifying, building and maintaining a system for derivatives. Fixed income adds another layer of complexity to an already complicated task.

But an integrated system, especially if it operates in real time, has many advantages. It more accurately reflects the way a merged organisation works. It also enables managers to consolidate bond and derivative risks and to run various forms of analysis across the combined portfolios, such as present value, value-at-risk or scenario simulations.

An integrated system can allow bond dealers to price with the swaps curve, updated by the swaps desk. It can enable repo dealers to see the bond inventory. It enables sales desks to see quotes from dealers. And it means that the system has the potential to show profit and loss for any user, desk or book, calculated in any of a number of ways, including by portfolio, counterparty or instrument, so traders and managers can always be aware of their profitability.

The 1990s has seen the traditions of fixed income collide head on with the rocket science of derivatives. By merging the two, banks are better able to respond to their customers’ needs and to exploit profitable opportunities while managing the risks of today’s fast and complex markets. But to support this consolidated approach, banks need an integrated fixed income and derivatives system that reflects the new world order.


Open Source Moves Mountains

The Python language (www. may be less well-known, but is widely appreciated by Web cognoscenti. And Sendmail (, the most popular E-mail server program, also belongs to this select band, as does BIND, the main domain name system server software (see

The previous cultural divide between the free software community and business users was bridged in the most dramatic way by Netscape’s announcement in January that it would not only be giving away its forthcoming Communicator product, but making the source code available. That is, anyone would be able to take the program, modify it and then use it for any purpose.

In fact this free availability of the source code – rather than the zero price-tag – is the key defining characteristic of all the free products mentioned above. It is what makes such software so powerful. By throwing open the development process a kind of virtual programming team is created that potentially encompasses anyone on the Internet. In particular, bug-testing is carried out automatically on a huge scale, often resulting in greater reliability than commercial products.

This approach has been dubbed Open Source by its leading theorist, Eric Raymond. His analysis, called The Cathedral And The Bazaar, of how the Open Source movement works – and why it is so successful – apparently played an important part in convincing Netscape to take the unprecedented step of opening up its software development process.

Of course, cynics will argue that such a move was simply a desperate last gamble by a company that has been comprehensively outflanked by Microsoft, not least by giving away its browser from the start. And it is no doubt true that Netscape would never have countenanced such a risky move without this prodding from its rival.

But there is increasing evidence that Netscape is indeed tapping into an important movement that could well see a major reversal of its decline in the browser arena.

One indicator is the extremely positive response the company has received from the wider Internet community. As well as the several hundred thousand copies of the source code the company claims have been downloaded, there are a number of major projects and sites supporting the Mozilla movement.


For example, Netscape was unable to release all the cryptographic code in its browser because of US export regulations. But the Australian-based Mozilla Crypto team succeeded in writing their own -version 15 hours after Netscape released its code. Another group is working on Jazilla, a Java-based version of the browser.

Netscape’s Mozilla has also acquired XML capabilities overnight through James Clark’s expat program.

The Open Source movement is gaining other adherents: recently Corel said it would be releasing all the code for a toolset for a forthcoming Linux-based network computer. It is not hard to see many of Microsoft’s other hard-pressed competitors embracing this form of guerrilla software development.