From the above discussion of contrasting factors between grid computing and cloud computing, it is clear that it's not a simple matter of choosing one over the other. The trade-offs involved relate to which functionality is most suitable over the other. It seems as though cloud computing is more suited to businesses looking to derive value out of their IT operations in a streamlined fashion. The agility that comes with utilizing services from the cloud complements its scalability. The grid computing paradigm on the other hand, has been the traditional arena of funded scientific research although there are emerging instances of its use in biomedical, financial and industrial research. It now finds applications in weather modeling and weapons test simulations. In fact, web serving (serving requests of website content from users located all over the world) is an example of a commercial application that benefits from the grid infrastructure.
Both computing paradigms are revolutionary; however, they're both still immature. Their scalable properties are as promising as is their ability to provide on-demand resources. However, both are struggling to battle their inherent weaknesses and emerge as viable commercial options for businesses. Experts across the board agree that while cloud computing will not replace grids, they might merge and few even imagine the possibility of a World Wide Grid!