Wednesday, July 27, 2011

What is virtualization?

Virtualization is the creation of a virtual (rather than actual) version of something, such as an operating system, a server, a storage device or network resources.
You probably know a little about virtualization if you have ever divided your hard drive into different partitions. A partition is the logical division of a hard disk drive to create, in effect, two separate hard drives.
Operating system virtualization is the use of software to allow a piece of hardware to run multiple operating system images at the same time. The technology got its start on mainframes decades ago, allowing administrators to avoid wasting expensive processing power.
In 2005, virtualization software was adopted faster than anyone imagined, including the experts. There are three areas of IT where virtualization is making headroads, network virtualization, storage virtualization and server virtualization:
  • Network virtualization is a method of combining the available resources in a network by splitting up the available bandwidth into channels, each of which is independent from the others, and each of which can be assigned (or reassigned) to a particular server or device in real time. The idea is that virtualization disguises the true complexity of the network by separating it into manageable parts, much like your partitioned hard drive makes it easier to manage your files.
  • Storage virtualization is the pooling of physical storage from multiple network storage devices into what appears to be a single storage device that is managed from a central console. Storage virtualization is commonly used in storage area networks (SANs).
  • Server virtualization is the masking of server resources (including the number and identity of individual physical servers, processors, and operating systems) from server users. The intention is to spare the user from having to understand and manage complicated details of server resources while increasing resource sharing and utilization and maintaining the capacity to expand later.
Virtualization can be viewed as part of an overall trend in enterprise IT that includes autonomic computing, a scenario in which the IT environment will be able to manage itself based on perceived activity, and utility computing, in which computer processing power is seen as a utility that clients can pay for only as needed. The usual goal of virtualization is to centralize administrative tasks while improving scalability and work loads.

What is World Wide Web (WWW)?

A technical definition of the World Wide Web is: all the resources and users on the Internet that are using the Hypertext Transfer Protocol (HTTP).
A broader definition comes from the organization that Web inventor Tim Berners-Lee helped found, the World Wide Web Consortium (W3C):
"The World Wide Web is the universe of network-accessible information, an embodiment of human knowledge."

What is Web 2.0 (or Web 2)?

Web 2.0 (or Web 2) is the popular term for advanced Internet technology and applications including blogs, wikis, RSS and social bookmarking. The two major components of Web 2.0 are the technological advances enabled by Ajax and other new applications such as RSS and Eclipse and the user empowerment that they support.

Tim O'Reilly is generally credited with inventing the term, following a conference dealing with next-generation Web concepts and issues held by O'Reilly Media and MediaLive International in 2004. O'Reilly Media has subsequently been energetic about trying to copyright "Web 2.0" and holds an annual conference of the same name. There is, however, some dispute about whether O'Reilly is responsible for the original coinage. Joe Firmage, for instance, used Web 2.0 to describe using the World Wide Web as a platform in 2003.

One of the most significant differences between Web 2.0 and the traditional World Wide Web (retroactively referred to as Web 1.0) is greater collaboration among Internet users and other users, content providers, and enterprises.

Originally, data was posted on Web sites, and users simply viewed or downloaded the content. Increasingly, users have more input into the nature and scope of Web content and in some cases exert real-time control over it. For example, multiple-vendor online book outlets such as BookFinder4U make it possible for users to upload book reviews as well as find rare and out-of-print books at a minimum price, and dynamic encyclopedias such as Wikipedia allow users to create and edit the content of a worldwide information database in multiple languages. Internet forums have become more extensive and led to the proliferation of blogging. The dissemination of news evolved into RSS.

There is no clear-cut demarcation between Web 2.0 and Web 1.0 technologies, hardware and applications. The distinction is, to a large extent, subjective. Here are a few characteristics often noted as descriptive of Web 2.0:

  • blogging
  • Ajax and other new technologies
  • Google Base and other free Web services
  • RSS-generated syndication
  • social bookmarking
  • mash-ups
  • wikis and other collaborative applications
  • dynamic as opposed to static site content
  • interactive encyclopedias and dictionaries
  • ease of data creation, modification or deletion by individual users
  • advanced gaming.

Critics of Web 2.0 maintain that it makes it too easy for the average person to affect online content and that, as a result, the credibility, ethics and even legality of Web content could suffer. Defenders of Web 2.0 point out that these problems have existed ever since the infancy of the medium and that the alternative -- widespread censorship based on ill-defined elitism -- would be far worse. The final judgment concerning any Web content, say the defenders, should be made by end users alone. Web 2.0 reflects evolution in that direction.

Some industry pundits are already claiming that Web 2.0 is merely a transitional phase between the early days of the World Wide Web's existence and a more established phase they're calling Web 3.0.