Google’s (GOOG) Gmail system, which serves millions of customers around the world, shut down yesterday. The problem may hurt its efforts to market applications which include e-mail to businesses. Google prides itself on the reliability of its software, at least according to its marketing pitch. It is not clear why Google had the problem. It keeps thousands of servers. Some of those may have developed trouble or some outside programmer, a hacker, may have sent a bug into the system to make it collapse.
Twitter, the micro-blogging service used by tens of millions of people, went off-line last month. It blamed the trouble on a malicious programmer in Russia who was trying to shut down the account of a user in neighboring Georgia. It is astonishing the the local actions of a small number of programmers can bring an entire Internet service to its knees, but that appears to have been the case.
Two months ago, hackers, probably from North Korea, were able to shut down or slow access to a number of major websites in South Korea and the US, another example of the Interent’s vulnerability.
The Internet as it is set up now operates at three levels. The first is the servers that store data and content for use by customers. Google and other large companies keep massive server farms. These are well-protected from outside programmers by sophisticated software, but clearly that does not always work. The protection devices may fail more and more often as hackers get more skillful
The second part of the Internet is the “pipes” that carry data, video, e-mail messages, and text from companies such as Google, Hulu, Yahoo!, and Twitter, the millions of websites that make up Internet content and services, to customers. Some of these “pipe” providers, specifically telecom and cable companies, are already complaining that the amount of data that they have to move is rising too rapidly for them to accommodate. Video files, which are particularly large, have posed significant problems to cable companies and they have asked Washington to be able to charge more for customers who use the Internet to transfer large files. So far, the government has turned those requests down.
The last piece of the Intenet is the end user, the companies and consumers who have the PCs that collect and parse all the data that comes in through the global web. Perversely, these PCs are not only part of the problem because they can request such tremendous amounts of data; they are also the tools by which hackers build malicious code that they send out to compromise the effectiveness of the servers at firms including Twitter and Google. They may be passive in receiving data, but can be active in destroying the data transport system which is at the core of the worldwide web’s operations.
The problem of online outages is as old as the internet itself, AOL’s dial-up 56k service would go down regularly in the mid-1990s. The company did not have enough modems to keep up with demand. That caused its system to periodically collapse.
The Internet is not invulnerable, as most people who use it assume. It may not even be reliable. Outages take major online services offline more often. The North Korean and Russian incidents show that a very small group of people can disrupt very large systems.
The Internet as the public has known it and used it for the last decade may not be the Internet for the future. The system is getting old and rickety, particularly for the volume of commerce it has to accommodate. The prophylactic software that was meant to protect the web is less effective. Like anything else the is used regularly whether it is a car, a light bulb, or a PC, the Internet is going to have to be patched and upgraded more often now. It won’t work every hour of every day anymore.
Douglas A. McIntyre