Web Hosting - Sharing A Server
Matt Williams - 20th April, 2010
Things To Think About
You can often get a substantial discount off web hosting fees by sharing a server with other sites. Or, you may
have multiple sites of your own on the same system. But, just as sharing a house can have benefits and drawbacks,
so too with a server.
The first consideration is availability. Shared servers get re-booted more often than stand alone systems. That can
happen for multiple reasons. Another site's software may produce a problem or make a change that requires a
re-boot. While that's less common on Unix-based systems than on Windows, it still happens. Be prepared for more
scheduled and unplanned outages when you share a server.
Load is the next, and more obvious, issue. A single pickup truck can only haul so much weight. If the truck is
already half-loaded with someone else's rocks, it will not haul yours as easily.
Most websites are fairly static. A reader hits a page, then spends some time skimming it before loading another.
During that time, the server has capacity to satisfy other requests without affecting you. All the shared resources
- CPU, memory, disks, network and other components - can easily handle multiple users (up to a point).
But all servers have inherent capacity limitations. The component that processes software instructions (the CPU)
can only do so much. Most large servers will have more than one (some as many as 16), but there are still limits to
what they can do. The more requests they receive, the busier they are. At a certain point, your software request
(such as accessing a website page) has to wait a bit.
Memory on a server functions in a similar way. It's a shared resource on the server and there is only so much of
it. As it gets used up, the system lets one process use some, then another, in turn. But sharing that resource
causes delays. The more requests there are, the longer the delays. You may experience that as waiting for a page to
appear in the browser or a file to download.
Bottlenecks can appear in other places outside, but connected to, the server itself. Network components get shared
among multiple users along with everything else. And, as with those others, the more requests there are (and the
longer they tie them up) the longer the delays you notice.
The only way to get an objective look at whether a server and the connected network have enough capacity is to
measure and test. All systems are capable of reporting how much of what is being used.
Most can compile that information into some form of statistical report. Reviewing that data allows for a rational
assessment of how much capacity is being used and how much is still available. It also allows a knowledgeable
person to make projections of how much more sharing is possible with what level of impact.
Request that information and, if necessary, get help in interpreting it. Then you can make a cost-benefit decision
based on fact.
Top of page