Skip to main content

How to increase the visitor traffic your website can cope with

31st January 2024
Reading Time
7 minutes
GWS Team

There are two main areas of considerations that affect the amount of simultaneous web traffic your website can handle.

The first would lie in the technical configuration of your hosting service, and to some extent you will get what you pay for in this respect, so it is worth shopping around for a hosting package that meets the needs of your website now and in the near future, allowing for expected growth in web traffic and active users, but without costing you more than you actually need to pay to meet those needs.

All other factors being equal, a well-configured dedicated server with a recent hardware specification including multiple processor cores running at a high speed would give you the fastest performance.

A Virtual Private Server would represent the more affordable middle ground, giving you a dedicated share of the resources on a server you share with other accounts, which tends to result in reasonable page access times provided that the number of users you have does not exceed the resource allowance that has been allocated to your VPS.

Shared hosting, the cheapest generic hosting option, tends to be the slowest and least reliable at coping with high levels of demand, because at all times your website is directly competing for server resource with every other website hosted on the same server.

The second consideration resides in factors that you can control, working within the ultimate resource limits set by your hosting provider.

We look at several of these under separate headings below.

Reducing Resource Sizes

Every resource accessed by users in connection with your website will require server processor speed, hard drive space and memory proportionate to its size. This means that for any level of given hosting resource limits, you can accommodate more visitor traffic with smaller resources.

Media resources such as images, audio and video clips, animations and other graphical content, including PDF documents that contain it, can be very resource-hungry. Suitable compression protocols should be applied to optimise all such content for web usage, while maintaining visual and (where relevant) sonic quality levels that are appropriate for your users’ demands and your brand image. There is always going to be a trade-off between speed and quality, but, except in the most demanding mission-critical contexts where the highest quality levels are needed at any cost, sophisticated image and audio compression algorithms can very significantly reduce the size of graphical and audio-visual resources with negligible impact on user experience.

Image data compression can be carried out in Photoshop or Photoshop Elements on a sliding quality scale. If you have access to one of these programs, try reducing the quality level to 8 or 9, instead of 10, and see if the standard is still visually acceptable. You should find a significant space saving if you do.

There are also free web resources for image data compression. We have frequently used TinyPNG, which tends to reduce the sizes of PNG and JPEG files by between half and four-fifths of their original size without there being any loss of visual quality that is clearly perceptible to a typical web viewer, although a close side-by-side comparison of images before and after compression through this tool might show differences in the fine resolution of colour shades that would be apparent to a discerning designer, photographer or artist. The free TinyPNG service does not accept source images over 5Mb, but the paid service does not carry any such limits. A host of user-recommended alternatives to TinyPNG including Gumlet, ImageKit, Quicq and Cloudimage, some of which are paid services, is covered in this third-party article

Where images are concerned, compression is not the only tool available to you to reduce size. It is also wise, prior to applying image compression, to size down any images that are larger than the screen space in which you plan to display them so that they do not consume any more pixel data than necessary for their display. If your website has a window that is only 1000 pixels across for a particular image, there is no point in uploading a version of that image that is 3000 pixels wide. You can beneficially first reduce it to one third of its original dimensions.

PDF files based on images can also be compressed to different standards. Adobe Acrobat offers a wide range of compression options for scanned documents and images, as well as for overall documents. These are worth experimenting with to establish where the optimal balance lies for your needs between file size and visual quality. Excessive compression can lead to grainy-looking text and images that are unsatisfying to the user. For free downloads of materials that do not directly visually represent your brand, steeper compression settings may be acceptable. For paid downloads, you should think about prioritising image quality more so that your customers are not left with a cheap and shoddy-looking file that they resent having paid for.

Fixing slow pages

Some pages on websites are very slow to load – normally, these are ones which are very complex in how they are constructed, and have to run a lot of database queries, or manipulate a lot of data to create the page. This issue was more common before modern content management systems, when custom code was present on certain pages, making them much slower than others.

A slow page in the admin area of your website may not be a problem. A slow page on the public part of your website, however, can massively reduce the number of simultaneous visitors your website can support.

If there are one or more pages that are much slower to load than others, then it is worth asking your web developers to focus on optimising those to improve the overall site performance and enable it to support more visitors.

With WordPress websites, you will tend to find that some plug-ins will slow down every single page on the site, so the problem tends to be global.


Website hosting can typically deliver a certain number of pages to a certain amount of users in a given period of time, assuming the requests are spread out evenly, and don’t include any pages that are particularly slow to load.

While you can certainly pay for more expensive hosting to allow your website to deliver more pages per second, one simple way to increase the number of users your website supports is to add caching. This normally means that the web page that is delivered to the user is ‘cached’ in a finished form, reducing the work that goes into constructing and delivering it by 80-90%, with a corresponding large increase in the number of page requests and therefore visitors that the site can support. A variety of caching plugins is available for WordPress sites, so if you use WordPress, check with your web developers that your site has a caching plug-in installed and configured correctly. Many other content management systems used to build websites also have caching built in, so if you have another CMS system like Drupal, check with your web developers that caching is enabled and working.

One thing to be aware of is that caching typically doesn’t work when you are logged in to a website as an editor or a member, so the main benefit of caching will be for anonymous visitors.

Browser caching can also be significant - the first time that someone visits a page on your website, the browser cache will be empty, but it will populate with each page visit, and this can significantly speed up the second and subsequent visits to your website, as long as your site provides sensible directives to browsers as to how long elements should be cached for. This tends to be overlooked but it is worth considering as it will help with the browsing experience on the second and subsequent page views for an individual visitor.

Content Delivery Networks

Part of the speed of a website is determined by latency – the time it takes the request to travel to the server hosting the website, and the time it takes the response to travel back to the user’s browser. Traditionally, hosting in the USA would lead to more latency if the user was requesting pages from the UK, because of the time it took traffic to cross from a US data centre over the Atlantic to the UK, so websites would appear to be loading more slowly.

One way to resolve this is to use a hosting facility in the main target market you are serving. However, a better way to resolve this, especially if you have multiple geographically separated target markets, is to use a content delivery network.

This directs requests to your website to access points around the globe, using the one that is best-connected to where the visitor is accessing the internet. This can significantly reduce latency when viewing a website that is hosted in a different continent, or when the internet connectivity from a country is poor. This can mean the difference between a tolerable experience when browsing your website and an unacceptable one.

A content delivery network typically also takes the weight off your website hosting, by caching potentially half of the data that is delivered to visitors so that does not have to be re-requested from the server, which can double the number of visitors your website can support.

Another benefit of CDNs is that they can be used to discourage or block ‘bot’ activity – robots or scripts that request pages on your site to index them or to scan for vulnerabilities, such as forms that can be used to send spam or opportunities to inject malware. Bot activity can put a significant load on your website and so reduce the number of real (human) visitors that it can support.

Cloud Storage

You can also help to maintain the speed and visitor-handling capabilities of your website by uploading larger resources such as video, audio and large PDF files to separate cloud hosting such as Amazon S3 storage. This then gives you access to a separate server resource for handling the delivery of those large files, without impacting your primary web host’s handling of your regular website visitor user volumes while calls to download such large files are being made by a small number of users. Videos and image-based PDF files (for example, scanned books) are particularly likely to increase your bandwidth requirements and unless they are behind a registration wall, you can find they are being linked from many other websites as well as your own. 

Cloud hosting of this kind does incur extra charges as it is a separate service from your regular web hosting, but it can be surprisingly affordable as an add-on to your main web hosting service, with providers such as Amazon S3 offering pay-as-you-go pricing at low rates per Megabyte of data storage and data transfer, billed based on actual usage levels. Although there would seem to be a risk with such a pricing model of astronomical bills if a surge of users starts to download your files, you can guard against this by setting your own hard limits per user and / or per month, or completely offset it by putting up paywalls that effectively charge the user for any high-quality file downloads.

Want to succeed online?

GWS Media have over two decades of experience in all areas of online marketing work, from logo design and website design and development to copywriting, content marketing, SEO and email marketing. Contact us today for a chat about your project, and we'll be delighted to see if we can help.

Start a project