Resistance is Fertile

If you don’t have a great deal of time to dedicate to the operation a web site then it’s a good idea to seek out the best provider for the intended purpose. Seek security, resilience, reliability, bandwidth and space; probably in that order.

The primary goals of a web site are availability and integrity. You get neither if the web site or its server environment are not secure.

There are a number of “bots” (automated pieces of software “robots”) of differing flavours roaming the net, looking for vulnerable sites. If e.g. your web site looks like it runs PHP, a popular language for implementing web sites quickly and easily, a horde of bots will try to take over your site. It doesn’t matter if it’s high-profile or not. The objective for the bots is mainly to occupy space and to consume resources at your cost, almost always for their profit, usually by unlawful means.

Some bot networks have in the past been used to attack and to deny service to high-profile sites based on the views expressed on the sites or by the owners of the sites. Such attacks consume all the connectivity resources of the victim’s server but are relatively rare. Those DDoS (Distributed, Denial of Service) attacks can and do employ bots operating from other sites which have been taken over to form a “botnet”; often simply for profit.

There are two basic types of web sites/pages:

  • Static pages are those which contain all the layout and the details you see on the web page in a fixed manner. It’s almost like leafing through a book.
  • Dynamic pages draw their information for layout and page contents from programs, “flat” data files and possibly one or more database engines. Their behaviour can change with every visit, along with the type of visitor.

Dynamic content is easy for users, programmers and administrators to change and to maintain. Alas, that’s also the case for attackers. The programming languages and database engines that provide the dynamic content are multi-edged, often cutting in unexpected ways. It requires insight, experience and skill on the part of the programmer to prevent handing that tool over to the attackers.

I have a small site with static pages with some company information that’s not linked from anywhere. It still provides some tarpits (places where, should one step, one gets stuck) which’ll keep thousands of bots entertained for days without impacting significantly on my uplink bandwidth or the web server.

A site with static pages is still vulnerable via bugs in the server engine. Best practice assumes that all software has bugs, so even the server engine must be “boxed in” within the physical server to prevent it from being able to reach resources beyond what it needs to operate. If the web server is running on a virtual server, then that adds a potential layer of protection and the ability to switch quickly to a new virtual server.

It should go without saying that one mustn’t automatically enable another virtual server with the same vulnerabilities because it’ll take an attacker at most a few minutes to “own” the new virtual server. If you have virtual servers, then make one that’s just got the simplest-possible web server with a “Down for maintenance” static page to display when others visit. Publish that static server while investigating why the dynamic server was compromised.

Static pages can still provide dynamic behaviour via CGI (Common Gateway Interface); which then makes the web server potentially as vulnerable as if there were a proper user logged in at the server. So the programming and configuration of the dynamic components has to be very, very meticulous and must always assume the worst. The same holds for PHP, etc sites.

Unfortunately, security is only a minor chapter deep within the programming and user guides of CMS (Content Management Systems) and their respective languages; and not the first. A practice which fails to emphacise and to support the primary goals of a web site: Availability and integrity.

Maintaining a site of static pages that presents up-to-date information is an order of magnitude more effort than using a CMS and a web server supporting dynamic content. The balance of effort in static content management and on-going server management for dynamic content in the face of attacks seems to be swinging. If you need dynamic content, then your decision is largely made; unless you want to implement dynamic content on static pages via the CGI without

“Good” server management with the inevitability of a successful attack is in the first instance the provision of backups and/or checkpoints. There are also provisions for redundancy (clouds), should one server become the focus of a DDoS attack so one has the option to try to exhaust the attacker’s resources (very difficult if their botnet owns hundreds of other computers on the Internet).

Laws prohibit a counter-strike against attacking systems. So you can’t “shoot back”. Law-abiding people have only the massive inertia of ill-equipped authorities to help them turn off the attacks and to bring the instigators to justice. If one’s site is “hacked”, then the chain of evidence must be preserved, meaning that the victim’s system has to be taken offline until an “image” can be officially preserved. If the web site is significant, a “spare” server has to be put in its place. Redundant/cloud services can help.

The black hats (bad guys) understand the legal burden imposed on those who wish to pursue them by law enforcement. They also understand that most people therefore won’t pursue and would rather keep repairing the damage.

ISP’s (Internet Service Providers) are largely ineffective at dealing with attacks unless they originate within the same country; the same ISP or another cooperative ISP. They are bound by the same laws as individuals and can only act directly against their own customers to the extent of a breach of contract.

The focus of an ISP is to make money by providing connectivity. Keep that in mind if you are hosting your own site either in-house or hosted at the ISP.