Archive for the 'FAQ' Category

Definition: Shared Hosting

Sunday, January 28th, 2007

Shared hosting means a website is hosted on a server which also hosts other (typically dozens if not hundreds) websites.

Every website needs a certain amount of hard disk space for storing files (e.g. html, image jpegs, php or other scripting languages). It also needs a certain amount of computer processing from the server’s cpu and it needs a certain amount of bandwidth to send webpages out to the website visitors over the Internet.

The amount of hard disk space needed is simply a function of the size and complexity of the site. The computer processing load and bandwidth are a function of the number of visitors to the site and whether it has a lot of image files or a lot of programmatic functions (Imagine how many financial transactions are being handled by Amazon at any given moment).

The vast majority of the websites on the Internet are neither large enough nor get enough visitors to justify having their own dedicated server. So the vast majority of websites use shared hosting.

Shared hosting has the following advantages and disadvantages:


  • Less expensive: because one is sharing the cost of the server with many other websites, the cost per website is much reduced. A typical cost for a shared hosting account would be $10 per month, a dedicated server would cost on the order of $300 per month.
  • Preconfigured: the company providing the shared account will have provided the tools and capabilities needed by a typical website including a control panel, email services, webstatistics, means of installing various web applications (e.g. blogs, image galleries, etc.) With a dedicated server, it is assumed that you want to, and are capable of, configuring the server yourself.
  • Security: the company providing the shared hosting server will be responsible for maintaining a secure environment, and making sure that the operating system is properly patched. With a dedicated server, you will get some support from the hosting service but you will share the responsibility of keeping the server secure.
  • Reliability: because the shared hosting server is hosting dozens of websites, it is in the interest of the hosting service to ensure that the server stays operational and that it is returned to service ASAP if there is a failure. With a dedicated server, the hosting company will certainly assist you with problems but, since you will be the only customer involved, they may not give you priority if there are problems on their other servers.
  • Availability: there are literally thousands of companies offering shared hosting which is one of the reasons the cost is so affordable.


  • Flexibility: because a share hosting server is pre-configured for the “average” website, one may find that it lacks certain capabilities that one needs… And that it may be difficult or impossible to add them. Typically this will be in the area of scripting languages. For example, most shared hosting will not have python or ruby (scripting languages) integrated with the webserver… Or the shared hosting server may not use the latest version of PHP.
  • Security: even though the hosting company takes responsibility for overall security, the fact that there are dozens of websites on the server means that each one constitutes a potential vulnerability. The entire shared hosting server (and all the sites it hosts) can be brought down by a vulnerability on a single site.
  • Reliability: as mentioned above, there are thousands of companies offering shared hosting at very low prices but finding a reliable hosting company can be very challenging.

On balance, of course, shared hosting is the best option for most websites. It is important to find a reliable hosting service and it is important to review the various packages offered to make sure they meet your needs.

Definition: Extranet

Friday, January 26th, 2007

An Extranet is a website that is present on the Internet but is only available to authorized individuals. It is typically protected by a mechanism that requires a visitor to provide an authorized login id and corresponding password. The Extranet mechanisms may also require that a visitor logging in do so using an encrypted connection.

The users of an Extranet are typically employees, customers, or suppliers with a relationship to the organization operating the Extranet.

See also Intranet.

Definition: Intranet

Friday, January 26th, 2007

An Intranet is a website residing on a computer on a local area network or LAN. It is usually not accessible from the outside world (i.e. the rest of the Internet) either because the local network is not connected to the Internet or, more commonly, because it is protected behind a firewall.

An Intranet is typically used to provide information to employees or members of the organization owning the local area network. The Intranet might be used to post company announcements and policies or run web applications (e.g. shared calendars or CRM software).

A common practice in small business is to take an obsolete Windows PC, install Linux and set it up as in inexpensive web server for the Intranet.

See also Extranet.

Search Engine Optimization: There are No Free Lunches

Friday, January 12th, 2007

I don’t think there is any area of the web which has more snake oil being peddled than the area of Search Engine Optimization, aka SEO.

I get emails every day from outfits guaranteeing that I will get top-ten listing on Google, Yahoo, and MSN… All I need to do is sign up with these guys and pay them a lot of money. Amazing… they don’t know what I am selling, what my website content looks like, or how much competition I have… But they can still guaranty me top-ten listing. How can they do that?

The answer, of course, is they can’t. This is the web-age equivalent of the old UHF TV ad’s for veggiematics or some other poorly manufactured household product. You will pay them some amount (and it will vary all over the map… from a few bucks to thousands of dollars depending on how dumb you are) and, wonder of wonders, you won’t get top-ten listing Google, Yahoo, or MSN. The only difference between the ten dollar haircut and the thousand dollar one will be the ingenuity and creativity of their explanations of what went wrong.

So now you are both ripped off and ticked off… Are you going to sue them? If you only paid a few hundred it’s not worth it… If you paid them thousands then either they’re long gone to some South American country or they’ve put half their ill-gotten gains into hiring fancy lawyers to defend themselves.

The actual ranking algorithms of the major search engines are closely guarded secrets and there is a whole industry of folks reading tea leaves trying to reverse-engineer them.

But I think one can establish some pretty good guidelines based on simple common sense. So here are Denholm’s Common Sense Laws of Search Engine Optimization:

  1. The major search engines are genuinely trying to produce search results that are as useful as possible to their user base.
  2. What makes a website useful to their user base is content.
  3. The only thing that search engine spiders and crawlers can analyze is text. So it is only text (and not images) that matters.
  4. Apart from the content itself, the only thing that will matter to search engines in terms of ranking will be the number and quality of other sites that link to yours.
  5. The only additional thing you can do is to make sure that your content is properly indexed using metatags and h1, h2, h3 element tags.

Any attempt to “game the system” will only have temporary success and will, ultimately, backfire once the search engines figure out how the gaming works. Note that I am talking about “natural ranking”, not sponsored links or pay-for-click. The latter are, indeed, available to the highest bidder. But it is the natural rank that matters. Most search engine users either knowingly or instinctively ignore sponsored and pay-for-click listings.

So, the bottom line is that you want to have your website filled with as much useful text content as possible. Use meta-tags and h# tags to highlight for the search engines which keywords best describe your content. Get authoritative incoming links that tell the search engines that others find your content of value. Do not establish reciprocal links with sites which have no logical connection to yours. That is merely the latest form of gaming the system and it will not help… And it probably hurts.

Good incoming links are likely to be those from professional organizations, chambers of commerce, trade groups, large companies, government agencies, universities, etc. One of our customers is a kayak dealer. He has incoming links from about 20 kayak manufacturers. My belief is that has a serious, positive impact on his search engine ranking.

And even if you have all the above in place, your ranking will still depend on how many other websites are competing in the same conceptual space and how good their content is, and their incoming links, and their keyword indexing.

There are no magic bullets and anyone who claims they can guarantee a high search engine ranking on the major search engines is blowing smoke.

That is not to say that you shouldn’t try to optimize your site for search engine visibility. I frequently see sites that seemed to have been designed for stealth instead of visibility.

Apart from having good content, good indexing of said content, and incoming links from good sites that have a logical content-based reason to link to you, there are some things that you can do and some things you can avoid:

  • Do not create entire sites using Macromedia Flash (or any other dynamic architecture such as PHP/MySQL, Cold Fusion, or ASP). The search engines cannot be bothered figuring out how to analyze websites where navigation is dynamic and all content delivered dynamically will be ignored. The sole exception to this is where your dynamic mechanism uses mod_rewrite (or an equivalent) to mimic static links. (Note that Flash is fine for individual animated images… Just don’t build the site navigation using it.)
  • Do write compliant, valid HTML or XHTML and do validate your HTML/XHTML against the W3C validation engines. The search engine spiders and ranking algorithms depend on your code being correctly written in order to analyze your site. If your code is poorly written and incorrect, they will ignore your site. Note that your site can look fine to a human viewer using a browser and still be “broken” from a spider’s point of view.
  • Do use alt-tags if your site is based on image content. The search engines “like” images but the alt-tags are the only way they “understand” what the images represent.
  • Do submit your websites to the search engines when you first create them and after major changes or updates.
  • Do promote your website address in non-web channels. Use it in print advertising, your letterhead, business cards, press-releases. This does not help directly with your ranking but it does increase traffic to your site… And a surprising number of people actually search for a web address rather than enter it directly in the browser.

Problems with spam filters…

Wednesday, January 10th, 2007

This last few months I have noticed an increasing problem with legitimate emails getting blocked by poorly conceived and configured spam filters.

It appears that many ISP’s (Internet Service Providers) are responding to complaints by their customers (about the torrent of spam they receive) by modifying and reconfiguring their spam filtering technology.

Unfortunately, these responses have not been well thought out and implemented.

I will provide a couple of examples but first I need to define some terms:

  • User PC – the personal computer used by an end-user to send or receive an email
  • Email Client – the software on the User PC used to compose and read email
  • Email Server – the server used by multiple users to handle both incoming and outgoing emails
  • ISP – a provider of internet connectivity to individuals and small businesses (includes big phone companies like Verizon, big cable TV operators like Comcast, and smaller local providers).
  • Spammer – the bad guys, typically using someone else’s hijacked PC’s or servers to broadcast thousands of undesirable junk emails to huge lists of email addresses

Example #1

I recently sent an email to a customer (an firm of architects) and had it bounced back to me as spam by the firm’s ISP. In this case the ISP was small, local outfit but a lot of the big guys are doing the same thing.

I created the email on my Windows XP computer using my email client (Mozilla Thunderbird). My email client is configured to use as the “From” address. It is also configured to send email via an email server provided by the hosting service I use to host This email server has a name along the lines of

Why was my legitimate email bounced back to me? The local ISP analyzed the header information of my email and found that the email server I was using was not part of the same domain as my email address and decided that this meant my email was spam.

My “From” address was which had a different domain than the email server The email header also includes the numeric IP address of the originating SMTP email server which would also be associated with my hosting service rather than my domain.

Now, it is true that essentially all spam is sent from SMTP servers which are not associated with their purported “From” or “Reply-to” email addresses. But it is also true that the vast majority of small to medium size businesses and organizations do not have their own SMTP email servers. So this policy of blocking emails where the “From” domain is not associated with the originating email server means that a lot of legitimate emails are being blocked.

In this particular instance, I called up my customer on the phone and explained the situation and asked for his fax number… And then I faxed him what I would have otherwise emailed.

Note that this means that this firm of architects is having some portion of its legitimate incoming email blocked. If the sender is not that motivated, he or she may just shrug and “walk away” and some potential business is lost.

The big email providers such as Yahoo, MSN/Hotmail, and Google/GMail also follow this practice. They don’t block unassociated emails but they do automatically put them in the end-user’s spam or bulk folder.

SPF Record

There is a partial work-around but it’s a little complicated and very few small or medium size businesses know enough to apply it.

The partial work around is called the SPF Record. SPF stands for Sender Policy Framework and you can find out more about at But in essence, the SPF record is part of a domain’s registration record and it provides a list of domains and servers that can legitimately send email associated with that domain.

So I set up an SPF Record that included the email server information for my hosting provider’s email server (and a third party email server I use). I waited a few hours and then sent test emails to accounts I have on Yahoo, MSN, and GMail… And it worked. None treated my emails as spam.

I then sent a test email to my architect customer and it also got through fine.

To summarize… Once the SPF record is implemented, an ISP (such as Yahoo) receiving an email purported to be from can check the domain records for and confirm that the smtp server I used was authorized to send emails.

Example #2

My hosting service is very good about ensuring that their servers are not used for spamming. This is critical because each of their servers hosts dozens of domains and they all share a single email server. If any one of those domains is sending spam (either deliberately or because they were hacked) then all mail from that email server gets blocked or labeled as spam by the big ISP’s.

But recently a new problem has arisen with one of the larger ISP’s (Comcast).

The scenario is that someone hosting their domain with the hosting service decides to forward their email (sent to, for example, to an email address they have on Comcast (i.e. Presumably Comcast is the ISP for either their small business or their home computer.

This should not be a problem but it is… Because Comcast views any spam that is being forwarded as being generated by the forwarding server and Comcast then blocks said forwarding server. This is causing such a problem that the hosting service has banned anyone from forwarding email to Comcast addresses. This is, of course, hurting Comcast’s own customers… Some are finding that they are not allowed to forward emails to their own comcast email accounts… And even more are having legitimate emails sent to them blocked.

I have a number of customers with comcast addresses, and I now send emails to them via my GMail account. That seems to work fine even though I am sure a lot of spam gets forwarded via GMail. Perhaps Comcast is scared of blocking emails from an entity as large as Google.