Facebook Tweet Google+


Abandoning Scripts for Static Pages

Feb 2015

So a site I developed got front-paged on Reddit. Twice.

I’d like to give the story of each occurrence…

When I finished developing my site, I did a bit of load-testing. My conclusion, “This seems OK. Maybe needs a little help. ” I wrote a crude caching system to help performance. Shortly afterwards, it got front-paged on Reddit, and the site choked with all the heavy traffic. I turned off my caching because it was just making things worse. I had my cache self-cleaning at the end of the day. Caches go stale and need to be cleaned, but end-of-day cleansing doesn’t help if my cache grows to over 100 MB in just a couple hours. Anyway, the site limped along and complaints of slowness were fairly frequent.

After that experience, I decided to rewrite the entire site from scratch with a focus on performance. When finished, the SECOND time it hit Reddit front page, the site did not even blink. Perfect performance. Zero complaints. This is using the same $15/mo hosting I use for all my sites.

How did I do this? It was a very simple architectural decision. I designed the system so it does not matter if 1 person or 1,000 people connect and use the site. Here is how I did it:

  • I ripped out as much PHP as possible and replaced it with JavaScript.
  • I ripped out the database entirely. I was only using it to log stats anyway. Server logs will do.

The idea is very simple. Instead of having 1 CPU doing a 1,000 calculations (my server), I now have 1,000 CPUs doing 1 calculation (the user’s web browser). The only thing my server is now doing is deciding which files to give to the browser. When finished, the browser then disconnects, and my server can move on to the next user. Apache is extremely efficient at this task.

After seeing my site’s breathtakingly dramatic performance improvement, I have become a very big fan of using my server as little as possible. Maybe it is having a cheap host, but I now consider my server the coke point for everything. I try to help it as much as possible. I now set timers in my code, and if a routine takes too long, I try to rewrite. If I find a calculation that can be ripped out and put in to client-side JavaScript, I rip it out. If a big file is a slow download, I move to Amazon S3. And so on…

My most popular site is pure static HTML with light JavaScript. The site is dynamic with PHP when I develop it, but then I “compile” it to raw HTML when I upload to my public server. Both Google and my users thank me. Google gives a bonus to fast sites, and users like having fast downloads (from Amazon S3). I did not get as many visitors when that site was heavy with PHP.

PHP great for dynamic content, but it is doggedly slow. Try to avoid it if your site is 90% static.



Dan Nagle is a SW Developer and the founder of NagleCode. His apps have been downloaded hundreds of thousands of times and have been featured all over the internet. He resides in Huntsville, AL.

More Posts

Jan 2018: Pay Down Calc v2.0 - A Resource For Consumers and Agents
Jul 2015: Unity Publishing Metrics
May 2015: JavaScript is not the answer to everything
Feb 2015: Abandoning Scripts for Static Pages
Apr 2014: Qt is better than Java.
Oct 2013: The government does not need to make a health website.
Sep 2013: Google Reader may by dying but RSS is not
Jun 2013: Stop writing batch scripts. Start writing AutoIt scripts.
Mar 2013: Goodbye XP Hello Linux Mint
Dec 2012: Your Goals Should be SMART
Apr 2011: Happy 40th Birthday FTP and Farewell
Aug 2010: Breaking the World Record Typing Numbers to Words

Copyright © NagleCode 2007 - 2018.