|Facebook Share||Tweet This||Google+ Share|
So a site I developed got front-paged on Reddit. Twice.
I’d like to give the story of each occurrence…
When I finished developing my site, I did a bit of load-testing. My conclusion, “This seems OK. Maybe needs a little help. ” I wrote a crude caching system to help performance. Shortly afterwards, it got front-paged on Reddit, and the site choked with all the heavy traffic. I turned off my caching because it was just making things worse. I had my cache self-cleaning at the end of the day. Caches go stale and need to be cleaned, but end-of-day cleansing doesn’t help if my cache grows to over 100 MB in just a couple hours. Anyway, the site limped along and complaints of slowness were fairly frequent.
After that experience, I decided to rewrite the entire site from scratch with a focus on performance. When finished, the SECOND time it hit Reddit front page, the site did not even blink. Perfect performance. Zero complaints. This is using the same $15/mo hosting I use for all my sites.
How did I do this? It was a very simple architectural decision. I designed the system so it does not matter if 1 person or 1,000 people connect and use the site. Here is how I did it:
The idea is very simple. Instead of having 1 CPU doing a 1,000 calculations (my server), I now have 1,000 CPUs doing 1 calculation (the user’s web browser). The only thing my server is now doing is deciding which files to give to the browser. When finished, the browser then disconnects, and my server can move on to the next user. Apache is extremely efficient at this task.
PHP great for dynamic content, but it is doggedly slow. Try to avoid it if your site is 90% static.