Complexity and web document delivery.

On the drive home from picking up our kids from my parent's house tonight, I recalled an exercise we went through at AOL a year-and-a-half ago or so. We were in the midst of an SEO fire-drill to ensure our dynamically generated web content was as crawlable and indexible as it could be. Someone asked the obviously question: "how many pages/documents do we have?" It took about three weeks to come up with an estimate that could be reasonably explained and believed; give or take roughly 100 million documents, AOL had approximately 500 million pages it could potentially respond to an HTTP request with.

While that number, whatever it actually was, has changed dramatically over the past 1.5 years, its an interesting one. In the world of highly dynamic content, the number of pages that can be served is effectively infinite. That poses some interesting scaling problems (caching "dynamic" content for example) as well as product problems (users don't want to interact with randomly generated content when they're trying to find something from a productivity standpoint (stumpleupon is entertainment for the most part)).

Drawing the line between what should be expected contextual structure (the header at the top of a Google search results page for example) and variable content is an artform.

I'm rambling... I just thought it was interesting to compare/contrast those 100 million document level product/technical challenges with Me.dium's website (relatively small). Designing a tight, compelling, intuitive product is a fascinating challenge no matter what the size.

Jud Valeski

Jud Valeski

Parent, photographer, mountain biker, runner, investor, wagyu & sushi eater, and a Boulderite. Full bio here: https://valeski.org/jud-valeski-bio
Boulder, CO