"Let's adjust the learning parameter to .00125 and the momentum factor to .022." "Sure thing, but we better run that by Legal first." There's a recipe for success
Perhaps in 40-50 years, there will be a stable version of the World Wide Web that it makes sense to clamp down with regulation, including search neutrality. However, right new, the web is changing way too fast.
Looking backward, imagine what kind of standards would have been written for a web search back when Alta Vista was the best. What are the odds that a government effort would have a reasonable approach to ranking pages according to linkage patterns? How about deciding what counts as a keyword? Is Zeitgeist one word or two? How about the "did you mean" results? If such an attempt had succeeded, then today we wouldn't even know about these innovations. They would have been squashed by the regulation before anyone could try.
Looking forward, imagine all the ways the web might change in the coming decade. What if more of the web moves into social spaces that have severe privacy needs, such as Facebook and Orkut? What if more of the content uses rich media, as do physical magazines, and is less possible to describe using plain text? What if things go the other way, and the web's information becomes a scattering of little sentences that are glued together on the fly for a particular user's settings?
The search neutrality project can only impoverish our Internet. I'm not clear on why the American government has jurisdiction over the "World" Wide Web, but to the extent they do, I hope they do the right thing and just go take a nap.
Post a Comment