Tuesday, June 29, 2010

Wrapping code is slow on Firefox

UPDATE: Filed as bug 576630 with Mozilla. It would be great if this slowdown can be removed, because wrapping chunks of code in a function wrapper is a widely useful tool to have available.

I just learned, to my dismay, that adding a single layer of wrapping around a body of JavaScript code can cause Firefox to really slow down. That is, there are cases where the following code takes a second to load:

statement1
statement2
...
statement1000

Yet, the following equivalent code takes 30+ seconds to load:

(function() {
statement1
statement2
...
statement1000
})()


This is disappointing, because wrapping code inside a function is a straightforward way to control name visibility. If this code defines a bunch of new functions and vars, you might not want them all to be globally visible throughout a browser window. Yet, because of this parsing problem on Firefox, simply adding a wrapper function might not be a good idea.

After some investigation, the problem only arises when there are a lot of functions defined directly inside a single other function. Adding another layer of wrapping gets rid of the parse time problem. That is, the following parses very quickly:


(function() {
(function() {
statement1
..
statement10
})()
...
(function() {
statement991
...
statement1000
})()
})()


Of course, to use this approach, you have to make sure that the cross-references between the statements still work. In general this requires modifying the statements to install and read properties on some object that is shared among all the chunks.

Example Code and Timings



I wrote a Scala script named genslow.scala that generates two files: test.html and module.html. Load the first page in firefox, and it will cause a load of the second file into an iframe. An alert will pop up once all the code is loaded saying how long the load took.

There are three variables the top of the script that can be used to modify module.html. On my machine, I get the following timings:

default: 1,015 ms
jslink: 1,135 ms
wrapper: 34,288 ms
wrapper+jslink: 52,078 ms
wrapper+jslink+chunk: 1,188 ms

The timings were on Firefox 3.6.3 on Linux. I only report the first trial in the above table, but the pattern is robust across hitting reload.

Wednesday, June 23, 2010

Mass Effect for Xb^H^H Windows

I just got Mass Effect for Windows, but after reading the README file, I fear for the computer it will be installed on:

Game Known Issues
-----------------
In Mass Effect you will occsaionally find elevators that connect different
locations. While riding in an elevator the game is loading significant amounts
of information and modifying data. We recommend against saving the game after
an elevator is activated until the player departs the elevator. Saving during
elevator trips can occasional cause unusual behaviors.

Okay, I can see that being hard to fix. Load/save systems are often tricky, and being between zones would only make it worse. It goes on, though:

Mass Effect does not run on a system using a GMA X3000 video card, a general
protection fault error appear after double clicking the start icon.

Um, wow. That's it? It just doesn't work if you have this card?

Mass Effect does not run optimally on the Sapphire Radeon x1550 series of video
cards. We recommend that Mass Effect is not played on a system with this video
card.

Or that one?

Mass Effect does not run optimally on the NVIDIA GeForce 7100 series of video
cards. We recommend that Mass Effect is not played on a system with this video
card.

That one, either? Methinks they should list the cards it does work with, and on the box, not in a README file.

Mass Effect does not run optimally on a computer with a Pentium 4 CPU with a
FSB below 800 MHz under Windows Vista. We recommend that Mass Effect is not
played on a system with this CPU and operating system combination.

Err, okay. This kinda goes along with "minimum system requirements".

The the NVIDIA 8800 Series of video cards can require significant time (30
seconds or more) to change resolutions. This is due to a required
recalculation of thousands of video shaders.

"Required". As if they couldn't have precomputed shaders for the 10-20 most common resolutions. As if any other game has this problem.

After reading this, I wasn't confident. Sure enough, I get a General Protection Fault on startup. As extra weirdness, it reports a "file not found" exception from within some graphics library.

Overall, I guess what the developers did is make the Xbox version first, and then make a half-hearted attempt to port to Windows. If I'd realized how flaky this is, I probably would have passed it over.

Friday, June 18, 2010

Commutative? Associative? Idemflabitical?!

Guy Steele has a noble goal in mind for the Fortress library:

I have noticed that when I give talks about Fortress and start talking about the importance of algebraic properties for parallel programming, I often see many pairs of eyes glaze over, if only a bit. It occurs to me that maybe not everyone is conversant or comfortable with the terminology of modern albegra (maybe they had a bad experience with the New Math back in the 1960s, a fate I barely escaped), and this may obscure the essential underlying ideas, which after all are not that difficult, and perhaps even obvious to the average programmer when explained in everyday terms.


It's a good goal. Using technical terms often obscures the real point one is trying to make. Worse, technical terms are often dress up a claim to sound like it says much more than it does. For all kinds of reasons, it is better to well-known informal terms whenever they will work.

Nonetheless, I am not so sure about changing terms like commutative and associative in the Fortress library. It looks like a case where the hard part is not in the terminology, but in the underlying theory itself. Once a programmer understands the theory well enough to work with the library, they'll almost certainly know the standard formal terms anyway.

A similar issue come up for Scala, where writers of very flexible libraries end up working with rather complicated static types. In such cases, there is no getting around understanding how the Scala type system works. The Scala collections library is deep mojo.

That doesn't mean the complexity must leak out to users of the library, however. In both cases, the designers of core libraries must think hard, because they are working with deep abstractions. If the designers do well, then users of these libraries will find everything working effortlessly.

Saturday, June 12, 2010

Pinker defends electronica

Steven Pinker just wrote a great defense of electronic communication. The highlight for me:

For a reality check today, take the state of science, which demands high levels of brainwork and is measured by clear benchmarks of discovery. These days scientists are never far from their e-mail, rarely touch paper and cannot lecture without PowerPoint. If electronic media were hazardous to intelligence, the quality of science would be plummeting. Yet discoveries are multiplying like fruit flies, and progress is dizzying.

The same can be said for software engineering. Email and instant messaging give huge productivity increases. In a nutshell, they help people work together.

On the other hand, I don't agree with this part:

The effects of consuming electronic media are also likely to be far more limited than the panic implies. Media critics write as if the brain takes on the qualities of whatever it consumes, the informational equivalent of “you are what you eat.” As with primitive peoples who believe that eating fierce animals will make them fierce, they assume that watching quick cuts in rock videos turns your mental life into quick cuts or that reading bullet points and Twitter postings turns your thoughts into bullet points and Twitter postings.


I believe that what you spend your time mentally consuming strongly affects how you think about things and how you come at new things you encounter. However, we shouldn't blame the media, but the content. Watching Bruno and watching the Matrix get a person thinking about entirely different things, but they come through the same medium. Likewise, reading Fail Blog and reading Metamodern put the mind in entirely different places, even though they're both blogs.

Saturday, June 5, 2010

Evidence from successful usage

One way to test an engineering technique is to see how projects that tried it have gone. If the project fails, you suspect the technique is bad. If the project succeeds, you suspect the technique is good. It's harder than it sounds to make use of such information, though. There are too few projects, and each one has many different peculiarities. It's unclear which peculiarities led to the success or the failure. In a word, these are experiments are natural rather than controlled.

One kind of information does shine through from such experiments, however. While they are poor at comparing or quantifying the value of different techniques, they at least let us see which techniques are viable. A successful project requires that all of the techniques used are at least tolerable, because otherwise the project would have fallen apart. Therefore, whenever a project succeeds, all the techniques it used must at least be viable. Those techniques might not be good, but they must at least not be fatally bad.

This kind of claim is weak, but the evidence for it is very strong. Thus I'm surprised how often I run into knowledgeable people saying that this or that technique is so bad that it would ruin any project it was used on. The most common example is that people love to say dynamically typed languages are useless. In my mind, there are too many successful sites written in PHP or Ruby to believe such a claim.

Even one successful project tells us a technique is viable. What if there are none? This question doesn't come up very often. If a few people try a technique and it's a complete stinker, they tend to stop trying, and they tend to stop pushing it. Once in a while, though....

Once in a while there's something like synchronous RPC in a web browser. The technique certainly gets talked about. However, I've been asking around for a year or two now, and I have not yet found even one good web site that uses it. Unless and until that changes, I have to believe that synchronous RPC in the browser isn't even viable. It's beyond awkward. If you try it, you won't end up with a site you feel is launchable.

Friday, May 28, 2010

"Free" as in not really

Apple is apparently going to start rejecting GPL apps, but the reason isn't what I expected.

The sticking point was that the App Store's terms of service says that a piece of software downloaded from the store can only be used on five devices. But the FSF said that the terms of service impose numerous legal restrictions on the use and distribution of GNU Go that are forbidden by GPLv2 section 6:


So, the FSF is considering it not redistributable enough that an application is available for free via the iPhone store. The quibble is something about, the receiver of the software should be able to further redistribute the code for free, versus telling people that they can download it themselves from the app store.

In general, the GNU license isn't all that "free" in any common definition of the word. It seems pretty darned free to me if anyone who has an iPhone at all is able to download the software, run it all they like, and even go grab the source code. It's hard for me to see this as anything other than the FSF trying to get negotiating leverage and make itself more important. The problem is, their efforts to gain power are involving steps that are against their mission. To promote free software, they're seeking power, and to seek power, they're blocking the distribution of free software.

Open source is not for everyone. However, if you really want to give away software, it seems to me it should be given away in some simple, intuitive way. Either public domain it, or, if that seems too radical, use a clearly free license such as the MIT license.

Wednesday, May 19, 2010

Microsoft's revenge

CNET reports that Microsoft is suing an obscure company I have never heard of for stealing Microsoft's patented ideas. The patents in question?

The patents cover a variety of back-end and user interface features, ranging from one covering a "system and method for providing and displaying a Web page having an embedded menu" to another that covers a "method and system for stacking toolbars in a computer display."


These patents are about routine programming, not about novel ideas that deserve over a decade of exclusive use. They shouldn't have even been granted. Yet, not only have the been granted, but similarly groundless patents have been upheld in the past. Who knows? Maybe this case will hold up, too.

I challenge anyone to read up on how patents are supposed to help the public, and then compare that to how they are actually working.