Profiling information is used to calculate code coverage and a clustering strategy. This determines which functions are stubbed out and which are not and groups functions into batches which are downloaded together, called clusters.
I tentatively believe this approach can produce a somewhat reasonable user experience. It has some unfortunate problems, though, due to unpredictable functions being stubbed out. Any call to a stubbed out function, whichever ones the compiler chooses, will result in a blocking network operation to download more code. Whenever this happens:
- Unrelated parts of the application are also paused, not just the parts that need the missing code.
- There is no way for the programmer to sensibly deal with a download failure. The entire application has to halt.
- There is no way to give a status update to the user indicating that a download is in progress.
These problems are fundamental. In practice, matters are even worse. On many browsers, a synchronous network download will lock up the entire browser, including tabs other than the one that issued the request. Locking an entire browser does not make for a good user experience. It does not make people feel like they are in good hands when they visit a web site.
GWT avoids these problems by insisting that the application never blocks on a code download. The request of a code download returns immediately. If a network download is required, then its success or failure is indicated with an asynchronous callback. Until that callback is triggered, the rest of the application keeps running.