From ye olde Surfinโ€™ Safari:

One of the things we did was to optimize within opcodes. Many JavaScript operations are highly polymorphic - they have different behavior in lots of different cases. Just by checking for the most common and fastest cases first, you can speed up JavaScript programs quite a bit.


How much of the speed improvement comes at the [admittedly likely nearly unnoticeable] expense of a few free processor cycles? That is, has the browser swapped from a programming paradigm where applications only request processor time when they "really need it", and then spin like crazy to one where they now make use of what seems to be down time to perform technically unneeded operations for future [from the user's perspective] speed?

Or, to put this yet another way, are applications becoming more selfish with the processor when nobody's looking? I've noticed that Chrome is pretty processor intensive in the background. Though much of this seems to be related to Flash ads, why doesn't a Chrome window put itself to sleep when it's not visible?

The move would mirror the way that programs have increasingly become RAM hogs in ways that they would have never done if RAM hadn't become cheap and available (thanks, Steve). Moore's Law has become increasingly unimportant of late, pushed only by games, Vista, and iMovie, and it would seem that everyone's starting to jump onto SETI@Home's excess processor bandwagon.

Or I could have completely misunderstood what Hyatt was saying. Likely the latter.