I was watching an excellent video describing the iterations of the Angular compiler, and rabbit holed a little with hidden classes.

The most enjoyable resource I found on this (and javascript optimization in general) was from mrale.ph:

[Falling back to the runtime to bullheadedly access properties from objects out of any context from our code] is an absolutely valid way to implement property lookup, however it has one significant problem: if we pit our property lookup implementation against those used in modern JS VMs we will discover that it is far too slow.

Our interpreter isย amnesiac: every time it does a property lookup it has to execute a generic property lookup algorithm, it does not learn anything from the previous attempts and has to pay full price again and again. Thatโ€™s why performance oriented VMs implement property lookup in a different way.

What if each property access in our program was capable of learning from objects that it saw before and apply this knowledge to similar objects? Potentially that would allow us to save a lot of time by avoiding costly generic lookup algorithm and instead use a quicker one that only applies to objects of certain shape.

โ€ฆ

This optimization technique is known as Inline Caching and I have written about it before.

[emph and bracketed paraphrase mine]

It's worth a full read. And once you've got how hidden classes, polymorphism, and megamorphism works, you could probably fall into exactly the same compiler optimization steps Angular's Tobias Bosch does in his video, above.


Here's a quick bit on poly/mega/morphism from the same source, as I once again save you from googling, one resource at a time.

If we continue callingย fย with objects of different shapes its degree of polymorphism will continue to grow until it reaches a predefined threshold - maximum possible capacity for the inline cache (e.g.ย 4ย for property loads in V8) - at that point [the] cache will transition to aย megamorphicย state.
...
In V8 megamorphic ICs can still continue to cache things but instead of doing it locally they will put what they want to cache into a global hashtable. This hashtable has a fixed size and entries are simply overwritten on collisions.

It's duck typing, all the way down, until you have too many ducks, at which point we default to a home-rolled bird almanac.

Labels: , , ,