:: Enseignements :: ESIPE :: E5INFO :: 2019-2020 :: Machine Virtuelle (et bazar autour ...) ::
[LOGO]

Lab 3b - JVM Interpreter / Optimization


Exercice 1 - speculative optimizations

In JavaScript, the qualifier of a function call can come from two ways, either from a value on the stack (often a local variable) or from the global object.
If the qualifier value comes from the global object, we can try to use an optimization techniques known as speculative optimization.
The idea is that for a call site of a function call, the qualifier may stay the same like by example when the function print is called in a look, in that case, it's better to organize the generated code with a fast path and a slow path.
The fast path can quickly check that the function is print so print can be directly called. The slow path will do the normal slow call.

The api java.lang.invoke have been designed especially to solve this issue by enabling a tree of method handles to be generated directly in assembler by the JIT if the method handles are constant.
Here the idea is to implement a speculative optimization i.e. to install a guard in front of the call that will check that the function value taken as first argument (the value of the qualifier) will not changed. If the function value doesn't change, the fast patch can call it directly. If the function value change at some point in time, the function can be called in the slow path and the guard removed so all next calls will never check if the function is constant or not.

Because we need to be able to mutate the code if the speculative optimization doesn't work (here if the qualifier is not constant), instead of using a ConstantCallSite, we will use a MutableCallSite.
Here is an example of how to use a MutableCallSite:
  ...
  public static CallSite bsm_funcall(Lookup lookup, String name, MethodType type) {
    return new InliningCache(type);
  }
  
  private static class InliningCache extends MutableCallSite {
    private static final MethodHandle SLOW_PATH;
    static {
      var lookup = MethodHandles.lookup();
      try {
        SLOW_PATH = lookup.findVirtual(InliningCache.class, "slowPath", methodType(Object.class, Object.class, Object.class, Object[].class));
      } catch (NoSuchMethodException | IllegalAccessException e) {
        throw new AssertionError(e);
      }
    }
    
    public InliningCache(MethodType type) {
      super(type);
      setTarget(SLOW_PATH.bindTo(this).asCollector(Object[].class, type.parameterCount() - 2));
    }
    
    private Object slowPath(Object qualifier, Object receiver, Object[] args) {
      var jsObject = (JSObject)qualifier;
      return jsObject.invoke(receiver, args);
      
      // var jsObject = (JSObject)qualifier;
      // var mh = jsObject.getMethodHandle();
    }
  }
  ...
     

  1. What the code above do ?
    Does it implement the speculative optimization ?
  2. Modify the code to install a speculative guard that check that the function value doesn't not change. If it doesn't work, it will try to install a new guard with the new qualifier.
    Note: builtin methods like print are specials because they accept an illimited (256 in practice) number of arguments. They are varargs collectors) and can adapt themselves to several signatures. The adaptation is done by calling asType() before any other methods.
  3. Verify that the test callAUserDefinedFunctionWithTheWrongNumberOfArguments pass.
  4. How to create a bi-morphic inlining cache, i.e. a cache that will install at most 2 guard with two different qualifier before giving up and us asCollector + asType for all calls ?
    Modify the implementation to implement such bi-morphic inlining cache.

Exercice 2 - speculative optimizations without a check

Instead of checking if the qualifier change or not between the calls, we can use a push approach instead of a pull approach.
The idea is to notify the method handles when something used by the speculative optimization doesn't hold anymore.
The package java.lang.invoke has a class SwitchPoint to switch off guards created on this SwitchPoint when the assumption of a speculative optimization is not true anymore.

  1. Let suppose that the environment can not be modified, in that case, we can do better !
    We know in the Rewriter if the function call is a constant one or not, so instead of doing a speculative optimization, it's better to separate the two kinds of function call by teaching the Rewriter to emit a call to two different bootstrap methods of invokedynamic depending if the function call is constant or not. In that case, the bootstrap method corresponding to the constant function call will do the lookup so the constant function will be available when bootstrapping, avoiding to use a guard to know if the function value is constant or not.
    Note: in case of a non-constant function call, the speculative code can be kept because even if the function value comes from a value from the stack it can be considered as a constant because the code is always called with the same function. By example, this code use a non-constant function call with a function that never change (in this snippet):
      function foo(bar) {
        bar();
      }
    
      function baz() {
        print('baz');
      }
    
      foo(baz);
      foo(baz);
            
  2. Modifiy JSObject.java to add a field containing a SwitchPoint that change and is invalidated each time the method register is called.
    How can we use the SwitchPoint to invalidate the method handles each time the global object change ?
  3. Implement the new strategy.