-
Notifications
You must be signed in to change notification settings - Fork 2
Description
The main idea for this right now is to use ExaModels to generate the kernels. However, we still want to keep the front-end API as JuMP models, so we need some way of converting JuMP functions (which are in terms of 1 instance) to ExaModels structs that take batches as input.
There already exists InfiniteExaModels.jl, which can be used to generate an ExaModel from an InfiniteOpt model, which ends up being pretty close to what we need -- the "number of supports" in InfiniteOpt is like our "batch size". However, I think it may be better to instead maintain our own single-instance JuMP function -> batched ExaModels.SIMDFunction pipeline. There are a few arguments for having our own pipeline:
- We'd have two fewer dependencies.
- If we use
InfiniteExaModels, we either need to implementJuMP->InfiniteOptor force users to give anInfiniteOptmodel as input. This may not be as compatible with existing formulation libraries, but (using a subset of) their API is actually quite nice for our purpose. - It allows us greater flexibility in the structure of the generated
ExaModels.SIMDFunctions, as well as their evaluation. That can be important for e.g. jacobians since we'd get huge block-diagonal ones usingInfiniteExaModelssinceExaModelsdoesn't realize the batch instances are totally independent (in fact they are not, since they'd be tied together using an expectation in the objective). Similar story for the hessian of the Lagrangian. Indexing can also be made nicer; under theInfiniteExaModelssetup, we have to flatten the predictions/reshape the results. - Doing the conversion at the function level rather than the model level more closely aligns with expected use cases in L2O. For example, the completion of the special cases in
L2ODLLevaluatesJuMPfunctions that are built during the decomposition. It is not obvious how we'd useInfiniteExaModelsto generate kernels for those, but it is easy if we have aJuMP->SIMDFunctionpipeline.
I am prototyping some of this at https://github.com/klamike/BatchOptInterface.jl
Metadata
Metadata
Assignees
Labels
Type
Projects
Status