Yes, the test you pointed contains cases where an AOTInductor compiled .so file can be wrapped with torch::CustomClassHolder
and thus works with existing TorchScript inference solution. It is a showcase for users who are currently leveraging TorchScript for their inference. If your inference system was not built on top of TorchScript, you can use plain C++ deployment with AOTInductor generated .so file.