-
Notifications
You must be signed in to change notification settings - Fork 3.2k
Batchnorm training mode support in a minimal build #17103
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Batchnorm training mode support in a minimal build #17103
Conversation
Thanks for the review @edgchen1 :) |
This is excluding batchnorm training mode support for standard onnxruntime releases, regression found from ort-nightly==1.16.0.dev20230821001. Can we revert that effect? There are some exported models for inference that runs just bn specifically in training mode. |
@edgchen1 @skottmckay @BowenBao under what conditions would you need the training batchnorm for inferencing? |
afaik some models are set to manipulate running mean and running variance in some ways. There is a separate issue pytorch/pytorch#75252 discussing other ways of export but there is no traction. It would be nice if the previous behavior can be restored for 1.16 release. I hope training minimal build would still have this op after #17270 so the goal of this PR is not affected? |
Now that ONNX Runtime supports a minimal training build, the batchnorm in training mode is needed even in a minimal build.
This PR removes the preprocessor macro
BATCHNORM_INCLUDE_TRAINING_SUPPORT
and replaces it withENABLE_TRAINING_OPS