Skip to content

onnxruntime errors out due to ORT_ENABLE_BASIC optimization: Unexpected data type for Clip 'min' input of 11 #24158

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
coffezhou opened this issue Mar 25, 2025 · 1 comment
Labels
stale issues that have not been addressed in a while; categorized by a bot

Comments

@coffezhou
Copy link

coffezhou commented Mar 25, 2025

Expected behavior

onnxruntime should run the model when using the optimization ORT_ENABLE_BASIC.

Actual behavior

When using the optimization ORT_ENABLE_BASIC, onnxruntime crashes.

Traceback (most recent call last):
  File "/home/carla/Documents/test_onnxruntime/0322/test.py", line 18, in <module>
    ort_session = onnxruntime.InferenceSession(
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/home/carla/anaconda3/envs/nnsmith-onnxruntime/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 472, in __init__
    self._create_inference_session(providers, provider_options, disabled_optimizers)
  File "/home/carla/anaconda3/envs/nnsmith-onnxruntime/lib/python3.11/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 544, in _create_inference_session
    sess.initialize_session(providers, provider_options, disabled_optimizers)
onnxruntime.capi.onnxruntime_pybind11_state.RuntimeException: [ONNXRuntimeError] : 6 : RUNTIME_EXCEPTION : Exception during initialization: /onnxruntime_src/onnxruntime/core/optimizer/relu_clip_fusion.cc:77 virtual onnxruntime::common::Status onnxruntime::FuseReluClip::Apply(onnxruntime::Graph&, onnxruntime::Node&, onnxruntime::RewriteRule::RewriteRuleEffect&, const onnxruntime::logging::Logger&) const Unexpected data type for Clip 'min' input of 11

To reproduce

Environment

OS: Ubuntu 20.04
onnxruntime: 1.22.0.dev20250320003

Steps to reproduce

This bug can be reproduced by the following code with the model in the attachment. For the model, it can be correctly checked by onnx.checker.check_model. However, if we replace ORT_ENABLE_BASIC with ORT_DISABLE_ALL, every thing is ok.

from typing import Dict, List, Literal, Optional
import sys

import numpy as np
import onnx
import onnxruntime

model_path = "model.onnx"
onnx_model = onnx.load(model_path)
onnx.checker.check_model(onnx_model)

sess_options = onnxruntime.SessionOptions()              
sess_options.graph_optimization_level = onnxruntime.GraphOptimizationLevel.ORT_ENABLE_BASIC

ort_session = onnxruntime.InferenceSession(
                     onnx_model.SerializeToString(), sess_options, providers=["CPUExecutionProvider"]
                  ) 

model.zip

Urgency

No response

Platform

Linux

OS Version

Ubuntu 20.04

ONNX Runtime Installation

Built from Source

ONNX Runtime Version or Commit ID

1.22.0.dev20250320003

ONNX Runtime API

Python

Architecture

X64

Execution Provider

Default CPU

Execution Provider Library Version

No response

Copy link
Contributor

This issue has been automatically marked as stale due to inactivity and will be closed in 30 days if no further activity occurs. If further support is needed, please provide an update and/or more details.

@github-actions github-actions bot added the stale issues that have not been addressed in a while; categorized by a bot label Apr 24, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
stale issues that have not been addressed in a while; categorized by a bot
Projects
None yet
Development

No branches or pull requests

1 participant