ZBLOG

trt_infer: ModelImporter.cpp:776: --- End node --- trt_infer: ModelImporter.cpp:778: ERROR: builtin_op_importers.cpp:4890 In function importFallbackPluginImporter: [8] Assertion failed: creator && "Plugin not found, are the plugin name, version, a...

The error message you mentioned is related to TensorRT (TRT) inference and occurs when a plugin required by your model cannot be found or loaded correctly. This issue commonly arises when using custom plugins or unsupported operations in the TRT inference process.

To resolve this issue, you can try the following steps:

  1. Ensure that you have properly built and registered any custom plugins used by your model with TensorRT. Make sure the plugin library files are accessible during runtime.

  2. Check if there are any unsupported operations or layers in your model that require custom implementations. You might need to create and register appropriate plugin implementations for these operations.

  3. Verify that the plugin name, version, and namespace specified in your model match those of the actual plugin implementation.

  4. Double-check that you are using compatible versions of TensorRT, CUDA, and cuDNN libraries.

  5. If you are using an older version of TensorRT, consider updating to the latest version as it may include fixes for known issues related to plugin importing.

  6. Review the documentation and examples provided by NVIDIA for developing and integrating custom plugins with TensorRT to ensure proper implementation.

If none of these steps resolve the issue, it may be helpful to provide more details about your specific setup, including the model architecture and the code snippet causing the error, so that further assistance can be provided.


内容由零声教学AI助手提供,问题来源于学员提问

本站部分文章来源于网络,版权归原作者所有,如有侵权请联系站长删除。
转载请注明出处:https://sdn.0voice.com/?id=3796

分享:
扫描分享到社交APP
上一篇
下一篇
发表列表
游客 游客
此处应有掌声~
评论列表

还没有评论,快来说点什么吧~

联系我们

在线咨询: 点击这里给我发消息

微信号:3007537140

上班时间: 10:30-22:30

关注我们
x

注册

已经有帐号?