CreateEmbeddedAI
Create a Media Node that performs AI inference using a generic ONNX model. The node takes a video stream, pre-processes it according to the configuration, and runs the specified model. Inference results are streamed back to the client.
Request Type: EmbeddedAIRequest (streamed)
Field |
Type |
Repeated |
Description |
message |
oneOf |
||
⮑subscription |
|||
⮑initial_config |
Response Type: EmbeddedAIEvent (streamed)
Field |
Type |
Repeated |
Description |
message |
oneOf |
||
⮑node_id |
|||
⮑subscription_response |
|||
⮑inbound_context |
|||
⮑outbound_context |
|||
⮑inference_result |