spawned_endpoint_demo.exs launches the backend against an honest fake
llama-server fixture and prints the endpoint descriptor fields that would be
handed northbound to jido_integration or another consumer.
The example is truthful about what it demonstrates:
- the runtime path is really
:spawned - the endpoint publication contract is the real
SelfHostedInferenceCore.EndpointDescriptor - the example shows the northbound chat-completions URL and published auth header
- the fixture avoids requiring a local model download or a live
llama-serverbinary for the demo
Run it with:
mix run examples/spawned_endpoint_demo.exs