Skip to main content

Examples

Ready-to-run code snippets for calling Routerly from any language. Every example connects to the same two endpoints:

ProtocolEndpointWhen to use
OpenAIhttp://localhost:3000/v1/chat/completionsDefault — use with any OpenAI-compatible SDK
Anthropichttp://localhost:3000/v1/messagesUse with the Anthropic SDK or when targeting Claude models

Common pattern

Every integration needs two values:

  • Base URLhttp://localhost:3000/v1 (OpenAI) or http://localhost:3000 (Anthropic)
  • API Key — your project token: sk-rt-YOUR_PROJECT_TOKEN

Change these two lines in your existing code and nothing else needs to change.


Languages

LanguageOpenAI SDKAnthropic SDKRaw HTTP
JavaScript / TypeScriptopenai npm@anthropic-ai/sdk npmfetch
Pythonopenai pipanthropic piphttpx
Javajava.net.http
Gogo-openainet/http
C# / .NETAzure.AI.OpenAIHttpClient
PHPopenai-php/clientGuzzle
Rubyruby-openai gemNet::HTTP
Rustasync-openai cratereqwest

Getting a project token

Create a token in the dashboard: Projects → select your project → TokensNew Token.

Tokens start with sk-rt- and are shown once. Copy it before closing the dialog.


Next steps

  • Read the full LLM Proxy API reference to see all supported request parameters.
  • See Integrations for ready-made setup guides for tools like Cursor, Open WebUI, and LangChain.