Skip to main content
POST
/
chat
/
completions
Chat Completions
curl --request POST \
  --url https://api.euron.one/api/v1/euri/chat/completions \
  --header 'Authorization: Bearer <token>' \
  --header 'Content-Type: application/json' \
  --data '
{
  "model": "<string>",
  "messages": [
    {}
  ],
  "max_tokens": 123,
  "temperature": 123,
  "stream": true,
  "top_p": 123,
  "frequency_penalty": 123,
  "presence_penalty": 123,
  "stop": {},
  "n": 123,
  "tools": [
    {}
  ],
  "tool_choice": {},
  "response_format": {},
  "seed": 123
}
'

Request

POST https://api.euron.one/api/v1/euri/chat/completions

Headers

HeaderRequiredDescription
AuthorizationYesBearer YOUR_EURI_API_KEY
Content-TypeYesapplication/json

Body parameters

model
string
required
Model ID to use. See Models for the full list.
messages
array
required
Array of message objects representing the conversation.Each message has:
  • role"system", "user", or "assistant"
  • content — The message text (string)
max_tokens
integer
Maximum number of tokens to generate. Defaults to model maximum.
temperature
number
Sampling temperature between 0 and 2. Lower = more deterministic. Default: 0.7.
stream
boolean
If true, returns a stream of Server-Sent Events (SSE). Default: false.
top_p
number
Nucleus sampling. Default: 1.
frequency_penalty
number
Penalize repeated tokens. Range: -2.0 to 2.0. Default: 0.
presence_penalty
number
Penalize tokens already present. Range: -2.0 to 2.0. Default: 0.
stop
string | array
Up to 4 sequences where the API will stop generating.
n
integer
Number of completions to generate. Default: 1.
tools
array
List of tools (functions) the model can call. OpenAI function-calling format.
tool_choice
string | object
Controls tool usage: "auto", "none", or a specific tool object.
response_format
object
Force output format. Example: {"type": "json_object"} for JSON mode.
seed
integer
Seed for deterministic sampling (best-effort).

Examples

Basic chat

curl -X POST https://api.euron.one/api/v1/euri/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer YOUR_EURI_API_KEY" \
  -d '{
    "model": "gpt-4o-mini",
    "messages": [
      {"role": "system", "content": "You are a helpful assistant."},
      {"role": "user", "content": "Explain AGI in 2 simple lines."}
    ],
    "max_tokens": 200,
    "temperature": 0.7
  }'

Streaming

curl -X POST https://api.euron.one/api/v1/euri/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer YOUR_EURI_API_KEY" \
  -d '{
    "model": "gemini-2.5-flash",
    "messages": [{"role": "user", "content": "Write a haiku about code."}],
    "stream": true
  }'

Using Claude

curl -X POST https://api.euron.one/api/v1/euri/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer YOUR_EURI_API_KEY" \
  -d '{
    "model": "claude-sonnet-4-6",
    "messages": [{"role": "user", "content": "Compare REST and GraphQL."}],
    "max_tokens": 500
  }'

Using Llama

curl -X POST https://api.euron.one/api/v1/euri/chat/completions \
  -H "Content-Type: application/json" \
  -H "Authorization: Bearer YOUR_EURI_API_KEY" \
  -d '{
    "model": "llama-4-scout-17b-16e-instruct",
    "messages": [{"role": "user", "content": "What is Rust good for?"}],
    "max_tokens": 300
  }'

Response

{
  "id": "chatcmpl-58e31942-784a-4075-bfa9-0aad20692061",
  "object": "chat.completion",
  "created": 1744998577,
  "model": "gpt-4o-mini",
  "choices": [
    {
      "index": 0,
      "message": {
        "role": "assistant",
        "content": "AGI stands for Artificial General Intelligence..."
      },
      "finish_reason": "stop"
    }
  ],
  "usage": {
    "prompt_tokens": 272,
    "completion_tokens": 145,
    "total_tokens": 417
  }
}

Stream response

When stream: true, the response is a series of SSE events:
data: {"id":"chatcmpl-...","choices":[{"delta":{"content":"AGI"},"index":0}]}

data: {"id":"chatcmpl-...","choices":[{"delta":{"content":" stands"},"index":0}]}

data: [DONE]

Errors

CodeMeaning
400Missing messages or model, invalid parameters
401Invalid or missing API key
403Daily token limit reached or insufficient wallet balance
429Rate limited
500Upstream model error