TikHub-AI-Proxy
    • Overview (PLEASE READ)
    • Streaming API
    • OpenAI
      • OpenAI response
      • OpenAI embeddings
      • OpenAI audio transcription
      • OpenAI chat completion
    • Claude
      • Claude chat completion
      • Claude message
    • DeepSeek
      • DeepSeek chat completion
    • Sora
      • Sora video generation
      • Get Sora video status
      • Download Sora video content
      • Remix Sora video
      • List Sora videos
      • Delete Sora video
    • Gemini
      • Gemini content
    • Seedance
      • Seedance video generation
      • Retrieve Seedance task
    • Kling
      • Kling text-to-video
      • Retrieve Kling text-to-video task
      • Kling image-to-video
      • Retrieve Kling image-to-video task
    • Veo
      • Veo video generation
      • Fetch Veo video generation status
    • Schemas
      • ChatCompletionRequest
      • ChatMessage
      • Tool
      • ToolCall
      • ChatCompletionResponse
      • ChatCompletionChoice
      • ContentFilterResults
      • UsageInfo
      • EmbeddingRequest
      • EmbeddingResponse
      • TranscriptionRequest
      • TranscriptionResponse
      • ClaudeMessageRequest
      • ClaudeMessageResponse
      • VideoCreateRequest
      • VideoResponse
      • VideoRemixRequest
      • GeminiGenerateContentRequest
      • VideoListResponse
      • GeminiContent
      • VideoDeleteResponse
      • GeminiGenerationConfig
      • ResponseRequest
      • GeminiGenerateContentResponse
      • ResponseObject
      • SeedanceTaskRequest
      • SeedanceTaskCreateResponse
      • SeedanceTaskResponse
      • KlingText2VideoRequest
      • KlingImage2VideoRequest
      • KlingTaskResponse
      • KlingTaskDetailResponse
      • KlingTaskListResponse
      • VeoGenerateRequest
      • VeoOperationResponse
      • VeoFetchOperationRequest
      • VeoFetchOperationResponse

    ResponseRequest

    {
        "model": "gpt-4.1",
        "input": "string",
        "instructions": "string",
        "background": false,
        "conversation": "string",
        "include": [
            "string"
        ],
        "max_output_tokens": 0,
        "max_tool_calls": 0,
        "metadata": {
            "property1": "string",
            "property2": "string"
        },
        "parallel_tool_calls": true,
        "previous_response_id": "string",
        "prompt": {},
        "reasoning": {
            "effort": "low",
            "summary": "auto"
        },
        "service_tier": "auto",
        "store": true,
        "stream": false,
        "stream_options": {},
        "temperature": 1,
        "text": {
            "format": {
                "type": "text"
            }
        },
        "tool_choice": "string",
        "tools": [
            {}
        ],
        "top_logprobs": 0,
        "top_p": 1,
        "truncation": "disabled"
    }
    Built with