TikHub-AI-Proxy
    • Overview (PLEASE READ)
    • Streaming API
    • OpenAI
      • OpenAI response
      • OpenAI embeddings
      • OpenAI audio transcription
      • OpenAI chat completion
    • Claude
      • Claude chat completion
      • Claude message
    • DeepSeek
      • DeepSeek chat completion
    • Sora
      • Sora video generation
      • Get Sora video status
      • Download Sora video content
      • Remix Sora video
      • List Sora videos
      • Delete Sora video
    • Gemini
      • Gemini content
    • Seedance
      • Seedance video generation
      • Retrieve Seedance task
    • Kling
      • Kling text-to-video
      • Retrieve Kling text-to-video task
      • Kling image-to-video
      • Retrieve Kling image-to-video task
    • Veo
      • Veo video generation
      • Fetch Veo video generation status
    • Schemas
      • ChatCompletionRequest
      • ChatMessage
      • Tool
      • ToolCall
      • ChatCompletionResponse
      • ChatCompletionChoice
      • ContentFilterResults
      • UsageInfo
      • EmbeddingRequest
      • EmbeddingResponse
      • TranscriptionRequest
      • TranscriptionResponse
      • ClaudeMessageRequest
      • ClaudeMessageResponse
      • VideoCreateRequest
      • VideoResponse
      • VideoRemixRequest
      • GeminiGenerateContentRequest
      • VideoListResponse
      • GeminiContent
      • VideoDeleteResponse
      • GeminiGenerationConfig
      • ResponseRequest
      • GeminiGenerateContentResponse
      • ResponseObject
      • SeedanceTaskRequest
      • SeedanceTaskCreateResponse
      • SeedanceTaskResponse
      • KlingText2VideoRequest
      • KlingImage2VideoRequest
      • KlingTaskResponse
      • KlingTaskDetailResponse
      • KlingTaskListResponse
      • VeoGenerateRequest
      • VeoOperationResponse
      • VeoFetchOperationRequest
      • VeoFetchOperationResponse

    ClaudeMessageResponse

    {
        "id": "string",
        "type": "message",
        "role": "assistant",
        "content": [
            {
                "type": "text",
                "text": "string",
                "id": "string",
                "name": "string",
                "input": {}
            }
        ],
        "model": "string",
        "stop_reason": "end_turn",
        "stop_sequence": "string",
        "usage": {
            "input_tokens": 0,
            "output_tokens": 0
        }
    }
    Built with