Before you begin!
Visit the Wetrocloud console to get your API key. If you have trouble obtaining it, refer to this guide.

Introduction

We designed WetroCloud’s API to be mostly compatible with OpenAI’s client libraries, making it easy to configure your existing applications to run on WetroCloud and try our inference with 10X speed. We supports a multiple of models from OpenAI, Anthopic, Meta etc. Visit Wetrocloud supported models list to check out our supported models.

Configuring OpenAI to Use WetroCloud API

To start using WetroCloud with OpenAI’s client libraries, ensure you have the OpenAI SDK installed, then pass your WetroCloud API key to the api_key parameter and change the base_url to https://api.wetrocloud.com/v1/openai:

1

Install OpenAI SDK

First, install OpenAI.

pip install openai
2

Configuration

To use WetroCloud with OpenAI’s SDK, set the API base URL to WetroCloud’s endpoint and provide your API key.

import openai



client = openai.OpenAI(

    base_url="https://api.wetrocloud.com/v1/openai",

    api_key="<wetrocloud_api_key>"

)
3

Example: Text Completion

Just use

import openai



client = openai.OpenAI(

    base_url="https://api.wetrocloud.com/v1/openai",

    api_key="<wetrocloud_api_key>"

)



response = client.chat.completions.create(

model="llama-3.3-70b",

messages=[

        {"role": "system", "content": "You are a helpful assistant."},

        {"role": "user", "content": "What is WetroCloud?"}

    ]

)



print(response["choices"][0]["message"]["content"])

Benefits of Using WetroCloud with OpenAI SDK

  • Same OpenAI Functions + inference 10x speed
  • No Code Changes – Just update api_base to WetroCloud’s endpoint.
  • Access to multiple model – Use latest models from OpenAI,Anthopic,Meta etc. with WetroCloud’s API key.
  • Scalable & Cost-Effective – Optimized for performance at scale.

Great, that was easy!

For more details, visit WetroCloud’s website and WetroCloud’s Documentation.