Table of contents
Official Content
  • This documentation is valid for:

The Chat API via Python SDK - PyGEAI lets you interact with Globant Enterprise AI to generate chat completions using configurable models and parameters. You can use this API in three different ways, depending on your preferred level of abstraction:

  1. Command Line
  2. Low-Level Service Layer
  3. High-Level Service Layer

1. Command Line

The geai chat completion command line generates a chat completion based on the provided model and messages. Various flags allow customization of the response, such as stream, temperature, and max-tokens.

geai chat completion \
  --model "saia:assistant:Welcome data Assistant 3" \
  --messages '[{"role": "user", "content": "Hi, welcome to Globant Enterprise AI!!"}]' \
  --temperature 0.7 \
  --max-tokens 1000 \
  --stream 1

Use a different API key alias for authentication:

geai --alias admin chat completion \
  --model "saia:assistant:Welcome data Assistant 3" \
  --messages '[{"role": "user", "content": "What is Globant Enterprise AI?"}]' \
  --temperature 0.5 \
  --max-tokens 500

Non-streaming response with additional parameters like frequency and presence penalties:

geai chat completion \
  --model "saia:assistant:Welcome data Assistant 3" \
  --messages '[{"role": "user", "content": "Can you explain AI solutions offered by Globant?"}]' \
  --temperature 0.6 \
  --max-tokens 800 \
  --frequency-penalty 0.1 \
  --presence-penalty 0.2 \
  --stream 0

Using tools and tool-choice to fetch weather data:

geai chat completion \
  --model "saia:assistant:Welcome data Assistant 3" \
  --messages '[{"role": "user", "content": "Please get the current weather for San Francisco."}]' \
  --temperature 0.6 \
  --max-tokens 800 \
  --tools '[{"name": "get_weather", "description": "Fetches the current weather for a given location", "parameters": {"type": "object", "properties": {"location": {"type": "string", "description": "City name"}}, "required": ["location"]}, "strict": true}]' \
  --tool-choice '{"type": "function", "function": {"name": "get_weather"}}' \
  --stream 1

2. Low-Level Service Layer

The ChatClient class provides a Low-Level Service Layer to generate chat completions. It supports both streaming and non-streaming responses and allows fine-grained control over parameters.

from pygeai.chat.clients import ChatClient

client = ChatClient()

response = client.chat_completion(
    model="saia:assistant:Welcome data Assistant 3",
    messages=[{"role": "user", "content": "What is Globant Enterprise AI?"}],
    temperature=0.5,
    max_tokens=500,
    stream=False
)
print(response)

Streaming response with tools:

from pygeai.chat.clients import ChatClient

client = ChatClient()

llm_settings = {
    "temperature": 0.6,
    "max_tokens": 800,
    "frequency_penalty": 0.1,
    "presence_penalty": 0.2
}

messages = [{"role": "user", "content": "Please get the current weather for San Francisco."}]

tools = [
    {
        "name": "get_weather",
        "description": "Fetches the current weather for a given location",
        "parameters": {
            "type": "object",
            "properties": {"location": {"type": "string", "description": "City name"}},
            "required": ["location"]
        },
        "strict": True
    }
]

tool_choice = {"type": "function", "function": {"name": "get_weather"}}

response = client.chat_completion(
    model="saia:assistant:Welcome data Assistant 3",
    messages=messages,
    stream=True,
    tools=tools,
    tool_choice=tool_choice,
    **llm_settings
)

for chunk in response:
    print(chunk, end="")

Using variables and thread ID:

from pygeai.chat.clients import ChatClient

client = ChatClient()

response = client.chat_completion(
    model="saia:assistant:Welcome data Assistant 3",
    messages=[
        {"role": "system", "content": "You are a helpful assistant for Globant Enterprise AI."},
        {"role": "user", "content": "What AI solutions does Globant offer?"}
    ],
    temperature=0.8,
    max_tokens=2000,
    presence_penalty=0.1,
    thread_id="thread_123e4567-e89b-12d3-a456-426614174000",
    variables=[{"key": "user_region", "value": "North America"}, {"key": "industry", "value": "Technology"}],
    stream=False
)
print(response)

3. High-Level Service Layer

The ChatManager class provides a High-Level Service Layer for generating chat completions. It does not support streaming responses but simplifies the process by using structured models like ChatMessageList and LlmSettings.

from pygeai.chat.managers import ChatManager
from pygeai.core.models import LlmSettings, ChatMessageList, ChatMessage

manager = ChatManager()

llm_settings = LlmSettings(
    temperature=0.5,
    max_tokens=500,
    frequency_penalty=0.2
)

messages = ChatMessageList(
    messages=[ChatMessage(role="user", content="Can you explain what Globant Enterprise AI does?")]
)

response = manager.chat_completion(
    model="saia:assistant:Welcome data Assistant 3",
    messages=messages,
    llm_settings=llm_settings
)
print(response)

Using tools to check weather and send an email:

from pygeai.chat.managers import ChatManager
from pygeai.core.models import LlmSettings, ChatMessageList, ChatMessage, ChatTool, ChatToolList

manager = ChatManager()

llm_settings = LlmSettings(
    temperature=0.7,
    max_tokens=1000,
    frequency_penalty=0.3,
    presence_penalty=0.2
)

messages = ChatMessageList(
    messages=[ChatMessage(role="user", content="Can you check the weather for New York and send an email summary?")]
)

tools = ChatToolList(
    variables=[
        ChatTool(
            name="get_weather",
            description="Fetches the current weather for a given location",
            parameters={
                "type": "object",
                "properties": {"location": {"type": "string", "description": "City name"}},
                "required": ["location"]
            },
            strict=True
        ),
        ChatTool(
            name="send_email",
            description="Sends an email to a recipient with a subject and body",
            parameters={
                "type": "object",
                "properties": {
                    "recipient": {"type": "string", "description": "Email address"},
                    "subject": {"type": "string", "description": "Email subject"},
                    "body": {"type": "string", "description": "Email content"}
                },
                "required": ["recipient", "subject", "body"]
            },
            strict=False
        )
    ]
)

response = manager.chat_completion(
    model="saia:assistant:Welcome data Assistant 3",
    messages=messages,
    llm_settings=llm_settings,
    tools=tools
)
print(response)

With variables and thread_id:

from pygeai.chat.managers import ChatManager
from pygeai.core.models import LlmSettings, ChatMessageList, ChatMessage, ChatVariable, ChatVariableList

manager = ChatManager()

llm_settings = LlmSettings(
    temperature=0.8,
    max_tokens=2000,
    presence_penalty=0.1
)

messages = ChatMessageList(
    messages=[
        ChatMessage(role="system", content="You are a helpful assistant for Globant Enterprise AI."),
        ChatMessage(role="user", content="What AI solutions does Globant offer?")
    ]
)

variables = ChatVariableList(
    variables=[
        ChatVariable(key="user_region", value="North America"),
        ChatVariable(key="industry", value="Technology")
    ]
)

response = manager.chat_completion(
    model="saia:assistant:Welcome data Assistant 3",
    messages=messages,
    llm_settings=llm_settings,
    thread_id="thread_123e4567-e89b-12d3-a456-426614174000",
    variables=variables
)
print(response)

With Tool choice:

from pygeai.chat.managers import ChatManager
from pygeai.core.models import LlmSettings, ChatMessageList, ChatMessage, ChatTool, ChatToolList, ToolChoice, ToolChoiceObject, ToolChoiceFunction

manager = ChatManager()

llm_settings = LlmSettings(
    temperature=0.6,
    max_tokens=800,
    frequency_penalty=0.1,
    presence_penalty=0.2
)

messages = ChatMessageList(
    messages=[ChatMessage(role="user", content="Please get the current weather for San Francisco.")]
)

tools = ChatToolList(
    variables=[
        ChatTool(
            name="get_weather",
            description="Fetches the current weather for a given location",
            parameters={
                "type": "object",
                "properties": {"location": {"type": "string", "description": "City name"}},
                "required": ["location"]
            },
            strict=True
        ),
        ChatTool(
            name="send_notification",
            description="Sends a notification with a message",
            parameters={
                "type": "object",
                "properties": {"message": {"type": "string", "description": "Notification content"}},
                "required": ["message"]
            },
            strict=False
        )
    ]
)

tool_choice = ToolChoice(
    value=ToolChoiceObject(
        function=ToolChoiceFunction(name="get_weather")
    )
)

response = manager.chat_completion(
    model="saia:assistant:Welcome data Assistant 3",
    messages=messages,
    llm_settings=llm_settings,
    tool_choice=tool_choice,
    tools=tools
)
print(response)

Availability

Since version 2025-05.

Last update: August 2025 | © GeneXus. All rights reserved. GeneXus Powered by Globant