Table of contents
Official Content
  • This documentation is valid for:

The Embeddings API via Python SDK - PyGEAI enables you to generate vector representations (embeddings) from different types of input, such as text and images.

You can use different LLM providers and their respective models for this purpose.

To achieve this, you have three options:

  1. Command Line
  2. Low-Level Service Layer
  3. High-Level Service Layer

1. Command Line

Use the following command line to generate embeddings:

geai emb generate \
 -i "<your_text_input>" \
 -i "<your_image_input>" \
 -m "<provider>/<model_name>"

Replace the placeholders with your desired values:

  • your_text_input: The text for which you want to generate an embedding. For example: "Help me with Globant Enterprise AI."
  • your_image_input: The image data, encoded appropriately (e.g., base64). For example:
"image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAIAAACQd1PeAAAAEElEQVR4nGK6HcwNCAAA//8DTgE8HuxwEQAAAABJRU5ErkJggg=="
  • provider/model_name: The provider and model to use for embedding generation. For example: "awsbedrock/amazon.titan-embed-text-v1"

2. Low-Level Service Layer

Use the following code snippet to generate embeddings using the Low-Level Service Layer:

from pygeai.core.embeddings.clients import EmbeddingsClient
from pygeai.core.services.llm.model import Model
from pygeai.core.services.llm.providers import Provider

client = EmbeddingsClient()

inputs = [
    "<your_text_input>",
    "<your_image_input>"
]

embeddings = client.generate_embeddings(
    input_list=inputs,
    model=f"{Provider.<provider>}/{Model.<provider>.<model_name>}",
    encoding_format=None,
    dimensions=None,
    user=None,
    input_type=None,
    timeout=600,
    cache=False
)

print(embeddings)

Replace the placeholders with your desired values:

  • your_text_input: Text for which you want to generate an embedding. For example: "Help me with Globant Enterprise AI"
  • your_image_input: Image data, encoded appropriately (e.g., base64). For example:
"image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAIAAACQd1PeAAAAEElEQVR4nGK6HcwNCAAA//8DTgE8HuxwEQAAAABJRU5ErkJggg=="
  • provider: LLM provider. For example: AWS_BEDROCK
  • model_name: Specific model from the provider. For example: AMAZON_TITAN_EMBED_TEXT_V1

3. High-Level Service Layer

Use the following code snippet to generate embeddings using the High-Level Service Layer:

from pygeai.core.embeddings.managers import EmbeddingsManager
from pygeai.core.embeddings.models import EmbeddingConfiguration
from pygeai.core.services.llm.model import Model
from pygeai.core.services.llm.providers import Provider

manager = EmbeddingsManager()

inputs = [
    "<your_text_input>",
    "<your_image_input>"
]

configuration = EmbeddingConfiguration(
    inputs=inputs,
    model=f"{Provider.<provider>}/{Model.<provider>.<model_name>}",
    encoding_format=None,
    dimensions=None,
    user=None,
    input_type=None,
    timeout=600,
    cache=False
)

embeddings = manager.generate_embeddings(configuration)
print(embeddings)

Replace the placeholders with your desired values:

  • your_text_input: Text for which you want to generate an embedding. For example: "Help me with Globant Enterprise AI"
  • your_image_input: Image data, encoded appropriately (e.g., base64). For example:
"image/png;base64,iVBORw0KGgoAAAANSUhEUgAAAAEAAAABCAIAAACQd1PeAAAAEElEQVR4nGK6HcwNCAAA//8DTgE8HuxwEQAAAABJRU5ErkJggg=="
  • provider: LLM provider. For example: AWS_BEDROCK
  • model_name: Specific model from the provider. For example: AMAZON_TITAN_EMBED_TEXT_V1

Availability

Since version 2025-05.

Last update: December 2025 | © GeneXus. All rights reserved. GeneXus Powered by Globant