Bedrock Access Gateway

In the previous guide, we have shown how to leverage on Amazon Bedrock‘s Custom Model Import to deploy SEA-LION in the cloud. After importing the SEA-LION models, you can build applications with the AWS SDK. This guide describes an alternative method to build the applications with OpenAI-compatible APIs served by the Bedrock Access Gateway.

Prerequisites

The SEA-LION model is imported and available on Amazon Bedrock. For imported models, you are charged for model inference. Please refer to the Amazon Bedrock Pricing page for the latest information.

Please check that the following are installed on your development machine.

Local Installation

If you have already cloned the https://github.com/aisingapore/bedrock-access-gateway repository from the previous guide, skip to the next step. The repository is a fork to support the imported models. At the time of writing, the original repository supports foundation models only.

Navigate to the src directory.

Check the access keys and environment variables. Please ensure that AWS_REGION is set to the region where the model is imported.

  • AWS_ACCESS_KEY_ID

  • AWS_SECRET_ACCESS_KEY

  • AWS_SESSION_TOKEN

  • AWS_REGION

Run with Uvicorn

Before running the gateway, it is a good practice to create a virtual environment to isolate the app. Please follow these steps to create a virtual environment, or feel free to use your preferred tool.

Initialise the virtual environment.

Activate the virtual environment.

Install the packages.

Start the gateway.

Run with Docker Compose

Alternatively, start the gateway with Docker Compose if it is installed.

Test the APIs

List the models with the following curl script. The imported model is in the list with the format arn:aws:bedrock:<AWS_REGION>:<ACCOUNT_ID>:imported-model/<MODEL_ID>

Test the chat completion API with the following curl script. Replace <AWS_REGION>, <ACCOUNT_ID> and <MODEL_ID> with the values from the imported model’s ARN (Amazon Resource Name) from the model list.

Demo

With the gateway, the demo from the previous guide can use the Python library for OpenAI API to integrate with the imported model. Follow the steps in the guide to set it up. Run the demo with the --api parameter.

Last updated