DeployFrame Docs

OpenAI API Setup

Configure OpenAI API for additional AI services in your platform

OpenAI API Setup

To enable advanced text and chat-based AI services in your platform, you need to configure OpenAI API access. This guide will walk you through obtaining an API key and adding it to your deployment.

Prerequisites

Before proceeding, ensure you have:

  • Deployed your core infrastructure successfully
  • Completed the AWS Bedrock setup (for image generation services)

Setting up OpenAI API Access

Create an OpenAI Account

  1. Visit OpenAI's website
  2. Click "Sign up" in the top right corner
  3. Create an account using your email, Google account, or Microsoft account
  4. Complete the verification process if required

If you already have an OpenAI account, you can simply log in and proceed to the next step.

Access the API Keys Section

  1. After logging in, click on your profile icon in the top right corner
  2. Select "API keys" from the dropdown menu
  3. You'll be taken to the API keys dashboard

If you can't see the API keys option, you may need to switch to the API platform by clicking "API" in the top navigation bar.

Generate a New API Key

  1. Click the "Create new secret key" button
  2. Give your key a descriptive name (e.g., "AI SaaS Platform")
  3. Click "Create secret key"
  4. Your new API key will be displayed - copy it immediately

OpenAI will only show your API key once. If you don't copy it now, you'll need to create a new one.

Add API Key to Environment Variables

  1. Open your .env file in the cdk directory
  2. Look for the OPENAI_API_KEY variable
  3. Set its value to your newly created API key:
# OpenAI API
OPENAI_API_KEY=sk-your-api-key-here

Replace sk-your-api-key-here with your actual OpenAI API key.

Redeploy Your Infrastructure

Now that you've added your OpenAI API key, redeploy your infrastructure to apply the changes:

cd cdk
cdk deploy --all --require-approval never

This deployment will update your Lambda functions with the OPENAI_API_KEY environment variable, allowing them to access OpenAI's models for various AI services.

Understanding OpenAI API Usage

The AI SaaS platform is configured to use GPT-4 Mini (Turbo) by default for an optimal balance of performance and cost-efficiency. This model provides high-quality responses while maintaining reasonable API costs.

Using the OpenAI API incurs costs based on your usage. Your AI SaaS platform manages these costs through the credits system, ensuring predictable expenses for both you and your users.

Verifying OpenAI Integration

After completing the deployment with your OpenAI API key, verify that the OpenAI-powered services are working correctly:

Access PDF Chat Service

  1. Navigate to your application URL
  2. Sign in to your account
  3. Go to the AI Services section
  4. Select the "Chat with PDF" service

Upload a PDF Document

  1. Click the "Upload" button in the PDF chat interface
  2. Select a PDF document from your computer
  3. Wait for the document to be processed

Processing time depends on the size and complexity of your PDF. Simple documents are typically processed within seconds.

Test PDF Chat Functionality

  1. Once the PDF is processed, ask a question about the content of the document
  2. For example: "What are the main points in this document?"
  3. Click "Send" or press Enter
  4. Verify that you receive a relevant response that references information from your PDF

A successful response indicates that your OpenAI API integration is working correctly. The service uses GPT-4 Mini to analyze the document and generate responses based on its content.

Check Credit Usage

After testing the PDF chat service, verify that credits were deducted according to the service pricing configuration.

Troubleshooting

If you encounter issues with the OpenAI-powered services, check the following:

  1. API Key Format: Ensure your API key starts with sk- and is properly formatted
  2. API Key Validity: Verify that your API key is active in the OpenAI dashboard
  3. Usage Limits: Check if you've hit any usage limits in your OpenAI account
  4. Lambda Logs: Check CloudWatch logs for any errors in the relevant Lambda functions

Next Steps

With both AWS Bedrock and OpenAI API configured, your AI SaaS platform now has full access to all AI services. You can proceed to Verifying Your Deployment to test all services comprehensively.