OpenAI API Setup
Configure OpenAI API for additional AI services in your platform
OpenAI API Setup
To enable advanced text and chat-based AI services in your platform, you need to configure OpenAI API access. This guide will walk you through obtaining an API key and adding it to your deployment.
Prerequisites
Before proceeding, ensure you have:
- Deployed your core infrastructure successfully
- Completed the AWS Bedrock setup (for image generation services)
Setting up OpenAI API Access
Create an OpenAI Account
- Visit OpenAI's website
- Click "Sign up" in the top right corner
- Create an account using your email, Google account, or Microsoft account
- Complete the verification process if required
If you already have an OpenAI account, you can simply log in and proceed to the next step.
Access the API Keys Section
- After logging in, click on your profile icon in the top right corner
- Select "API keys" from the dropdown menu
- You'll be taken to the API keys dashboard
If you can't see the API keys option, you may need to switch to the API platform by clicking "API" in the top navigation bar.
Generate a New API Key
- Click the "Create new secret key" button
- Give your key a descriptive name (e.g., "AI SaaS Platform")
- Click "Create secret key"
- Your new API key will be displayed - copy it immediately
OpenAI will only show your API key once. If you don't copy it now, you'll need to create a new one.
Add API Key to Environment Variables
- Open your
.env
file in thecdk
directory - Look for the
OPENAI_API_KEY
variable - Set its value to your newly created API key:
Replace sk-your-api-key-here
with your actual OpenAI API key.
Redeploy Your Infrastructure
Now that you've added your OpenAI API key, redeploy your infrastructure to apply the changes:
This deployment will update your Lambda functions with the OPENAI_API_KEY environment variable, allowing them to access OpenAI's models for various AI services.
Understanding OpenAI API Usage
The AI SaaS platform is configured to use GPT-4 Mini (Turbo) by default for an optimal balance of performance and cost-efficiency. This model provides high-quality responses while maintaining reasonable API costs.
Using the OpenAI API incurs costs based on your usage. Your AI SaaS platform manages these costs through the credits system, ensuring predictable expenses for both you and your users.
Verifying OpenAI Integration
After completing the deployment with your OpenAI API key, verify that the OpenAI-powered services are working correctly:
Access PDF Chat Service
- Navigate to your application URL
- Sign in to your account
- Go to the AI Services section
- Select the "Chat with PDF" service
Upload a PDF Document
- Click the "Upload" button in the PDF chat interface
- Select a PDF document from your computer
- Wait for the document to be processed
Processing time depends on the size and complexity of your PDF. Simple documents are typically processed within seconds.
Test PDF Chat Functionality
- Once the PDF is processed, ask a question about the content of the document
- For example: "What are the main points in this document?"
- Click "Send" or press Enter
- Verify that you receive a relevant response that references information from your PDF
A successful response indicates that your OpenAI API integration is working correctly. The service uses GPT-4 Mini to analyze the document and generate responses based on its content.
Check Credit Usage
After testing the PDF chat service, verify that credits were deducted according to the service pricing configuration.
Troubleshooting
If you encounter issues with the OpenAI-powered services, check the following:
- API Key Format: Ensure your API key starts with
sk-
and is properly formatted - API Key Validity: Verify that your API key is active in the OpenAI dashboard
- Usage Limits: Check if you've hit any usage limits in your OpenAI account
- Lambda Logs: Check CloudWatch logs for any errors in the relevant Lambda functions
Next Steps
With both AWS Bedrock and OpenAI API configured, your AI SaaS platform now has full access to all AI services. You can proceed to Verifying Your Deployment to test all services comprehensively.