Configuring LLM Providers
Availability
✅
🔶
From
October 25
December 25
Features
PR Linter, Reports, Error Analysis
Reports Only
This guide covers the setup and configuration of Large Language Model (LLM) providers for AI-powered features in sfp:
AI-Powered PR Linter - sfp-pro only
AI Assisted Insight Reports - Available in both sfp-pro and community (alpha)
AI-Assisted Error Analysis - Intelligent validation error analysis
These features require OpenCode CLI and an authenticated LLM provider.
OpenCode is currently only supported on OSX or Linux runtimes. It's not supported for Windows platforms.
For sfp (community) users: These AI features are available in alpha. You can use npm install -g @flxbl-io/sfp@ai
to access them.
Prerequisites
OpenCode CLI Installation
OpenCode CLI is the underlying engine that manages AI interactions for sfp's AI-powered features. It handles provider authentication, model selection, and secure API communication.
Installation Methods
Global Installation (Recommended)
# Install OpenCode CLI globally via npm
npm install -g opencode-ai
# Verify installation
opencode --version
Alternative Installation Methods
# Using yarn
yarn global add opencode-ai
# Using pnpm
pnpm add -g opencode-ai
# Using Homebrew (macOS/Linux)
brew install opencode-ai
For more installation options and troubleshooting, see the OpenCode documentation.
Supported LLM Providers
sfp currently supports the following LLM providers through OpenCode:
Anthropic (Claude)
✅ Fully Supported
⭐ Yes
Best overall performance, Flxbl framework understanding
OpenAI
✅ Fully Supported
Yes
Wide model selection, good performance
Amazon Bedrock
✅ Fully Supported
Yes
Enterprise environments with AWS infrastructure
Provider Configuration
Anthropic (Claude) - Recommended
Anthropic's Claude models provide the best understanding of Salesforce and Flxbl framework patterns. The default model used is claude-4-sonnet-xxxxx
which offers optimal balance between performance and cost.
Setup Methods
Method 1: Interactive Authentication (Recommended)
# Authenticate with Anthropic
sfp ai auth --provider anthropic --auth
# This will prompt for your API key and store it securely
Method 2: Environment Variable
# Add to your shell profile (.bashrc, .zshrc, etc.)
export ANTHROPIC_API_KEY="sk-ant-xxxxxxxxxxxxx"
Method 3: Configuration File Create or edit config/ai-architecture.yaml
:
enabled: true
provider: anthropic
# Model is optional - uses claude-sonnet-4-20250514 by default
Getting an Anthropic API Key
Visit console.anthropic.com
Sign up or log in to your account
Navigate to API Keys section
Create a new API key for sfp usage
Copy the key (starts with
sk-ant-
)
OpenAI
OpenAI provides access to GPT models with good code analysis capabilities.
Setup Methods
Method 1: Interactive Authentication
sfp ai auth --provider openai --auth
Method 2: Environment Variable
export OPENAI_API_KEY="sk-xxxxxxxxxxxxx"
Method 3: Configuration File
# In config/ai-assist.yaml
enabled: true
provider: openai
# Model is optional - uses gpt-5 by default
Getting an OpenAI API Key
Visit platform.openai.com
Sign up or log in
Go to API Keys section
Create a new secret key
Copy the key (starts with
sk-
)
Amazon Bedrock
Amazon Bedrock is ideal for enterprise environments already using AWS infrastructure. It provides access to Claude models through AWS.
Setup Methods
Method 1: AWS Profile
# Set both required environment variables
export AWS_BEARER_TOKEN_BEDROCK="your-bearer-token"
export AWS_REGION="us-east-1"
# Both variables must be set for authentication to work
Method 2: AWS Credentials
# Authenticate with Amazon Bedrock
sfp ai auth --provider amazon-bedrock --auth
# This will prompt for both Bearer Token and Region
Method 3: Configuration File
# In config/ai-assist.yaml
enabled: true
provider: amazon-bedrock
model: anthropic.claude-sonnet-4-20250514-v1:0 # Default model
Important: AWS Bedrock requires both AWS_BEARER_TOKEN_BEDROCK
and AWS_REGION
environment variables to be set. Authentication will fail if either is missing.
Bedrock Model Access: Ensure your AWS account has access to the Claude models in Bedrock. You may need to request access through the AWS Console under Bedrock > Model access.
Regional Considerations
Bedrock automatically handles model prefixes based on your AWS region:
US Regions: Models may require
us.
prefixEU Regions: Models may require
eu.
prefixAP Regions: Models may require
apac.
prefix
The OpenCode SDK handles this automatically based on your AWS_REGION
.
GitHub Copilot
GitHub Copilot can be used if you have an active subscription with model access enabled.
Setup
# Authenticate with GitHub Copilot
sfp ai auth --provider github-copilot --auth
# Ensure models are enabled in GitHub Settings:
# https://github.com/settings/copilot/features
Configuration File Reference
The AI features are configured through config/ai-assist.yaml
in your project root:
# Enable/disable AI features
enabled: true
# Provider Configuration
provider: anthropic # anthropic, openai, amazon-bedrock, github-copilot
# Model Configuration (Optional - uses provider defaults if not specified)
# Default models:
# - anthropic: claude-sonnet-4-20250514
# - github-copilot: claude-sonnet-4
# - openai: gpt-5
# - amazon-bedrock: anthropic.claude-sonnet-4-20250514-v1:0
model: claude-sonnet-4-20250514 # Override default model
# Architectural Patterns to Check (for PR Linter)
patterns:
- singleton
- factory
- repository
- service-layer
# Architecture Principles
principles:
- separation-of-concerns
- single-responsibility
- dependency-inversion
# Focus Areas for Analysis
focusAreas:
- security
- performance
- maintainability
- testability
# Additional Context Files
contextFiles:
- ARCHITECTURE.md
- docs/patterns.md
- docs/coding-standards.md
Authentication Management
Checking Authentication Status
# Check all providers
sfp ai auth
# Check specific provider
sfp ai auth --provider anthropic
# List all supported providers
sfp ai auth --list
Testing Provider Inference
After configuring authentication, you can verify that providers are working correctly using the ai check
command:
# Test all configured providers
sfp ai check
# Test specific provider with default model
sfp ai check --provider anthropic
# Test Amazon Bedrock (uses default: anthropic.claude-sonnet-4-20250514-v1:0)
sfp ai check --provider amazon-bedrock
# Test GitHub Copilot (uses default: claude-sonnet-4)
sfp ai check --provider github-copilot
This command performs a simple inference test to verify:
Authentication is configured correctly
The provider is accessible
Model inference is working
Response time and performance
Authentication Storage
Credentials are stored securely in ~/.sfp/ai-auth.json
with appropriate file permissions. This file is created automatically when you authenticate.
Rotating API Keys
# Re-authenticate to update stored credentials
sfp ai auth --provider anthropic --auth
# Or update environment variable
export ANTHROPIC_API_KEY="sk-ant-new-key-xxxxx"
Usage Priority
When multiple authentication methods are available, sfp uses the following priority:
Environment Variables - Highest priority, useful for CI/CD
Stored Credentials - From
~/.sfp/ai-auth.json
Configuration File - From
config/ai-assist.yaml
Troubleshooting
OpenCode CLI Not Found
# Verify installation
which opencode
# If not found, reinstall
npm install -g opencode-ai
# Check npm global bin path is in PATH
npm bin -g
Provider Not Available
# Check authentication
sfp ai auth --provider anthropic
# Verify environment variables
echo $ANTHROPIC_API_KEY
# For AWS Bedrock - check both required variables
echo $AWS_BEARER_TOKEN_BEDROCK
echo $AWS_REGION
# Check stored credentials exist
ls -la ~/.sfp/ai-auth.json
# Test provider inference
sfp ai check --provider <provider-name>
AWS Bedrock Specific Issues
Both Environment Variables Required
# This will NOT work (missing region)
export AWS_BEARER_TOKEN_BEDROCK="token"
# This will work (both variables set)
export AWS_BEARER_TOKEN_BEDROCK="token"
export AWS_REGION="us-east-1"
Authentication Failed
Verify both
AWS_BEARER_TOKEN_BEDROCK
andAWS_REGION
are setCheck that your bearer token is valid and not expired
Ensure your AWS account has access to Claude models in Bedrock
API Rate Limits
If you encounter rate limits:
Anthropic: Check your usage at console.anthropic.com
OpenAI: Monitor at platform.openai.com/usage
Bedrock: Check AWS CloudWatch metrics
Model Not Found
Ensure you're using the correct model identifier:
# Correct
model: claude-4-sonnet-xxxxx
Last updated