Codegen offers flexibility in choosing the Large Language Model (LLM) that powers your agent, allowing you to select from various providers and specific models. You can also configure custom API keys and base URLs if you have specific arrangements or need to use self-hosted models.
LLM Configuration settings are applied globally for your entire organization. You can access and modify these settings by navigating to:
This central location ensures that all agents operating under your organization adhere to the selected LLM provider and model, unless specific per-repository or per-agent overrides are explicitly configured (if supported by your plan).
LLM Configuration UI at codegen.com/settings/model
As shown in the UI, you can generally configure the following:
While Codegen provides access to a variety of models for experimentation and specific use cases, we highly encourage the use of Anthropic’s Claude 3.7 (Haiku). Our internal testing and prompt engineering are heavily optimized for Claude 3.7, and it consistently delivers the best performance, reliability, and cost-effectiveness for most software engineering tasks undertaken by Codegen agents. Other models are made available primarily for users who are curious or have unique, pre-existing workflows.
For advanced users or those with specific enterprise agreements with LLM providers, Codegen may allow you to use your own API keys and, in some cases, custom base URLs (e.g., for Azure OpenAI deployments or other proxy/gateway services).
Using the default Codegen-managed LLM configuration (especially with Claude 3.7) is recommended for most users to ensure optimal performance and to benefit from our continuous prompt improvements.
The availability of specific models, providers, and custom configuration options may vary based on your Codegen plan and the current platform capabilities.