TRAE现已支持AI Gateway和一键部署
来源: Vercel News
ByteDance's coding agent TRAE now integrates both AI Gateway and direct Vercel deployments, bringing unified AI access and instant production shipping to over 1.6 million monthly active developers. Teams can now access hundreds of models through a single API key and deploy applications directly to Vercel from the TRAE interface.
AI Gateway provides unified access to models from Anthropic, OpenAI, Google, xAI, DeepSeek, Z.AI, MiniMax, Moonshot AI, and more without managing multiple provider accounts.
The integration includes automatic failover that routes around provider outages, zero markup on AI tokens, and unified observability to monitor both deployments and AI usage. Meanwhile, the Vercel deployment integration handles authorization automatically and returns live URLs immediately after clicking Deploy.
SOLO Mode
Setting up Vercel deployment
In SOLO mode, click the + tab and choose Integrations to connect your Vercel account. When your project is ready, click Deploy in the chat panel to ship directly to production.
Once linked, all projects can immediately deploy to Vercel and are also visible in your Vercel dashboard.
Setting up AI Gateway
In Integrations, choose Vercel AI Gateway as your AI Service and add your API key from the Vercel AI Gateway dashboard. Select any model and start coding with automatic failover, low latency, and full observability.
IDE Mode
TRAE's IDE mode supports AI Gateway as a model provider with access to the full range of available models alongside direct deployment capabilities.
Configuration
You can switch models with a single configuration change while maintaining unified billing through Vercel. This creates a complete development experience where teams write code with any AI model, then ship to production with one click from the same interface.
Get started with AI Gateway or explore the documentation to learn more.