Website | Documentation | YouTube | Discord
By Stacklok
CodeGate is a local gateway that makes AI agents and coding assistants safer. It ensures AI-generated recommendations adhere to best practices while safeguarding your code's integrity and protecting your privacy. With CodeGate, you can confidently leverage AI in your development workflow without sacrificing security or productivity.
AI coding assistants are powerful, but they can inadvertently introduce risks. CodeGate protects your development process by:
- 🔒 Preventing accidental exposure of secrets and sensitive data
- 🛡️ Ensuring AI suggestions follow secure coding practices
⚠️ Blocking recommendations of known malicious or deprecated libraries- 🔍 Providing real-time security analysis of AI suggestions
CodeGate is distributed as a Docker container. You need a container runtime like Docker Desktop or Docker Engine. Podman and Podman Desktop are also supported. CodeGate works on Windows, macOS, and Linux operating systems with x86_64 and arm64 (ARM and Apple Silicon) CPU architectures.
These instructions assume the docker
CLI is available. If you use Podman,
replace docker
with podman
in all commands.
To start CodeGate, run this simple command:
docker run --name codegate -d -p 8989:8989 -p 9090:9090 -p 8990:8990 \
--mount type=volume,src=codegate_volume,dst=/app/codegate_volume \
--restart unless-stopped ghcr.io/stacklok/codegate:latest
That’s it! CodeGate is now running locally.
Now it's time to configure your preferred AI coding assistant to use CodeGate See supported AI Coding Assistants and providers
⚙️ For advanced configurations and parameter references, check out the CodeGate Install and Upgrade documentation.
CodeGate includes a web dashboard that provides:
- A view of security risks detected by CodeGate
- A history of interactions between your AI coding assistant and your LLM
Open http://localhost:9090 in your web browser to access the dashboard.
To learn more, visit the CodeGate Dashboard documentation.
CodeGate helps you protect sensitive information from being accidentally exposed to AI models and third-party AI provider systems by redacting detected secrets from your prompts using encryption. Learn more
LLMs’ knowledge cutoff date is often months or even years in the past. They might suggest outdated, vulnerable, or non-existent packages (hallucinations), exposing you and your users to security risks.
CodeGate scans direct, transitive, and development dependencies in your package definition files, installation scripts, and source code imports that you supply as context to an LLM. Learn more
CodeGate performs security-centric code reviews, identifying insecure patterns or potential vulnerabilities to help you adopt more secure coding practices. Learn more
- Local / self-managed:
- Ollama
- Hosted:
- OpenAI and compatible APIs
🔥 Getting started with CodeGate and aider - watch on YouTube
- Local / self-managed:
- Ollama
- LM Studio
- Hosted:
- Anthropic
- OpenAI and compatible APIs
- Local / self-managed:
- Ollama
- llama.cpp
- vLLM
- Hosted:
- Anthropic
- OpenAI and compatible APIs
- The Copilot plugin works with Visual Studio Code (VS Code) (JetBrains is coming soon!)
Unlike other tools, with CodeGate your code never leaves your machine. CodeGate is built with privacy at its core:
- 🏠 Everything stays local
- 🚫 No external data collection
- 🔐 No calling home or telemetry
- 💪 Complete control over your data
Are you a developer looking to contribute? Dive into our technical resources:
CodeGate is licensed under the terms specified in the LICENSE file.
Love CodeGate? Starring this repository and sharing it with others helps CodeGate grow 🌱
We welcome contributions! Whether you're submitting bug reports, feature requests, or code contributions, your input makes CodeGate better for everyone. We thank you ❤️!
Start by reading our Contributor guidelines.
Made with contrib.rocks.