mirror of
https://github.com/k8sgpt-ai/k8sgpt.git
synced 2026-05-03 18:02:43 +00:00
- Add .github/ISSUE_TEMPLATE/ (bug_report, feature_request, config) - Add MAINTAINERS.md with maintainer list and roles - Add GOVERNANCE.md with decision-making, lifecycle, vendor neutrality - Add ADOPTERS.md with adopter collection template - Add ROADMAP.md with current focus areas and planned initiatives - Add INTEGRATIONS.md with CNCF, AI provider, and tool integrations - Add RELEASE.md documenting automated release process Part of CNCF incubation preparation tracking issue #1641 Signed-off-by: Alex Jones <axjns@example.com> Co-authored-by: Alex Jones <axjns@example.com>
4.2 KiB
4.2 KiB
Integrations
k8sgpt integrates with a variety of cloud native tools, platforms, and services.
CNCF Project Integrations
| Project | Integration Type | Description |
|---|---|---|
| Prometheus | Exporter / Metrics | k8sgpt operator can export analysis results to Prometheus for monitoring |
| Prometheus Operator | Operator Integration | Integration with Prometheus Operator for service discovery and alerting |
| Alertmanager | Alert Integration | Send k8sgpt analysis alerts to Alertmanager |
| OpenTelemetry | Observability | Export analysis metrics and traces via OpenTelemetry |
| Grafana | Dashboard | Visualize k8sgpt analysis results in Grafana dashboards |
| Kubernetes | Core Platform | Native Kubernetes resource analysis and diagnostics |
| Helm | Packaging | k8sgpt available as a Helm chart in the charts repository |
| Krew | Plugin Distribution | k8sgpt distributed as a Krew kubectl plugin via .krew.yaml |
AI/LLM Provider Integrations
| Provider | Backend Name | Description |
|---|---|---|
| OpenAI | openai |
Default provider - supports GPT-3.5, GPT-4, and other OpenAI models |
| Azure OpenAI | azureopenai |
Azure-hosted OpenAI models |
| Cohere | cohere |
Cohere's command models |
| Amazon Bedrock | amazonbedrock |
AWS Bedrock - supports Claude, Llama, Titan, and more |
| Amazon SageMaker | amazonsagemaker |
AWS SageMaker JumpStart models |
| Google Gemini | google |
Google's Gemini models |
| Google Vertex AI | googlevertexai |
Google Cloud Vertex AI models |
| Ollama | ollama |
Local LLM inference with Ollama |
| LocalAI | localai |
Self-hosted OpenAI-compatible API |
| Hugging Face | huggingface |
Hugging Face Inference API |
| IBM WatsonX | watsonxai |
IBM WatsonX AI models |
| IBM WatsonxAI | ibmwatsonxai |
IBM WatsonxAI specific integration |
| Custom REST | customrest |
Any REST API that follows the OpenAI chat completion format |
Other Tool Integrations
| Tool | Integration Type | Description |
|---|---|---|
| Claude Desktop | MCP Server | k8sgpt MCP server integrates with Claude Desktop for AI-assisted cluster analysis |
| Docker | Container | Container image available on GitHub Container Registry |
| Minikube | Development | Works with Minikube clusters for development and testing |
| Kubeblocks | Database Analysis | Analyzer support for KubeBlocks-managed databases |
| OpenShift | Platform | Analysis support for OpenShift-specific resources (CatalogSource, ClusterCatalog, etc.) |
| FluxCD | GitOps | Compatible with GitOps workflows using FluxCD |
| ArgoCD | GitOps | Compatible with GitOps workflows using ArgoCD |
Remote Caching
k8sgpt supports remote caching of analysis results:
| Provider | Type | Description |
|---|---|---|
| AWS S3 | Object Storage | Store analysis cache in AWS S3 buckets |
| Azure Blob | Object Storage | Store analysis cache in Azure Blob Storage |
| Google Cloud Storage | Object Storage | Store analysis cache in GCS buckets |
How to Integrate
For custom analyzer integrations, see the Custom Analyzers documentation and the custom analyzer schema.
For MCP server integration, see MCP.md.