k8sgpt/pkg/server/README.md
Alex Jones 69fe2db8ac
feat: integration refactor (#684)
* feat: more significant refactor

Signed-off-by: Alex Jones <alexsimonjones@gmail.com>

* feat: more significant refactor

Signed-off-by: Alex Jones <alexsimonjones@gmail.com>

* feat: reworked the integration activate/deactivation

Signed-off-by: Alex Jones <alexsimonjones@gmail.com>

* chore: updated schema for list integrations

Signed-off-by: Alex Jones <alexsimonjones@gmail.com>

* fix: error with incorrect error being swallowed

Signed-off-by: Alex Jones <alexsimonjones@gmail.com>

* feat: added namespace check

Signed-off-by: Alex Jones <alexsimonjones@gmail.com>

* chore: fixed issue with namespace and skip install validation

Signed-off-by: Alex Jones <alexsimonjones@gmail.com>

---------

Signed-off-by: Alex Jones <alexsimonjones@gmail.com>
2023-09-28 07:43:05 +01:00

1000 B

serve

The serve commands allow you to run k8sgpt in a grpc server mode. This would be enabled typically through k8sgpt serve and is how the in-cluster k8sgpt deployment functions when managed by the k8sgpt-operator

The grpc interface that is served is hosted on buf and the repository for this is here

grpcurl

A fantastic tool for local debugging and development is grpcurl It allows you to form curl like requests that are http2 e.g.

grpcurl -plaintext -d '{"namespace": "k8sgpt", "explain" : "true"}' localhost:8080 schema.v1.ServerService/Analyze
grpcurl -plaintext  localhost:8080 schema.v1.ServerService/ListIntegrations 
{
  "integrations": [
    "trivy"
  ]
}

grpcurl -plaintext -d '{"integrations":{"trivy":{"enabled":"true","namespace":"default","skipInstall":"false"}}}' localhost:8080 schema.v1.ServerService/AddConfig