kubeshark/agent/pkg/utils/har.go
M. Mert Yıldıran d2fe3f6620
Migrate from SQLite to Basenine and introduce a new filtering syntax (#279)
* Fix the OOMKilled error by calling `debug.FreeOSMemory` periodically

* Remove `MAX_NUMBER_OF_GOROUTINES` environment variable

* Change the line

* Increase the default value of `TCP_STREAM_CHANNEL_TIMEOUT_MS` to `10000`

* Write the client and integrate to the new real-time database

* Refactor the WebSocket implementaiton for `/ws`

* Adapt the UI to the new filtering system

* Fix the rest of the issues in the UI

* Increase the buffer of the scanner

* Implement accessing single records

* Increase the buffer of another scanner

* Populate `Request` and `Response` fields of `MizuEntry`

* Add syntax highlighting for the query

* Add database to `Dockerfile`

* Fix some issues

* Update the `realtime_dbms` Git module commit hash

* Upgrade Gin version and print the query string

* Revert "Upgrade Gin version and print the query string"

This reverts commit aa09f904ee.

* Use WebSocket's itself to query instead of the query string

* Fix some errors related to conversion to HAR

* Fix the issues caused by the latest merge

* Fix the build error

* Fix PR validation GitHub workflow

* Replace the git submodule with latest Basenine version `0.1.0`

Remove `realtime_client.go` and use the official client library `github.com/up9inc/basenine/client/go` instead.

* Move Basenine host and port constants to `shared` module

* Reliably execute and wait for Basenine to become available

* Upgrade Basenine version

* Properly close WebSocket and data channel

* Fix the issues caused by the recent merge commit

* Clean up the TypeScript code

* Update `.gitignore`

* Limit the database size

* Add `Macros` method signature to `Dissector` interface and set the macros provided by the protocol extensions

* Run `go mod tidy` on `agent`

* Upgrade `github.com/up9inc/basenine/client/go` version

* Implement a mechanism to update the query using click events in the UI and use it for protocol macros

* Update the query on click to timestamps

* Fix some issues in the WebSocket and channel handling

* Update the query on clicks to status code

* Update the query on clicks to method, path and service

* Update the query on clicks to is outgoing, source and destination ports

* Add an API endpoint to validate the query against syntax errors

* Move the query background color state into `TrafficPage`

* Fix the logic in `setQuery`

* Display a toast message in case of a syntax error in the query

* Remove a call to `fmt.Printf`

* Upgrade Basenine version to `0.1.3`

* Fix an issue related to getting `MAX_ENTRIES_DB_BYTES` environment variable

* Have the `path` key in request details, in HTTP

* Rearrange the HTTP headers for the querying

* Do the same thing for `cookies` and `queryString`

* Update the query on click to table elements

Add the selectors for `TABLE` type representations in HTTP extension.

* Update the query on click to `bodySize` and `elapsedTime` in `EntryTitle`

* Add the selectors for `TABLE` type representations in AMQP extension

* Add the selectors for `TABLE` type representations in Kafka extension

* Add the selectors for `TABLE` type representations in Redis extension

* Define a struct in `tap/api.go` for the section representation data

* Add the selectors for `BODY` type representations

* Add `request.path` to the HTTP request details

* Change the summary string's field name from `path` to `summary`

* Introduce `queryable` CSS class for queryable UI elements and underline them on hover

* Instead of `N requests` at the bottom, make it `Displaying N results (queried X/Y)` and live update the values

Upgrade Basenine version to `0.2.0`.

* Verify the sha256sum of Basenine executable inside `Dockerfile`

* Pass the start time to web UI through WebSocket and always show the `EntriesList` footer

* Pipe the `stderr` of Basenine as well

* Fix the layout issues related to `CodeEditor` in the UI

* Use the correct `shasum` command in `Dockerfile`

* Upgrade Basenine version to `0.2.1`

* Limit the height of `CodeEditor` container

* Remove `Paused` enum `ConnectionStatus` in UI

* Fix the issue caused by the recent merge

* Add the filtering guide (cheatsheet)

* Update open cheatsheet button's title

* Update cheatsheet content

* Remove the old SQLite code, adapt the `--analyze` related code to Basenine

* Change the method signature of `NewEntry`

* Change the method signature of `Represent`

* Introduce `HTTPPair` field in `MizuEntry` specific to HTTP

* Remove `Entry`, `EntryId` and `EstimatedSizeBytes` fields from `MizuEntry`

Also remove the `getEstimatedEntrySizeBytes` method.

* Remove `gorm.io/gorm` dependency

* Remove unused `sensitiveDataFiltering` folder

* Increase the left margin of open cheatsheet button

* Add `overflow: auto` to the cheatsheet `Modal`

* Fix `GetEntry` method

* Fix the macro for gRPC

* Fix an interface conversion in case of AMQP

* Fix two more interface conversion errors in AMQP

* Make the `syncEntriesImpl` method blocking

* Fix a grammar mistake in the cheatsheet

* Adapt to the changes in the recent merge commit

* Improve the cheatsheet text

* Always display the timestamp in `en-US`

* Upgrade Basenine version to `0.2.2`

* Fix the order of closing Basenine connections and channels

* Don't close the Basenine channels at all

* Upgrade Basenine version to `0.2.3`

* Set the initial filter to `rlimit(100)`

* Make Basenine persistent

* Upgrade Basenine version to `0.2.4`

* Update `debug.Dockerfile`

* Fix a failing test

* Upgrade Basenine version to `0.2.5`

* Revert "Do not show play icon when disconnected (#428)"

This reverts commit 8af2e562f8.

* Upgrade Basenine version to `0.2.6`

* Make all non-informative things informative

* Make `100` a constant

* Use `===` in JavaScript no matter what

* Remove a forgotten `console.log`

* Add a comment and update the `query` in `syncEntriesImpl`

* Don't call `panic` in `GetEntry`

* Replace `panic` calls in `startBasenineServer` with `logger.Log.Panicf`

* Remove unnecessary `\n` characters in the logs
2021-11-09 19:54:48 +03:00

254 lines
6.5 KiB
Go

package utils
import (
"bytes"
"errors"
"fmt"
"strconv"
"strings"
"time"
"github.com/google/martian/har"
"github.com/up9inc/mizu/shared/logger"
)
// Keep it because we might want cookies in the future
//func BuildCookies(rawCookies []interface{}) []har.Cookie {
// cookies := make([]har.Cookie, 0, len(rawCookies))
//
// for _, cookie := range rawCookies {
// c := cookie.(map[string]interface{})
// expiresStr := ""
// if c["expires"] != nil {
// expiresStr = c["expires"].(string)
// }
// expires, _ := time.Parse(time.RFC3339, expiresStr)
// httpOnly := false
// if c["httponly"] != nil {
// httpOnly, _ = strconv.ParseBool(c["httponly"].(string))
// }
// secure := false
// if c["secure"] != nil {
// secure, _ = strconv.ParseBool(c["secure"].(string))
// }
// path := ""
// if c["path"] != nil {
// path = c["path"].(string)
// }
// domain := ""
// if c["domain"] != nil {
// domain = c["domain"].(string)
// }
//
// cookies = append(cookies, har.Cookie{
// Name: c["name"].(string),
// Value: c["value"].(string),
// Path: path,
// Domain: domain,
// HTTPOnly: httpOnly,
// Secure: secure,
// Expires: expires,
// Expires8601: expiresStr,
// })
// }
//
// return cookies
//}
func BuildHeaders(rawHeaders []interface{}) ([]har.Header, string, string, string, string, string) {
var host, scheme, authority, path, status string
headers := make([]har.Header, 0, len(rawHeaders))
for _, header := range rawHeaders {
h := header.(map[string]interface{})
headers = append(headers, har.Header{
Name: h["name"].(string),
Value: h["value"].(string),
})
if h["name"] == "Host" {
host = h["value"].(string)
}
if h["name"] == ":authority" {
authority = h["value"].(string)
}
if h["name"] == ":scheme" {
scheme = h["value"].(string)
}
if h["name"] == ":path" {
path = h["value"].(string)
}
if h["name"] == ":status" {
status = h["value"].(string)
}
}
return headers, host, scheme, authority, path, status
}
func BuildPostParams(rawParams []interface{}) []har.Param {
params := make([]har.Param, 0, len(rawParams))
for _, param := range rawParams {
p := param.(map[string]interface{})
name := ""
if p["name"] != nil {
name = p["name"].(string)
}
value := ""
if p["value"] != nil {
value = p["value"].(string)
}
fileName := ""
if p["fileName"] != nil {
fileName = p["fileName"].(string)
}
contentType := ""
if p["contentType"] != nil {
contentType = p["contentType"].(string)
}
params = append(params, har.Param{
Name: name,
Value: value,
Filename: fileName,
ContentType: contentType,
})
}
return params
}
func NewRequest(request map[string]interface{}) (harRequest *har.Request, err error) {
headers, host, scheme, authority, path, _ := BuildHeaders(request["_headers"].([]interface{}))
cookies := make([]har.Cookie, 0) // BuildCookies(request["_cookies"].([]interface{}))
postData, _ := request["postData"].(map[string]interface{})
mimeType, _ := postData["mimeType"]
if mimeType == nil || len(mimeType.(string)) == 0 {
mimeType = "text/html"
}
text, _ := postData["text"]
postDataText := ""
if text != nil {
postDataText = text.(string)
}
queryString := make([]har.QueryString, 0)
for _, _qs := range request["_queryString"].([]interface{}) {
qs := _qs.(map[string]interface{})
queryString = append(queryString, har.QueryString{
Name: qs["name"].(string),
Value: qs["value"].(string),
})
}
url := fmt.Sprintf("http://%s%s", host, request["url"].(string))
if strings.HasPrefix(mimeType.(string), "application/grpc") {
url = fmt.Sprintf("%s://%s%s", scheme, authority, path)
}
harParams := make([]har.Param, 0)
if postData["params"] != nil {
harParams = BuildPostParams(postData["params"].([]interface{}))
}
harRequest = &har.Request{
Method: request["method"].(string),
URL: url,
HTTPVersion: request["httpVersion"].(string),
HeadersSize: -1,
BodySize: int64(bytes.NewBufferString(postDataText).Len()),
QueryString: queryString,
Headers: headers,
Cookies: cookies,
PostData: &har.PostData{
MimeType: mimeType.(string),
Params: harParams,
Text: postDataText,
},
}
return
}
func NewResponse(response map[string]interface{}) (harResponse *har.Response, err error) {
headers, _, _, _, _, _status := BuildHeaders(response["_headers"].([]interface{}))
cookies := make([]har.Cookie, 0) // BuildCookies(response["_cookies"].([]interface{}))
content, _ := response["content"].(map[string]interface{})
mimeType, _ := content["mimeType"]
if mimeType == nil || len(mimeType.(string)) == 0 {
mimeType = "text/html"
}
encoding, _ := content["encoding"]
text, _ := content["text"]
bodyText := ""
if text != nil {
bodyText = text.(string)
}
harContent := &har.Content{
Encoding: encoding.(string),
MimeType: mimeType.(string),
Text: []byte(bodyText),
Size: int64(len(bodyText)),
}
status := int(response["status"].(float64))
if strings.HasPrefix(mimeType.(string), "application/grpc") {
if _status != "" {
status, err = strconv.Atoi(_status)
}
if err != nil {
logger.Log.Errorf("Failed converting status to int %s (%v,%+v)", err, err, err)
return nil, errors.New("failed converting response status to int for HAR")
}
}
harResponse = &har.Response{
HTTPVersion: response["httpVersion"].(string),
Status: status,
StatusText: response["statusText"].(string),
HeadersSize: -1,
BodySize: int64(bytes.NewBufferString(bodyText).Len()),
Headers: headers,
Cookies: cookies,
Content: harContent,
}
return
}
func NewEntry(request map[string]interface{}, response map[string]interface{}, startTime time.Time, elapsedTime int64) (*har.Entry, error) {
harRequest, err := NewRequest(request)
if err != nil {
logger.Log.Errorf("Failed converting request to HAR %s (%v,%+v)", err, err, err)
return nil, errors.New("failed converting request to HAR")
}
harResponse, err := NewResponse(response)
if err != nil {
logger.Log.Errorf("Failed converting response to HAR %s (%v,%+v)", err, err, err)
return nil, errors.New("failed converting response to HAR")
}
if elapsedTime < 1 {
elapsedTime = 1
}
harEntry := har.Entry{
StartedDateTime: startTime,
Time: elapsedTime,
Request: harRequest,
Response: harResponse,
Cache: &har.Cache{},
Timings: &har.Timings{
Send: -1,
Wait: -1,
Receive: elapsedTime,
},
}
return &harEntry, nil
}