Migrate from SQLite to Basenine and introduce a new filtering syntax (#279)

* Fix the OOMKilled error by calling `debug.FreeOSMemory` periodically

* Remove `MAX_NUMBER_OF_GOROUTINES` environment variable

* Change the line

* Increase the default value of `TCP_STREAM_CHANNEL_TIMEOUT_MS` to `10000`

* Write the client and integrate to the new real-time database

* Refactor the WebSocket implementaiton for `/ws`

* Adapt the UI to the new filtering system

* Fix the rest of the issues in the UI

* Increase the buffer of the scanner

* Implement accessing single records

* Increase the buffer of another scanner

* Populate `Request` and `Response` fields of `MizuEntry`

* Add syntax highlighting for the query

* Add database to `Dockerfile`

* Fix some issues

* Update the `realtime_dbms` Git module commit hash

* Upgrade Gin version and print the query string

* Revert "Upgrade Gin version and print the query string"

This reverts commit aa09f904ee.

* Use WebSocket's itself to query instead of the query string

* Fix some errors related to conversion to HAR

* Fix the issues caused by the latest merge

* Fix the build error

* Fix PR validation GitHub workflow

* Replace the git submodule with latest Basenine version `0.1.0`

Remove `realtime_client.go` and use the official client library `github.com/up9inc/basenine/client/go` instead.

* Move Basenine host and port constants to `shared` module

* Reliably execute and wait for Basenine to become available

* Upgrade Basenine version

* Properly close WebSocket and data channel

* Fix the issues caused by the recent merge commit

* Clean up the TypeScript code

* Update `.gitignore`

* Limit the database size

* Add `Macros` method signature to `Dissector` interface and set the macros provided by the protocol extensions

* Run `go mod tidy` on `agent`

* Upgrade `github.com/up9inc/basenine/client/go` version

* Implement a mechanism to update the query using click events in the UI and use it for protocol macros

* Update the query on click to timestamps

* Fix some issues in the WebSocket and channel handling

* Update the query on clicks to status code

* Update the query on clicks to method, path and service

* Update the query on clicks to is outgoing, source and destination ports

* Add an API endpoint to validate the query against syntax errors

* Move the query background color state into `TrafficPage`

* Fix the logic in `setQuery`

* Display a toast message in case of a syntax error in the query

* Remove a call to `fmt.Printf`

* Upgrade Basenine version to `0.1.3`

* Fix an issue related to getting `MAX_ENTRIES_DB_BYTES` environment variable

* Have the `path` key in request details, in HTTP

* Rearrange the HTTP headers for the querying

* Do the same thing for `cookies` and `queryString`

* Update the query on click to table elements

Add the selectors for `TABLE` type representations in HTTP extension.

* Update the query on click to `bodySize` and `elapsedTime` in `EntryTitle`

* Add the selectors for `TABLE` type representations in AMQP extension

* Add the selectors for `TABLE` type representations in Kafka extension

* Add the selectors for `TABLE` type representations in Redis extension

* Define a struct in `tap/api.go` for the section representation data

* Add the selectors for `BODY` type representations

* Add `request.path` to the HTTP request details

* Change the summary string's field name from `path` to `summary`

* Introduce `queryable` CSS class for queryable UI elements and underline them on hover

* Instead of `N requests` at the bottom, make it `Displaying N results (queried X/Y)` and live update the values

Upgrade Basenine version to `0.2.0`.

* Verify the sha256sum of Basenine executable inside `Dockerfile`

* Pass the start time to web UI through WebSocket and always show the `EntriesList` footer

* Pipe the `stderr` of Basenine as well

* Fix the layout issues related to `CodeEditor` in the UI

* Use the correct `shasum` command in `Dockerfile`

* Upgrade Basenine version to `0.2.1`

* Limit the height of `CodeEditor` container

* Remove `Paused` enum `ConnectionStatus` in UI

* Fix the issue caused by the recent merge

* Add the filtering guide (cheatsheet)

* Update open cheatsheet button's title

* Update cheatsheet content

* Remove the old SQLite code, adapt the `--analyze` related code to Basenine

* Change the method signature of `NewEntry`

* Change the method signature of `Represent`

* Introduce `HTTPPair` field in `MizuEntry` specific to HTTP

* Remove `Entry`, `EntryId` and `EstimatedSizeBytes` fields from `MizuEntry`

Also remove the `getEstimatedEntrySizeBytes` method.

* Remove `gorm.io/gorm` dependency

* Remove unused `sensitiveDataFiltering` folder

* Increase the left margin of open cheatsheet button

* Add `overflow: auto` to the cheatsheet `Modal`

* Fix `GetEntry` method

* Fix the macro for gRPC

* Fix an interface conversion in case of AMQP

* Fix two more interface conversion errors in AMQP

* Make the `syncEntriesImpl` method blocking

* Fix a grammar mistake in the cheatsheet

* Adapt to the changes in the recent merge commit

* Improve the cheatsheet text

* Always display the timestamp in `en-US`

* Upgrade Basenine version to `0.2.2`

* Fix the order of closing Basenine connections and channels

* Don't close the Basenine channels at all

* Upgrade Basenine version to `0.2.3`

* Set the initial filter to `rlimit(100)`

* Make Basenine persistent

* Upgrade Basenine version to `0.2.4`

* Update `debug.Dockerfile`

* Fix a failing test

* Upgrade Basenine version to `0.2.5`

* Revert "Do not show play icon when disconnected (#428)"

This reverts commit 8af2e562f8.

* Upgrade Basenine version to `0.2.6`

* Make all non-informative things informative

* Make `100` a constant

* Use `===` in JavaScript no matter what

* Remove a forgotten `console.log`

* Add a comment and update the `query` in `syncEntriesImpl`

* Don't call `panic` in `GetEntry`

* Replace `panic` calls in `startBasenineServer` with `logger.Log.Panicf`

* Remove unnecessary `\n` characters in the logs
This commit is contained in:
M. Mert Yıldıran
2021-11-09 19:54:48 +03:00
committed by GitHub
parent 31d95c6557
commit d2fe3f6620
62 changed files with 3077 additions and 2327 deletions

View File

@@ -15,6 +15,7 @@ var _protocol api.Protocol = api.Protocol{
Name: "kafka",
LongName: "Apache Kafka Protocol",
Abbreviation: "KAFKA",
Macro: "kafka",
Version: "12",
BackgroundColor: "#000000",
ForegroundColor: "#ffffff",
@@ -61,7 +62,7 @@ func (d dissecting) Dissect(b *bufio.Reader, isClient bool, tcpID *api.TcpID, co
}
}
func (d dissecting) Analyze(item *api.OutputChannelItem, entryId string, resolvedSource string, resolvedDestination string) *api.MizuEntry {
func (d dissecting) Analyze(item *api.OutputChannelItem, resolvedSource string, resolvedDestination string) *api.MizuEntry {
request := item.Pair.Request.Payload.(map[string]interface{})
reqDetails := request["details"].(map[string]interface{})
service := "kafka"
@@ -70,76 +71,76 @@ func (d dissecting) Analyze(item *api.OutputChannelItem, entryId string, resolve
} else if resolvedSource != "" {
service = resolvedSource
}
apiKey := ApiKey(reqDetails["ApiKey"].(float64))
apiKey := ApiKey(reqDetails["apiKey"].(float64))
summary := ""
switch apiKey {
case Metadata:
_topics := reqDetails["Payload"].(map[string]interface{})["Topics"]
_topics := reqDetails["payload"].(map[string]interface{})["topics"]
if _topics == nil {
break
}
topics := _topics.([]interface{})
for _, topic := range topics {
summary += fmt.Sprintf("%s, ", topic.(map[string]interface{})["Name"].(string))
summary += fmt.Sprintf("%s, ", topic.(map[string]interface{})["name"].(string))
}
if len(summary) > 0 {
summary = summary[:len(summary)-2]
}
break
case ApiVersions:
summary = reqDetails["ClientID"].(string)
summary = reqDetails["clientID"].(string)
break
case Produce:
_topics := reqDetails["Payload"].(map[string]interface{})["TopicData"]
_topics := reqDetails["payload"].(map[string]interface{})["topicData"]
if _topics == nil {
break
}
topics := _topics.([]interface{})
for _, topic := range topics {
summary += fmt.Sprintf("%s, ", topic.(map[string]interface{})["Topic"].(string))
summary += fmt.Sprintf("%s, ", topic.(map[string]interface{})["topic"].(string))
}
if len(summary) > 0 {
summary = summary[:len(summary)-2]
}
break
case Fetch:
_topics := reqDetails["Payload"].(map[string]interface{})["Topics"]
_topics := reqDetails["payload"].(map[string]interface{})["topics"]
if _topics == nil {
break
}
topics := _topics.([]interface{})
for _, topic := range topics {
summary += fmt.Sprintf("%s, ", topic.(map[string]interface{})["Topic"].(string))
summary += fmt.Sprintf("%s, ", topic.(map[string]interface{})["topic"].(string))
}
if len(summary) > 0 {
summary = summary[:len(summary)-2]
}
break
case ListOffsets:
_topics := reqDetails["Payload"].(map[string]interface{})["Topics"]
_topics := reqDetails["payload"].(map[string]interface{})["topics"]
if _topics == nil {
break
}
topics := _topics.([]interface{})
for _, topic := range topics {
summary += fmt.Sprintf("%s, ", topic.(map[string]interface{})["Name"].(string))
summary += fmt.Sprintf("%s, ", topic.(map[string]interface{})["name"].(string))
}
if len(summary) > 0 {
summary = summary[:len(summary)-2]
}
break
case CreateTopics:
topics := reqDetails["Payload"].(map[string]interface{})["Topics"].([]interface{})
topics := reqDetails["payload"].(map[string]interface{})["topics"].([]interface{})
for _, topic := range topics {
summary += fmt.Sprintf("%s, ", topic.(map[string]interface{})["Name"].(string))
summary += fmt.Sprintf("%s, ", topic.(map[string]interface{})["name"].(string))
}
if len(summary) > 0 {
summary = summary[:len(summary)-2]
}
break
case DeleteTopics:
topicNames := reqDetails["TopicNames"].([]string)
topicNames := reqDetails["topicNames"].([]string)
for _, name := range topicNames {
summary += fmt.Sprintf("%s, ", name)
}
@@ -148,44 +149,48 @@ func (d dissecting) Analyze(item *api.OutputChannelItem, entryId string, resolve
request["url"] = summary
elapsedTime := item.Pair.Response.CaptureTime.Sub(item.Pair.Request.CaptureTime).Round(time.Millisecond).Milliseconds()
entryBytes, _ := json.Marshal(item.Pair)
return &api.MizuEntry{
ProtocolName: _protocol.Name,
ProtocolLongName: _protocol.LongName,
ProtocolAbbreviation: _protocol.Abbreviation,
ProtocolVersion: _protocol.Version,
ProtocolBackgroundColor: _protocol.BackgroundColor,
ProtocolForegroundColor: _protocol.ForegroundColor,
ProtocolFontSize: _protocol.FontSize,
ProtocolReferenceLink: _protocol.ReferenceLink,
EntryId: entryId,
Entry: string(entryBytes),
Url: fmt.Sprintf("%s%s", service, summary),
Method: apiNames[apiKey],
Status: 0,
RequestSenderIp: item.ConnectionInfo.ClientIP,
Service: service,
Timestamp: item.Timestamp,
ElapsedTime: elapsedTime,
Path: summary,
ResolvedSource: resolvedSource,
ResolvedDestination: resolvedDestination,
SourceIp: item.ConnectionInfo.ClientIP,
DestinationIp: item.ConnectionInfo.ServerIP,
SourcePort: item.ConnectionInfo.ClientPort,
DestinationPort: item.ConnectionInfo.ServerPort,
IsOutgoing: item.ConnectionInfo.IsOutgoing,
Protocol: _protocol,
Source: &api.TCP{
Name: resolvedSource,
IP: item.ConnectionInfo.ClientIP,
Port: item.ConnectionInfo.ClientPort,
},
Destination: &api.TCP{
Name: resolvedDestination,
IP: item.ConnectionInfo.ServerIP,
Port: item.ConnectionInfo.ServerPort,
},
Outgoing: item.ConnectionInfo.IsOutgoing,
Request: reqDetails,
Response: item.Pair.Response.Payload.(map[string]interface{})["details"].(map[string]interface{}),
Url: fmt.Sprintf("%s%s", service, summary),
Method: apiNames[apiKey],
Status: 0,
RequestSenderIp: item.ConnectionInfo.ClientIP,
Service: service,
Timestamp: item.Timestamp,
StartTime: item.Pair.Request.CaptureTime,
ElapsedTime: elapsedTime,
Summary: summary,
ResolvedSource: resolvedSource,
ResolvedDestination: resolvedDestination,
SourceIp: item.ConnectionInfo.ClientIP,
DestinationIp: item.ConnectionInfo.ServerIP,
SourcePort: item.ConnectionInfo.ClientPort,
DestinationPort: item.ConnectionInfo.ServerPort,
IsOutgoing: item.ConnectionInfo.IsOutgoing,
}
}
func (d dissecting) Summarize(entry *api.MizuEntry) *api.BaseEntryDetails {
return &api.BaseEntryDetails{
Id: entry.EntryId,
Id: entry.Id,
Protocol: _protocol,
Url: entry.Url,
RequestSenderIp: entry.RequestSenderIp,
Service: entry.Service,
Summary: entry.Path,
Summary: entry.Summary,
StatusCode: entry.Status,
Method: entry.Method,
Timestamp: entry.Timestamp,
@@ -202,49 +207,43 @@ func (d dissecting) Summarize(entry *api.MizuEntry) *api.BaseEntryDetails {
}
}
func (d dissecting) Represent(entry *api.MizuEntry) (p api.Protocol, object []byte, bodySize int64, err error) {
p = _protocol
func (d dissecting) Represent(protoIn api.Protocol, request map[string]interface{}, response map[string]interface{}) (protoOut api.Protocol, object []byte, bodySize int64, err error) {
protoOut = _protocol
bodySize = 0
var root map[string]interface{}
json.Unmarshal([]byte(entry.Entry), &root)
representation := make(map[string]interface{}, 0)
request := root["request"].(map[string]interface{})["payload"].(map[string]interface{})
response := root["response"].(map[string]interface{})["payload"].(map[string]interface{})
reqDetails := request["details"].(map[string]interface{})
resDetails := response["details"].(map[string]interface{})
apiKey := ApiKey(reqDetails["ApiKey"].(float64))
apiKey := ApiKey(request["apiKey"].(float64))
var repRequest []interface{}
var repResponse []interface{}
switch apiKey {
case Metadata:
repRequest = representMetadataRequest(reqDetails)
repResponse = representMetadataResponse(resDetails)
repRequest = representMetadataRequest(request)
repResponse = representMetadataResponse(response)
break
case ApiVersions:
repRequest = representApiVersionsRequest(reqDetails)
repResponse = representApiVersionsResponse(resDetails)
repRequest = representApiVersionsRequest(request)
repResponse = representApiVersionsResponse(response)
break
case Produce:
repRequest = representProduceRequest(reqDetails)
repResponse = representProduceResponse(resDetails)
repRequest = representProduceRequest(request)
repResponse = representProduceResponse(response)
break
case Fetch:
repRequest = representFetchRequest(reqDetails)
repResponse = representFetchResponse(resDetails)
repRequest = representFetchRequest(request)
repResponse = representFetchResponse(response)
break
case ListOffsets:
repRequest = representListOffsetsRequest(reqDetails)
repResponse = representListOffsetsResponse(resDetails)
repRequest = representListOffsetsRequest(request)
repResponse = representListOffsetsResponse(response)
break
case CreateTopics:
repRequest = representCreateTopicsRequest(reqDetails)
repResponse = representCreateTopicsResponse(resDetails)
repRequest = representCreateTopicsRequest(request)
repResponse = representCreateTopicsResponse(response)
break
case DeleteTopics:
repRequest = representDeleteTopicsRequest(reqDetails)
repResponse = representDeleteTopicsResponse(resDetails)
repRequest = representDeleteTopicsRequest(request)
repResponse = representDeleteTopicsResponse(response)
break
}
@@ -254,4 +253,10 @@ func (d dissecting) Represent(entry *api.MizuEntry) (p api.Protocol, object []by
return
}
func (d dissecting) Macros() map[string]string {
return map[string]string{
`kafka`: fmt.Sprintf(`proto.abbr == "%s"`, _protocol.Abbreviation),
}
}
var Dissector dissecting