[analyzer] a minimal implementation of static graph analyzer (#2852)

* [hotfix] meta tensor default device.

* [siu] add experimental submodules to main branch.

* [siu]

* [siu]

* [analyzer] init.

* [analyzer] readme.

* [analyzer] readme.

* [analyzer] readme.

* [analyzer] readme.

* [test] add test.

* Update symbolic_trace.py

* mark skip tests.

* try except.

* try except.

* try except.

* s

* init

* init

* fix

* skip

* skip

---------

Co-authored-by: Daniel Shao <superdainiu@MININT-PVARVID.fareast.corp.microsoft.com>
Co-authored-by: Daniel Shao <superdainiu@Daniels-Mac.local>
This commit is contained in:
Super Daniel
2023-03-10 13:21:05 +08:00
committed by GitHub
parent 5d5f475d75
commit fff98f06ed
32 changed files with 4471 additions and 1 deletions

View File

@@ -288,13 +288,16 @@ class MetaInfoProp(torch.fx.Interpreter):
def flops_repr(flop: int) -> str:
return f"{flop:,} FLOPs"
accumulate_size = 0
for node in self.module.graph.nodes:
node: Node
accumulate_size += calculate_fwd_out(node) + calculate_fwd_tmp(node)
node_summaries.append([
node.op,
str(node),
flops_repr(node.meta['fwd_flop']),
flops_repr(node.meta['bwd_flop']),
mem_repr(accumulate_size),
mem_repr(calculate_fwd_in(node)),
mem_repr(calculate_fwd_out(node)),
mem_repr(calculate_fwd_tmp(node)),
@@ -309,6 +312,7 @@ class MetaInfoProp(torch.fx.Interpreter):
'Op',
'Forward FLOPs',
'Backward FLOPs',
'Accumulated Memory',
'FWD_IN',
'FWD_OUT',
'FWD_TMP',