Compare commits

...

359 Commits

Author SHA1 Message Date
ibuler
ee01b4e0f7 fix: 修改缩进 2021-01-15 11:15:09 +08:00
ibuler
82077a3ae1 fix: bug 2021-01-14 10:35:16 +08:00
Bai
d2760b98f6 fix: 修复celery日志清除问题 2021-01-11 10:29:32 +08:00
Bai
5f47a072b6 perf: 升级依赖 python-ldap==3.3.1 2020-12-22 17:19:15 +08:00
xinwen
25201e295d fix: 将节点的资产添加到系统用户时固定组织 2020-12-22 15:14:26 +08:00
Bai
a802fb792f fix: 修复资产导入携带disk_info信息时失败的问题 2020-12-21 17:11:20 +08:00
Bai
1646f9b27b fix: 修复提交系统设置失败的问题 2020-12-21 14:29:48 +08:00
Jiangjie.Bai
3f4877f26b Merge pull request #5295 from jumpserver/dev
fix: 修复命令列表过滤字段文案`会话ID`
2020-12-17 18:26:54 +08:00
Bai
0e4d778335 fix: 修复命令列表过滤字段文案会话ID 2020-12-17 18:25:48 +08:00
Jiangjie.Bai
52d20080ff Merge pull request #5293 from jumpserver/dev
fix: 修改系统监控问题
2020-12-17 17:36:06 +08:00
Bai
ed8d72c06b fix: 修改系统监控问题 2020-12-17 17:34:37 +08:00
Jiangjie.Bai
5e9e3ec6f6 Merge pull request #5288 from jumpserver/dev
chore: Merge master from dev
2020-12-17 14:56:25 +08:00
xinwen
4f5f92deb8 fix: 批量删除管理用户报错信息太丑 2020-12-17 14:47:37 +08:00
xinwen
d2a15ee702 fix: 申请资产工单如果系统用户没填内容不推荐系统用户 2020-12-17 14:35:08 +08:00
Bai
a3a591da4b fix: 修复命令导出excel格式报错 2020-12-17 14:31:34 +08:00
Bai
3f2925116e fix: 修复metrics获取terminal过滤is_deleted字段 2020-12-17 11:30:29 +08:00
xinwen
c3e2e536e0 fix: 【用户管理】-创建用户组-可将系统审计员加入到用户组 #579 2020-12-17 10:33:45 +08:00
Jiangjie.Bai
8c133d5fdb Merge pull request #5278 from jumpserver/dev
chore: Merge master from dev
2020-12-16 18:50:49 +08:00
xinwen
89d8efe0f1 fix: perms.signals_handler.on_application_permission_applications_changed 修改名字 2020-12-16 18:49:20 +08:00
Bai
54303ea33f fix: 修复节点创建时更新孩子full_value日志输出问题 2020-12-16 18:37:54 +08:00
Bai
4dcd8dd8dd fix: 修复节点创建时更新孩子full_value日志输出问题 2020-12-16 18:37:54 +08:00
老广
4f04a7d258 Merge pull request #5280 from jumpserver/pr@dev@fix_auto_push
fix: 推送系统用户时 AdHocExecution id 重复
2020-12-16 17:36:45 +08:00
xinwen
bf308e24b6 fix: 推送系统用户时 AdHocExecution id 重复 2020-12-16 17:31:37 +08:00
Bai
b3642f3ff4 fix: 修复LDAP用户登录(未找到)时循环调用问题 2020-12-16 12:01:57 +08:00
xinwen
3aed4955c8 fix: 远程应用授权的一些问题 2020-12-16 12:00:53 +08:00
fit2bot
9a5f9a9c92 fix: 应用授权不会自动推送的bug (#5271)
Co-authored-by: xinwen <coderWen@126.com>
2020-12-16 10:23:37 +08:00
Orange
6d5bec1ef2 Merge pull request #5269 from jumpserver/dev
chore: Merge master from dev
2020-12-15 20:35:51 +08:00
Bai
e93fd1fd44 fix: 删除终端列表state的默认值0 2020-12-15 20:34:47 +08:00
xinwen
7bf37611bd fix: 系统审计员不应该能添加到组 2020-12-15 19:24:25 +08:00
xinwen
b8ec4bfaa5 fix: 日志审计-操作日志中搜索-按动作搜索:需修改文字 删除文件-->删除 #524 2020-12-15 18:38:23 +08:00
Bai
58b6293b76 fix: 组件监控添加offline数量 2020-12-15 18:37:16 +08:00
xinwen
8e12eebceb fix: 获得 oidc acs 等认证方式失败 2020-12-15 18:36:37 +08:00
Bai
72d6ea43fa fix: 修改命令列表过滤参数session 2020-12-15 18:34:05 +08:00
Bai
deedd49dc5 fix: 修复命令记录导出excel文件格式未定义的问题 2020-12-15 18:07:11 +08:00
fit2bot
a36e6fbf84 fix: 修改判断会话活跃逻辑;不必要判断协议 (#5262)
* fix: 修改判断会话活跃逻辑;不必要判断协议

* fix: 修改导入task问题

Co-authored-by: Bai <bugatti_it@163.com>
2020-12-15 18:06:35 +08:00
Bai
b57453cc3c fix: 修复命令列表过滤使用session_id字段 2020-12-15 18:05:42 +08:00
Jiangjie.Bai
62f2909d59 Merge pull request #5256 from jumpserver/dev
chore: Merge master from dev
2020-12-15 14:20:23 +08:00
xinwen
0d469ff95b fix(orgs): 用户离开组织后授权的资产没主动刷新 2020-12-15 14:00:51 +08:00
xinwen
ca883f1fb4 fix: 工单申请资产审批时系统用户没有推荐 2020-12-15 13:06:55 +08:00
fit2bot
6e0fbd78e7 fix: 修复prometheus_metricsAPI数据获取bug;修复组件注册type为空bug (#5253)
* fix: 修复prometheus_metricsAPI数据获取bug;修复组件注册type为空bug

* fix: 修改审计migrations/userloginlog/backend的verbose_name字段

Co-authored-by: Bai <bugatti_it@163.com>
2020-12-15 13:00:59 +08:00
ibuler
0813cff74f perf: 优化 docs swagger,不再要求debug 2020-12-14 11:39:02 +08:00
ibuler
ff428b84f9 fix(assets): 修复asset更新子节点时的日志错误 2020-12-14 11:33:34 +08:00
ibuler
d0c9aa2c55 fix(assets): 修复更新孩子节点时的log error 2020-12-14 11:33:34 +08:00
老广
1d5e603c0d Merge pull request #5231 from jumpserver/dev
chore(merge): 合并 dev 到 master
2020-12-11 19:29:45 +08:00
ibuler
ddbbc8df17 fix(docker): 修复Dockerfile中 echo引起的sh和bash换行兼容问题 2020-12-11 19:26:46 +08:00
Bai
90df404931 fix: 修复swagger问题 2020-12-11 19:02:33 +08:00
Bai
b9cbff1a5f del: 删除测试prometheus相关代码 2020-12-11 19:02:33 +08:00
ibuler
b9717eece3 fix: 修改访问swagger会产生的错误 2020-12-11 18:30:20 +08:00
Bai
f9cf2a243b fix: 修复settings中搜索LDAP用户重复问题 2020-12-11 18:26:10 +08:00
Bai
e056430fce fix: 修复只配置DC域时,LDAP用户认证失败的问题 2020-12-11 18:26:10 +08:00
Jiangjie.Bai
2b2821c0a1 Merge pull request #5223 from jumpserver/dev
chore(merge): 合并 dev 到 master
2020-12-11 16:53:36 +08:00
Bai
213221beae perf: 修改BasePermissionViewSet的custom_filter_fields 2020-12-11 16:19:18 +08:00
Bai
2db9c90a74 feat: 修改翻译:认证方式 2020-12-11 15:49:39 +08:00
Bai
8ced6f1168 fix: 用户ProfileAPI设置is_first_login不是可读写 2020-12-11 15:44:17 +08:00
Bai
6703ab9a77 perf: 添加BasePermissionsViewSet,支持搜索过滤 2020-12-11 15:44:17 +08:00
老广
2fc6e6cd54 Merge pull request #5213 from jumpserver/dev
chore(merge): 合并dev到master
2020-12-10 23:03:35 +08:00
Bai
2176fd8fac feat: 更新翻译 2020-12-10 21:30:56 +08:00
fit2bot
856e7c16e5 feat: 添加组件监控;TerminalModel添加type字段; (#5206)
* feat: 添加组件监控;TerminalModel添加type字段;

* feat: Terminal序列类添加type字段

* feat: Terminal序列类添加type字段为只读

* feat: 修改组件status文案

* feat: 取消上传组件状态序列类count字段

* reactor: 修改termina/models目录结构

* feat: 修改ComponentTypeChoices

* feat: 取消考虑CoreComponent类型

* feat: 修改Terminal status判断逻辑

* feat: 终端列表添加status过滤; 组件状态序列类添加default值

* feat: 添加PrometheusMetricsAPI

* feat: 修改PrometheusMetricsAPI

Co-authored-by: Bai <bugatti_it@163.com>
2020-12-10 20:50:22 +08:00
fit2bot
d4feaf1e08 fix: 修复由于更新django captch版本引起的css丢失问题 (#5204)
* fix: 修复由于更新django captch版本引起的css丢失问题

* perf: 优化验证码的高度

Co-authored-by: ibuler <ibuler@qq.com>
2020-12-10 20:48:10 +08:00
fit2bot
5aee2ce3db chore: 升级依赖库版本 (#5205)
* chore: 升级依赖库版本

* fix: 几个库回退几个版本

Co-authored-by: ibuler <ibuler@qq.com>
2020-12-10 20:46:45 +08:00
xinwen
4424c4bde2 perf(asset): 手动启动节点资产数量自检程序时区分组织 2020-12-10 18:17:01 +08:00
fit2bot
5863e3e008 perf(asset): 资产树,右击增加计算节点数量的菜单,可以让后台去计算 #527 (#5207)
Co-authored-by: xinwen <coderWen@126.com>
2020-12-10 17:12:39 +08:00
xinwen
79a371eb6c perf(auth): 密码过期后,走重置密码流程 #530 2020-12-10 16:06:26 +08:00
fit2bot
7c7de96158 feat(login): 登录日志要体现用哪个backend登录的 #4472 (#5199)
Co-authored-by: xinwen <coderWen@126.com>
2020-12-09 18:43:13 +08:00
ibuler
80b03e73f6 feat(celery): 添加celery的health check接口 2020-12-09 18:06:34 +08:00
fit2bot
32dbab2e34 perf: 数据库应用database字段添加allow_null=True (#5196)
* perf: 数据库应用database字段修改为required

* perf: 数据库应用database字段添加allow_null=True

Co-authored-by: Bai <bugatti_it@163.com>
2020-12-09 13:44:07 +08:00
ibuler
b189e363cc revert(system): 暂时去掉system组件 2020-12-09 11:24:44 +08:00
Bai
4c3a655239 perf: 用户序列类禁止修改source字段 2020-12-08 20:54:33 +08:00
Bai
5533114db5 feat: 用户授权应用树按组织节点进行区分 2020-12-08 20:37:07 +08:00
Jiangjie.Bai
4c469afa95 feat: 取消资产配置相关字段只读模式 (#5182)
* feat: 取消资产配置相关字段只读模式

* feat: 取消资产配置相关字段只读模式
2020-12-08 20:32:48 +08:00
fit2bot
2ccc5beeda perf(Dockerfile): 不再使用zh_CN.UTF-8, en_US.UTF-8 应该也是可以的 (#5190)
* perf(Dockerfile): 不再使用zh_CN.UTF-8, en_US.UTF-8 应该也是可以的

* fix: 还原回原来的LANG设置

* perf: 合并层数

* feat: 修改Dockerfile

Co-authored-by: ibuler <ibuler@qq.com>
Co-authored-by: Bai <bugatti_it@163.com>
2020-12-08 20:22:48 +08:00
xinwen
4b67d6925e feat(asset): api 添加推送系统用户到多个资产 2020-12-08 18:08:44 +08:00
fit2bot
dd979f582a stash (#5178)
* Dev (#4791)

* fix(xpack): 修复last login太长的问题 (#4786)

Co-authored-by: ibuler <ibuler@qq.com>

* perf: 更新密码中也发送邮件 (#4789)

Co-authored-by: ibuler <ibuler@qq.com>

* fix(terminal): 修复获取螺旋的异步api

* fix(terminal): 修复有的录像存储有问题的导致下载录像的bug

* fix(orgs): 修复组织添加用户bug

* perf(requirements): 修改jms-storage==0.0.34 (#4797)

Co-authored-by: Bai <bugatti_it@163.com>

Co-authored-by: fit2bot <68588906+fit2bot@users.noreply.github.com>
Co-authored-by: ibuler <ibuler@qq.com>
Co-authored-by: Bai <bugatti_it@163.com>

* stash

* feat(system): 添加系统app

* stash

* fix: 修复一些bug

Co-authored-by: xinwen <coderWen@126.com>
Co-authored-by: ibuler <ibuler@qq.com>
Co-authored-by: Bai <bugatti_it@163.com>
Co-authored-by: Jiangjie.Bai <32935519+BaiJiangJie@users.noreply.github.com>
2020-12-08 14:26:18 +08:00
Bai
042ea5e137 feat: 授权应用树API返回org_name字段 2020-12-08 11:01:09 +08:00
ibuler
2a6f68c7ba revert: 还原原来的jms,自动运行migraionts 2020-12-07 19:16:27 +08:00
fit2bot
43b5e97b95 feat(excel): 添加Excel导入/导出 (#5124)
* refactor(drf_renderer): 添加 ExcelRenderer 支持导出excel文件格式; 优化CSVRenderer, 抽象 BaseRenderer

* perf(renderer): 支持导出资源详情

* refactor(drf_parser): 添加 ExcelParser 支持导入excel文件格式; 优化CSVParser, 抽象 BaseParser

* refactor(drf_parser): 添加 ExcelParser 支持导入excel文件格式; 优化CSVParser, 抽象 BaseParser 2

* perf(renderer): 捕获renderer处理异常

* perf: 添加excel依赖包

* perf(drf): 优化导入导出错误日志

* perf: 添加依赖包 pyexcel-io==0.6.4

* perf: 添加依赖包pyexcel-xlsx==0.6.0

* feat: 修改drf/renderer&parser变量命名

* feat: 修改drf/renderer的bug

* feat: 修改drf/renderer&parser变量命名

Co-authored-by: Bai <bugatti_it@163.com>
2020-12-07 15:23:05 +08:00
ibuler
619b521ea1 fix: 修改语言i18n 2020-12-05 13:22:22 +08:00
fit2bot
3447eeda68 fix(applications): 修改attrs不能为null (#5172)
Co-authored-by: Bai <bugatti_it@163.com>
2020-12-04 13:10:59 +08:00
fit2bot
75ef413ea5 fix(applications): 修改attrs不能为null (#5171)
Co-authored-by: Bai <bugatti_it@163.com>
2020-12-04 10:24:10 +08:00
fit2bot
662c9092dc reactor(dockerfile): 使用debian构建docker (#5169)
Co-authored-by: ibuler <ibuler@qq.com>
2020-12-03 19:14:28 +08:00
ibuler
c8d54b28e2 perf: 优化变量命名 2020-12-03 14:23:16 +08:00
ibuler
96cd307d1f perf: 优化entrypoint.sh 2020-12-03 14:23:16 +08:00
ibuler
6385cb3f86 perf: 优化启动,不再自动运行migrations 2020-12-03 14:23:16 +08:00
Bai
36e9d8101a fix: 添加迁移文件: Node ordering 2020-12-03 11:16:27 +08:00
ibuler
3354ab8ce9 fix(req): fix wheel version 2020-12-03 10:47:21 +08:00
Bai
89ec6ba6ef fix: Node ordering [parent_key, value]; 修复默认组织Default节点显示问题(存在key为0的Default节点) 2020-12-03 10:45:25 +08:00
Bai
af40e46a75 fix: 优化迁移Default节点 2020-12-03 10:29:00 +08:00
Bai
86fcd3c251 fix: 添加迁移文件(如果需要,将Default节点的key从0修改为1) 2020-12-03 10:29:00 +08:00
fit2bot
c2d5928273 build(pip): 锁定pip版本 (#5152)
* build(pip): 锁定pip版本

* fix: 锁定pip版本

* fix(req): 锁定加密库版本

* fix(build): 引用pip缓存

Co-authored-by: ibuler <ibuler@qq.com>
2020-12-02 11:09:39 +08:00
xinwen
e656ba70ec fix(assets): 推送动态系统用户未指定 username 取全部 usernames 2020-12-01 20:08:54 +08:00
xinwen
bb807e6251 fix(perms): 新建授权时动态用户可能推送不成功 2020-12-01 20:08:54 +08:00
Bai
bbd6cae3d7 perf(org): 优化获取org_name字段 2020-11-30 15:21:56 +08:00
Bai
c3b09dd800 perf(perms): 优化用户授权树返回org_name字段;添加thread_local属性org_mapper减少查询次数 2020-11-30 15:21:56 +08:00
Bai
6d427b9834 fix: 禁止删除组织根节点 2020-11-30 14:19:20 +08:00
xinwen
610aaf5244 fix(assets): 动态系统用户和用户关系变化时没有推送到资产 2020-11-26 15:20:09 +08:00
xinwen
df2f1b3e6e perf(User): 用户列表在大规模数据情况下慢 2020-11-26 12:32:18 +08:00
fit2bot
f26b7a470a perf(celery-task): 优化检查节点资产数量的 Celery 任务 (#5052)
Co-authored-by: xinwen <coderWen@126.com>
2020-11-25 17:30:07 +08:00
xinwen
a4667f3312 fix(Node): Node 保存的时候,在信号里设置 parent_key 2020-11-25 16:35:12 +08:00
xinwen
91081d9423 refactor(perms): 在动态用户所绑定的授权规则中,如授权给用户组,当用户组增加成员后,动态系统用户下没有相应增加用户,因此也不会自动推送 (#5084) (#5086) 2020-11-24 19:31:45 +08:00
fit2bot
3041697edc fix(orgs): 兼容旧的组织用户关系接口 (#5088)
Co-authored-by: xinwen <coderWen@126.com>
2020-11-24 19:09:14 +08:00
xinwen
75d7530ea5 fix(perms): 在动态用户所绑定的授权规则中,如授权给用户组,当用户组增加成员后,动态系统用户下没有相应增加用户,因此也不会自动推送 2020-11-24 10:27:03 +08:00
ibuler
975cc41bce perf(build): 优化使用pip mirror 2020-11-22 18:16:45 +08:00
ibuler
439999381d perf(build): 优化构建时用的mirror 2020-11-22 17:51:24 +08:00
xinwen
39ab5978be perf(perms): 获取用户所有授权时转换成 list 2020-11-22 17:24:00 +08:00
ibuler
7be7c8cee1 fix(perms): 修复我的资产页面问题 2020-11-22 16:55:21 +08:00
xinwen
68b22cbdec fix(perms): 修复用户组授权树与资产问题 2020-11-22 15:03:03 +08:00
xinwen
a7c704bea3 perf(celery-task): 优化检查节点资产数量的 Celery 任务 2020-11-22 11:36:10 +08:00
xinwen
21993b0d89 perf(perms): 优化用户授权资产列表加载速度 2020-11-22 11:35:13 +08:00
xinwen
73ccf3be5f fix(perms): 当用户授权为空时,清空旧的授权树 2020-11-22 11:26:39 +08:00
ibuler
bf3056abc4 fix(django3): 修复django3兼容问题 2020-11-20 15:25:37 +08:00
xinwen
f2fd9f5990 perf(assets): 限制搜索授权资产返回的条数 2020-11-20 15:24:33 +08:00
fit2bot
6d39a51c36 [fix]: 兼容django 3 (#5038)
* chore(django): 修改版本依赖

* [fix]: 兼容django 3

* fix(merge): 去掉不用的JSONField

* fix(requirements): 修改加密库的版本

Co-authored-by: ibuler <ibuler@qq.com>
2020-11-19 15:50:31 +08:00
xinwen
7fa94008c9 fix(old-api): 调整旧的组织与用户关联接口 2020-11-19 15:21:16 +08:00
Jiangjie.Bai
9685a25dc6 Merge pull request #5034 from jumpserver/dev
fix(perms): nodes-with-assets 接口添加刷新重构树
2020-11-18 11:53:12 +08:00
xinwen
1af4fcd381 fix(perms): nodes-with-assets 接口添加刷新重构树 2020-11-18 11:47:50 +08:00
Jiangjie.Bai
177055acdc Merge pull request #5032 from jumpserver/dev
fix(assets): 修复获取org_root的问题
2020-11-18 11:29:53 +08:00
ibuler
6ec0b3ad54 fix(assets): 修复获取org_root的问题 2020-11-18 11:23:39 +08:00
Orange
49dd611292 Merge pull request #5029 from jumpserver/dev
Dev
2020-11-17 19:46:36 +08:00
xinwen
f557c1dace fix(assets): model 添加 asset 默认排序 2020-11-17 19:40:49 +08:00
xinwen
6e87b94789 fix(command): 系统设置-安全设置-告警接收邮件字段如果为空,则更新不了 2020-11-17 19:19:08 +08:00
xinwen
b0dba35e5a fix(public-api): 缺少SECURITY_PASSWORD_EXPIRATION_TIME字段 2020-11-17 19:11:39 +08:00
ibuler
d0b19f20c3 perf(tickets): 优化授权申请工单 2020-11-17 19:10:23 +08:00
xinwen
3e78d627f8 fix(perms): 作业中心-批量命令-选择系统用户之后,左侧资产列表未筛选,还是全部资产 2020-11-17 18:45:45 +08:00
Jiangjie.Bai
0763404235 Merge pull request #5021 from jumpserver/dev
Dev
2020-11-17 16:48:07 +08:00
ibuler
31cd441a34 fix(assets): 修复资产导入时填写节点引起的空节点名称的问题 2020-11-17 16:41:17 +08:00
ibuler
40c0aac5a9 fix(ops): 修复run as 可能引起的返回多个asset user 2020-11-17 16:39:13 +08:00
ibuler
83099dcd16 fix(ops): 修复task run as的问题 2020-11-17 16:39:13 +08:00
ibuler
0db72bd00f fix(i18n): 修复js18n的问题 2020-11-17 16:29:09 +08:00
Bai
732b8cc0b8 fix(ops): 修复AdHocExecution字段task_display长度问题 2020-11-17 16:27:41 +08:00
Bai
a9f90b4e31 perf(terminal): 修改数据库字段长度(command_stroage/replay_storage name 128) 2020-11-17 15:59:42 +08:00
Bai
cf1fbabca1 fix(command): 修复批量命令执行可能获取到host为None的问题 2020-11-17 15:25:11 +08:00
ibuler
cbcefe8bd3 fix(ops): 修复因为更改controlmaster引起的连接不服用维内托 2020-11-16 15:22:25 +08:00
ibuler
133a2e4714 fix(assets): 修复动态系统用户推送的bug 2020-11-16 15:07:01 +08:00
fit2bot
b4b9149d5d chore(docs): 修改readme (#5005)
* chore(docs): 修改readme

* chore(docs): 修改文档拼写

Co-authored-by: ibuler <ibuler@qq.com>
2020-11-16 14:52:07 +08:00
fit2bot
9af4d5f76f fix(crypto): 有时解密失败 (#5003)
* fix(nodes): 节点默认按 value 排序

* fix(crypto): 有时解密失败

Co-authored-by: xinwen <coderWen@126.com>
2020-11-16 14:47:27 +08:00
Orange
96d26cc96f Merge pull request #4991 from jumpserver/dev
fix(system_user): 修复更新系统用户时ad_domain字段为空提交失败的问题
2020-11-13 14:57:23 +08:00
Bai
18d8f59bb5 fix(system_user): 修复更新系统用户时ad_domain字段为空提交失败的问题 2020-11-13 14:56:15 +08:00
老广
841f707b6d Merge pull request #4982 from jumpserver/dev
Dev
2020-11-12 14:12:35 +08:00
xinwen
448c5db3bb fix(orgs): 添加旧的 member 相关 api 2020-11-12 11:20:42 +08:00
Bai
75b886675e perf(i18n): 更新翻译 2020-11-12 11:14:23 +08:00
Bai
24e22115de perf(i18n): 更新翻译 2020-11-12 11:14:23 +08:00
Bai
b18ead0ffa perf(i18n): 更新翻译 2020-11-12 11:01:48 +08:00
xinwen
6e8922da1c fix(trans): 完善翻译 2020-11-12 10:12:25 +08:00
Bai
dcb38ef534 perf(session): 修改变量名terminate 2020-11-11 19:10:53 +08:00
Bai
8a693d9fa7 feat(session): session列表返回can_termination字段值;设置所有db协议类型会话不可被终断和监控 2020-11-11 17:49:54 +08:00
xinwen
487932590b fix(terminal): 扩展 Terminal name 长度 2020-11-11 15:01:30 +08:00
peijianbo
79b5aa68c8 feat(terminal):危险命令告警功能 2020-11-10 18:31:16 +08:00
Bai
50a4735b07 pref(cloud): 添加 Azure 依赖包 2020-11-10 11:13:00 +08:00
Bai
1183109354 perf(github): 更新github issue模版 2020-11-10 10:30:15 +08:00
xinwen
202e619c4b feat(assets): 添加系统用户资产列表 api 2020-11-10 10:23:51 +08:00
xinwen
179cb7531c feat(assets): 管理用户的资产列表增加 options 方法 2020-11-10 10:23:51 +08:00
ibuler
987f840431 fix(i18n): 修复重置密码那的i18n问题 2020-11-10 10:17:55 +08:00
fit2bot
f04544e8df feat(MFA): 修改文案Google Authenticator为手机验证器 (#4964)
* feat(MFA): 修改文案Google Authenticator为手机验证器

* feat(MFA): 修改文案手机验证器 为 MFA验证器

* feat(MFA): 修改文案手机验证器 为 MFA验证器2

Co-authored-by: Bai <bugatti_it@163.com>
2020-11-09 19:08:24 +08:00
fit2bot
cd6dc6a722 fix(perms): 由于组织不对,导致生成或显示授权树错误 (#4957)
* perf(perms): 优化授权树生成速度

* fix(perms): 由于组织不对,导致生成或显示授权树错误

Co-authored-by: xinwen <coderWen@126.com>
2020-11-09 17:30:50 +08:00
Bai
150552d734 perf(requirements): 升级依赖版本jms-storage==0.0.35 2020-11-09 17:29:10 +08:00
Bai
388314ca5a fix(applications): 修复应用列表会返回所有组织下数据的问题 2020-11-09 16:46:38 +08:00
Bai
26d00329e7 fix(assets): 修复计算node full value时,获取node value 有可能为 __proxy__ 的问题 2020-11-06 10:20:48 +08:00
xinwen
e93be8f828 feat(crypto): 支持国密算法 2020-11-05 19:04:50 +08:00
Bai
2690092faf perf(ldap): LDAP用户搜索,本地忽略大小写,远端支持模糊 2020-11-05 14:57:08 +08:00
Bai
eabaae81ac perf(application): 优化RemoteApp应用Chrome序列类的字符串主键关联字段 2020-11-05 14:13:45 +08:00
Bai
6df331cbed perf(perms): 用户/用户组授权的所有应用API返回attrs属性 2020-11-05 11:25:32 +08:00
xinwen
0390e37fd5 feat(orgs): relation 增加搜索功能 2020-11-05 10:40:00 +08:00
xinwen
44d9aff573 fix(orgs): 改正单词拼写 2020-11-05 09:58:58 +08:00
xinwen
2b4f8bd11c feat(ops): Task 支持批量操作 2020-11-05 09:54:51 +08:00
xinwen
231332585d fix(perms): 重建授权树冲突时,响应里加 code 2020-11-03 17:53:49 +08:00
ibuler
531de188d6 perf(systemuser): 优化系统用户家目录权限更改 2020-11-03 17:51:02 +08:00
ibuler
0c1f717fb2 feat(assets): 推送系统用户增加comment 2020-11-03 17:51:02 +08:00
xinwen
9d9177ed05 refactor(terminal): 去掉 Session 中 Terminal 的外键关系 2020-11-03 15:00:44 +08:00
xinwen
ab77d5db4b fix(orgs): org-member-relation url 拼写错误 2020-11-03 14:49:50 +08:00
ibuler
eadecb83ed fix: 修改系统用户节点关系的抖索 2020-11-03 14:33:15 +08:00
ibuler
5d6088abd3 fix(assets): 修复nodes display pop 引起的bug 2020-11-03 11:22:36 +08:00
xinwen
38f7c123e5 fix(audits): sso 登录日志没有 type 2020-11-03 11:18:56 +08:00
xinwen
d7daf7071a fix(ticket): 工单申请资产的时候审批人不能搜索 2020-11-03 10:53:31 +08:00
Bai
795245d7f4 perf(perms): 授权给用户应用列表API添加dispaly字段 2020-11-02 10:26:39 +08:00
Bai
7ea2a0d6a5 perf(perms): 应用授权规则序列类添加applications类型校验 2020-10-30 17:25:34 +08:00
xinwen
c90b9d70dc perf(perms): 优化根据资产获取授权的系统用户 2020-10-30 15:59:19 +08:00
Bai
f6c24f809c perf(application): 修改DoaminAPI返回application数量;修改Application数据库datbase字段required=False 2020-10-30 15:58:46 +08:00
Bai
e369a8d51f perf(readme): 修改Readme 2020-10-30 15:44:42 +08:00
Bai
c74c9f51f0 perf(readme): 修改Readme 2020-10-30 15:44:05 +08:00
ibuler
57bf9ca8b1 perf(assets): 优化更新子节点名称的算法 2020-10-30 14:17:09 +08:00
Bai
ddc2d1106b perf(perms): 修改授权授权应用API,添加category/type过滤字段 2020-10-30 12:54:33 +08:00
ibuler
15992c636a perf: 优化node full value 2020-10-30 10:50:45 +08:00
Bai
36cd18ab9a perf(perms): 修改变量名 2020-10-30 10:47:32 +08:00
Bai
676ee93837 perf(perms): 优化方法名称;授权查询语句; 2020-10-30 10:47:32 +08:00
fit2bot
c02f8e499b feat: 添加资产导入时可以直接写节点 (#4868)
* feat: 优化资产导入, 可以添加节点全称,并自动创建

* feat: 添加资产导入时可以直接写节点

* fix: 修改错误

* fix: 添加node value校验,不能包含/

* chore: merge migrations

* perf: 去掉full value replace

Co-authored-by: ibuler <ibuler@qq.com>
2020-10-30 10:16:49 +08:00
ibuler
4ebb4d1b6d chore: resolve conflict 2020-10-29 19:19:20 +08:00
Bai
5e7650d719 perf(application): RemoteApp应用序列类返回asset_info字段 2020-10-29 05:48:01 -05:00
Bai
bf302f47e5 perf(application): 修改RemoteA序列类asset required=False 2020-10-29 05:48:01 -05:00
Bai
1ddc228449 perf(application): RemoteApp序列类字段asset,设置为CharPrimaryKeyRelatedField,修改其他字段的required=Flase 2020-10-29 05:48:01 -05:00
Bai
c9065fd96e perf(application): 优化一些小细节 2020-10-29 05:48:01 -05:00
Bai
4a09dc6e3e feat(assets): 修改GatewayModel ip字段类型为CharField 2020-10-29 05:48:01 -05:00
Bai
55bfb942e2 perf(applications): 修改应用序列类字段label 2020-10-29 05:48:01 -05:00
Bai
9aed51ffe9 perf(applications): 修改应用序列类字段长度限制 2020-10-29 05:48:01 -05:00
Bai
a98816462f perf(applications): 添加DB序列类字段翻译 2020-10-29 05:48:01 -05:00
Bai
abe32e6c79 perf(perms): 添加应用/应用授权API的type_display/category_display字段 2020-10-29 05:48:01 -05:00
Bai
77c8ca5863 perf(perms): 应用授权表添加字段,type和category 2020-10-29 05:48:01 -05:00
Bai
8fa15b3378 perf(assets/terminal): 资产系统用户和Session会话添加协议选项: mysql/oracle/postgresql 2020-10-29 05:48:01 -05:00
Bai
a3507975fb perf(application): 修改获取远程应用连接参数的API(2) 2020-10-29 05:48:01 -05:00
Bai
76ca6d587d perf(application): 修改获取远程应用连接参数的API 2020-10-29 05:48:01 -05:00
Bai
038582a8c1 feat(applications): 修改应用/应用授权的迁移文件,解决多种应用/应用授权name字段重复的问题 2020-10-29 05:48:01 -05:00
Bai
ca2fc3cb5e feat(applications): 修改ApplicationAPI方法获取序列类的逻辑 2020-10-29 05:48:01 -05:00
Bai
cc30b766f8 feat(applications): 修改ApplicationAPI方法method判断->action 2020-10-29 05:48:01 -05:00
Michael Bai
b7bd88b8a0 feat(applications): 修改ApplicationAPI方法options 2020-10-29 05:48:01 -05:00
Bai
5518e1e00f perf(application): 优化Application获取序列类attrs字段适配 2020-10-29 05:48:01 -05:00
Bai
0632e88f5d feat(application): 修改Application Model的domain字段选项2 2020-10-29 05:48:01 -05:00
Bai
9dc2255894 feat(application): 修改Application Model的domain字段选项 2020-10-29 05:48:01 -05:00
Bai
1baf35004d refactor(perms): 添加应用授权规则迁移文件;迁移旧的应用授权(db/remoteapp/k8sapp)到新的应用授权 2020-10-29 05:48:01 -05:00
Bai
5acff310f7 perf(application): 修改迁移文件,迁移应用包含id字段 2020-10-29 05:48:01 -05:00
Bai
fdded8b90f refactor(perms): 修改授权规则的目录结构(asset、application) 2020-10-29 05:48:01 -05:00
Bai
1d550cbe64 feat(perms): 添加ApplicationPermission API(包含用户/用户组/授权/校验等API) 2020-10-29 05:48:01 -05:00
Bai
4847b7a680 feat(perms): 添加ApplicationPermission Model 和 API(包含ViewSet和RelationViewSet) 2020-10-29 05:48:01 -05:00
Bai
1c551b4fe8 feat(application): 迁移old_application到new_application 2020-10-29 05:48:01 -05:00
Bai
6ffba739f2 perf(requirements): 添加依赖django-mysql==3.9.0 2020-10-29 05:48:01 -05:00
ibuler
0282346945 perf: 修改创建 2020-10-29 05:48:01 -05:00
ibuler
f6d9af8beb perf(application): 优化type优先级 2020-10-29 05:48:01 -05:00
ibuler
ba4e6e9a9f refacter: 重构application 2020-10-29 05:48:01 -05:00
ibuler
874a3eeebf perf(sessions): 优化命令 2020-10-29 18:27:36 +08:00
ibuler
dd793a4eca perf: 优化日志保存策略 2020-10-29 18:27:36 +08:00
xinwen
f7e6c14bc5 fix(assets): 向资产推送系统用户bug 2020-10-28 21:40:40 -05:00
xinwen
f6031d6f5d fix(logger): 把 drf 异常放到单独的日志文件中 2020-10-28 21:26:35 -05:00
xinwen
5e779e6542 fix(systemuser): 系统用户添加 ad_domain 字段 2020-10-28 21:23:29 -05:00
xinwen
7031b7f28b fix(auth): 修复用户登录失败次数出现0次 2020-10-28 06:06:57 -05:00
xinwen
e2f540a1f4 fix(assets): 网关的密码不能包含特殊字符 2020-10-28 06:05:53 -05:00
ibuler
108a1da212 chore: 修改github 语言识别 2020-10-26 20:51:09 -05:00
xinwen
b4a8cb768b fix(assets): 系统用户与用户组发生变化时报错 2020-10-26 05:31:30 -05:00
xinwen
6b2f606430 fix(tickets): 工单申请资产授权通过人显示不对 2020-10-26 02:03:12 -05:00
xinwen
70a8db895d fix(migrations): 生成一下遗漏的 migrations 2020-10-23 17:00:37 +08:00
xinwen
0043dc6110 fix(assets): 资产列表添加默认 date_created 排序 2020-10-23 17:00:37 +08:00
xinwen
87d2798612 fix(assets): 资产列表不能用到assets_amount字段 2020-10-23 16:54:19 +08:00
ibuler
e2d8eee629 fix(i18n): 修改收藏夹的翻译 2020-10-23 16:52:06 +08:00
xinwen
8404db8cef fix(assets): 修改 AssetViewSet.filter_fields 2020-10-23 02:25:04 -05:00
Bai
fd7f379b10 perf(requirements): 升级依赖django-timezone-field==4.0 2020-10-21 13:01:28 +08:00
ibuler
111c63ee6a perf: 修改beat版本 2020-10-20 15:02:11 +08:00
ibuler
4eb5d51840 fix: domain端口可能随便填写的问题 2020-10-20 15:01:06 +08:00
ibuler
7f53a80855 fix: 修改textfield 不再限制长度 2020-10-20 15:00:22 +08:00
ibuler
90afabdcb2 fix(perms): 修复用户的资产不区分组织的问题 2020-10-20 14:56:23 +08:00
ibuler
de405be753 fix(perms): 修复asset permission导入的bug 2020-10-20 14:46:31 +08:00
Bai
f84b845385 perf(config): 升级依赖redis==3.5.3; 添加CACHES配置: health_check_interval=30; 解决因网络不稳定导致的redis连接失败异常 2020-10-19 04:45:34 -05:00
xinwen
b1ac3fa94f fix(orgs): 更新用户时org_roles参数为None时不更新组织角色 2020-10-15 04:59:25 -05:00
Jiangjie.Bai
32fab08ed3 Merge pull request #4802 from jumpserver/dev
chore: merge dev to master
2020-10-15 14:08:36 +08:00
fit2bot
8943850ca9 perf(Dockerfile): 去掉多余的代码 (#4801)
* perf(Dockerfile): 优化构建docker,经常变动的包不使用镜像

Co-authored-by: ibuler <ibuler@qq.com>
2020-10-15 14:05:11 +08:00
ibuler
8fff57813a perf(Dockerfile): 优化构建docker,经常变动的包不使用镜像 2020-10-15 00:59:58 -05:00
xinwen
9128210e87 Merge pull request #4798 from jumpserver/dev
Dev to master
2020-10-15 12:22:00 +08:00
xinwen
e3dd03f4c7 Dev (#4791)
* fix(xpack): 修复last login太长的问题 (#4786)

Co-authored-by: ibuler <ibuler@qq.com>

* perf: 更新密码中也发送邮件 (#4789)

Co-authored-by: ibuler <ibuler@qq.com>

* fix(terminal): 修复获取螺旋的异步api

* fix(terminal): 修复有的录像存储有问题的导致下载录像的bug

* fix(orgs): 修复组织添加用户bug

* perf(requirements): 修改jms-storage==0.0.34 (#4797)

Co-authored-by: Bai <bugatti_it@163.com>

Co-authored-by: fit2bot <68588906+fit2bot@users.noreply.github.com>
Co-authored-by: ibuler <ibuler@qq.com>
Co-authored-by: Bai <bugatti_it@163.com>
2020-10-15 12:00:38 +08:00
fit2bot
aabb2aff1f perf(requirements): 修改jms-storage==0.0.34 (#4797)
Co-authored-by: Bai <bugatti_it@163.com>
2020-10-15 11:51:45 +08:00
xinwen
f8bbca38e3 fix(orgs): 修复组织添加用户bug 2020-10-14 22:50:55 -05:00
ibuler
12b180ddea fix(terminal): 修复有的录像存储有问题的导致下载录像的bug 2020-10-15 11:48:52 +08:00
ibuler
4917769964 fix(terminal): 修复获取螺旋的异步api 2020-10-15 11:48:52 +08:00
fit2bot
5868d56e18 perf: 更新密码中也发送邮件 (#4789)
Co-authored-by: ibuler <ibuler@qq.com>
2020-10-14 19:59:05 +08:00
fit2bot
bab76f3cda fix(xpack): 修复last login太长的问题 (#4786)
Co-authored-by: ibuler <ibuler@qq.com>
2020-10-14 19:58:44 +08:00
xinwen
475c0e4187 Merge pull request #4784 from jumpserver/dev
Dev
2020-10-14 15:59:21 +08:00
老广
3d070231f4 Merge pull request #4783 from jumpserver/pr@dev@chore_merge
chore: merge with master
2020-10-14 02:55:28 -05:00
ibuler
c5b0cafabd chore: merge with master 2020-10-14 15:53:06 +08:00
ibuler
a69bba8702 Merge branch 'dev' of github.com:jumpserver/jumpserver into dev 2020-10-14 15:48:32 +08:00
xinwen
cfd0098019 fix(perms): 修复一次性获取所有资产与节点sql泛滥问题 2020-10-14 02:43:22 -05:00
ibuler
52f1dcf662 fix(users): 修复邀请用户的bug 2020-10-14 15:31:24 +08:00
xinwen
373c6c77e0 fix(perms): 未激活资产不能使用 2020-10-13 19:34:19 +08:00
xinwen
f3d052554d fix(perms): 修复失效资产授权action 还在的问题 2020-10-13 19:34:19 +08:00
xinwen
a57ce482dd fix(assets): 资产树批量删除资产数量不对 2020-10-13 19:34:19 +08:00
xinwen
a449d97f67 fix(orgs): 组织添加成员bug 2020-10-13 19:34:19 +08:00
ibuler
84e4238848 fix(ops): 修复任务schedule属性的bug 2020-10-13 19:34:19 +08:00
clannon
82dd1c35ea Update config_example.yml
SECRET_KEY 和 BOOTSTRAP_TOKEN 后面默认留个空格,遇到好几个新手对yaml格式不注意了,虽然是个小问题……
2020-10-13 19:34:19 +08:00
ibuler
5ac974a44c fix(deps): 修复依赖版本 2020-10-13 19:34:19 +08:00
fit2bot
c4caeb92ee perf: 优化生成假数据 (#4759)
* perf: 优化生成假数据
2020-10-13 19:34:19 +08:00
ibuler
f4799d90c0 fix(assets): 修复点击节点更新硬件信息的bug 2020-10-13 19:34:19 +08:00
ibuler
f97685c788 perf(orgs): 优化组织用户添加 2020-10-13 19:34:19 +08:00
ibuler
0439376326 feat(users): 添加用户suggetion api 2020-10-13 19:34:19 +08:00
xinwen
dd2413edd8 fix(perms): 授权树与资产列表的一些 bug 2020-10-13 19:34:19 +08:00
xinwen
459c5c07c9 fix(perms): 未激活资产不能使用 2020-10-13 06:30:18 -05:00
xinwen
ef86a49c1e fix(perms): 修复失效资产授权action 还在的问题 2020-10-13 06:20:43 -05:00
xinwen
0ad389515b fix(assets): 资产树批量删除资产数量不对 2020-10-13 05:47:54 -05:00
xinwen
2432b9a553 fix(orgs): 组织添加成员bug 2020-10-13 03:38:25 -05:00
ibuler
12216a718a Merge branch 'dev' of github.com:jumpserver/jumpserver into dev 2020-10-13 10:52:18 +08:00
ibuler
2190db1bb5 fix(ops): 修复任务schedule属性的bug 2020-10-12 21:51:31 -05:00
ibuler
5d36537404 fix(ops): 修复任务schedule属性的bug 2020-10-12 19:08:13 +08:00
clannon
5a87634c26 Update config_example.yml
SECRET_KEY 和 BOOTSTRAP_TOKEN 后面默认留个空格,遇到好几个新手对yaml格式不注意了,虽然是个小问题……
2020-10-12 05:23:49 -05:00
老广
e5eb84999a Merge pull request #4768 from jumpserver/pr@dev@chore_merge_with_master
Merge branch 'master' into dev
2020-10-12 05:22:25 -05:00
ibuler
426a86b52d Merge branch 'master' into dev 2020-10-12 18:19:58 +08:00
ibuler
f84acfe282 fix(deps): 修复依赖版本 2020-10-12 05:13:52 -05:00
fit2bot
c73b49fe30 perf: 优化生成假数据 (#4759)
* perf: 优化生成假数据
2020-10-12 12:44:30 +08:00
ibuler
98238f71ae fix(assets): 修复点击节点更新硬件信息的bug 2020-10-11 22:31:08 -05:00
ibuler
66e45f1c80 perf(orgs): 优化组织用户添加 2020-10-12 11:25:43 +08:00
ibuler
93a400f6e6 feat(users): 添加用户suggetion api 2020-10-12 11:18:11 +08:00
xinwen
535d7d8373 fix(perms): 授权树与资产列表的一些 bug 2020-10-11 21:43:44 -05:00
Bai
db268280b4 perf(requirements): 修改依赖包Pillow==7.1.0 2020-10-08 22:06:43 -05:00
Bai
873789bdab perf(requirements): 修改依赖包Pillow==7.1.0 2020-10-08 22:04:23 -05:00
Jiangjie.Bai
6584890ab1 Merge pull request #4754 from jumpserver/dev
Dev
2020-10-09 10:37:47 +08:00
fit2bot
96d5c519ec perf(i18n): 添加翻译信息 (#4748)
* perf(i18n): 添加翻译信息

* perf(users): 重置密码成功邮件添加DEBUG信息

* perf(i18n): 修改翻译信息

* perf(i18n): 修改翻译信息

Co-authored-by: Bai <bugatti_it@163.com>
2020-09-30 16:11:03 +08:00
Bai
6e91217303 perf(authentication): 修改用户登录页面,使用其他方式认证时点击忘记密码提示联系管理员 2020-09-30 02:05:14 -05:00
xinwen
5dd1dfc59e fix(orgs): 用户修改组织角色报错 2020-09-30 11:40:52 +08:00
Bai
a53e930950 perf(tickets): 申请资产工单支持授权多个系统用户 2020-09-29 19:11:20 +08:00
xinwen
8f52f79d91 fix(migrations): 增加迁移脚本 2020-09-29 17:48:19 +08:00
xinwen
3af0e68c84 fix(perms): 用户授权树bug 2020-09-29 17:25:00 +08:00
Bai
3ccf32ed48 feat(authentication): 用户重置密码成功后,发送用户重置密码成功邮件 2020-09-29 16:28:14 +08:00
xinwen
d52ed2ffb9 fix(xpack): GatheredUser 点击资产树报错 2020-09-29 16:26:02 +08:00
ibuler
38588151d1 继续修改issue template 2020-09-29 14:28:27 +08:00
ibuler
2a95aca28f chore: 修改issue模板 2020-09-29 14:18:14 +08:00
Bai
1915224063 fix(terminal): 修复正在使用的命令/录像存储可以被删除的问题 2020-09-29 13:37:48 +08:00
fit2bot
da4f9efb42 fix(perms): 修改检查资产授权过期策略 (#4722)
* fix(perms): 修改检查资产授权过期策略

* perf: 优化一行代码

Co-authored-by: xinwen <coderWen@126.com>
Co-authored-by: ibuler <ibuler@qq.com>
2020-09-29 13:34:55 +08:00
fit2bot
579c2c1d7a feat(celery): 保证同时只有一个beat在运行 (#4723)
* feat(celery): 保证同时只有一个beat在运行

* fix: 修复代码拼写错误

* fix: 修复拼写

* fix: remove import

Co-authored-by: ibuler <ibuler@qq.com>
2020-09-29 13:33:53 +08:00
xinwen
2a86c3a376 fix(perms): 完善检查资产授权过期的celery task 2020-09-29 11:44:20 +08:00
Bai
5558e854de perf(permms): 应用授权树返回授权应用的总数量 2020-09-29 11:43:31 +08:00
xinwen
31b6c3b679 feat(celery): 设置 node_tree celery work 为 2 2020-09-28 18:49:40 +08:00
fit2bot
2209a2c8d2 feat(orgs): 修改OrgMemberRelationAPI,支持通过查询参数控制是否忽略已存在的数据 (#4720)
* feat(orgs): 修改OrgMemberRelationAPI,支持通过查询参数控制是否忽略已存在的数据

* feat(orgs): 修改构建数据库查询参数的问题

Co-authored-by: Bai <bugatti_it@163.com>
2020-09-28 18:49:18 +08:00
xinwen
f596b65ed7 feat(auth): sso 生成的地址重复访问的时候,重定向到用户指定的 next 地址 2020-09-28 16:20:32 +08:00
fit2bot
2c9c64a13f fix(jms): 启动脚本 task 添加 celery_node_tree, check_asset_perm_expired 两个 worker (#4716)
* fix(jms): 启动脚本 task 添加 celery_node_tree, check_asset_perm_expired 两个 worker

* fix: 修改脚本

Co-authored-by: xinwen <coderWen@126.com>
Co-authored-by: ibuler <ibuler@qq.com>
2020-09-28 16:19:49 +08:00
xinwen
1d49b3deca fix(perms): 用户搜索全部授权资产报错 2020-09-28 13:17:45 +08:00
xinwen
6701a1b604 fix(perms): 用户添加到用户组报错 2020-09-27 20:53:34 +08:00
ibuler
b8ff3b38bf fix: 修复middleware引起的bug 2020-09-27 17:58:41 +08:00
fit2bot
d3be16ffe8 fix (#4680)
* perf(perms): 资产授权列表关联数据改为 `prefetch_related`

* perf(perms): 优化一波

* dispatch_mapping_node_tasks.delay

* perf: 在做一些优化

* perf: 再优化一波

* perf(perms): 授权更改节点慢的问题

* fix: 修改一处bug

* perf(perms): ungrouped 资产数量计算方式

* fix: 修复dispatch data中的bug

* fix(assets): add_nodes_assets_to_system_users celery task

* fix: 修复ungrouped的bug

* feat(nodes): 添加 favorite 节点

* feat(node): 添加 favorite api

* fix: 修复clean keys的bug


Co-authored-by: xinwen <coderWen@126.com>
Co-authored-by: ibuler <ibuler@qq.com>
2020-09-27 16:02:44 +08:00
老广
e3648d11b1 feat: 录像存储server类型,可以设置如何存储了 (#4699)
feat: Server 类型的录像存储可以上传到 oss等上面
2020-09-27 14:34:47 +08:00
fit2bot
d4037998c8 perf: 优化middleware的使用 (#4707)
perf: 优化middleware的使用
2020-09-27 14:34:05 +08:00
fit2bot
82de636b5c perf(common): 检查referer (#4697)
Co-authored-by: ibuler <ibuler@qq.com>
2020-09-27 11:48:21 +08:00
Bai
91f1280f97 perf(settings): public setting API返回LOGIN_TITLE字段 2020-09-27 11:37:38 +08:00
fit2bot
7c82f5aa2b chore: 修改issue template (#4703)
* chore: 修改issue template

* perf: 又修改了些

Co-authored-by: ibuler <ibuler@qq.com>
2020-09-25 16:09:56 +08:00
老广
6a801eaf33 Update issue templates 2020-09-25 15:47:43 +08:00
xinwen
28da819735 perf(assets): 优化节点树
修改树策略,做读优化,写的速度降低
2020-09-21 10:23:09 +08:00
Jiangjie.Bai
cdf3cf3e8f Merge pull request #4667 from jumpserver/dev
fix(command): 修复命令导出选中项问题
2020-09-16 19:55:18 +08:00
Bai
118564577e fix(command): 修复命令导出选中项问题 2020-09-16 19:53:58 +08:00
Jiangjie.Bai
47f2df0a5b Merge pull request #4665 from jumpserver/dev
fix(command): 修复命令导出选中项问题
2020-09-16 19:38:16 +08:00
Bai
e4aafc236d fix(command): 修复命令导出选中项问题 2020-09-16 19:36:21 +08:00
Jiangjie.Bai
b1c530bba8 Merge pull request #4661 from jumpserver/dev
Dev
2020-09-16 19:03:38 +08:00
Bai
95aa9781c3 fix(command): 修复命令导出选中项会导出全部的问题 2020-09-16 19:02:49 +08:00
xinwen
9f6540afe3 fix(tickets): 调整登录确认工单 title 2020-09-16 18:03:15 +08:00
ibuler
832bb832ce fix(authentication): 修复cas退出的bug 2020-09-16 17:53:01 +08:00
ibuler
501329a8db fix: 再次修复 2020-09-16 17:51:45 +08:00
ibuler
8913aacd1e fix(authentication): 修复同时开启radius, openid引起的问题 2020-09-16 17:51:45 +08:00
xinwen
e461fbdf50 fix(tickets): 修复已处理工单的 待处理人 字段 2020-09-16 17:47:26 +08:00
peijianbo
3941539408 fix(authentication):修复开启二次认证时,地址跳转出错问题 2020-09-16 16:46:28 +08:00
xinwen
605db2d905 fix(auth): 调整登录复核工单 title 2020-09-16 15:31:20 +08:00
Jiangjie.Bai
1ef3f24465 Merge pull request #4648 from jumpserver/dev
chore: merge dev to master
2020-09-15 17:23:48 +08:00
peijianbo
4090a0b123 feat(uathentication):登录表单回车可直接提交表单 2020-09-15 17:10:52 +08:00
ibuler
a55e28fc87 perf: 优化ldap超时时间 2020-09-15 15:26:18 +08:00
ibuler
82cf53181f perf(settings): 修改默认超时时间为10s 2020-09-15 15:26:18 +08:00
ibuler
78232aa900 perf(terminal): 优化命令提交 2020-09-14 19:25:50 +08:00
ibuler
d2c93aff66 feat: 可以关闭工单菜单 2020-09-14 18:25:47 +08:00
peijianbo
516e2309c0 bug(authentication): 登录表单仅提交时加密(xpack) 2020-09-14 17:28:49 +08:00
peijianbo
4688e46f97 feat(authentication):将cas认证通过的登录日志记录到系统 2020-09-14 12:46:12 +08:00
peijianbo
1299f3da75 feat(authentication):登录表单仅提交时加密 2020-09-14 12:45:44 +08:00
Bai
fe502cbe41 fix(assets): 修复系统用户导入模版没有密码字段的问题 2020-09-14 12:43:39 +08:00
xinwen
09bfac34f1 fix(orgs): 修复 org-memeber-relation POST 报错 2020-09-14 11:00:10 +08:00
Jiangjie.Bai
12a86d7244 Merge pull request #4611 from jumpserver/dev
Dev
2020-09-08 20:08:53 +08:00
Jiangjie.Bai
269eea8802 Merge branch 'master' into dev 2020-09-08 19:32:38 +08:00
老广
72aa265dd7 doc: 修改readme,添加子项目连接 (#4602)
* Update README.md

* Update README.md

* Update README.md

* Update README.md

* Update README.md

* Update README.md
2020-09-08 14:31:54 +08:00
老广
e26716e1e1 Update README.md
docs: 添加  developer wanted
2020-09-08 14:30:31 +08:00
Bai
80b9db417c feat(ldap): 获取ldap用户列表,采用线程方式 2020-09-08 11:57:45 +08:00
Bai
d944b5f4ff feat(tickets): 工单添加comment字段 2020-09-07 20:00:09 +08:00
peijianbo
1b84afee0c feat(audits):修改日志默认保存时间(90->9999) 2020-09-07 18:35:19 +08:00
fit2bot
172b6edd28 feat(user):同一个账号仅允许在一台终端设备登录 (#4590)
* feat(user):同一个账号仅允许在一台终端设备登录

* feat(user):同一个账号仅允许在一台终端设备登录

* feat(user):同一个账号仅允许在一台终端设备登录

* feat(user):同一个账号仅允许在一台终端设备登录

* feat(user):同一个账号仅允许在一台终端设备登录

Co-authored-by: peijianbo <peijainbo3006@163.com>
2020-09-07 17:42:59 +08:00
Bai
e6f248bfa0 feat(i18n): 添加云同步实例任务hostname_strategy字段翻译信息 2020-09-07 17:39:35 +08:00
ibuler
1f037b1933 feat(i18n): 添加新翻译 2020-09-02 15:21:48 +08:00
Bai
ae9bbd2683 fix(common) 修复管理员未设置Email主题前缀导致发送邮件失败的问题 2020-09-02 10:23:44 +08:00
xinwen
a0085c4eab feat(README): 添加企业版试用链接 2020-09-01 16:50:43 +08:00
ibuler
ddb71c43c4 fix(users): 修复用户在不同组织引起的问题 2020-09-01 16:47:13 +08:00
herealways
8227f44058 feat: 添加AES GCM模式为默认的加密方式 2020-09-01 14:48:40 +08:00
ibuler
e81762d692 ci(Dockerfile): 修改依赖的setuptools版本,导致的ldap无法安装问题 2020-09-01 13:39:08 +08:00
老广
5b8fa1809c Update README.md
docs: 添加  developer wanted
2020-08-24 10:04:03 +08:00
BaiJiangJie
90ba6442dd Merge pull request #4523 from jumpserver/dev
fix(orgs): 完善组织与用户变化时的信号
2020-08-20 16:40:07 +08:00
xinwen
a28334b6d8 fix(orgs): 完善组织与用户变化时的信号 2020-08-20 16:34:10 +08:00
314 changed files with 10421 additions and 4014 deletions

View File

@@ -1,20 +0,0 @@
[简述你的问题]
##### 使用版本
[请提供你使用的JumpServer版本 如 2.0.1 注: 1.4及以下版本不再提供支持]
##### 使用浏览器版本
[请提供你使用的浏览器版本 如 Chrome 84.0.4147.105 ]
##### 问题复现步骤
1. [步骤1]
2. [步骤2]
##### 具体表现[截图可能会更好些,最好能截全]
##### 其他
[注:] 完成后请关闭 issue

10
.github/ISSUE_TEMPLATE/----.md vendored Normal file
View File

@@ -0,0 +1,10 @@
---
name: 需求建议
about: 提出针对本项目的想法和建议
title: "[Feature] "
labels: 类型:需求
assignees: ibuler
---
**请描述您的需求或者改进建议.**

22
.github/ISSUE_TEMPLATE/bug---.md vendored Normal file
View File

@@ -0,0 +1,22 @@
---
name: Bug 提交
about: 提交产品缺陷帮助我们更好的改进
title: "[Bug] "
labels: 类型:bug
assignees: wojiushixiaobai
---
**JumpServer 版本(v1.5.9以下不再支持)**
**浏览器版本**
**Bug 描述**
**Bug 重现步骤(有截图更好)**
1.
2.
3.

10
.github/ISSUE_TEMPLATE/question.md vendored Normal file
View File

@@ -0,0 +1,10 @@
---
name: 问题咨询
about: 提出针对本项目安装部署、使用及其他方面的相关问题
title: "[Question] "
labels: 类型:提问
assignees: wojiushixiaobai
---
**请描述您的问题.**

View File

@@ -1,5 +1,6 @@
FROM registry.fit2cloud.com/public/python:v3 as stage-build
MAINTAINER Jumpserver Team <ibuler@qq.com>
# 编译代码
FROM python:3.8.6-slim as stage-build
MAINTAINER JumpServer Team <ibuler@qq.com>
ARG VERSION
ENV VERSION=$VERSION
@@ -8,32 +9,38 @@ ADD . .
RUN cd utils && bash -ixeu build.sh
FROM registry.fit2cloud.com/public/python:v3
# 构建运行时环境
FROM python:3.8.6-slim
ARG PIP_MIRROR=https://pypi.douban.com/simple
ENV PIP_MIRROR=$PIP_MIRROR
ARG MYSQL_MIRROR=https://mirrors.tuna.tsinghua.edu.cn/mysql/yum/mysql57-community-el6/
ENV MYSQL_MIRROR=$MYSQL_MIRROR
ARG PIP_JMS_MIRROR=https://pypi.douban.com/simple
ENV PIP_JMS_MIRROR=$PIP_JMS_MIRROR
WORKDIR /opt/jumpserver
COPY ./requirements ./requirements
RUN useradd jumpserver
RUN yum -y install epel-release && \
echo -e "[mysql]\nname=mysql\nbaseurl=${MYSQL_MIRROR}\ngpgcheck=0\nenabled=1" > /etc/yum.repos.d/mysql.repo
RUN yum -y install $(cat requirements/rpm_requirements.txt)
RUN pip install --upgrade pip setuptools wheel -i ${PIP_MIRROR} && \
pip config set global.index-url ${PIP_MIRROR}
RUN pip install -r requirements/requirements.txt || pip install -r requirements/requirements.txt
COPY ./requirements/deb_buster_requirements.txt ./requirements/deb_buster_requirements.txt
RUN sed -i 's/deb.debian.org/mirrors.aliyun.com/g' /etc/apt/sources.list \
&& sed -i 's/security.debian.org/mirrors.aliyun.com/g' /etc/apt/sources.list \
&& apt update \
&& grep -v '^#' ./requirements/deb_buster_requirements.txt | xargs apt -y install \
&& localedef -c -f UTF-8 -i zh_CN zh_CN.UTF-8 \
&& cp /usr/share/zoneinfo/Asia/Shanghai /etc/localtime
COPY ./requirements/requirements.txt ./requirements/requirements.txt
RUN pip install --upgrade pip==20.2.4 setuptools==49.6.0 wheel==0.34.2 -i ${PIP_MIRROR} \
&& pip config set global.index-url ${PIP_MIRROR} \
&& pip install --no-cache-dir $(grep 'jms' requirements/requirements.txt) -i ${PIP_JMS_MIRROR} \
&& pip install --no-cache-dir -r requirements/requirements.txt
COPY --from=stage-build /opt/jumpserver/release/jumpserver /opt/jumpserver
RUN mkdir -p /root/.ssh/ && echo -e "Host *\n\tStrictHostKeyChecking no\n\tUserKnownHostsFile /dev/null" > /root/.ssh/config
RUN mkdir -p /root/.ssh/ \
&& echo "Host *\n\tStrictHostKeyChecking no\n\tUserKnownHostsFile /dev/null" > /root/.ssh/config
RUN echo > config.yml
VOLUME /opt/jumpserver/data
VOLUME /opt/jumpserver/logs
ENV LANG=zh_CN.UTF-8
ENV LC_ALL=zh_CN.UTF-8
EXPOSE 8070
EXPOSE 8080

View File

@@ -4,6 +4,10 @@
[![Django](https://img.shields.io/badge/django-2.2-brightgreen.svg?style=plastic)](https://www.djangoproject.com/)
[![Docker Pulls](https://img.shields.io/docker/pulls/jumpserver/jms_all.svg)](https://hub.docker.com/u/jumpserver)
|Developer Wanted|
|------------------|
|JumpServer 正在寻找开发者,一起为改变世界做些贡献吧,哪怕一点点,联系我 <ibuler@fit2cloud.com> |
JumpServer 是全球首款开源的堡垒机,使用 GNU GPL v2.0 开源协议,是符合 4A 规范的运维安全审计系统。
JumpServer 使用 Python / Django 为主进行开发,遵循 Web 2.0 规范,配备了业界领先的 Web Terminal 方案,交互界面美观、用户体验好。
@@ -21,7 +25,8 @@ JumpServer 采纳分布式架构,支持多机房跨区域部署,支持横向
- 无插件: 仅需浏览器,极致的 Web Terminal 使用体验;
- 多云支持: 一套系统,同时管理不同云上面的资产;
- 云端存储: 审计录像云端存储,永不丢失;
- 多租户: 一套系统,多个子公司和部门同时使用
- 多租户: 一套系统,多个子公司和部门同时使用
- 多应用支持: 数据库Windows远程应用Kubernetes。
## 版本说明
@@ -194,13 +199,76 @@ v2.1.0 是 v2.0.0 之后的功能版本。
<td>文件传输</td>
<td>可对文件的上传、下载记录进行审计</td>
</tr>
<tr>
<td rowspan="20">数据库审计<br>Database</td>
<td rowspan="2">连接方式</td>
<td>命令方式</td>
</tr>
<tr>
<td>Web UI方式 (X-PACK)</td>
</tr>
<tr>
<td rowspan="4">支持的数据库</td>
<td>MySQL</td>
</tr>
<tr>
<td>Oracle (X-PACK)</td>
</tr>
<tr>
<td>MariaDB (X-PACK)</td>
</tr>
<tr>
<td>PostgreSQL (X-PACK)</td>
</tr>
<tr>
<td rowspan="6">功能亮点</td>
<td>语法高亮</td>
</tr>
<tr>
<td>SQL格式化</td>
</tr>
<tr>
<td>支持快捷键</td>
</tr>
<tr>
<td>支持选中执行</td>
</tr>
<tr>
<td>SQL历史查询</td>
</tr>
<tr>
<td>支持页面创建 DB, TABLE</td>
</tr>
<tr>
<td rowspan="2">会话审计</td>
<td>命令记录</td>
</tr>
<tr>
<td>录像回放</td>
</tr>
</table>
## 快速开始
- [极速安装](https://docs.jumpserver.org/zh/master/install/setup_by_fast/)
- [完整文档](https://docs.jumpserver.org)
- [演示视频](https://jumpserver.oss-cn-hangzhou.aliyuncs.com/jms-media/%E3%80%90%E6%BC%94%E7%A4%BA%E8%A7%86%E9%A2%91%E3%80%91Jumpserver%20%E5%A0%A1%E5%9E%92%E6%9C%BA%20V1.5.0%20%E6%BC%94%E7%A4%BA%E8%A7%86%E9%A2%91%20-%20final.mp4)
- [演示视频](https://www.bilibili.com/video/BV1ZV41127GB)
## 组件项目
- [Lina](https://github.com/jumpserver/lina) JumpServer Web UI 项目
- [Luna](https://github.com/jumpserver/luna) JumpServer Web Terminal 项目
- [Koko](https://github.com/jumpserver/koko) JumpServer 字符协议 Connector 项目,替代原来 Python 版本的 [Coco](https://github.com/jumpserver/coco)
- [Guacamole](https://github.com/jumpserver/docker-guacamole) JumpServer 图形协议 Connector 项目,依赖 [Apache Guacamole](https://guacamole.apache.org/)
## 致谢
- [Apache Guacamole](https://guacamole.apache.org/) Web页面连接 RDP, SSH, VNC协议设备JumpServer 图形化连接依赖
- [OmniDB](https://omnidb.org/) Web页面连接使用数据库JumpServer Web数据库依赖
## JumpServer 企业版
- [申请企业版试用](https://jinshuju.net/f/kyOYpi)
> 注:企业版支持离线安装,申请通过后会提供高速下载链接。
## 案例研究

2
apps/.gitattributes vendored Normal file
View File

@@ -0,0 +1,2 @@
*.js linguist-language=python
*.html linguist-language=python

View File

@@ -1,3 +1,5 @@
from .application import *
from .mixin import *
from .remote_app import *
from .database_app import *
from .k8s_app import *

View File

@@ -0,0 +1,20 @@
# coding: utf-8
#
from orgs.mixins.api import OrgBulkModelViewSet
from .mixin import ApplicationAttrsSerializerViewMixin
from ..hands import IsOrgAdminOrAppUser
from .. import models, serializers
__all__ = [
'ApplicationViewSet',
]
class ApplicationViewSet(ApplicationAttrsSerializerViewMixin, OrgBulkModelViewSet):
model = models.Application
filter_fields = ('name', 'type', 'category')
search_fields = filter_fields
permission_classes = (IsOrgAdminOrAppUser,)
serializer_class = serializers.ApplicationSerializer

View File

@@ -0,0 +1,139 @@
import uuid
from common.exceptions import JMSException
from orgs.models import Organization
from .. import models
class ApplicationAttrsSerializerViewMixin:
def get_serializer_class(self):
serializer_class = super().get_serializer_class()
if getattr(self, 'swagger_fake_view', False):
return serializer_class
app_type = self.request.query_params.get('type')
app_category = self.request.query_params.get('category')
type_options = list(dict(models.Category.get_all_type_serializer_mapper()).keys())
category_options = list(dict(models.Category.get_category_serializer_mapper()).keys())
# ListAPIView 没有 action 属性
# 不使用method属性因为options请求时为method为post
action = getattr(self, 'action', 'list')
if app_type and app_type not in type_options:
raise JMSException(
'Invalid query parameter `type`, select from the following options: {}'
''.format(type_options)
)
if app_category and app_category not in category_options:
raise JMSException(
'Invalid query parameter `category`, select from the following options: {}'
''.format(category_options)
)
if action in [
'create', 'update', 'partial_update', 'bulk_update', 'partial_bulk_update'
] and not app_type:
# action: create / update
raise JMSException(
'The `{}` action must take the `type` query parameter'.format(action)
)
if app_type:
# action: create / update / list / retrieve / metadata
attrs_cls = models.Category.get_type_serializer_cls(app_type)
class_name = 'ApplicationDynamicSerializer{}'.format(app_type.title())
elif app_category:
# action: list / retrieve / metadata
attrs_cls = models.Category.get_category_serializer_cls(app_category)
class_name = 'ApplicationDynamicSerializer{}'.format(app_category.title())
else:
attrs_cls = models.Category.get_no_password_serializer_cls()
class_name = 'ApplicationDynamicSerializer'
cls = type(class_name, (serializer_class,), {'attrs': attrs_cls()})
return cls
class SerializeApplicationToTreeNodeMixin:
@staticmethod
def _serialize_db(db):
return {
'id': db.id,
'name': db.name,
'title': db.name,
'pId': '',
'open': False,
'iconSkin': 'database',
'meta': {'type': 'database_app'}
}
@staticmethod
def _serialize_remote_app(remote_app):
return {
'id': remote_app.id,
'name': remote_app.name,
'title': remote_app.name,
'pId': '',
'open': False,
'isParent': False,
'iconSkin': 'chrome',
'meta': {'type': 'remote_app'}
}
@staticmethod
def _serialize_cloud(cloud):
return {
'id': cloud.id,
'name': cloud.name,
'title': cloud.name,
'pId': '',
'open': False,
'isParent': False,
'iconSkin': 'k8s',
'meta': {'type': 'k8s_app'}
}
def _serialize_application(self, application):
method_name = f'_serialize_{application.category}'
data = getattr(self, method_name)(application)
data.update({
'pId': application.org.id,
'org_name': application.org_name
})
return data
def serialize_applications(self, applications):
data = [self._serialize_application(application) for application in applications]
return data
@staticmethod
def _serialize_organization(org):
return {
'id': org.id,
'name': org.name,
'title': org.name,
'pId': '',
'open': True,
'isParent': True,
'meta': {
'type': 'node'
}
}
def serialize_organizations(self, organizations):
data = [self._serialize_organization(org) for org in organizations]
return data
@staticmethod
def filter_organizations(applications):
organizations_id = set(applications.values_list('org_id', flat=True))
organizations = [Organization.get_instance(org_id) for org_id in organizations_id]
return organizations
def serialize_applications_with_org(self, applications):
organizations = self.filter_organizations(applications)
data_organizations = self.serialize_organizations(organizations)
data_applications = self.serialize_applications(applications)
data = data_organizations + data_applications
return data

View File

@@ -3,8 +3,9 @@
from orgs.mixins.api import OrgBulkModelViewSet
from orgs.mixins import generics
from common.exceptions import JMSException
from ..hands import IsOrgAdmin, IsAppUser
from ..models import RemoteApp
from .. import models
from ..serializers import RemoteAppSerializer, RemoteAppConnectionInfoSerializer
@@ -14,7 +15,7 @@ __all__ = [
class RemoteAppViewSet(OrgBulkModelViewSet):
model = RemoteApp
model = models.RemoteApp
filter_fields = ('name', 'type', 'comment')
search_fields = filter_fields
permission_classes = (IsOrgAdmin,)
@@ -22,6 +23,18 @@ class RemoteAppViewSet(OrgBulkModelViewSet):
class RemoteAppConnectionInfoApi(generics.RetrieveAPIView):
model = RemoteApp
model = models.Application
permission_classes = (IsAppUser, )
serializer_class = RemoteAppConnectionInfoSerializer
@staticmethod
def check_category_allowed(obj):
if not obj.category_is_remote_app:
raise JMSException(
'The request instance(`{}`) is not of category `remote_app`'.format(obj.category)
)
def get_object(self):
obj = super().get_object()
self.check_category_allowed(obj)
return obj

View File

@@ -0,0 +1,140 @@
# Generated by Django 2.2.13 on 2020-10-19 12:01
from django.db import migrations, models
import django.db.models.deletion
import django_mysql.models
import uuid
CATEGORY_DB_LIST = ['mysql', 'oracle', 'postgresql', 'mariadb']
CATEGORY_REMOTE_LIST = ['chrome', 'mysql_workbench', 'vmware_client', 'custom']
CATEGORY_CLOUD_LIST = ['k8s']
CATEGORY_DB = 'db'
CATEGORY_REMOTE = 'remote_app'
CATEGORY_CLOUD = 'cloud'
CATEGORY_LIST = [CATEGORY_DB, CATEGORY_REMOTE, CATEGORY_CLOUD]
def get_application_category(old_app):
_type = old_app.type
if _type in CATEGORY_DB_LIST:
category = CATEGORY_DB
elif _type in CATEGORY_REMOTE_LIST:
category = CATEGORY_REMOTE
elif _type in CATEGORY_CLOUD_LIST:
category = CATEGORY_CLOUD
else:
category = None
return category
def common_to_application_json(old_app):
category = get_application_category(old_app)
date_updated = old_app.date_updated if hasattr(old_app, 'date_updated') else old_app.date_created
return {
'id': old_app.id,
'name': old_app.name,
'type': old_app.type,
'category': category,
'comment': old_app.comment,
'created_by': old_app.created_by,
'date_created': old_app.date_created,
'date_updated': date_updated,
'org_id': old_app.org_id
}
def db_to_application_json(database):
app_json = common_to_application_json(database)
app_json.update({
'attrs': {
'host': database.host,
'port': database.port,
'database': database.database
}
})
return app_json
def remote_to_application_json(remote):
app_json = common_to_application_json(remote)
attrs = {
'asset': str(remote.asset.id),
'path': remote.path,
}
attrs.update(remote.params)
app_json.update({
'attrs': attrs
})
return app_json
def k8s_to_application_json(k8s):
app_json = common_to_application_json(k8s)
app_json.update({
'attrs': {
'cluster': k8s.cluster
}
})
return app_json
def migrate_and_integrate_applications(apps, schema_editor):
db_alias = schema_editor.connection.alias
database_app_model = apps.get_model("applications", "DatabaseApp")
remote_app_model = apps.get_model("applications", "RemoteApp")
k8s_app_model = apps.get_model("applications", "K8sApp")
database_apps = database_app_model.objects.using(db_alias).all()
remote_apps = remote_app_model.objects.using(db_alias).all()
k8s_apps = k8s_app_model.objects.using(db_alias).all()
database_applications = [db_to_application_json(db_app) for db_app in database_apps]
remote_applications = [remote_to_application_json(remote_app) for remote_app in remote_apps]
k8s_applications = [k8s_to_application_json(k8s_app) for k8s_app in k8s_apps]
applications_json = database_applications + remote_applications + k8s_applications
application_model = apps.get_model("applications", "Application")
applications = [
application_model(**application_json)
for application_json in applications_json
if application_json['category'] in CATEGORY_LIST
]
for application in applications:
if application_model.objects.using(db_alias).filter(name=application.name).exists():
application.name = '{}-{}'.format(application.name, application.type)
application.save()
class Migration(migrations.Migration):
dependencies = [
('assets', '0057_fill_node_value_assets_amount_and_parent_key'),
('applications', '0005_k8sapp'),
]
operations = [
migrations.CreateModel(
name='Application',
fields=[
('org_id', models.CharField(blank=True, db_index=True, default='', max_length=36, verbose_name='Organization')),
('id', models.UUIDField(default=uuid.uuid4, primary_key=True, serialize=False)),
('created_by', models.CharField(blank=True, max_length=32, null=True, verbose_name='Created by')),
('date_created', models.DateTimeField(auto_now_add=True, null=True, verbose_name='Date created')),
('date_updated', models.DateTimeField(auto_now=True, verbose_name='Date updated')),
('name', models.CharField(max_length=128, verbose_name='Name')),
('category', models.CharField(choices=[('db', 'Database'), ('remote_app', 'Remote app'), ('cloud', 'Cloud')], max_length=16, verbose_name='Category')),
('type', models.CharField(choices=[('mysql', 'MySQL'), ('oracle', 'Oracle'), ('postgresql', 'PostgreSQL'), ('mariadb', 'MariaDB'), ('chrome', 'Chrome'), ('mysql_workbench', 'MySQL Workbench'), ('vmware_client', 'vSphere Client'), ('custom', 'Custom'), ('k8s', 'Kubernetes')], max_length=16, verbose_name='Type')),
('attrs', django_mysql.models.JSONField(default=dict)),
('comment', models.TextField(blank=True, default='', max_length=128, verbose_name='Comment')),
('domain', models.ForeignKey(blank=True, null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='applications', to='assets.Domain', verbose_name='Domain')),
],
options={
'ordering': ('name',),
'unique_together': {('org_id', 'name')},
},
),
migrations.RunPython(migrate_and_integrate_applications),
]

View File

@@ -0,0 +1,18 @@
# Generated by Django 3.1 on 2020-11-19 03:10
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('applications', '0006_application'),
]
operations = [
migrations.AlterField(
model_name='application',
name='attrs',
field=models.JSONField(),
),
]

View File

@@ -1,3 +1,4 @@
from .application import *
from .remote_app import *
from .database_app import *
from .k8s_app import *

View File

@@ -0,0 +1,140 @@
from itertools import chain
from django.db import models
from django.utils.translation import ugettext_lazy as _
from orgs.mixins.models import OrgModelMixin
from common.mixins import CommonModelMixin
from common.db.models import ChoiceSet
class DBType(ChoiceSet):
mysql = 'mysql', 'MySQL'
oracle = 'oracle', 'Oracle'
pgsql = 'postgresql', 'PostgreSQL'
mariadb = 'mariadb', 'MariaDB'
@classmethod
def get_type_serializer_cls_mapper(cls):
from ..serializers import database_app
mapper = {
cls.mysql: database_app.MySQLAttrsSerializer,
cls.oracle: database_app.OracleAttrsSerializer,
cls.pgsql: database_app.PostgreAttrsSerializer,
cls.mariadb: database_app.MariaDBAttrsSerializer,
}
return mapper
class RemoteAppType(ChoiceSet):
chrome = 'chrome', 'Chrome'
mysql_workbench = 'mysql_workbench', 'MySQL Workbench'
vmware_client = 'vmware_client', 'vSphere Client'
custom = 'custom', _('Custom')
@classmethod
def get_type_serializer_cls_mapper(cls):
from ..serializers import remote_app
mapper = {
cls.chrome: remote_app.ChromeAttrsSerializer,
cls.mysql_workbench: remote_app.MySQLWorkbenchAttrsSerializer,
cls.vmware_client: remote_app.VMwareClientAttrsSerializer,
cls.custom: remote_app.CustomRemoteAppAttrsSeralizers,
}
return mapper
class CloudType(ChoiceSet):
k8s = 'k8s', 'Kubernetes'
@classmethod
def get_type_serializer_cls_mapper(cls):
from ..serializers import k8s_app
mapper = {
cls.k8s: k8s_app.K8sAttrsSerializer,
}
return mapper
class Category(ChoiceSet):
db = 'db', _('Database')
remote_app = 'remote_app', _('Remote app')
cloud = 'cloud', 'Cloud'
@classmethod
def get_category_type_mapper(cls):
return {
cls.db: DBType,
cls.remote_app: RemoteAppType,
cls.cloud: CloudType
}
@classmethod
def get_category_type_choices_mapper(cls):
return {
name: tp.choices
for name, tp in cls.get_category_type_mapper().items()
}
@classmethod
def get_type_choices(cls, category):
return cls.get_category_type_choices_mapper().get(category, [])
@classmethod
def get_all_type_choices(cls):
all_grouped_choices = tuple(cls.get_category_type_choices_mapper().values())
return tuple(chain(*all_grouped_choices))
@classmethod
def get_all_type_serializer_mapper(cls):
mapper = {}
for tp in cls.get_category_type_mapper().values():
mapper.update(tp.get_type_serializer_cls_mapper())
return mapper
@classmethod
def get_type_serializer_cls(cls, tp):
mapper = cls.get_all_type_serializer_mapper()
return mapper.get(tp, None)
@classmethod
def get_category_serializer_mapper(cls):
from ..serializers import remote_app, database_app, k8s_app
return {
cls.db: database_app.DBAttrsSerializer,
cls.remote_app: remote_app.RemoteAppAttrsSerializer,
cls.cloud: k8s_app.CloudAttrsSerializer,
}
@classmethod
def get_category_serializer_cls(cls, cg):
mapper = cls.get_category_serializer_mapper()
return mapper.get(cg, None)
@classmethod
def get_no_password_serializer_cls(cls):
from ..serializers import common
return common.NoPasswordSerializer
class Application(CommonModelMixin, OrgModelMixin):
name = models.CharField(max_length=128, verbose_name=_('Name'))
domain = models.ForeignKey('assets.Domain', null=True, blank=True, related_name='applications', verbose_name=_("Domain"), on_delete=models.SET_NULL)
category = models.CharField(max_length=16, choices=Category.choices, verbose_name=_('Category'))
type = models.CharField(max_length=16, choices=Category.get_all_type_choices(), verbose_name=_('Type'))
attrs = models.JSONField()
comment = models.TextField(
max_length=128, default='', blank=True, verbose_name=_('Comment')
)
class Meta:
unique_together = [('org_id', 'name')]
ordering = ('name',)
def __str__(self):
category_display = self.get_category_display()
type_display = self.get_type_display()
return f'{self.name}({type_display})[{category_display}]'
def category_is_remote_app(self):
return self.category == Category.remote_app

View File

@@ -1,3 +1,5 @@
from .application import *
from .remote_app import *
from .database_app import *
from .k8s_app import *
from .common import *

View File

@@ -0,0 +1,42 @@
# coding: utf-8
#
from rest_framework import serializers
from django.utils.translation import ugettext_lazy as _
from orgs.mixins.serializers import BulkOrgResourceModelSerializer
from .. import models
__all__ = [
'ApplicationSerializer',
]
class ApplicationSerializer(BulkOrgResourceModelSerializer):
category_display = serializers.ReadOnlyField(source='get_category_display', label=_('Category'))
type_display = serializers.ReadOnlyField(source='get_type_display', label=_('Type'))
class Meta:
model = models.Application
fields = [
'id', 'name', 'category', 'category_display', 'type', 'type_display', 'attrs',
'domain', 'created_by', 'date_created', 'date_updated', 'comment'
]
read_only_fields = [
'created_by', 'date_created', 'date_updated', 'get_type_display',
]
def create(self, validated_data):
validated_data['attrs'] = validated_data.pop('attrs', {})
instance = super().create(validated_data)
return instance
def update(self, instance, validated_data):
new_attrs = validated_data.pop('attrs', {})
instance = super().update(instance, validated_data)
attrs = instance.attrs
attrs.update(new_attrs)
instance.attrs = attrs
instance.save()
return instance

View File

@@ -0,0 +1,11 @@
from rest_framework import serializers
class NoPasswordSerializer(serializers.JSONField):
def to_representation(self, value):
new_value = {}
for k, v in value.items():
if 'password' not in k:
new_value[k] = v
return new_value

View File

@@ -1,14 +1,35 @@
# coding: utf-8
#
from rest_framework import serializers
from django.utils.translation import ugettext_lazy as _
from orgs.mixins.serializers import BulkOrgResourceModelSerializer
from common.serializers import AdaptedBulkListSerializer
from .. import models
__all__ = [
'DatabaseAppSerializer',
]
class DBAttrsSerializer(serializers.Serializer):
host = serializers.CharField(max_length=128, label=_('Host'))
port = serializers.IntegerField(label=_('Port'))
# 添加allow_null=True兼容之前数据库中database字段为None的情况
database = serializers.CharField(max_length=128, required=True, allow_null=True, label=_('Database'))
class MySQLAttrsSerializer(DBAttrsSerializer):
port = serializers.IntegerField(default=3306, label=_('Port'))
class PostgreAttrsSerializer(DBAttrsSerializer):
port = serializers.IntegerField(default=5432, label=_('Port'))
class OracleAttrsSerializer(DBAttrsSerializer):
port = serializers.IntegerField(default=1521, label=_('Port'))
class MariaDBAttrsSerializer(MySQLAttrsSerializer):
pass
class DatabaseAppSerializer(BulkOrgResourceModelSerializer):
@@ -24,3 +45,6 @@ class DatabaseAppSerializer(BulkOrgResourceModelSerializer):
'created_by', 'date_created', 'date_updated'
'get_type_display',
]
extra_kwargs = {
'get_type_display': {'label': _('Type for display')},
}

View File

@@ -1,15 +1,20 @@
from rest_framework import serializers
from django.utils.translation import ugettext_lazy as _
from orgs.mixins.serializers import BulkOrgResourceModelSerializer
from .. import models
__all__ = [
'K8sAppSerializer',
]
class CloudAttrsSerializer(serializers.Serializer):
cluster = serializers.CharField(max_length=1024, label=_('Cluster'))
class K8sAttrsSerializer(CloudAttrsSerializer):
pass
class K8sAppSerializer(BulkOrgResourceModelSerializer):
type_display = serializers.CharField(source='get_type_display', read_only=True)
type_display = serializers.CharField(source='get_type_display', read_only=True, label=_('Type for display'))
class Meta:
model = models.K8sApp

View File

@@ -2,21 +2,138 @@
#
import copy
from django.utils.translation import ugettext_lazy as _
from django.core.exceptions import ObjectDoesNotExist
from rest_framework import serializers
from common.serializers import AdaptedBulkListSerializer
from common.fields.serializer import CustomMetaDictField
from common.utils import get_logger
from orgs.mixins.serializers import BulkOrgResourceModelSerializer
from assets.models import Asset
from .. import const
from ..models import RemoteApp
from ..models import RemoteApp, Category, Application
logger = get_logger(__file__)
__all__ = [
'RemoteAppSerializer', 'RemoteAppConnectionInfoSerializer',
]
class CharPrimaryKeyRelatedField(serializers.PrimaryKeyRelatedField):
def to_internal_value(self, data):
instance = super().to_internal_value(data)
return str(instance.id)
def to_representation(self, value):
# value is instance.id
if self.pk_field is not None:
return self.pk_field.to_representation(value)
return value
class RemoteAppAttrsSerializer(serializers.Serializer):
asset_info = serializers.SerializerMethodField()
asset = CharPrimaryKeyRelatedField(queryset=Asset.objects, required=False, label=_("Asset"))
path = serializers.CharField(max_length=128, label=_('Application path'))
@staticmethod
def get_asset_info(obj):
asset_info = {}
asset_id = obj.get('asset')
if not asset_id:
return asset_info
try:
asset = Asset.objects.get(id=asset_id)
asset_info.update({
'id': str(asset.id),
'hostname': asset.hostname
})
except ObjectDoesNotExist as e:
logger.error(e)
return asset_info
class ChromeAttrsSerializer(RemoteAppAttrsSerializer):
REMOTE_APP_PATH = 'C:\Program Files (x86)\Google\Chrome\Application\chrome.exe'
path = serializers.CharField(max_length=128, label=_('Application path'), default=REMOTE_APP_PATH)
chrome_target = serializers.CharField(max_length=128, allow_blank=True, required=False, label=_('Target URL'))
chrome_username = serializers.CharField(max_length=128, allow_blank=True, required=False, label=_('Username'))
chrome_password = serializers.CharField(max_length=128, allow_blank=True, required=False, write_only=True, label=_('Password'))
class MySQLWorkbenchAttrsSerializer(RemoteAppAttrsSerializer):
REMOTE_APP_PATH = 'C:\Program Files\MySQL\MySQL Workbench 8.0 CE\MySQLWorkbench.exe'
path = serializers.CharField(max_length=128, label=_('Application path'), default=REMOTE_APP_PATH)
mysql_workbench_ip = serializers.CharField(max_length=128, allow_blank=True, required=False, label=_('IP'))
mysql_workbench_port = serializers.IntegerField(required=False, label=_('Port'))
mysql_workbench_name = serializers.CharField(max_length=128, allow_blank=True, required=False, label=_('Database'))
mysql_workbench_username = serializers.CharField(max_length=128, allow_blank=True, required=False, label=_('Username'))
mysql_workbench_password = serializers.CharField(max_length=128, allow_blank=True, required=False, write_only=True, label=_('Password'))
class VMwareClientAttrsSerializer(RemoteAppAttrsSerializer):
REMOTE_APP_PATH = 'C:\Program Files (x86)\VMware\Infrastructure\Virtual Infrastructure Client\Launcher\VpxClient.exe'
path = serializers.CharField(max_length=128, label=_('Application path'), default=REMOTE_APP_PATH)
vmware_target = serializers.CharField(max_length=128, allow_blank=True, required=False, label=_('Target URL'))
vmware_username = serializers.CharField(max_length=128, allow_blank=True, required=False, label=_('Username'))
vmware_password = serializers.CharField(max_length=128, allow_blank=True, required=False, write_only=True, label=_('Password'))
class CustomRemoteAppAttrsSeralizers(RemoteAppAttrsSerializer):
custom_cmdline = serializers.CharField(max_length=128, allow_blank=True, required=False, label=_('Operating parameter'))
custom_target = serializers.CharField(max_length=128, allow_blank=True, required=False, label=_('Target url'))
custom_username = serializers.CharField(max_length=128, allow_blank=True, required=False, label=_('Username'))
custom_password = serializers.CharField(max_length=128, allow_blank=True, required=False, write_only=True, label=_('Password'))
class RemoteAppConnectionInfoSerializer(serializers.ModelSerializer):
parameter_remote_app = serializers.SerializerMethodField()
asset = serializers.SerializerMethodField()
class Meta:
model = Application
fields = [
'id', 'name', 'asset', 'parameter_remote_app',
]
read_only_fields = ['parameter_remote_app']
@staticmethod
def get_parameters(obj):
"""
返回Guacamole需要的RemoteApp配置参数信息中的parameters参数
"""
serializer_cls = Category.get_type_serializer_cls(obj.type)
fields = serializer_cls().get_fields()
fields.pop('asset', None)
fields_name = list(fields.keys())
attrs = obj.attrs
_parameters = list()
_parameters.append(obj.type)
for field_name in list(fields_name):
value = attrs.get(field_name, None)
if not value:
continue
if field_name == 'path':
value = '\"%s\"' % value
_parameters.append(str(value))
_parameters = ' '.join(_parameters)
return _parameters
def get_parameter_remote_app(self, obj):
parameters = self.get_parameters(obj)
parameter = {
'program': const.REMOTE_APP_BOOT_PROGRAM_NAME,
'working_directory': '',
'parameters': parameters,
}
return parameter
@staticmethod
def get_asset(obj):
return obj.attrs.get('asset')
# TODO: DELETE
class RemoteAppParamsDictField(CustomMetaDictField):
type_fields_map = const.REMOTE_APP_TYPE_FIELDS_MAP
default_type = const.REMOTE_APP_TYPE_CHROME
@@ -24,8 +141,9 @@ class RemoteAppParamsDictField(CustomMetaDictField):
convert_key_to_upper = False
# TODO: DELETE
class RemoteAppSerializer(BulkOrgResourceModelSerializer):
params = RemoteAppParamsDictField()
params = RemoteAppParamsDictField(label=_('Parameters'))
type_fields_map = const.REMOTE_APP_TYPE_FIELDS_MAP
class Meta:
@@ -39,6 +157,10 @@ class RemoteAppSerializer(BulkOrgResourceModelSerializer):
'created_by', 'date_created', 'asset_info',
'get_type_display'
]
extra_kwargs = {
'asset_info': {'label': _('Asset info')},
'get_type_display': {'label': _('Type for display')},
}
def process_params(self, instance, validated_data):
new_params = copy.deepcopy(validated_data.get('params', {}))
@@ -66,21 +188,3 @@ class RemoteAppSerializer(BulkOrgResourceModelSerializer):
return super().update(instance, validated_data)
class RemoteAppConnectionInfoSerializer(serializers.ModelSerializer):
parameter_remote_app = serializers.SerializerMethodField()
class Meta:
model = RemoteApp
fields = [
'id', 'name', 'asset', 'parameter_remote_app',
]
read_only_fields = ['parameter_remote_app']
@staticmethod
def get_parameter_remote_app(obj):
parameter = {
'program': const.REMOTE_APP_BOOT_PROGRAM_NAME,
'working_directory': '',
'parameters': obj.parameters,
}
return parameter

View File

@@ -10,6 +10,7 @@ from .. import api
app_name = 'applications'
router = BulkRouter()
router.register(r'applications', api.ApplicationViewSet, 'application')
router.register(r'remote-apps', api.RemoteAppViewSet, 'remote-app')
router.register(r'database-apps', api.DatabaseAppViewSet, 'database-app')
router.register(r'k8s-apps', api.K8sAppViewSet, 'k8s-app')

View File

@@ -1,3 +1,4 @@
from .mixin import *
from .admin_user import *
from .asset import *
from .label import *

View File

@@ -94,7 +94,6 @@ class AdminUserAssetsListView(generics.ListAPIView):
permission_classes = (IsOrgAdmin,)
serializer_class = serializers.AssetSimpleSerializer
filter_fields = ("hostname", "ip")
http_method_names = ['get']
search_fields = filter_fields
def get_object(self):

View File

@@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
#
from assets.api import FilterAssetByNodeMixin
from rest_framework.viewsets import ModelViewSet
from rest_framework.generics import RetrieveAPIView
from django.shortcuts import get_object_or_404
@@ -14,7 +14,7 @@ from .. import serializers
from ..tasks import (
update_asset_hardware_info_manual, test_asset_connectivity_manual
)
from ..filters import AssetByNodeFilterBackend, LabelFilterBackend, IpInFilterBackend
from ..filters import FilterAssetByNodeFilterBackend, LabelFilterBackend, IpInFilterBackend
logger = get_logger(__file__)
@@ -25,14 +25,14 @@ __all__ = [
]
class AssetViewSet(OrgBulkModelViewSet):
class AssetViewSet(FilterAssetByNodeMixin, OrgBulkModelViewSet):
"""
API endpoint that allows Asset to be viewed or edited.
"""
model = Asset
filter_fields = (
"hostname", "ip", "systemuser__id", "admin_user__id", "platform__base",
"is_active", 'ip'
"is_active"
)
search_fields = ("hostname", "ip")
ordering_fields = ("hostname", "ip", "port", "cpu_cores")
@@ -41,7 +41,7 @@ class AssetViewSet(OrgBulkModelViewSet):
'display': serializers.AssetDisplaySerializer,
}
permission_classes = (IsOrgAdminOrAppUser,)
extra_filter_backends = [AssetByNodeFilterBackend, LabelFilterBackend, IpInFilterBackend]
extra_filter_backends = [FilterAssetByNodeFilterBackend, LabelFilterBackend, IpInFilterBackend]
def set_assets_node(self, assets):
if not isinstance(assets, list):
@@ -115,7 +115,6 @@ class AssetTaskCreateApi(generics.CreateAPIView):
class AssetGatewayListApi(generics.ListAPIView):
permission_classes = (IsOrgAdminOrAppUser,)
serializer_class = serializers.GatewayWithAuthSerializer
model = Asset
def get_queryset(self):
asset_id = self.kwargs.get('pk')

View File

@@ -1,7 +1,9 @@
# ~*~ coding: utf-8 ~*~
from rest_framework.views import APIView, Response
from django.views.generic.detail import SingleObjectMixin
from django.utils.translation import ugettext as _
from rest_framework.views import APIView, Response
from rest_framework.serializers import ValidationError
from common.utils import get_logger
from common.permissions import IsOrgAdmin, IsOrgAdminOrAppUser
@@ -42,6 +44,10 @@ class GatewayTestConnectionApi(SingleObjectMixin, APIView):
def post(self, request, *args, **kwargs):
self.object = self.get_object(Gateway.objects.all())
local_port = self.request.data.get('port') or self.object.port
try:
local_port = int(local_port)
except ValueError:
raise ValidationError({'port': _('Number required')})
ok, e = self.object.test_connective(local_port=local_port)
if ok:
return Response("ok")

90
apps/assets/api/mixin.py Normal file
View File

@@ -0,0 +1,90 @@
from typing import List
from assets.models import Node, Asset
from assets.pagination import AssetLimitOffsetPagination
from common.utils import lazyproperty, dict_get_any, is_uuid, get_object_or_none
from assets.utils import get_node, is_query_node_all_assets
class SerializeToTreeNodeMixin:
permission_classes = ()
def serialize_nodes(self, nodes: List[Node], with_asset_amount=False):
if with_asset_amount:
def _name(node: Node):
return '{} ({})'.format(node.value, node.assets_amount)
else:
def _name(node: Node):
return node.value
data = [
{
'id': node.key,
'name': _name(node),
'title': _name(node),
'pId': node.parent_key,
'isParent': True,
'open': node.is_org_root(),
'meta': {
'node': {
"id": node.id,
"key": node.key,
"value": node.value,
},
'type': 'node'
}
}
for node in nodes
]
return data
def get_platform(self, asset: Asset):
default = 'file'
icon = {'windows', 'linux'}
platform = asset.platform_base.lower()
if platform in icon:
return platform
return default
def serialize_assets(self, assets, node_key=None):
if node_key is None:
get_pid = lambda asset: getattr(asset, 'parent_key', '')
else:
get_pid = lambda asset: node_key
data = [
{
'id': str(asset.id),
'name': asset.hostname,
'title': asset.ip,
'pId': get_pid(asset),
'isParent': False,
'open': False,
'iconSkin': self.get_platform(asset),
'chkDisabled': not asset.is_active,
'meta': {
'type': 'asset',
'asset': {
'id': asset.id,
'hostname': asset.hostname,
'ip': asset.ip,
'protocols': asset.protocols_as_list,
'platform': asset.platform_base,
'org_name': asset.org_name
},
}
}
for asset in assets
]
return data
class FilterAssetByNodeMixin:
pagination_class = AssetLimitOffsetPagination
@lazyproperty
def is_query_node_all_assets(self):
return is_query_node_all_assets(self.request)
@lazyproperty
def node(self):
return get_node(self.request)

View File

@@ -1,16 +1,28 @@
# ~*~ coding: utf-8 ~*~
from functools import partial
from collections import namedtuple, defaultdict
from collections import namedtuple
from rest_framework import status
from rest_framework.serializers import ValidationError
from rest_framework.response import Response
from rest_framework.decorators import action
from django.utils.translation import ugettext_lazy as _
from django.shortcuts import get_object_or_404, Http404
from django.utils.decorators import method_decorator
from django.db.models.signals import m2m_changed
from common.const.http import POST
from common.exceptions import SomeoneIsDoingThis
from common.const.signals import PRE_REMOVE, POST_REMOVE
from assets.models import Asset
from common.utils import get_logger, get_object_or_none
from common.tree import TreeNodeSerializer
from common.const.distributed_lock_key import UPDATE_NODE_TREE_LOCK_KEY
from orgs.mixins.api import OrgModelViewSet
from orgs.mixins import generics
from orgs.lock import org_level_transaction_lock
from orgs.utils import current_org
from assets.tasks import check_node_assets_amount_task
from ..hands import IsOrgAdmin
from ..models import Node
from ..tasks import (
@@ -18,12 +30,13 @@ from ..tasks import (
test_node_assets_connectivity_manual,
)
from .. import serializers
from .mixin import SerializeToTreeNodeMixin
logger = get_logger(__file__)
__all__ = [
'NodeViewSet', 'NodeChildrenApi', 'NodeAssetsApi',
'NodeAddAssetsApi', 'NodeRemoveAssetsApi', 'NodeReplaceAssetsApi',
'NodeAddAssetsApi', 'NodeRemoveAssetsApi', 'MoveAssetsToNodeApi',
'NodeAddChildrenApi', 'NodeListAsTreeApi',
'NodeChildrenAsTreeApi',
'NodeTaskCreateApi',
@@ -37,6 +50,11 @@ class NodeViewSet(OrgModelViewSet):
permission_classes = (IsOrgAdmin,)
serializer_class = serializers.NodeSerializer
@action(methods=[POST], detail=False, url_name='launch-check-assets-amount-task')
def launch_check_assets_amount_task(self, request):
task = check_node_assets_amount_task.delay(current_org.id)
return Response(data={'task': task.id})
# 仅支持根节点指直接创建子节点下的节点需要通过children接口创建
def perform_create(self, serializer):
child_key = Node.org_root().get_next_child_key()
@@ -52,6 +70,9 @@ class NodeViewSet(OrgModelViewSet):
def destroy(self, request, *args, **kwargs):
node = self.get_object()
if node.is_org_root():
error = _("You can't delete the root node ({})".format(node.value))
return Response(data={'error': error}, status=status.HTTP_403_FORBIDDEN)
if node.has_children_or_has_assets():
error = _("Deletion failed and the node contains children or assets")
return Response(data={'error': error}, status=status.HTTP_403_FORBIDDEN)
@@ -136,7 +157,7 @@ class NodeChildrenApi(generics.ListCreateAPIView):
return queryset
class NodeChildrenAsTreeApi(NodeChildrenApi):
class NodeChildrenAsTreeApi(SerializeToTreeNodeMixin, NodeChildrenApi):
"""
节点子节点作为树返回,
[
@@ -150,31 +171,23 @@ class NodeChildrenAsTreeApi(NodeChildrenApi):
"""
model = Node
serializer_class = TreeNodeSerializer
http_method_names = ['get']
def get_queryset(self):
queryset = super().get_queryset()
queryset = [node.as_tree_node() for node in queryset]
queryset = self.add_assets_if_need(queryset)
queryset = sorted(queryset)
return queryset
def list(self, request, *args, **kwargs):
nodes = self.get_queryset().order_by('value')
nodes = self.serialize_nodes(nodes, with_asset_amount=True)
assets = self.get_assets()
data = [*nodes, *assets]
return Response(data=data)
def add_assets_if_need(self, queryset):
def get_assets(self):
include_assets = self.request.query_params.get('assets', '0') == '1'
if not include_assets:
return queryset
return []
assets = self.instance.get_assets().only(
"id", "hostname", "ip", "os",
"org_id", "protocols",
"org_id", "protocols", "is_active"
)
for asset in assets:
queryset.append(asset.as_tree_node(self.instance))
return queryset
def check_need_refresh_nodes(self):
if self.request.query_params.get('refresh', '0') == '1':
Node.refresh_nodes()
return self.serialize_assets(assets, self.instance.key)
class NodeAssetsApi(generics.ListAPIView):
@@ -200,14 +213,14 @@ class NodeAddChildrenApi(generics.UpdateAPIView):
def put(self, request, *args, **kwargs):
instance = self.get_object()
nodes_id = request.data.get("nodes")
children = [get_object_or_none(Node, id=pk) for pk in nodes_id]
children = Node.objects.filter(id__in=nodes_id)
for node in children:
if not node:
continue
node.parent = instance
return Response("OK")
@method_decorator(org_level_transaction_lock(UPDATE_NODE_TREE_LOCK_KEY), name='patch')
@method_decorator(org_level_transaction_lock(UPDATE_NODE_TREE_LOCK_KEY), name='put')
class NodeAddAssetsApi(generics.UpdateAPIView):
model = Node
serializer_class = serializers.NodeAssetsSerializer
@@ -220,6 +233,8 @@ class NodeAddAssetsApi(generics.UpdateAPIView):
instance.assets.add(*tuple(assets))
@method_decorator(org_level_transaction_lock(UPDATE_NODE_TREE_LOCK_KEY), name='patch')
@method_decorator(org_level_transaction_lock(UPDATE_NODE_TREE_LOCK_KEY), name='put')
class NodeRemoveAssetsApi(generics.UpdateAPIView):
model = Node
serializer_class = serializers.NodeAssetsSerializer
@@ -228,15 +243,17 @@ class NodeRemoveAssetsApi(generics.UpdateAPIView):
def perform_update(self, serializer):
assets = serializer.validated_data.get('assets')
instance = self.get_object()
if instance != Node.org_root():
instance.assets.remove(*tuple(assets))
else:
assets = [asset for asset in assets if asset.nodes.count() > 1]
instance.assets.remove(*tuple(assets))
node = self.get_object()
node.assets.remove(*assets)
# 把孤儿资产添加到 root 节点
orphan_assets = Asset.objects.filter(id__in=[a.id for a in assets], nodes__isnull=True).distinct()
Node.org_root().assets.add(*orphan_assets)
class NodeReplaceAssetsApi(generics.UpdateAPIView):
@method_decorator(org_level_transaction_lock(UPDATE_NODE_TREE_LOCK_KEY), name='patch')
@method_decorator(org_level_transaction_lock(UPDATE_NODE_TREE_LOCK_KEY), name='put')
class MoveAssetsToNodeApi(generics.UpdateAPIView):
model = Node
serializer_class = serializers.NodeAssetsSerializer
permission_classes = (IsOrgAdmin,)
@@ -244,9 +261,39 @@ class NodeReplaceAssetsApi(generics.UpdateAPIView):
def perform_update(self, serializer):
assets = serializer.validated_data.get('assets')
instance = self.get_object()
for asset in assets:
asset.nodes.set([instance])
node = self.get_object()
self.remove_old_nodes(assets)
node.assets.add(*assets)
def remove_old_nodes(self, assets):
m2m_model = Asset.nodes.through
# 查询资产与节点关系表,查出要移动资产与节点的所有关系
relates = m2m_model.objects.filter(asset__in=assets).values_list('asset_id', 'node_id')
if relates:
# 对关系以资产进行分组,用来发 `reverse=False` 信号
asset_nodes_mapper = defaultdict(set)
for asset_id, node_id in relates:
asset_nodes_mapper[asset_id].add(node_id)
# 组建一个资产 id -> Asset 的 mapper
asset_mapper = {asset.id: asset for asset in assets}
# 创建删除关系信号发送函数
senders = []
for asset_id, node_id_set in asset_nodes_mapper.items():
senders.append(partial(m2m_changed.send, sender=m2m_model, instance=asset_mapper[asset_id],
reverse=False, model=Node, pk_set=node_id_set))
# 发送 pre 信号
[sender(action=PRE_REMOVE) for sender in senders]
num = len(relates)
asset_ids, node_ids = zip(*relates)
# 删除之前的关系
rows, _i = m2m_model.objects.filter(asset_id__in=asset_ids, node_id__in=node_ids).delete()
if rows != num:
raise SomeoneIsDoingThis
# 发送 post 信号
[sender(action=POST_REMOVE) for sender in senders]
class NodeTaskCreateApi(generics.CreateAPIView):
@@ -267,7 +314,6 @@ class NodeTaskCreateApi(generics.CreateAPIView):
@staticmethod
def refresh_nodes_cache():
Node.refresh_nodes()
Task = namedtuple('Task', ['id'])
task = Task(id="0")
return task

View File

@@ -3,7 +3,8 @@ from django.shortcuts import get_object_or_404
from rest_framework.response import Response
from common.utils import get_logger
from common.permissions import IsOrgAdmin, IsOrgAdminOrAppUser, IsAppUser
from common.permissions import IsOrgAdmin, IsOrgAdminOrAppUser
from common.drf.filters import CustomFilter
from orgs.mixins.api import OrgBulkModelViewSet
from orgs.mixins import generics
from orgs.utils import tmp_to_org
@@ -12,14 +13,14 @@ from .. import serializers
from ..serializers import SystemUserWithAuthInfoSerializer
from ..tasks import (
push_system_user_to_assets_manual, test_system_user_connectivity_manual,
push_system_user_a_asset_manual,
push_system_user_to_assets
)
logger = get_logger(__file__)
__all__ = [
'SystemUserViewSet', 'SystemUserAuthInfoApi', 'SystemUserAssetAuthInfoApi',
'SystemUserCommandFilterRuleListApi', 'SystemUserTaskApi',
'SystemUserCommandFilterRuleListApi', 'SystemUserTaskApi', 'SystemUserAssetsListView',
]
@@ -82,18 +83,18 @@ class SystemUserTaskApi(generics.CreateAPIView):
permission_classes = (IsOrgAdmin,)
serializer_class = serializers.SystemUserTaskSerializer
def do_push(self, system_user, asset=None):
if asset is None:
def do_push(self, system_user, assets_id=None):
if assets_id is None:
task = push_system_user_to_assets_manual.delay(system_user)
else:
username = self.request.query_params.get('username')
task = push_system_user_a_asset_manual.delay(
system_user, asset, username=username
task = push_system_user_to_assets.delay(
system_user.id, assets_id, username=username
)
return task
@staticmethod
def do_test(system_user, asset=None):
def do_test(system_user):
task = test_system_user_connectivity_manual.delay(system_user)
return task
@@ -104,11 +105,16 @@ class SystemUserTaskApi(generics.CreateAPIView):
def perform_create(self, serializer):
action = serializer.validated_data["action"]
asset = serializer.validated_data.get('asset')
assets = serializer.validated_data.get('assets') or []
system_user = self.get_object()
if action == 'push':
task = self.do_push(system_user, asset)
assets = [asset] if asset else assets
assets_id = [asset.id for asset in assets]
assets_id = assets_id if assets_id else None
task = self.do_push(system_user, assets_id)
else:
task = self.do_test(system_user, asset)
task = self.do_test(system_user)
data = getattr(serializer, '_data', {})
data["task"] = task.id
setattr(serializer, '_data', data)
@@ -125,3 +131,18 @@ class SystemUserCommandFilterRuleListApi(generics.ListAPIView):
pk = self.kwargs.get('pk', None)
system_user = get_object_or_404(SystemUser, pk=pk)
return system_user.cmd_filter_rules
class SystemUserAssetsListView(generics.ListAPIView):
permission_classes = (IsOrgAdmin,)
serializer_class = serializers.AssetSimpleSerializer
filter_fields = ("hostname", "ip")
search_fields = filter_fields
def get_object(self):
pk = self.kwargs.get('pk')
return get_object_or_404(SystemUser, pk=pk)
def get_queryset(self):
system_user = self.get_object()
return system_user.get_all_assets()

View File

@@ -95,7 +95,7 @@ class SystemUserNodeRelationViewSet(BaseRelationViewSet):
'id', 'node', 'systemuser',
]
search_fields = [
"node__value", "systemuser__name", "systemuser_username"
"node__value", "systemuser__name", "systemuser__username"
]
def get_objects_attr(self):

View File

@@ -0,0 +1,6 @@
from rest_framework import status
from common.exceptions import JMSException
class NodeIsBeingUpdatedByOthers(JMSException):
status_code = status.HTTP_409_CONFLICT

View File

@@ -5,8 +5,8 @@ from rest_framework.compat import coreapi, coreschema
from rest_framework import filters
from django.db.models import Q
from common.utils import dict_get_any, is_uuid, get_object_or_none
from .models import Node, Label
from .models import Label
from assets.utils import is_query_node_all_assets, get_node
class AssetByNodeFilterBackend(filters.BaseFilterBackend):
@@ -21,47 +21,54 @@ class AssetByNodeFilterBackend(filters.BaseFilterBackend):
for field in self.fields
]
@staticmethod
def is_query_all(request):
query_all_arg = request.query_params.get('all')
show_current_asset_arg = request.query_params.get('show_current_asset')
def filter_node_related_all(self, queryset, node):
return queryset.filter(
Q(nodes__key__istartswith=f'{node.key}:') |
Q(nodes__key=node.key)
).distinct()
query_all = query_all_arg == '1'
if show_current_asset_arg is not None:
query_all = show_current_asset_arg != '1'
return query_all
@staticmethod
def get_query_node(request):
node_id = dict_get_any(request.query_params, ['node', 'node_id'])
if not node_id:
return None, False
if is_uuid(node_id):
node = get_object_or_none(Node, id=node_id)
else:
node = get_object_or_none(Node, key=node_id)
return node, True
@staticmethod
def perform_query(pattern, queryset):
return queryset.filter(nodes__key__regex=pattern).distinct()
def filter_node_related_direct(self, queryset, node):
return queryset.filter(nodes__key=node.key).distinct()
def filter_queryset(self, request, queryset, view):
node, has_query_arg = self.get_query_node(request)
if not has_query_arg:
return queryset
node = get_node(request)
if node is None:
return queryset
query_all = self.is_query_all(request)
query_all = is_query_node_all_assets(request)
if query_all:
pattern = node.get_all_children_pattern(with_self=True)
return self.filter_node_related_all(queryset, node)
else:
# pattern = node.get_children_key_pattern(with_self=True)
# 只显示当前节点下资产
pattern = r"^{}$".format(node.key)
return self.perform_query(pattern, queryset)
return self.filter_node_related_direct(queryset, node)
class FilterAssetByNodeFilterBackend(filters.BaseFilterBackend):
"""
需要与 `assets.api.mixin.FilterAssetByNodeMixin` 配合使用
"""
fields = ['node', 'all']
def get_schema_fields(self, view):
return [
coreapi.Field(
name=field, location='query', required=False,
type='string', example='', description='', schema=None,
)
for field in self.fields
]
def filter_queryset(self, request, queryset, view):
node = view.node
if node is None:
return queryset
query_all = view.is_query_node_all_assets
if query_all:
return queryset.filter(
Q(nodes__key__istartswith=f'{node.key}:') |
Q(nodes__key=node.key)
).distinct()
else:
return queryset.filter(nodes__key=node.key).distinct()
class LabelFilterBackend(filters.BaseFilterBackend):
@@ -113,9 +120,14 @@ class LabelFilterBackend(filters.BaseFilterBackend):
class AssetRelatedByNodeFilterBackend(AssetByNodeFilterBackend):
@staticmethod
def perform_query(pattern, queryset):
return queryset.filter(asset__nodes__key__regex=pattern).distinct()
def filter_node_related_all(self, queryset, node):
return queryset.filter(
Q(asset__nodes__key__istartswith=f'{node.key}:') |
Q(asset__nodes__key=node.key)
).distinct()
def filter_node_related_direct(self, queryset, node):
return queryset.filter(asset__nodes__key=node.key).distinct()
class IpInFilterBackend(filters.BaseFilterBackend):

View File

@@ -0,0 +1,23 @@
# Generated by Django 2.2.13 on 2020-09-04 09:51
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('assets', '0055_auto_20200811_1845'),
]
operations = [
migrations.AddField(
model_name='node',
name='assets_amount',
field=models.IntegerField(default=0),
),
migrations.AddField(
model_name='node',
name='parent_key',
field=models.CharField(db_index=True, default='', max_length=64, verbose_name='Parent key'),
),
]

View File

@@ -0,0 +1,38 @@
# Generated by Django 2.2.13 on 2020-08-21 08:20
from django.db import migrations
from django.db.models import Q
def fill_node_value(apps, schema_editor):
Node = apps.get_model('assets', 'Node')
Asset = apps.get_model('assets', 'Asset')
node_queryset = Node.objects.all()
node_amount = node_queryset.count()
width = len(str(node_amount))
print('\n')
for i, node in enumerate(node_queryset):
print(f'\t{i+1:0>{width}}/{node_amount} compute node[{node.key}]`s assets_amount ...')
assets_amount = Asset.objects.filter(
Q(nodes__key__istartswith=f'{node.key}:') | Q(nodes=node)
).distinct().count()
key = node.key
try:
parent_key = key[:key.rindex(':')]
except ValueError:
parent_key = ''
node.assets_amount = assets_amount
node.parent_key = parent_key
node.save()
print(' ' + '.'*65, end='')
class Migration(migrations.Migration):
dependencies = [
('assets', '0056_auto_20200904_1751'),
]
operations = [
migrations.RunPython(fill_node_value)
]

View File

@@ -0,0 +1,27 @@
# Generated by Django 2.2.13 on 2020-10-23 03:15
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('assets', '0057_fill_node_value_assets_amount_and_parent_key'),
]
operations = [
migrations.AlterModelOptions(
name='asset',
options={'ordering': ['-date_created'], 'verbose_name': 'Asset'},
),
migrations.AlterField(
model_name='asset',
name='comment',
field=models.TextField(blank=True, default='', verbose_name='Comment'),
),
migrations.AlterField(
model_name='commandfilterrule',
name='content',
field=models.TextField(help_text='One line one command', verbose_name='Content'),
),
]

View File

@@ -0,0 +1,28 @@
# Generated by Django 2.2.13 on 2020-10-27 11:05
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('assets', '0058_auto_20201023_1115'),
]
operations = [
migrations.AlterField(
model_name='systemuser',
name='protocol',
field=models.CharField(choices=[('ssh', 'ssh'), ('rdp', 'rdp'), ('telnet', 'telnet'), ('vnc', 'vnc'), ('mysql', 'mysql'), ('oracle', 'oracle'), ('mariadb', 'mariadb'), ('postgresql', 'postgresql'), ('k8s', 'k8s')], default='ssh', max_length=16, verbose_name='Protocol'),
),
migrations.AddField(
model_name='systemuser',
name='ad_domain',
field=models.CharField(default='', max_length=256),
),
migrations.AlterField(
model_name='gateway',
name='ip',
field=models.CharField(db_index=True, max_length=128, verbose_name='IP'),
),
]

View File

@@ -0,0 +1,58 @@
# Generated by Django 2.2.13 on 2020-10-26 11:31
from django.db import migrations, models
def get_node_ancestor_keys(key, with_self=False):
parent_keys = []
key_list = key.split(":")
if not with_self:
key_list.pop()
for i in range(len(key_list)):
parent_keys.append(":".join(key_list))
key_list.pop()
return parent_keys
def migrate_nodes_value_with_slash(apps, schema_editor):
model = apps.get_model("assets", "Node")
db_alias = schema_editor.connection.alias
nodes = model.objects.using(db_alias).filter(value__contains='/')
print('')
print("- Start migrate node value if has /")
for i, node in enumerate(list(nodes)):
new_value = node.value.replace('/', '|')
print("{} start migrate node value: {} => {}".format(i, node.value, new_value))
node.value = new_value
node.save()
def migrate_nodes_full_value(apps, schema_editor):
model = apps.get_model("assets", "Node")
db_alias = schema_editor.connection.alias
nodes = model.objects.using(db_alias).all()
print("- Start migrate node full value")
for i, node in enumerate(list(nodes)):
print("{} start migrate {} node full value".format(i, node.value))
ancestor_keys = get_node_ancestor_keys(node.key, True)
values = model.objects.filter(key__in=ancestor_keys).values_list('key', 'value')
values = [v for k, v in sorted(values, key=lambda x: len(x[0]))]
node.full_value = '/' + '/'.join(values)
node.save()
class Migration(migrations.Migration):
dependencies = [
('assets', '0059_auto_20201027_1905'),
]
operations = [
migrations.AddField(
model_name='node',
name='full_value',
field=models.CharField(default='', max_length=4096, verbose_name='Full value'),
),
migrations.RunPython(migrate_nodes_value_with_slash),
migrations.RunPython(migrate_nodes_full_value)
]

View File

@@ -0,0 +1,17 @@
# Generated by Django 2.2.13 on 2020-11-16 09:57
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('assets', '0060_node_full_value'),
]
operations = [
migrations.AlterModelOptions(
name='node',
options={'ordering': ['value'], 'verbose_name': 'Node'},
),
]

View File

@@ -0,0 +1,17 @@
# Generated by Django 2.2.13 on 2020-11-17 11:38
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('assets', '0061_auto_20201116_1757'),
]
operations = [
migrations.AlterModelOptions(
name='asset',
options={'ordering': ['hostname', 'ip'], 'verbose_name': 'Asset'},
),
]

View File

@@ -0,0 +1,72 @@
# Generated by Jiangjie.Bai on 2020-12-01 10:47
from django.db import migrations
from django.db.models import Q
default_node_value = 'Default' # Always
old_default_node_key = '0' # Version <= 1.4.3
new_default_node_key = '1' # Version >= 1.4.4
def compute_parent_key(key):
try:
return key[:key.rindex(':')]
except ValueError:
return ''
def migrate_default_node_key(apps, schema_editor):
""" 将已经存在的Default节点的key从0修改为1 """
# 1.4.3版本中Default节点的key为0
print('')
Node = apps.get_model('assets', 'Node')
Asset = apps.get_model('assets', 'Asset')
# key为0的节点
old_default_node = Node.objects.filter(key=old_default_node_key, value=default_node_value).first()
if not old_default_node:
print(f'Check old default node `key={old_default_node_key} value={default_node_value}` not exists')
return
print(f'Check old default node `key={old_default_node_key} value={default_node_value}` exists')
# key为1的节点
new_default_node = Node.objects.filter(key=new_default_node_key, value=default_node_value).first()
if new_default_node:
print(f'Check new default node `key={new_default_node_key} value={default_node_value}` exists')
all_assets = Asset.objects.filter(
Q(nodes__key__startswith=f'{new_default_node_key}:') | Q(nodes__key=new_default_node_key)
).distinct()
if all_assets:
print(f'Check new default node has assets (count: {len(all_assets)})')
return
all_children = Node.objects.filter(key__startswith=f'{new_default_node_key}:')
if all_children:
print(f'Check new default node has children nodes (count: {len(all_children)})')
return
print(f'Check new default node not has assets and children nodes, delete it.')
new_default_node.delete()
# 执行修改
print(f'Modify old default node `key` from `{old_default_node_key}` to `{new_default_node_key}`')
nodes = Node.objects.filter(
Q(key__istartswith=f'{old_default_node_key}:') | Q(key=old_default_node_key)
)
for node in nodes:
old_key = node.key
key_list = old_key.split(':', maxsplit=1)
key_list[0] = new_default_node_key
new_key = ':'.join(key_list)
node.key = new_key
node.parent_key = compute_parent_key(node.key)
# 批量更新
print(f'Bulk update nodes `key` and `parent_key`, (count: {len(nodes)})')
Node.objects.bulk_update(nodes, ['key', 'parent_key'])
class Migration(migrations.Migration):
dependencies = [
('assets', '0062_auto_20201117_1938'),
]
operations = [
migrations.RunPython(migrate_default_node_key)
]

View File

@@ -0,0 +1,17 @@
# Generated by Django 3.1 on 2020-12-03 03:00
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('assets', '0063_migrate_default_node_key'),
]
operations = [
migrations.AlterModelOptions(
name='node',
options={'ordering': ['parent_key', 'value'], 'verbose_name': 'Node'},
),
]

View File

@@ -47,6 +47,10 @@ class AssetManager(OrgManager):
)
class AssetOrgManager(OrgManager):
pass
class AssetQuerySet(models.QuerySet):
def active(self):
return self.filter(is_active=True)
@@ -223,9 +227,10 @@ class Asset(ProtocolsMixin, NodesRelationMixin, OrgModelMixin):
labels = models.ManyToManyField('assets.Label', blank=True, related_name='assets', verbose_name=_("Labels"))
created_by = models.CharField(max_length=128, null=True, blank=True, verbose_name=_('Created by'))
date_created = models.DateTimeField(auto_now_add=True, null=True, blank=True, verbose_name=_('Date created'))
comment = models.TextField(max_length=128, default='', blank=True, verbose_name=_('Comment'))
comment = models.TextField(default='', blank=True, verbose_name=_('Comment'))
objects = AssetManager.from_queryset(AssetQuerySet)()
org_objects = AssetOrgManager.from_queryset(AssetQuerySet)()
_connectivity = None
def __str__(self):
@@ -308,6 +313,12 @@ class Asset(ProtocolsMixin, NodesRelationMixin, OrgModelMixin):
}
return info
def nodes_display(self):
names = []
for n in self.nodes.all():
names.append(n.full_value)
return names
def as_node(self):
from .node import Node
fake_node = Node()
@@ -350,36 +361,4 @@ class Asset(ProtocolsMixin, NodesRelationMixin, OrgModelMixin):
class Meta:
unique_together = [('org_id', 'hostname')]
verbose_name = _("Asset")
@classmethod
def generate_fake(cls, count=100):
from .user import AdminUser, SystemUser
from random import seed, choice
from django.db import IntegrityError
from .node import Node
from orgs.utils import get_current_org
from orgs.models import Organization
org = get_current_org()
if not org or not org.is_real():
Organization.default().change_to()
nodes = list(Node.objects.all())
seed()
for i in range(count):
ip = [str(i) for i in random.sample(range(255), 4)]
asset = cls(ip='.'.join(ip),
hostname='.'.join(ip),
admin_user=choice(AdminUser.objects.all()),
created_by='Fake')
try:
asset.save()
asset.protocols = 'ssh/22'
if nodes and len(nodes) > 3:
_nodes = random.sample(nodes, 3)
else:
_nodes = [Node.default_node()]
asset.nodes.set(_nodes)
logger.debug('Generate fake asset : %s' % asset.ip)
except IntegrityError:
print('Error continue')
continue
ordering = ["hostname", "ip"]

View File

@@ -158,9 +158,11 @@ class AuthMixin:
if update_fields:
self.save(update_fields=update_fields)
def has_special_auth(self, asset=None):
def has_special_auth(self, asset=None, username=None):
from .authbook import AuthBook
queryset = AuthBook.objects.filter(username=self.username)
if username is None:
username = self.username
queryset = AuthBook.objects.filter(username=username)
if asset:
queryset = queryset.filter(asset=asset)
return queryset.exists()

View File

@@ -38,27 +38,3 @@ class Cluster(models.Model):
class Meta:
ordering = ['name']
verbose_name = _("Cluster")
@classmethod
def generate_fake(cls, count=5):
from random import seed, choice
import forgery_py
from django.db import IntegrityError
seed()
for i in range(count):
cluster = cls(name=forgery_py.name.full_name(),
bandwidth='200M',
contact=forgery_py.name.full_name(),
phone=forgery_py.address.phone(),
address=forgery_py.address.city() + forgery_py.address.street_address(),
# operator=choice(['北京联通', '北京电信', 'BGP全网通']),
operator=choice([_('Beijing unicom'), _('Beijing telecom'), _('BGP full netcom')]),
comment=forgery_py.lorem_ipsum.sentence(),
created_by='Fake')
try:
cluster.save()
logger.debug('Generate fake asset group: %s' % cluster.name)
except IntegrityError:
print('Error continue')
continue

View File

@@ -52,7 +52,7 @@ class CommandFilterRule(OrgModelMixin):
type = models.CharField(max_length=16, default=TYPE_COMMAND, choices=TYPE_CHOICES, verbose_name=_("Type"))
priority = models.IntegerField(default=50, verbose_name=_("Priority"), help_text=_("1-100, the higher will be match first"),
validators=[MinValueValidator(1), MaxValueValidator(100)])
content = models.TextField(max_length=1024, verbose_name=_("Content"), help_text=_("One line one command"))
content = models.TextField(verbose_name=_("Content"), help_text=_("One line one command"))
action = models.IntegerField(default=ACTION_DENY, choices=ACTION_CHOICES, verbose_name=_("Action"))
comment = models.CharField(max_length=64, blank=True, default='', verbose_name=_("Comment"))
date_created = models.DateTimeField(auto_now_add=True)

View File

@@ -9,6 +9,7 @@ import paramiko
from django.db import models
from django.utils.translation import ugettext_lazy as _
from common.utils.strings import no_special_chars
from orgs.mixins.models import OrgModelMixin
from .base import BaseUser
@@ -47,7 +48,7 @@ class Gateway(BaseUser):
(PROTOCOL_SSH, 'ssh'),
(PROTOCOL_RDP, 'rdp'),
)
ip = models.GenericIPAddressField(max_length=32, verbose_name=_('IP'), db_index=True)
ip = models.CharField(max_length=128, verbose_name=_('IP'), db_index=True)
port = models.IntegerField(default=22, verbose_name=_('Port'))
protocol = models.CharField(choices=PROTOCOL_CHOICES, max_length=16, default=PROTOCOL_SSH, verbose_name=_("Protocol"))
domain = models.ForeignKey(Domain, on_delete=models.CASCADE, verbose_name=_("Domain"))
@@ -64,8 +65,8 @@ class Gateway(BaseUser):
def test_connective(self, local_port=None):
if local_port is None:
local_port = self.port
if self.password and not re.match(r'\w+$', self.password):
return False, _("Password should not contain special characters")
if self.password and not no_special_chars(self.password):
return False, _("Password should not contains special characters")
client = paramiko.SSHClient()
client.set_missing_host_key_policy(paramiko.AutoAddPolicy())

View File

@@ -18,3 +18,15 @@ class FavoriteAsset(CommonModelMixin):
@classmethod
def get_user_favorite_assets_id(cls, user):
return cls.objects.filter(user=user).values_list('asset', flat=True)
@classmethod
def get_user_favorite_assets(cls, user, asset_perms_id=None):
from assets.models import Asset
from perms.utils.asset.user_permission import get_user_granted_all_assets
asset_ids = get_user_granted_all_assets(
user,
via_mapping_node=False,
asset_perms_id=asset_perms_id
).values_list('id', flat=True)
query_name = cls.asset.field.related_query_name()
return Asset.org_objects.filter(**{f'{query_name}__user_id': user.id}, id__in=asset_ids).distinct()

View File

@@ -33,21 +33,3 @@ class AssetGroup(models.Model):
def initial(cls):
asset_group = cls(name=_('Default'), comment=_('Default asset group'))
asset_group.save()
@classmethod
def generate_fake(cls, count=100):
from random import seed
import forgery_py
from django.db import IntegrityError
seed()
for i in range(count):
group = cls(name=forgery_py.name.full_name(),
comment=forgery_py.lorem_ipsum.sentence(),
created_by='Fake')
try:
group.save()
logger.debug('Generate fake asset group: %s' % group.name)
except IntegrityError:
print('Error continue')
continue

View File

@@ -2,132 +2,35 @@
#
import uuid
import re
import time
from django.db import models, transaction
from django.db.models import Q
from django.db.utils import IntegrityError
from django.utils.translation import ugettext_lazy as _
from django.utils.translation import ugettext
from django.core.cache import cache
from django.db.transaction import atomic
from common.utils import get_logger, lazyproperty
from common.utils import get_logger
from common.utils.common import lazyproperty
from orgs.mixins.models import OrgModelMixin, OrgManager
from orgs.utils import get_current_org, tmp_to_org, current_org
from orgs.utils import get_current_org, tmp_to_org
from orgs.models import Organization
__all__ = ['Node']
__all__ = ['Node', 'FamilyMixin', 'compute_parent_key']
logger = get_logger(__name__)
def compute_parent_key(key):
try:
return key[:key.rindex(':')]
except ValueError:
return ''
class NodeQuerySet(models.QuerySet):
def delete(self):
raise PermissionError("Bulk delete node deny")
class TreeCache:
updated_time_cache_key = 'NODE_TREE_UPDATED_AT_{}'
cache_time = 3600
assets_updated_time_cache_key = 'NODE_TREE_ASSETS_UPDATED_AT_{}'
def __init__(self, tree, org_id):
now = time.time()
self.created_time = now
self.assets_created_time = now
self.tree = tree
self.org_id = org_id
def _has_changed(self, tp="tree"):
if tp == "assets":
key = self.assets_updated_time_cache_key.format(self.org_id)
else:
key = self.updated_time_cache_key.format(self.org_id)
updated_time = cache.get(key, 0)
if updated_time > self.created_time:
return True
else:
return False
@classmethod
def set_changed(cls, tp="tree", t=None, org_id=None):
if org_id is None:
org_id = current_org.id
if tp == "assets":
key = cls.assets_updated_time_cache_key.format(org_id)
else:
key = cls.updated_time_cache_key.format(org_id)
ttl = cls.cache_time
if not t:
t = time.time()
cache.set(key, t, ttl)
def tree_has_changed(self):
return self._has_changed("tree")
def set_tree_changed(self, t=None):
logger.debug("Set tree tree changed")
self.__class__.set_changed(t=t, tp="tree")
def assets_has_changed(self):
return self._has_changed("assets")
def set_tree_assets_changed(self, t=None):
logger.debug("Set tree assets changed")
self.__class__.set_changed(t=t, tp="assets")
def get(self):
if self.tree_has_changed():
self.renew()
return self.tree
if self.assets_has_changed():
self.tree.init_assets()
return self.tree
def renew(self):
new_obj = self.__class__.new(self.org_id)
self.tree = new_obj.tree
self.created_time = new_obj.created_time
self.assets_created_time = new_obj.assets_created_time
@classmethod
def new(cls, org_id=None):
from ..utils import TreeService
logger.debug("Create node tree")
if not org_id:
org_id = current_org.id
with tmp_to_org(org_id):
tree = TreeService.new()
obj = cls(tree, org_id)
obj.tree = tree
return obj
class TreeMixin:
_org_tree_map = {}
@classmethod
def tree(cls):
org_id = current_org.org_id()
t = cls.get_local_tree_cache(org_id)
if t is None:
t = TreeCache.new()
cls._org_tree_map[org_id] = t
return t.get()
@classmethod
def get_local_tree_cache(cls, org_id=None):
t = cls._org_tree_map.get(org_id)
return t
@classmethod
def refresh_tree(cls, t=None):
TreeCache.set_changed(tp="tree", t=t, org_id=current_org.id)
@classmethod
def refresh_node_assets(cls, t=None):
TreeCache.set_changed(tp="assets", t=t, org_id=current_org.id)
raise NotImplementedError
class FamilyMixin:
@@ -138,16 +41,16 @@ class FamilyMixin:
@staticmethod
def clean_children_keys(nodes_keys):
nodes_keys = sorted(list(nodes_keys), key=lambda x: (len(x), x))
sort_key = lambda k: [int(i) for i in k.split(':')]
nodes_keys = sorted(list(nodes_keys), key=sort_key)
nodes_keys_clean = []
for key in nodes_keys[::-1]:
found = False
for k in nodes_keys:
if key.startswith(k + ':'):
found = True
break
if not found:
nodes_keys_clean.append(key)
base_key = ''
for key in nodes_keys:
if key.startswith(base_key + ':'):
continue
nodes_keys_clean.append(key)
base_key = key
return nodes_keys_clean
@classmethod
@@ -175,13 +78,16 @@ class FamilyMixin:
return re.match(children_pattern, self.key)
def get_children(self, with_self=False):
pattern = self.get_children_key_pattern(with_self=with_self)
return Node.objects.filter(key__regex=pattern)
q = Q(parent_key=self.key)
if with_self:
q |= Q(key=self.key)
return Node.objects.filter(q)
def get_all_children(self, with_self=False):
pattern = self.get_all_children_pattern(with_self=with_self)
children = Node.objects.filter(key__regex=pattern)
return children
q = Q(key__istartswith=f'{self.key}:')
if with_self:
q |= Q(key=self.key)
return Node.objects.filter(q)
@property
def children(self):
@@ -191,9 +97,11 @@ class FamilyMixin:
def all_children(self):
return self.get_all_children(with_self=False)
def create_child(self, value, _id=None):
with transaction.atomic():
def create_child(self, value=None, _id=None):
with atomic(savepoint=False):
child_key = self.get_next_child_key()
if value is None:
value = child_key
child = self.__class__.objects.create(
id=_id, key=child_key, value=value
)
@@ -255,10 +163,13 @@ class FamilyMixin:
ancestor_keys = self.get_ancestor_keys(with_self=with_self)
return self.__class__.objects.filter(key__in=ancestor_keys)
@property
def parent_key(self):
parent_key = ":".join(self.key.split(":")[:-1])
return parent_key
# @property
# def parent_key(self):
# parent_key = ":".join(self.key.split(":")[:-1])
# return parent_key
def compute_parent_key(self):
return compute_parent_key(self.key)
def is_parent(self, other):
return other.is_children(self)
@@ -294,45 +205,63 @@ class FamilyMixin:
sibling = sibling.exclude(key=self.key)
return sibling
@classmethod
def create_node_by_full_value(cls, full_value):
if not full_value:
return []
nodes_family = full_value.split('/')
nodes_family = [v for v in nodes_family if v]
org_root = cls.org_root()
if nodes_family[0] == org_root.value:
nodes_family = nodes_family[1:]
return cls.create_nodes_recurse(nodes_family, org_root)
@classmethod
def create_nodes_recurse(cls, values, parent=None):
values = [v for v in values if v]
if not values:
return None
if parent is None:
parent = cls.org_root()
value = values[0]
child, created = parent.get_or_create_child(value=value)
if len(values) == 1:
return child
return cls.create_nodes_recurse(values[1:], child)
def get_family(self):
ancestors = self.get_ancestors()
children = self.get_all_children()
return [*tuple(ancestors), self, *tuple(children)]
class FullValueMixin:
key = ''
@lazyproperty
def full_value(self):
if self.is_org_root():
return self.value
value = self.tree().get_node_full_tag(self.key)
return value
class NodeAssetsMixin:
key = ''
id = None
@lazyproperty
def assets_amount(self):
amount = self.tree().assets_amount(self.key)
return amount
def get_all_assets(self):
from .asset import Asset
if self.is_org_root():
return Asset.objects.filter(org_id=self.org_id)
pattern = '^{0}$|^{0}:'.format(self.key)
return Asset.objects.filter(nodes__key__regex=pattern).distinct()
q = Q(nodes__key__startswith=f'{self.key}:') | Q(nodes__key=self.key)
return Asset.objects.filter(q).distinct()
@classmethod
def get_node_all_assets_by_key_v2(cls, key):
# 最初的写法是:
# Asset.objects.filter(Q(nodes__key__startswith=f'{node.key}:') | Q(nodes__id=node.id))
# 可是 startswith 会导致表关联时 Asset 索引失效
from .asset import Asset
node_ids = cls.objects.filter(
Q(key__startswith=f'{key}:') |
Q(key=key)
).values_list('id', flat=True).distinct()
assets = Asset.objects.filter(
nodes__id__in=list(node_ids)
).distinct()
return assets
def get_assets(self):
from .asset import Asset
if self.is_org_root():
assets = Asset.objects.filter(Q(nodes=self) | Q(nodes__isnull=True))
else:
assets = Asset.objects.filter(nodes=self)
assets = Asset.objects.filter(nodes=self)
return assets.distinct()
def get_valid_assets(self):
@@ -341,51 +270,54 @@ class NodeAssetsMixin:
def get_all_valid_assets(self):
return self.get_all_assets().valid()
@classmethod
def _get_nodes_all_assets(cls, nodes_keys):
"""
当节点比较多的时候,这种正则方式性能差极了
:param nodes_keys:
:return:
"""
from .asset import Asset
nodes_keys = cls.clean_children_keys(nodes_keys)
nodes_children_pattern = set()
for key in nodes_keys:
children_pattern = cls.get_node_all_children_key_pattern(key)
nodes_children_pattern.add(children_pattern)
pattern = '|'.join(nodes_children_pattern)
return Asset.objects.filter(nodes__key__regex=pattern).distinct()
@classmethod
def get_nodes_all_assets_ids(cls, nodes_keys):
nodes_keys = cls.clean_children_keys(nodes_keys)
assets_ids = set()
for key in nodes_keys:
node_assets_ids = cls.tree().all_assets(key)
assets_ids.update(set(node_assets_ids))
assets_ids = cls.get_nodes_all_assets(nodes_keys).values_list('id', flat=True)
return assets_ids
@classmethod
def get_nodes_all_assets(cls, nodes_keys, extra_assets_ids=None):
from .asset import Asset
nodes_keys = cls.clean_children_keys(nodes_keys)
assets_ids = cls.get_nodes_all_assets_ids(nodes_keys)
q = Q()
node_ids = ()
for key in nodes_keys:
q |= Q(key__startswith=f'{key}:')
q |= Q(key=key)
if q:
node_ids = Node.objects.filter(q).distinct().values_list('id', flat=True)
q = Q(nodes__id__in=list(node_ids))
if extra_assets_ids:
assets_ids.update(set(extra_assets_ids))
return Asset.objects.filter(id__in=assets_ids)
q |= Q(id__in=extra_assets_ids)
if q:
return Asset.org_objects.filter(q).distinct()
else:
return Asset.objects.none()
class SomeNodesMixin:
key = ''
default_key = '1'
default_value = 'Default'
ungrouped_key = '-10'
ungrouped_value = _('ungrouped')
empty_key = '-11'
empty_value = _("empty")
favorite_key = '-12'
favorite_value = _("favorite")
@classmethod
def default_node(cls):
with tmp_to_org(Organization.default()):
defaults = {'value': cls.default_value}
try:
obj, created = cls.objects.get_or_create(
defaults=defaults, key=cls.default_key,
)
except IntegrityError as e:
logger.error("Create default node failed: {}".format(e))
cls.modify_other_org_root_node_key()
obj, created = cls.objects.get_or_create(
defaults=defaults, key=cls.default_key,
)
return obj
def is_default_node(self):
return self.key == self.default_key
@@ -420,51 +352,18 @@ class SomeNodesMixin:
@classmethod
def org_root(cls):
root = cls.objects.filter(key__regex=r'^[0-9]+$')
root = cls.objects.filter(parent_key='')\
.filter(key__regex=r'^[0-9]+$')\
.exclude(key__startswith='-')\
.order_by('key')
if root:
return root[0]
else:
return cls.create_org_root_node()
@classmethod
def ungrouped_node(cls):
with tmp_to_org(Organization.system()):
defaults = {'value': cls.ungrouped_value}
obj, created = cls.objects.get_or_create(
defaults=defaults, key=cls.ungrouped_key
)
return obj
@classmethod
def default_node(cls):
with tmp_to_org(Organization.default()):
defaults = {'value': cls.default_value}
try:
obj, created = cls.objects.get_or_create(
defaults=defaults, key=cls.default_key,
)
except IntegrityError as e:
logger.error("Create default node failed: {}".format(e))
cls.modify_other_org_root_node_key()
obj, created = cls.objects.get_or_create(
defaults=defaults, key=cls.default_key,
)
return obj
@classmethod
def favorite_node(cls):
with tmp_to_org(Organization.system()):
defaults = {'value': cls.favorite_value}
obj, created = cls.objects.get_or_create(
defaults=defaults, key=cls.favorite_key
)
return obj
@classmethod
def initial_some_nodes(cls):
cls.default_node()
cls.ungrouped_node()
cls.favorite_node()
@classmethod
def modify_other_org_root_node_key(cls):
@@ -496,12 +395,16 @@ class SomeNodesMixin:
logger.info('Modify key ( {} > {} )'.format(old_key, new_key))
class Node(OrgModelMixin, SomeNodesMixin, TreeMixin, FamilyMixin, FullValueMixin, NodeAssetsMixin):
class Node(OrgModelMixin, SomeNodesMixin, FamilyMixin, NodeAssetsMixin):
id = models.UUIDField(default=uuid.uuid4, primary_key=True)
key = models.CharField(unique=True, max_length=64, verbose_name=_("Key")) # '1:1:1:1'
value = models.CharField(max_length=128, verbose_name=_("Value"))
full_value = models.CharField(max_length=4096, verbose_name=_('Full value'), default='')
child_mark = models.IntegerField(default=0)
date_create = models.DateTimeField(auto_now_add=True)
parent_key = models.CharField(max_length=64, verbose_name=_("Parent key"),
db_index=True, default='')
assets_amount = models.IntegerField(default=0)
objects = OrgManager.from_queryset(NodeQuerySet)()
is_node = True
@@ -509,10 +412,10 @@ class Node(OrgModelMixin, SomeNodesMixin, TreeMixin, FamilyMixin, FullValueMixin
class Meta:
verbose_name = _("Node")
ordering = ['key']
ordering = ['parent_key', 'value']
def __str__(self):
return self.value
return self.full_value
# def __eq__(self, other):
# if not other:
@@ -536,18 +439,19 @@ class Node(OrgModelMixin, SomeNodesMixin, TreeMixin, FamilyMixin, FullValueMixin
def name(self):
return self.value
def computed_full_value(self):
# 不要在列表中调用该属性
values = self.__class__.objects.filter(
key__in=self.get_ancestor_keys()
).values_list('key', 'value')
values = [v for k, v in sorted(values, key=lambda x: len(x[0]))]
values.append(str(self.value))
return '/' + '/'.join(values)
@property
def level(self):
return len(self.key.split(':'))
@classmethod
def refresh_nodes(cls):
cls.refresh_tree()
@classmethod
def refresh_assets(cls):
cls.refresh_node_assets()
def as_tree_node(self):
from common.tree import TreeNode
name = '{} ({})'.format(self.value, self.assets_amount)
@@ -582,19 +486,26 @@ class Node(OrgModelMixin, SomeNodesMixin, TreeMixin, FamilyMixin, FullValueMixin
return
return super().delete(using=using, keep_parents=keep_parents)
@classmethod
def generate_fake(cls, count=100):
import random
org = get_current_org()
if not org or not org.is_real():
Organization.default().change_to()
nodes = list(cls.objects.all())
if count > 100:
length = 100
else:
length = count
def update_child_full_value(self):
nodes = self.get_all_children(with_self=True)
sort_key_func = lambda n: [int(i) for i in n.key.split(':')]
nodes_sorted = sorted(list(nodes), key=sort_key_func)
nodes_mapper = {n.key: n for n in nodes_sorted}
if not self.is_org_root():
# 如果是org_root那么parent_key为'', parent为自己所以这种情况不处理
# 更新自己时自己的parent_key获取不到
nodes_mapper.update({self.parent_key: self.parent})
for node in nodes_sorted:
parent = nodes_mapper.get(node.parent_key)
if not parent:
if node.parent_key:
logger.error(f'Node parent node in mapper: {node.parent_key} {node.value}')
continue
node.full_value = parent.full_value + '/' + node.value
self.__class__.objects.bulk_update(nodes, ['full_value'])
for i in range(length):
node = random.choice(nodes)
child = node.create_child('Node {}'.format(i))
print("{}. {}".format(i, child))
def save(self, *args, **kwargs):
self.full_value = self.computed_full_value()
instance = super().save(*args, **kwargs)
self.update_child_full_value()
return instance

View File

@@ -65,26 +65,6 @@ class AdminUser(BaseUser):
unique_together = [('name', 'org_id')]
verbose_name = _("Admin user")
@classmethod
def generate_fake(cls, count=10):
from random import seed
import forgery_py
from django.db import IntegrityError
seed()
for i in range(count):
obj = cls(name=forgery_py.name.full_name(),
username=forgery_py.internet.user_name(),
password=forgery_py.lorem_ipsum.word(),
comment=forgery_py.lorem_ipsum.sentence(),
created_by='Fake')
try:
obj.save()
logger.debug('Generate fake asset group: %s' % obj.name)
except IntegrityError:
print('Error continue')
continue
class SystemUser(BaseUser):
PROTOCOL_SSH = 'ssh'
@@ -92,6 +72,9 @@ class SystemUser(BaseUser):
PROTOCOL_TELNET = 'telnet'
PROTOCOL_VNC = 'vnc'
PROTOCOL_MYSQL = 'mysql'
PROTOCOL_ORACLE = 'oracle'
PROTOCOL_MARIADB = 'mariadb'
PROTOCOL_POSTGRESQL = 'postgresql'
PROTOCOL_K8S = 'k8s'
PROTOCOL_CHOICES = (
(PROTOCOL_SSH, 'ssh'),
@@ -99,6 +82,9 @@ class SystemUser(BaseUser):
(PROTOCOL_TELNET, 'telnet'),
(PROTOCOL_VNC, 'vnc'),
(PROTOCOL_MYSQL, 'mysql'),
(PROTOCOL_ORACLE, 'oracle'),
(PROTOCOL_MARIADB, 'mariadb'),
(PROTOCOL_POSTGRESQL, 'postgresql'),
(PROTOCOL_K8S, 'k8s'),
)
@@ -124,6 +110,7 @@ class SystemUser(BaseUser):
token = models.TextField(default='', verbose_name=_('Token'))
home = models.CharField(max_length=4096, default='', verbose_name=_('Home'), blank=True)
system_groups = models.CharField(default='', max_length=4096, verbose_name=_('System groups'), blank=True)
ad_domain = models.CharField(default='', max_length=256)
_prefer = 'system_user'
def __str__(self):
@@ -146,6 +133,24 @@ class SystemUser(BaseUser):
def login_mode_display(self):
return self.get_login_mode_display()
@property
def db_application_protocols(self):
return [
self.PROTOCOL_MYSQL, self.PROTOCOL_ORACLE, self.PROTOCOL_MARIADB,
self.PROTOCOL_POSTGRESQL
]
@property
def cloud_application_protocols(self):
return [self.PROTOCOL_K8S]
@property
def application_category_protocols(self):
protocols = []
protocols.extend(self.db_application_protocols)
protocols.extend(self.cloud_application_protocols)
return protocols
def is_need_push(self):
if self.auto_push and self.protocol in [self.PROTOCOL_SSH, self.PROTOCOL_RDP]:
return True
@@ -158,11 +163,16 @@ class SystemUser(BaseUser):
@property
def is_need_test_asset_connective(self):
return self.protocol not in [self.PROTOCOL_MYSQL]
return self.protocol not in self.application_category_protocols
def has_special_auth(self, asset=None, username=None):
if username is None and self.username_same_with_user:
raise TypeError('System user is dynamic, username should be pass')
return super().has_special_auth(asset=asset, username=username)
@property
def can_perm_to_asset(self):
return self.protocol not in [self.PROTOCOL_MYSQL]
return self.protocol not in self.application_category_protocols
def _merge_auth(self, other):
super()._merge_auth(other)
@@ -199,23 +209,3 @@ class SystemUser(BaseUser):
ordering = ['name']
unique_together = [('name', 'org_id')]
verbose_name = _("System user")
@classmethod
def generate_fake(cls, count=10):
from random import seed
import forgery_py
from django.db import IntegrityError
seed()
for i in range(count):
obj = cls(name=forgery_py.name.full_name(),
username=forgery_py.internet.user_name(),
password=forgery_py.lorem_ipsum.word(),
comment=forgery_py.lorem_ipsum.sentence(),
created_by='Fake')
try:
obj.save()
logger.debug('Generate fake asset group: %s' % obj.name)
except IntegrityError:
print('Error continue')
continue

39
apps/assets/pagination.py Normal file
View File

@@ -0,0 +1,39 @@
from rest_framework.pagination import LimitOffsetPagination
from rest_framework.request import Request
from assets.models import Node
class AssetLimitOffsetPagination(LimitOffsetPagination):
"""
需要与 `assets.api.mixin.FilterAssetByNodeMixin` 配合使用
"""
def get_count(self, queryset):
"""
1. 如果查询节点下的所有资产,那 count 使用 Node.assets_amount
2. 如果有其他过滤条件使用 super
3. 如果只查询该节点下的资产使用 super
"""
exclude_query_params = {
self.limit_query_param,
self.offset_query_param,
'node', 'all', 'show_current_asset',
'node_id', 'display', 'draw', 'fields_size',
}
for k, v in self._request.query_params.items():
if k not in exclude_query_params and v is not None:
return super().get_count(queryset)
is_query_all = self._view.is_query_node_all_assets
if is_query_all:
node = self._view.node
if not node:
node = Node.org_root()
return node.assets_amount
return super().get_count(queryset)
def paginate_queryset(self, queryset, request: Request, view=None):
self._request = request
self._view = view
return super().paginate_queryset(queryset, request, view=None)

View File

@@ -1,13 +1,12 @@
# -*- coding: utf-8 -*-
#
from rest_framework import serializers
from django.db.models import Prefetch, F, Count
from django.db.models import F
from django.utils.translation import ugettext_lazy as _
from orgs.mixins.serializers import BulkOrgResourceModelSerializer
from common.serializers import AdaptedBulkListSerializer
from ..models import Asset, Node, Label, Platform
from ..models import Asset, Node, Platform
from .base import ConnectivitySerializer
__all__ = [
@@ -67,15 +66,15 @@ class AssetSerializer(BulkOrgResourceModelSerializer):
slug_field='name', queryset=Platform.objects.all(), label=_("Platform")
)
protocols = ProtocolsField(label=_('Protocols'), required=False)
domain_display = serializers.ReadOnlyField(source='domain.name')
admin_user_display = serializers.ReadOnlyField(source='admin_user.name')
domain_display = serializers.ReadOnlyField(source='domain.name', label=_('Domain name'))
admin_user_display = serializers.ReadOnlyField(source='admin_user.name', label=_('Admin user name'))
nodes_display = serializers.ListField(child=serializers.CharField(), label=_('Nodes name'), required=False)
"""
资产的数据结构
"""
class Meta:
model = Asset
list_serializer_class = AdaptedBulkListSerializer
fields_mini = ['id', 'hostname', 'ip']
fields_small = fields_mini + [
'protocol', 'port', 'protocols', 'is_active', 'public_ip',
@@ -91,7 +90,7 @@ class AssetSerializer(BulkOrgResourceModelSerializer):
'platform': ['name']
}
fields_m2m = [
'nodes', 'labels',
'nodes', 'nodes_display', 'labels',
]
annotates_fields = {
# 'admin_user_display': 'admin_user__name'
@@ -99,9 +98,6 @@ class AssetSerializer(BulkOrgResourceModelSerializer):
fields_as = list(annotates_fields.keys())
fields = fields_small + fields_fk + fields_m2m + fields_as
read_only_fields = [
'vendor', 'model', 'sn', 'cpu_model', 'cpu_count',
'cpu_cores', 'cpu_vcpus', 'memory', 'disk_total', 'disk_info',
'os', 'os_version', 'os_arch', 'hostname_raw',
'created_by', 'date_created',
] + fields_as
@@ -116,6 +112,7 @@ class AssetSerializer(BulkOrgResourceModelSerializer):
def setup_eager_loading(cls, queryset):
""" Perform necessary eager loading of data. """
queryset = queryset.select_related('admin_user', 'domain', 'platform')
queryset = queryset.prefetch_related('nodes', 'labels')
return queryset
def compatible_with_old_protocol(self, validated_data):
@@ -133,14 +130,32 @@ class AssetSerializer(BulkOrgResourceModelSerializer):
if protocols_data:
validated_data["protocols"] = ' '.join(protocols_data)
def perform_nodes_display_create(self, instance, nodes_display):
if not nodes_display:
return
nodes_to_set = []
for full_value in nodes_display:
node = Node.objects.filter(full_value=full_value).first()
if node:
nodes_to_set.append(node)
else:
node = Node.create_node_by_full_value(full_value)
nodes_to_set.append(node)
instance.nodes.set(nodes_to_set)
def create(self, validated_data):
self.compatible_with_old_protocol(validated_data)
nodes_display = validated_data.pop('nodes_display', '')
instance = super().create(validated_data)
self.perform_nodes_display_create(instance, nodes_display)
return instance
def update(self, instance, validated_data):
nodes_display = validated_data.pop('nodes_display', '')
self.compatible_with_old_protocol(validated_data)
return super().update(instance, validated_data)
instance = super().update(instance, validated_data)
self.perform_nodes_display_create(instance, nodes_display)
return instance
class AssetDisplaySerializer(AssetSerializer):
@@ -153,7 +168,7 @@ class AssetDisplaySerializer(AssetSerializer):
@classmethod
def setup_eager_loading(cls, queryset):
""" Perform necessary eager loading of data. """
queryset = super().setup_eager_loading(queryset)
queryset = queryset\
.annotate(admin_user_username=F('admin_user__username'))
return queryset

View File

@@ -1,17 +1,19 @@
# -*- coding: utf-8 -*-
#
from rest_framework import serializers
from django.utils.translation import ugettext_lazy as _
from common.serializers import AdaptedBulkListSerializer
from orgs.mixins.serializers import BulkOrgResourceModelSerializer
from common.validators import NoSpecialChars
from ..models import Domain, Gateway
from .base import AuthSerializerMixin
class DomainSerializer(BulkOrgResourceModelSerializer):
asset_count = serializers.SerializerMethodField()
gateway_count = serializers.SerializerMethodField()
asset_count = serializers.SerializerMethodField(label=_('Assets count'))
application_count = serializers.SerializerMethodField(label=_('Applications count'))
gateway_count = serializers.SerializerMethodField(label=_('Gateways count'))
class Meta:
model = Domain
@@ -20,12 +22,12 @@ class DomainSerializer(BulkOrgResourceModelSerializer):
'comment', 'date_created'
]
fields_m2m = [
'asset_count', 'assets', 'gateway_count',
'asset_count', 'assets', 'application_count', 'gateway_count',
]
fields = fields_small + fields_m2m
read_only_fields = ('asset_count', 'gateway_count', 'date_created')
extra_kwargs = {
'assets': {'required': False}
'assets': {'required': False, 'label': _('Assets')},
}
list_serializer_class = AdaptedBulkListSerializer
@@ -33,6 +35,10 @@ class DomainSerializer(BulkOrgResourceModelSerializer):
def get_asset_count(obj):
return obj.assets.count()
@staticmethod
def get_application_count(obj):
return obj.applications.count()
@staticmethod
def get_gateway_count(obj):
return obj.gateway_set.all().count()
@@ -47,6 +53,9 @@ class GatewaySerializer(AuthSerializerMixin, BulkOrgResourceModelSerializer):
'private_key', 'public_key', 'domain', 'is_active', 'date_created',
'date_updated', 'created_by', 'comment',
]
extra_kwargs = {
'password': {'validators': [NoSpecialChars()]}
}
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)

View File

@@ -25,6 +25,9 @@ class NodeSerializer(BulkOrgResourceModelSerializer):
read_only_fields = ['key', 'org_id']
def validate_value(self, data):
if '/' in data:
error = _("Can't contains: " + "/")
raise serializers.ValidationError(error)
if self.instance:
instance = self.instance
siblings = instance.get_siblings()

View File

@@ -6,7 +6,6 @@ from common.serializers import AdaptedBulkListSerializer
from common.mixins.serializers import BulkSerializerMixin
from common.utils import ssh_pubkey_gen
from orgs.mixins.serializers import BulkOrgResourceModelSerializer
from assets.models import Node
from ..models import SystemUser, Asset
from .base import AuthSerializerMixin
@@ -35,17 +34,18 @@ class SystemUserSerializer(AuthSerializerMixin, BulkOrgResourceModelSerializer):
'auto_push', 'cmd_filters', 'sudo', 'shell', 'comment',
'auto_generate_key', 'sftp_root', 'token',
'assets_amount', 'date_created', 'created_by',
'home', 'system_groups'
'home', 'system_groups', 'ad_domain'
]
extra_kwargs = {
'password': {"write_only": True},
'public_key': {"write_only": True},
'private_key': {"write_only": True},
'token': {"write_only": True},
'nodes_amount': {'label': _('Node')},
'assets_amount': {'label': _('Asset')},
'nodes_amount': {'label': _('Nodes amount')},
'assets_amount': {'label': _('Assets amount')},
'login_mode_display': {'label': _('Login mode display')},
'created_by': {'read_only': True},
'ad_domain': {'required': False, 'allow_blank': True, 'label': _('Ad domain')},
}
def validate_auto_push(self, value):
@@ -149,13 +149,24 @@ class SystemUserListSerializer(SystemUserSerializer):
class Meta(SystemUserSerializer.Meta):
fields = [
'id', 'name', 'username', 'protocol',
'password', 'public_key', 'private_key',
'login_mode', 'login_mode_display',
'priority', "username_same_with_user",
'auto_push', 'sudo', 'shell', 'comment',
"assets_amount", 'home', 'system_groups',
'auto_generate_key',
'auto_generate_key', 'ad_domain',
'sftp_root',
]
extra_kwargs = {
'password': {"write_only": True},
'public_key': {"write_only": True},
'private_key': {"write_only": True},
'nodes_amount': {'label': _('Nodes amount')},
'assets_amount': {'label': _('Assets amount')},
'login_mode_display': {'label': _('Login mode display')},
'created_by': {'read_only': True},
'ad_domain': {'label': _('Ad domain')},
}
@classmethod
def setup_eager_loading(cls, queryset):
@@ -172,7 +183,8 @@ class SystemUserWithAuthInfoSerializer(SystemUserSerializer):
'login_mode', 'login_mode_display',
'priority', 'username_same_with_user',
'auto_push', 'sudo', 'shell', 'comment',
'auto_generate_key', 'sftp_root', 'token'
'auto_generate_key', 'sftp_root', 'token',
'ad_domain',
]
extra_kwargs = {
'nodes_amount': {'label': _('Node')},
@@ -222,15 +234,8 @@ class SystemUserNodeRelationSerializer(RelationMixin, serializers.ModelSerialize
'id', 'node', "node_display",
]
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.tree = Node.tree()
def get_node_display(self, obj):
if hasattr(obj, 'node_key'):
return self.tree.get_node_full_tag(obj.node_key)
else:
return obj.node.full_value
return obj.node.full_value
class SystemUserUserRelationSerializer(RelationMixin, serializers.ModelSerializer):
@@ -252,4 +257,8 @@ class SystemUserTaskSerializer(serializers.Serializer):
asset = serializers.PrimaryKeyRelatedField(
queryset=Asset.objects, allow_null=True, required=False, write_only=True
)
assets = serializers.PrimaryKeyRelatedField(
queryset=Asset.objects, allow_null=True, required=False, write_only=True,
many=True
)
task = serializers.CharField(read_only=True)

View File

@@ -1,17 +1,20 @@
# -*- coding: utf-8 -*-
#
from collections import defaultdict
from operator import add, sub
from assets.utils import is_asset_exists_in_node
from django.db.models.signals import (
post_save, m2m_changed, post_delete
post_save, m2m_changed, pre_delete, post_delete, pre_save
)
from django.db.models.aggregates import Count
from django.db.models import Q, F
from django.dispatch import receiver
from common.exceptions import M2MReverseNotAllowed
from common.const.signals import PRE_ADD, POST_ADD, POST_REMOVE, PRE_CLEAR, PRE_REMOVE
from common.utils import get_logger
from common.decorator import on_transaction_commit
from orgs.utils import tmp_to_root_org
from .models import Asset, SystemUser, Node, AuthBook
from .utils import TreeService
from .models import Asset, SystemUser, Node, compute_parent_key
from users.models import User
from .tasks import (
update_assets_hardware_info_util,
test_asset_connectivity_util,
@@ -34,6 +37,11 @@ def test_asset_conn_on_created(asset):
test_asset_connectivity_util.delay([asset])
@receiver(pre_save, sender=Node)
def on_node_pre_save(sender, instance: Node, **kwargs):
instance.parent_key = instance.compute_parent_key()
@receiver(post_save, sender=Asset)
@on_transaction_commit
def on_asset_created_or_update(sender, instance=None, created=False, **kwargs):
@@ -54,17 +62,9 @@ def on_asset_created_or_update(sender, instance=None, created=False, **kwargs):
instance.nodes.add(Node.org_root())
@receiver(post_delete, sender=Asset)
def on_asset_delete(sender, instance=None, **kwargs):
"""
当资产删除时,刷新节点,节点中存在节点和资产的关系
"""
logger.debug("Asset delete signal recv: {}".format(instance))
Node.refresh_assets()
@receiver(post_save, sender=SystemUser, dispatch_uid="jms")
def on_system_user_update(sender, instance=None, created=True, **kwargs):
@on_transaction_commit
def on_system_user_update(instance: SystemUser, created, **kwargs):
"""
当系统用户更新时,可能更新了秘钥,用户名等,这时要自动推送系统用户到资产上,
其实应该当 用户名,密码,秘钥 sudo等更新时再推送这里偷个懒,
@@ -74,53 +74,57 @@ def on_system_user_update(sender, instance=None, created=True, **kwargs):
if instance and not created:
logger.info("System user update signal recv: {}".format(instance))
assets = instance.assets.all().valid()
push_system_user_to_assets.delay(instance, assets)
push_system_user_to_assets.delay(instance.id, [_asset.id for _asset in assets])
@receiver(m2m_changed, sender=SystemUser.assets.through)
def on_system_user_assets_change(sender, instance=None, action='', model=None, pk_set=None, **kwargs):
@on_transaction_commit
def on_system_user_assets_change(instance, action, model, pk_set, **kwargs):
"""
当系统用户和资产关系发生变化时,应该重新推送系统用户到新添加的资产中
"""
if action != "post_add":
if action != POST_ADD:
return
logger.debug("System user assets change signal recv: {}".format(instance))
queryset = model.objects.filter(pk__in=pk_set)
if model == Asset:
system_users = [instance]
assets = queryset
system_users_id = [instance.id]
assets_id = pk_set
else:
system_users = queryset
assets = [instance]
for system_user in system_users:
push_system_user_to_assets.delay(system_user, assets)
system_users_id = pk_set
assets_id = [instance.id]
for system_user_id in system_users_id:
push_system_user_to_assets.delay(system_user_id, assets_id)
@receiver(m2m_changed, sender=SystemUser.users.through)
def on_system_user_users_change(sender, instance=None, action='', model=None, pk_set=None, **kwargs):
@on_transaction_commit
def on_system_user_users_change(sender, instance: SystemUser, action, model, pk_set, reverse, **kwargs):
"""
当系统用户和用户关系发生变化时,应该重新推送系统用户资产中
"""
if action != "post_add":
if action != POST_ADD:
return
if reverse:
raise M2MReverseNotAllowed
if not instance.username_same_with_user:
return
logger.debug("System user users change signal recv: {}".format(instance))
queryset = model.objects.filter(pk__in=pk_set)
if model == SystemUser:
system_users = queryset
else:
system_users = [instance]
for s in system_users:
push_system_user_to_assets_manual.delay(s)
usernames = model.objects.filter(pk__in=pk_set).values_list('username', flat=True)
for username in usernames:
push_system_user_to_assets_manual.delay(instance, username)
@receiver(m2m_changed, sender=SystemUser.nodes.through)
@on_transaction_commit
def on_system_user_nodes_change(sender, instance=None, action=None, model=None, pk_set=None, **kwargs):
"""
当系统用户和节点关系发生变化时,应该将节点下资产关联到新的系统用户上
"""
if action != "post_add":
if action != POST_ADD:
return
logger.info("System user nodes update signal recv: {}".format(instance))
@@ -135,104 +139,217 @@ def on_system_user_nodes_change(sender, instance=None, action=None, model=None,
@receiver(m2m_changed, sender=SystemUser.groups.through)
def on_system_user_groups_change(sender, instance=None, action=None, model=None,
pk_set=None, reverse=False, **kwargs):
def on_system_user_groups_change(instance, action, pk_set, reverse, **kwargs):
"""
当系统用户和用户组关系发生变化时,应该将组下用户关联到新的系统用户上
"""
if action != "post_add" or reverse:
if action != POST_ADD:
return
if reverse:
raise M2MReverseNotAllowed
logger.info("System user groups update signal recv: {}".format(instance))
groups = model.objects.filter(pk__in=pk_set).annotate(users_count=Count("users"))
users = groups.filter(users_count__gt=0).values_list('users', flat=True)
instance.users.add(*tuple(users))
users = User.objects.filter(groups__id__in=pk_set).distinct()
instance.users.add(*users)
@receiver(m2m_changed, sender=Asset.nodes.through)
def on_asset_nodes_change(sender, instance=None, action='', **kwargs):
def on_asset_nodes_add(instance, action, reverse, pk_set, **kwargs):
"""
资产节点发生变化时,刷新节点
"""
if action.startswith('post'):
logger.debug("Asset nodes change signal recv: {}".format(instance))
Node.refresh_assets()
with tmp_to_root_org():
Node.refresh_assets()
本操作共访问 4 次数据库
@receiver(m2m_changed, sender=Asset.nodes.through)
def on_asset_nodes_add(sender, instance=None, action='', model=None, pk_set=None, **kwargs):
"""
当资产的节点发生变化时,或者 当节点的资产关系发生变化时,
节点下新增的资产,添加到节点关联的系统用户中
"""
if action != "post_add":
if action != POST_ADD:
return
logger.debug("Assets node add signal recv: {}".format(action))
if model == Node:
nodes = model.objects.filter(pk__in=pk_set).values_list('key', flat=True)
assets = [instance.id]
else:
if reverse:
nodes = [instance.key]
assets = model.objects.filter(pk__in=pk_set).values_list('id', flat=True)
asset_ids = pk_set
else:
nodes = Node.objects.filter(pk__in=pk_set).values_list('key', flat=True)
asset_ids = [instance.id]
# 节点资产发生变化时,将资产关联到节点及祖先节点关联的系统用户, 只关注新增的
nodes_ancestors_keys = set()
node_tree = TreeService.new()
for node in nodes:
ancestors_keys = node_tree.ancestors_ids(nid=node)
nodes_ancestors_keys.update(ancestors_keys)
system_users = SystemUser.objects.filter(nodes__key__in=nodes_ancestors_keys)
nodes_ancestors_keys.update(Node.get_node_ancestor_keys(node, with_self=True))
system_users_assets = defaultdict(set)
for system_user in system_users:
assets_has_set = system_user.assets.all().filter(id__in=assets).values_list('id', flat=True)
assets_remain = set(assets) - set(assets_has_set)
system_users_assets[system_user].update(assets_remain)
for system_user, _assets in system_users_assets.items():
system_user.assets.add(*tuple(_assets))
# 查询所有祖先节点关联的系统用户,都是要跟资产建立关系的
system_user_ids = SystemUser.objects.filter(
nodes__key__in=nodes_ancestors_keys
).distinct().values_list('id', flat=True)
# 查询所有已存在的关系
m2m_model = SystemUser.assets.through
exist = set(m2m_model.objects.filter(
systemuser_id__in=system_user_ids, asset_id__in=asset_ids
).values_list('systemuser_id', 'asset_id'))
# TODO 优化
to_create = []
for system_user_id in system_user_ids:
asset_ids_to_push = []
for asset_id in asset_ids:
if (system_user_id, asset_id) in exist:
continue
asset_ids_to_push.append(asset_id)
to_create.append(m2m_model(
systemuser_id=system_user_id,
asset_id=asset_id
))
push_system_user_to_assets.delay(system_user_id, asset_ids_to_push)
m2m_model.objects.bulk_create(to_create)
def _update_node_assets_amount(node: Node, asset_pk_set: set, operator=add):
"""
一个节点与多个资产关系变化时,更新计数
:param node: 节点实例
:param asset_pk_set: 资产的`id`集合, 内部不会修改该值
:param operator: 操作
* -> Node
# -> Asset
* [3]
/ \
* * [2]
/ \
* * [1]
/ / \
* [a] # # [b]
"""
# 获取节点[1]祖先节点的 `key` 含自己,也就是[1, 2, 3]节点的`key`
ancestor_keys = node.get_ancestor_keys(with_self=True)
ancestors = Node.objects.filter(key__in=ancestor_keys).order_by('-key')
to_update = []
for ancestor in ancestors:
# 迭代祖先节点的`key`,顺序是 [1] -> [2] -> [3]
# 查询该节点及其后代节点是否包含要操作的资产,将包含的从要操作的
# 资产集合中去掉,他们是重复节点,无论增加或删除都不会影响节点的资产数量
asset_pk_set -= set(Asset.objects.filter(
id__in=asset_pk_set
).filter(
Q(nodes__key__istartswith=f'{ancestor.key}:') |
Q(nodes__key=ancestor.key)
).distinct().values_list('id', flat=True))
if not asset_pk_set:
# 要操作的资产集合为空,说明都是重复资产,不用改变节点资产数量
# 而且既然它包含了,它的祖先节点肯定也包含了,所以祖先节点都不用
# 处理了
break
ancestor.assets_amount = operator(F('assets_amount'), len(asset_pk_set))
to_update.append(ancestor)
Node.objects.bulk_update(to_update, fields=('assets_amount', 'parent_key'))
def _remove_ancestor_keys(ancestor_key, tree_set):
# 这里判断 `ancestor_key` 不能是空,防止数据错误导致的死循环
# 判断是否在集合里,来区分是否已被处理过
while ancestor_key and ancestor_key in tree_set:
tree_set.remove(ancestor_key)
ancestor_key = compute_parent_key(ancestor_key)
def _update_nodes_asset_amount(node_keys, asset_pk, operator):
"""
一个资产与多个节点关系变化时,更新计数
:param node_keys: 节点 id 的集合
:param asset_pk: 资产 id
:param operator: 操作
"""
# 所有相关节点的祖先节点,组成一棵局部树
ancestor_keys = set()
for key in node_keys:
ancestor_keys.update(Node.get_node_ancestor_keys(key))
# 相关节点可能是其他相关节点的祖先节点,如果是从相关节点里干掉
node_keys -= ancestor_keys
to_update_keys = []
for key in node_keys:
# 遍历相关节点,处理它及其祖先节点
# 查询该节点是否包含待处理资产
exists = is_asset_exists_in_node(asset_pk, key)
parent_key = compute_parent_key(key)
if exists:
# 如果资产在该节点,那么他及其祖先节点都不用处理
_remove_ancestor_keys(parent_key, ancestor_keys)
continue
else:
# 不存在,要更新本节点
to_update_keys.append(key)
# 这里判断 `parent_key` 不能是空,防止数据错误导致的死循环
# 判断是否在集合里,来区分是否已被处理过
while parent_key and parent_key in ancestor_keys:
exists = is_asset_exists_in_node(asset_pk, parent_key)
if exists:
_remove_ancestor_keys(parent_key, ancestor_keys)
break
else:
to_update_keys.append(parent_key)
ancestor_keys.remove(parent_key)
parent_key = compute_parent_key(parent_key)
Node.objects.filter(key__in=to_update_keys).update(
assets_amount=operator(F('assets_amount'), 1)
)
@receiver(m2m_changed, sender=Asset.nodes.through)
def on_asset_nodes_remove(sender, instance=None, action='', model=None,
pk_set=None, **kwargs):
def update_nodes_assets_amount(action, instance, reverse, pk_set, **kwargs):
# 不允许 `pre_clear` ,因为该信号没有 `pk_set`
# [官网](https://docs.djangoproject.com/en/3.1/ref/signals/#m2m-changed)
refused = (PRE_CLEAR,)
if action in refused:
raise ValueError
"""
监控资产删除节点关系, 或节点删除资产,避免产生游离资产
"""
if action not in ["post_remove", "pre_clear", "post_clear"]:
mapper = {
PRE_ADD: add,
POST_REMOVE: sub
}
if action not in mapper:
return
if action == "pre_clear":
if model == Node:
instance._nodes = list(instance.nodes.all())
else:
instance._assets = list(instance.assets.all())
return
logger.debug("Assets node remove signal recv: {}".format(action))
if action == "post_remove":
queryset = model.objects.filter(pk__in=pk_set)
operator = mapper[action]
if reverse:
node: Node = instance
asset_pk_set = set(pk_set)
_update_node_assets_amount(node, asset_pk_set, operator)
else:
if model == Node:
queryset = instance._nodes
else:
queryset = instance._assets
if model == Node:
assets = [instance]
else:
assets = queryset
if isinstance(assets, list):
assets_not_has_node = []
for asset in assets:
if asset.nodes.all().count() == 0:
assets_not_has_node.append(asset.id)
else:
assets_not_has_node = assets.annotate(nodes_count=Count('nodes'))\
.filter(nodes_count=0).values_list('id', flat=True)
Node.org_root().assets.add(*tuple(assets_not_has_node))
asset_pk = instance.id
# 与资产直接关联的节点
node_keys = set(Node.objects.filter(id__in=pk_set).values_list('key', flat=True))
_update_nodes_asset_amount(node_keys, asset_pk, operator)
@receiver([post_save, post_delete], sender=Node)
def on_node_update_or_created(sender, **kwargs):
# 刷新节点
Node.refresh_nodes()
with tmp_to_root_org():
Node.refresh_nodes()
RELATED_NODE_IDS = '_related_node_ids'
@receiver(pre_delete, sender=Asset)
def on_asset_delete(instance: Asset, using, **kwargs):
node_ids = set(Node.objects.filter(
assets=instance
).distinct().values_list('id', flat=True))
setattr(instance, RELATED_NODE_IDS, node_ids)
m2m_changed.send(
sender=Asset.nodes.through, instance=instance, reverse=False,
model=Node, pk_set=node_ids, using=using, action=PRE_REMOVE
)
@receiver(post_delete, sender=Asset)
def on_asset_post_delete(instance: Asset, using, **kwargs):
node_ids = getattr(instance, RELATED_NODE_IDS, None)
if node_ids:
m2m_changed.send(
sender=Asset.nodes.through, instance=instance, reverse=False,
model=Node, pk_set=node_ids, using=using, action=POST_REMOVE
)

View File

@@ -9,3 +9,4 @@ from .gather_asset_users import *
from .gather_asset_hardware_info import *
from .push_system_user import *
from .system_user_connectivity import *
from .nodes_amount import *

View File

@@ -3,10 +3,13 @@
from celery import shared_task
from orgs.utils import tmp_to_root_org
__all__ = ['add_nodes_assets_to_system_users']
@shared_task
@tmp_to_root_org()
def add_nodes_assets_to_system_users(nodes_keys, system_users):
from ..models import Node
assets = Node.get_nodes_all_assets(nodes_keys).values_list('id', flat=True)

View File

@@ -129,5 +129,5 @@ def update_assets_hardware_info_period():
def update_node_assets_hardware_info_manual(node):
task_name = _("Update node asset hardware information: {}").format(node.name)
assets = node.get_all_assets()
result = update_assets_hardware_info_util.delay(assets, task_name=task_name)
result = update_assets_hardware_info_util(assets, task_name=task_name)
return result

View File

@@ -92,7 +92,7 @@ def add_asset_users(assets, results):
for username, data in users.items():
defaults = {'asset': asset, 'username': username, 'present': True}
if data.get("ip"):
defaults["ip_last_login"] = data["ip"]
defaults["ip_last_login"] = data["ip"][:32]
if data.get("date"):
defaults["date_last_login"] = data["date"]
GatheredUser.objects.update_or_create(

View File

@@ -0,0 +1,27 @@
from celery import shared_task
from django.utils.translation import gettext_lazy as _
from orgs.models import Organization
from orgs.utils import tmp_to_org
from ops.celery.decorator import register_as_period_task
from assets.utils import check_node_assets_amount
from common.utils.lock import AcquireFailed
from common.utils import get_logger
logger = get_logger(__file__)
@shared_task(queue='celery_heavy_tasks')
def check_node_assets_amount_task(org_id=Organization.ROOT_ID):
try:
with tmp_to_org(Organization.get_instance(org_id)):
check_node_assets_amount()
except AcquireFailed:
logger.error(_('The task of self-checking is already running and cannot be started repeatedly'))
@register_as_period_task(crontab='0 2 * * *')
@shared_task(queue='celery_heavy_tasks')
def check_node_assets_amount_period_task():
check_node_assets_amount_task()

View File

@@ -2,11 +2,13 @@
from itertools import groupby
from celery import shared_task
from common.db.utils import get_object_if_need, get_objects
from django.utils.translation import ugettext as _
from django.db.models import Empty
from common.utils import encrypt_password, get_logger
from orgs.utils import org_aware_func
from assets.models import SystemUser, Asset, AuthBook
from orgs.utils import org_aware_func, tmp_to_root_org
from . import const
from .utils import clean_ansible_task_hosts, group_asset_by_platform
@@ -34,6 +36,7 @@ def get_push_unixlike_system_user_tasks(system_user, username=None):
username = system_user.username
password = system_user.password
public_key = system_user.public_key
comment = system_user.name
groups = _split_by_comma(system_user.system_groups)
@@ -45,7 +48,8 @@ def get_push_unixlike_system_user_tasks(system_user, username=None):
'shell': system_user.shell or Empty,
'state': 'present',
'home': system_user.home or Empty,
'groups': groups or Empty
'groups': groups or Empty,
'comment': comment
}
tasks = [
@@ -62,24 +66,27 @@ def get_push_unixlike_system_user_tasks(system_user, username=None):
'module': 'group',
'args': 'name={} state=present'.format(username),
}
},
{
'name': 'Check home dir exists',
'action': {
'module': 'stat',
'args': 'path=/home/{}'.format(username)
},
'register': 'home_existed'
},
{
'name': "Set home dir permission",
'action': {
'module': 'file',
'args': "path=/home/{0} owner={0} group={0} mode=700".format(username)
},
'when': 'home_existed.stat.exists == true'
}
]
if not system_user.home:
tasks.extend([
{
'name': 'Check home dir exists',
'action': {
'module': 'stat',
'args': 'path=/home/{}'.format(username)
},
'register': 'home_existed'
},
{
'name': "Set home dir permission",
'action': {
'module': 'file',
'args': "path=/home/{0} owner={0} group={0} mode=700".format(username)
},
'when': 'home_existed.stat.exists == true'
}
])
if password:
tasks.append({
'name': 'Set {} password'.format(username),
@@ -132,6 +139,7 @@ def get_push_windows_system_user_tasks(system_user, username=None):
tasks = []
if not password:
logger.error("Error: no password found")
return tasks
task = {
'name': 'Add user {}'.format(username),
@@ -182,15 +190,12 @@ def get_push_system_user_tasks(system_user, platform="unixlike", username=None):
@org_aware_func("system_user")
def push_system_user_util(system_user, assets, task_name, username=None):
from ops.utils import update_or_create_ansible_task
hosts = clean_ansible_task_hosts(assets, system_user=system_user)
if not hosts:
assets = clean_ansible_task_hosts(assets, system_user=system_user)
if not assets:
return {}
platform_hosts_map = {}
hosts_sorted = sorted(hosts, key=group_asset_by_platform)
platform_hosts = groupby(hosts_sorted, key=group_asset_by_platform)
for i in platform_hosts:
platform_hosts_map[i[0]] = list(i[1])
assets_sorted = sorted(assets, key=group_asset_by_platform)
platform_hosts = groupby(assets_sorted, key=group_asset_by_platform)
def run_task(_tasks, _hosts):
if not _tasks:
@@ -201,33 +206,71 @@ def push_system_user_util(system_user, assets, task_name, username=None):
)
task.run()
for platform, _hosts in platform_hosts_map.items():
if not _hosts:
if system_user.username_same_with_user:
if username is None:
# 动态系统用户,但是没有指定 username
usernames = list(system_user.users.all().values_list('username', flat=True).distinct())
else:
usernames = [username]
else:
# 非动态系统用户指定 username 无效
assert username is None, 'Only Dynamic user can assign `username`'
usernames = [system_user.username]
for platform, _assets in platform_hosts:
_assets = list(_assets)
if not _assets:
continue
print(_("Start push system user for platform: [{}]").format(platform))
print(_("Hosts count: {}").format(len(_hosts)))
print(_("Hosts count: {}").format(len(_assets)))
if not system_user.has_special_auth():
logger.debug("System user not has special auth")
tasks = get_push_system_user_tasks(system_user, platform, username=username)
run_task(tasks, _hosts)
continue
id_asset_map = {_asset.id: _asset for _asset in _assets}
assets_id = id_asset_map.keys()
no_special_auth = []
special_auth_set = set()
for _host in _hosts:
system_user.load_asset_special_auth(_host)
tasks = get_push_system_user_tasks(system_user, platform, username=username)
run_task(tasks, [_host])
auth_books = AuthBook.objects.filter(username__in=usernames, asset_id__in=assets_id)
for auth_book in auth_books:
special_auth_set.add((auth_book.username, auth_book.asset_id))
for _username in usernames:
no_special_assets = []
for asset_id in assets_id:
if (_username, asset_id) not in special_auth_set:
no_special_assets.append(id_asset_map[asset_id])
if no_special_assets:
no_special_auth.append((_username, no_special_assets))
for _username, no_special_assets in no_special_auth:
tasks = get_push_system_user_tasks(system_user, platform, username=_username)
run_task(tasks, no_special_assets)
for auth_book in auth_books:
system_user._merge_auth(auth_book)
tasks = get_push_system_user_tasks(system_user, platform, username=auth_book.username)
asset = id_asset_map[auth_book.asset_id]
run_task(tasks, [asset])
@shared_task(queue="ansible")
@tmp_to_root_org()
def push_system_user_to_assets_manual(system_user, username=None):
"""
将系统用户推送到与它关联的所有资产上
"""
system_user = get_object_if_need(SystemUser, system_user)
assets = system_user.get_related_assets()
task_name = _("Push system users to assets: {}").format(system_user.name)
return push_system_user_util(system_user, assets, task_name=task_name, username=username)
@shared_task(queue="ansible")
@tmp_to_root_org()
def push_system_user_a_asset_manual(system_user, asset, username=None):
"""
将系统用户推送到一个资产上
"""
if username is None:
username = system_user.username
task_name = _("Push system users to asset: {}({}) => {}").format(
@@ -237,12 +280,17 @@ def push_system_user_a_asset_manual(system_user, asset, username=None):
@shared_task(queue="ansible")
def push_system_user_to_assets(system_user, assets, username=None):
@tmp_to_root_org()
def push_system_user_to_assets(system_user_id, assets_id, username=None):
"""
推送系统用户到指定的若干资产上
"""
system_user = SystemUser.objects.get(id=system_user_id)
assets = get_objects(Asset, assets_id)
task_name = _("Push system users to assets: {}").format(system_user.name)
return push_system_user_util(system_user, assets, task_name, username=username)
# @shared_task
# @register_as_period_task(interval=3600)
# @after_app_ready_start

View File

@@ -2,6 +2,7 @@
from django.urls import path, re_path
from rest_framework_nested import routers
from rest_framework_bulk.routes import BulkRouter
from django.db.transaction import non_atomic_requests
from common import api as capi
@@ -44,6 +45,7 @@ urlpatterns = [
path('admin-users/<uuid:pk>/assets/', api.AdminUserAssetsListView.as_view(), name='admin-user-assets'),
path('system-users/<uuid:pk>/auth-info/', api.SystemUserAuthInfoApi.as_view(), name='system-user-auth-info'),
path('system-users/<uuid:pk>/assets/', api.SystemUserAssetsListView.as_view(), name='system-user-assets'),
path('system-users/<uuid:pk>/assets/<uuid:aid>/auth-info/', api.SystemUserAssetAuthInfoApi.as_view(), name='system-user-asset-auth-info'),
path('system-users/<uuid:pk>/tasks/', api.SystemUserTaskApi.as_view(), name='system-user-task-create'),
path('system-users/<uuid:pk>/cmd-filter-rules/', api.SystemUserCommandFilterRuleListApi.as_view(), name='system-user-cmd-filter-rule-list'),
@@ -54,9 +56,9 @@ urlpatterns = [
path('nodes/children/', api.NodeChildrenApi.as_view(), name='node-children-2'),
path('nodes/<uuid:pk>/children/add/', api.NodeAddChildrenApi.as_view(), name='node-add-children'),
path('nodes/<uuid:pk>/assets/', api.NodeAssetsApi.as_view(), name='node-assets'),
path('nodes/<uuid:pk>/assets/add/', api.NodeAddAssetsApi.as_view(), name='node-add-assets'),
path('nodes/<uuid:pk>/assets/replace/', api.NodeReplaceAssetsApi.as_view(), name='node-replace-assets'),
path('nodes/<uuid:pk>/assets/remove/', api.NodeRemoveAssetsApi.as_view(), name='node-remove-assets'),
path('nodes/<uuid:pk>/assets/add/', non_atomic_requests(api.NodeAddAssetsApi.as_view()), name='node-add-assets'),
path('nodes/<uuid:pk>/assets/replace/', non_atomic_requests(api.MoveAssetsToNodeApi.as_view()), name='node-replace-assets'),
path('nodes/<uuid:pk>/assets/remove/', non_atomic_requests(api.NodeRemoveAssetsApi.as_view()), name='node-remove-assets'),
path('nodes/<uuid:pk>/tasks/', api.NodeTaskCreateApi.as_view(), name='node-task-create'),
path('gateways/<uuid:pk>/test-connective/', api.GatewayTestConnectionApi.as_view(), name='test-gateway-connective'),

View File

@@ -1,195 +1,59 @@
# ~*~ coding: utf-8 ~*~
#
from treelib import Tree
from treelib.exceptions import NodeIDAbsentError
from collections import defaultdict
from copy import deepcopy
import time
from common.utils import get_logger, timeit, lazyproperty
from django.db.models import Q
from common.utils import get_logger, dict_get_any, is_uuid, get_object_or_none
from common.utils.lock import DistributedLock
from common.http import is_true
from .models import Asset, Node
logger = get_logger(__file__)
class TreeService(Tree):
tag_sep = ' / '
@DistributedLock(name="assets.node.check_node_assets_amount", blocking=False)
def check_node_assets_amount():
for node in Node.objects.all():
logger.info(f'Check node assets amount: {node}')
assets_amount = Asset.objects.filter(
Q(nodes__key__istartswith=f'{node.key}:') | Q(nodes=node)
).distinct().count()
@staticmethod
@timeit
def get_nodes_assets_map():
nodes_assets_map = defaultdict(set)
asset_node_list = Node.assets.through.objects.values_list(
'asset', 'node__key'
)
for asset_id, key in asset_node_list:
nodes_assets_map[key].add(asset_id)
return nodes_assets_map
if node.assets_amount != assets_amount:
logger.warn(f'Node wrong assets amount <Node:{node.key}> '
f'{node.assets_amount} right is {assets_amount}')
node.assets_amount = assets_amount
node.save()
# 防止自检程序给数据库的压力太大
time.sleep(0.1)
@classmethod
@timeit
def new(cls):
from .models import Node
all_nodes = list(Node.objects.all().values("key", "value"))
all_nodes.sort(key=lambda x: len(x["key"].split(":")))
tree = cls()
tree.create_node(tag='', identifier='', data={})
for node in all_nodes:
key = node["key"]
value = node["value"]
parent_key = ":".join(key.split(":")[:-1])
tree.safe_create_node(
tag=value, identifier=key,
parent=parent_key,
)
tree.init_assets()
return tree
def init_assets(self):
node_assets_map = self.get_nodes_assets_map()
for node in self.all_nodes_itr():
key = node.identifier
assets = node_assets_map.get(key, set())
data = {"assets": assets, "all_assets": None}
node.data = data
def is_asset_exists_in_node(asset_pk, node_key):
return Asset.objects.filter(
id=asset_pk
).filter(
Q(nodes__key__istartswith=f'{node_key}:') | Q(nodes__key=node_key)
).exists()
def safe_create_node(self, **kwargs):
parent = kwargs.get("parent")
if not self.contains(parent):
kwargs['parent'] = self.root
self.create_node(**kwargs)
def all_children_ids(self, nid, with_self=True):
children_ids = self.expand_tree(nid)
if not with_self:
next(children_ids)
return list(children_ids)
def is_query_node_all_assets(request):
request = request
query_all_arg = request.query_params.get('all', 'true')
show_current_asset_arg = request.query_params.get('show_current_asset')
if show_current_asset_arg is not None:
return not is_true(show_current_asset_arg)
return is_true(query_all_arg)
def all_children(self, nid, with_self=True, deep=False):
children_ids = self.all_children_ids(nid, with_self=with_self)
return [self.get_node(i, deep=deep) for i in children_ids]
def ancestors_ids(self, nid, with_self=True):
ancestor_ids = list(self.rsearch(nid))
ancestor_ids.pop()
if not with_self:
ancestor_ids.pop(0)
return ancestor_ids
def get_node(request):
node_id = dict_get_any(request.query_params, ['node', 'node_id'])
if not node_id:
return None
def ancestors(self, nid, with_self=False, deep=False):
ancestor_ids = self.ancestors_ids(nid, with_self=with_self)
ancestors = [self.get_node(i, deep=deep) for i in ancestor_ids]
return ancestors
def get_node_full_tag(self, nid):
ancestors = self.ancestors(nid, with_self=True)
ancestors.reverse()
return self.tag_sep.join([n.tag for n in ancestors])
def get_family(self, nid, deep=False):
ancestors = self.ancestors(nid, with_self=False, deep=deep)
children = self.all_children(nid, with_self=False)
return ancestors + [self[nid]] + children
@staticmethod
def is_parent(child, parent):
parent_id = child.bpointer
return parent_id == parent.identifier
def root_node(self):
return self.get_node(self.root)
def get_node(self, nid, deep=False):
node = super().get_node(nid)
if deep:
node = self.copy_node(node)
node.data = {}
return node
def parent(self, nid, deep=False):
parent = super().parent(nid)
if deep:
parent = self.copy_node(parent)
return parent
@lazyproperty
def invalid_assets(self):
assets = Asset.objects.filter(is_active=False).values_list('id', flat=True)
return assets
def set_assets(self, nid, assets):
node = self.get_node(nid)
if node.data is None:
node.data = {}
node.data["assets"] = assets
def assets(self, nid):
node = self.get_node(nid)
return node.data.get("assets", set())
def valid_assets(self, nid):
return set(self.assets(nid)) - set(self.invalid_assets)
def all_assets(self, nid):
node = self.get_node(nid)
if node.data is None:
node.data = {}
all_assets = node.data.get("all_assets")
if all_assets is not None:
return all_assets
all_assets = set(self.assets(nid))
try:
children = self.children(nid)
except NodeIDAbsentError:
children = []
for child in children:
all_assets.update(self.all_assets(child.identifier))
node.data["all_assets"] = all_assets
return all_assets
def all_valid_assets(self, nid):
return set(self.all_assets(nid)) - set(self.invalid_assets)
def assets_amount(self, nid):
return len(self.all_assets(nid))
def valid_assets_amount(self, nid):
return len(self.all_valid_assets(nid))
@staticmethod
def copy_node(node):
new_node = deepcopy(node)
new_node.fpointer = None
return new_node
def safe_add_ancestors(self, node, ancestors):
# 如果没有祖先节点,那么添加该节点, 父节点是root node
if len(ancestors) == 0:
parent = self.root_node()
else:
parent = ancestors[0]
# 如果当前节点已再树中,则移动当前节点到父节点中
# 这个是由于 当前节点放到了二级节点中
if not self.contains(parent.identifier):
# logger.debug('Add parent: {}'.format(parent.identifier))
self.safe_add_ancestors(parent, ancestors[1:])
if self.contains(node.identifier):
# msg = 'Move node to parent: {} => {}'.format(
# node.identifier, parent.identifier
# )
# logger.debug(msg)
self.move_node(node.identifier, parent.identifier)
else:
# logger.debug('Add node: {}'.format(node.identifier))
self.add_node(node, parent)
#
# def __getstate__(self):
# self.mutex = None
# self.all_nodes_assets_map = {}
# self.nodes_assets_map = {}
# return self.__dict__
# def __setstate__(self, state):
# self.__dict__ = state
if is_uuid(node_id):
node = get_object_or_none(Node, id=node_id)
else:
node = get_object_or_none(Node, key=node_id)
return node

View File

@@ -0,0 +1,18 @@
# Generated by Django 3.1 on 2020-12-09 03:03
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('audits', '0010_auto_20200811_1122'),
]
operations = [
migrations.AddField(
model_name='userloginlog',
name='backend',
field=models.CharField(default='', max_length=32, verbose_name='Authentication backend'),
),
]

View File

@@ -105,6 +105,7 @@ class UserLoginLog(models.Model):
reason = models.CharField(default='', max_length=128, blank=True, verbose_name=_('Reason'))
status = models.BooleanField(max_length=2, default=True, choices=STATUS_CHOICE, verbose_name=_('Status'))
datetime = models.DateTimeField(default=timezone.now, verbose_name=_('Date login'))
backend = models.CharField(max_length=32, default='', verbose_name=_('Authentication backend'))
@classmethod
def get_login_logs(cls, date_from=None, date_to=None, user=None, keyword=None):

View File

@@ -12,7 +12,7 @@ from . import models
class FTPLogSerializer(serializers.ModelSerializer):
operate_display = serializers.ReadOnlyField(source='get_operate_display')
operate_display = serializers.ReadOnlyField(source='get_operate_display', label=_('Operate for display'))
class Meta:
model = models.FTPLog
@@ -23,16 +23,20 @@ class FTPLogSerializer(serializers.ModelSerializer):
class UserLoginLogSerializer(serializers.ModelSerializer):
type_display = serializers.ReadOnlyField(source='get_type_display')
status_display = serializers.ReadOnlyField(source='get_status_display')
mfa_display = serializers.ReadOnlyField(source='get_mfa_display')
type_display = serializers.ReadOnlyField(source='get_type_display', label=_('Type for display'))
status_display = serializers.ReadOnlyField(source='get_status_display', label=_('Status for display'))
mfa_display = serializers.ReadOnlyField(source='get_mfa_display', label=_('MFA for display'))
class Meta:
model = models.UserLoginLog
fields = (
'id', 'username', 'type', 'type_display', 'ip', 'city', 'user_agent',
'mfa', 'reason', 'status', 'status_display', 'datetime', 'mfa_display'
'mfa', 'reason', 'status', 'status_display', 'datetime', 'mfa_display',
'backend'
)
extra_kwargs = {
"user_agent": {'label': _('User agent')}
}
class OperateLogSerializer(serializers.ModelSerializer):
@@ -75,6 +79,8 @@ class CommandExecutionSerializer(serializers.ModelSerializer):
'hosts': {'label': _('Hosts')}, # 外键,会生成 sql。不在 model 上修改
'run_as': {'label': _('Run as')},
'user': {'label': _('User')},
'run_as_display': {'label': _('Run as for display')},
'user_display': {'label': _('User for display')},
}
@classmethod

View File

@@ -5,6 +5,8 @@ from django.db.models.signals import post_save, post_delete
from django.dispatch import receiver
from django.db import transaction
from django.utils import timezone
from django.contrib.auth import BACKEND_SESSION_KEY
from django.utils.translation import ugettext_lazy as _
from rest_framework.renderers import JSONRenderer
from rest_framework.request import Request
@@ -32,6 +34,19 @@ MODELS_NEED_RECORD = (
)
LOGIN_BACKEND = {
'PublicKeyAuthBackend': _('SSH Key'),
'RadiusBackend': User.Source.radius.label,
'RadiusRealmBackend': User.Source.radius.label,
'LDAPAuthorizationBackend': User.Source.ldap.label,
'ModelBackend': _('Password'),
'SSOAuthentication': _('SSO'),
'CASBackend': User.Source.cas.label,
'OIDCAuthCodeBackend': User.Source.openid.label,
'OIDCAuthPasswordBackend': User.Source.openid.label,
}
def create_operate_log(action, sender, resource):
user = current_request.user if current_request else None
if not user or not user.is_authenticated:
@@ -109,6 +124,16 @@ def on_audits_log_create(sender, instance=None, **kwargs):
sys_logger.info(msg)
def get_login_backend(request):
backend = request.session.get(BACKEND_SESSION_KEY, '')
backend = backend.rsplit('.', maxsplit=1)[-1]
if backend in LOGIN_BACKEND:
return LOGIN_BACKEND[backend]
else:
logger.warn(f'LOGIN_BACKEND_NOT_FOUND: {backend}')
return ''
def generate_data(username, request):
user_agent = request.META.get('HTTP_USER_AGENT', '')
login_ip = get_request_ip(request) or '0.0.0.0'
@@ -122,7 +147,8 @@ def generate_data(username, request):
'ip': login_ip,
'type': login_type,
'user_agent': user_agent,
'datetime': timezone.now()
'datetime': timezone.now(),
'backend': get_login_backend(request)
}
return data

View File

@@ -2,32 +2,44 @@
#
import datetime
from django.utils import timezone
from django.conf import settings
from celery import shared_task
from ops.celery.decorator import register_as_period_task
from ops.celery.decorator import (
register_as_period_task, after_app_shutdown_clean_periodic
)
from .models import UserLoginLog, OperateLog
from common.utils import get_log_keep_day
@register_as_period_task(interval=3600*24)
@shared_task
@after_app_shutdown_clean_periodic
def clean_login_log_period():
now = timezone.now()
try:
days = int(settings.LOGIN_LOG_KEEP_DAYS)
except ValueError:
days = 90
days = get_log_keep_day('LOGIN_LOG_KEEP_DAYS')
expired_day = now - datetime.timedelta(days=days)
UserLoginLog.objects.filter(datetime__lt=expired_day).delete()
@register_as_period_task(interval=3600*24)
@shared_task
@after_app_shutdown_clean_periodic
def clean_operation_log_period():
now = timezone.now()
try:
days = int(settings.LOGIN_LOG_KEEP_DAYS)
except ValueError:
days = 90
days = get_log_keep_day('OPERATE_LOG_KEEP_DAYS')
expired_day = now - datetime.timedelta(days=days)
OperateLog.objects.filter(datetime__lt=expired_day).delete()
@shared_task
def clean_ftp_log_period():
now = timezone.now()
days = get_log_keep_day('FTP_LOG_KEEP_DAYS')
expired_day = now - datetime.timedelta(days=days)
OperateLog.objects.filter(datetime__lt=expired_day).delete()
@register_as_period_task(interval=3600*24)
@shared_task
def clean_audits_log_period():
clean_login_log_period()
clean_operation_log_period()
clean_ftp_log_period()

View File

@@ -54,12 +54,3 @@ class UserConnectionTokenApi(RootOrgViewMixin, APIView):
return Response(value)
else:
return Response({'user': value['user']})
def get_permissions(self):
if self.request.query_params.get('user-only', None):
self.permission_classes = (AllowAny,)
return super().get_permissions()

View File

@@ -60,6 +60,7 @@ class SSOViewSet(AuthMixin, JmsGenericViewSet):
此接口违反了 `Restful` 的规范
`GET` 应该是安全的方法,但此接口是不安全的
"""
request.META['HTTP_X_JMS_LOGIN_TYPE'] = 'W'
authkey = request.query_params.get(AUTH_KEY)
next_url = request.query_params.get(NEXT_URL)
if not next_url or not next_url.startswith('/'):
@@ -73,12 +74,12 @@ class SSOViewSet(AuthMixin, JmsGenericViewSet):
token.save()
except (ValueError, SSOToken.DoesNotExist):
self.send_auth_signal(success=False, reason='authkey_invalid')
return HttpResponseRedirect(reverse('authentication:login'))
return HttpResponseRedirect(next_url)
# 判断是否过期
if (utcnow().timestamp() - token.date_created.timestamp()) > settings.AUTH_SSO_AUTHKEY_TTL:
self.send_auth_signal(success=False, reason='authkey_timeout')
return HttpResponseRedirect(reverse('authentication:login'))
return HttpResponseRedirect(next_url)
user = token.user
login(self.request, user, 'authentication.backends.api.SSOAuthentication')

View File

@@ -6,7 +6,7 @@ import time
from django.core.cache import cache
from django.utils.translation import ugettext as _
from django.utils.six import text_type
from six import text_type
from django.contrib.auth import get_user_model
from django.contrib.auth.backends import ModelBackend
from rest_framework import HTTP_HEADER_ENCODING
@@ -202,4 +202,6 @@ class SSOAuthentication(ModelBackend):
"""
什么也不做呀😺
"""
pass
def authenticate(self, request, sso_token=None, **kwargs):
pass

View File

@@ -0,0 +1,10 @@
from django_cas_ng.middleware import CASMiddleware as _CASMiddleware
from django.core.exceptions import MiddlewareNotUsed
from django.conf import settings
class CASMiddleware(_CASMiddleware):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
if not settings.AUTH_CAS:
raise MiddlewareNotUsed

View File

@@ -82,6 +82,12 @@ class LDAPAuthorizationBackend(LDAPBackend):
class LDAPUser(_LDAPUser):
def _search_for_user_dn_from_ldap_util(self):
from settings.utils import LDAPServerUtil
util = LDAPServerUtil()
user_dn = util.search_for_user_dn(self._username)
return user_dn
def _search_for_user_dn(self):
"""
This method was overridden because the AUTH_LDAP_USER_SEARCH
@@ -107,7 +113,14 @@ class LDAPUser(_LDAPUser):
if results is not None and len(results) == 1:
(user_dn, self._user_attrs) = next(iter(results))
else:
user_dn = None
# 解决直接配置DC域用户认证失败的问题(库不能从整棵树中搜索)
user_dn = self._search_for_user_dn_from_ldap_util()
if user_dn is None:
self._user_dn = None
self._user_attrs = None
else:
self._user_dn = user_dn
self._user_attrs = self._load_user_attrs()
return user_dn

View File

@@ -0,0 +1,10 @@
from jms_oidc_rp.middleware import OIDCRefreshIDTokenMiddleware as _OIDCRefreshIDTokenMiddleware
from django.core.exceptions import MiddlewareNotUsed
from django.conf import settings
class OIDCRefreshIDTokenMiddleware(_OIDCRefreshIDTokenMiddleware):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
if not settings.AUTH_OPENID:
raise MiddlewareNotUsed

View File

@@ -2,7 +2,7 @@
#
import traceback
from django.contrib.auth import get_user_model
from django.contrib.auth import get_user_model, authenticate
from radiusauth.backends import RADIUSBackend, RADIUSRealmBackend
from django.conf import settings
@@ -23,7 +23,7 @@ class CreateUserMixin:
email_suffix = settings.EMAIL_SUFFIX
email = '{}@{}'.format(username, email_suffix)
user = User(username=username, name=username, email=email)
user.source = user.SOURCE_RADIUS
user.source = user.Source.radius.value
user.save()
return user
@@ -38,16 +38,12 @@ class CreateUserMixin:
return [], False, False
return None
def authenticate(self, *args, **kwargs):
# 校验用户时会传入public_key参数父类authentication中不接受public_key参数所以要pop掉
# TODO:需要优化各backend的authenticate方法django进行调用前会检测各authenticate的参数
kwargs.pop('public_key', None)
return super().authenticate(*args, **kwargs)
class RadiusBackend(CreateUserMixin, RADIUSBackend):
pass
def authenticate(self, request, username='', password='', **kwargs):
return super().authenticate(request, username=username, password=password)
class RadiusRealmBackend(CreateUserMixin, RADIUSRealmBackend):
pass
def authenticate(self, request, username='', password='', realm=None, **kwargs):
return super().authenticate(request, username=username, password=password, realm=realm)

View File

@@ -218,5 +218,14 @@ class PasswdTooSimple(JMSException):
default_detail = _('Your password is too simple, please change it for security')
def __init__(self, url, *args, **kwargs):
super(PasswdTooSimple, self).__init__(*args, **kwargs)
super().__init__(*args, **kwargs)
self.url = url
class PasswordRequireResetError(JMSException):
default_code = 'passwd_has_expired'
default_detail = _('Your password has expired, please reset before logging in')
def __init__(self, url, *args, **kwargs):
super().__init__(*args, **kwargs)
self.url = url

View File

@@ -4,7 +4,7 @@
from django import forms
from django.conf import settings
from django.utils.translation import gettext_lazy as _
from captcha.fields import CaptchaField
from captcha.fields import CaptchaField, CaptchaTextInput
class UserLoginForm(forms.Form):
@@ -26,8 +26,12 @@ class UserCheckOtpCodeForm(forms.Form):
otp_code = forms.CharField(label=_('MFA code'), max_length=6)
class CustomCaptchaTextInput(CaptchaTextInput):
template_name = 'authentication/_captcha_field.html'
class CaptchaMixin(forms.Form):
captcha = CaptchaField()
captcha = CaptchaField(widget=CustomCaptchaTextInput)
class ChallengeMixin(forms.Form):

View File

@@ -53,7 +53,7 @@ class AuthMixin:
ip = ip or get_request_ip(self.request)
return ip
def check_is_block(self):
def check_is_block(self, raise_exception=True):
if hasattr(self.request, 'data'):
username = self.request.data.get("username")
else:
@@ -61,7 +61,11 @@ class AuthMixin:
ip = self.get_request_ip()
if is_block_login(username, ip):
logger.warn('Ip was blocked' + ': ' + username + ':' + ip)
raise errors.BlockLoginError(username=username, ip=ip)
exception = errors.BlockLoginError(username=username, ip=ip)
if raise_exception:
raise errors.BlockLoginError(username=username, ip=ip)
else:
return exception
def decrypt_passwd(self, raw_passwd):
# 获取解密密钥,对密码进行解密
@@ -106,9 +110,8 @@ class AuthMixin:
raise CredentialError(error=errors.reason_user_inactive)
elif not user.is_active:
raise CredentialError(error=errors.reason_user_inactive)
elif user.password_has_expired:
raise CredentialError(error=errors.reason_password_expired)
self._check_password_require_reset_or_not(user)
self._check_passwd_is_too_simple(user, password)
clean_failed_count(username, ip)
@@ -119,20 +122,34 @@ class AuthMixin:
return user
@classmethod
def _check_passwd_is_too_simple(cls, user, password):
def generate_reset_password_url_with_flash_msg(cls, user: User, flash_view_name):
reset_passwd_url = reverse('authentication:reset-password')
query_str = urlencode({
'token': user.generate_reset_token()
})
reset_passwd_url = f'{reset_passwd_url}?{query_str}'
flash_page_url = reverse(flash_view_name)
query_str = urlencode({
'redirect_url': reset_passwd_url
})
return f'{flash_page_url}?{query_str}'
@classmethod
def _check_passwd_is_too_simple(cls, user: User, password):
if user.is_superuser and password == 'admin':
reset_passwd_url = reverse('authentication:reset-password')
query_str = urlencode({
'token': user.generate_reset_token()
})
reset_passwd_url = f'{reset_passwd_url}?{query_str}'
url = cls.generate_reset_password_url_with_flash_msg(
user, 'authentication:passwd-too-simple-flash-msg'
)
raise errors.PasswdTooSimple(url)
flash_page_url = reverse('authentication:passwd-too-simple-flash-msg')
query_str = urlencode({
'redirect_url': reset_passwd_url
})
raise errors.PasswdTooSimple(f'{flash_page_url}?{query_str}')
@classmethod
def _check_password_require_reset_or_not(cls, user: User):
if user.password_has_expired:
url = cls.generate_reset_password_url_with_flash_msg(
user, 'authentication:passwd-has-expired-flash-msg'
)
raise errors.PasswordRequireResetError(url)
def check_user_auth_if_need(self, decrypt_passwd=False):
request = self.request

View File

@@ -1,11 +1,9 @@
import uuid
from functools import partial
from django.utils import timezone
from django.utils.translation import ugettext_lazy as _, ugettext as __
from rest_framework.authtoken.models import Token
from django.conf import settings
from django.utils.crypto import get_random_string
from common.db import models
from common.mixins.models import CommonModelMixin
@@ -53,7 +51,7 @@ class LoginConfirmSetting(CommonModelMixin):
def create_confirm_ticket(self, request=None):
from tickets.models import Ticket
title = _('Login confirm') + '{}'.format(self.user)
title = _('Login confirm') + ' {}'.format(self.user)
if request:
remote_addr = get_request_ip(request)
city = get_ip_city(remote_addr)

View File

@@ -1,15 +1,41 @@
from importlib import import_module
from django.contrib.auth import BACKEND_SESSION_KEY
from django.conf import settings
from django.contrib.auth import user_logged_in
from django.core.cache import cache
from django.dispatch import receiver
from django_cas_ng.signals import cas_user_authenticated
from jms_oidc_rp.signals import openid_user_login_failed, openid_user_login_success
from .signals import post_auth_success, post_auth_failed
@receiver(user_logged_in)
def on_user_auth_login_success(sender, user, request, **kwargs):
if settings.USER_LOGIN_SINGLE_MACHINE_ENABLED:
user_id = 'single_machine_login_' + str(user.id)
session_key = cache.get(user_id)
if session_key and session_key != request.session.session_key:
session = import_module(settings.SESSION_ENGINE).SessionStore(session_key)
session.delete()
cache.set(user_id, request.session.session_key, None)
@receiver(openid_user_login_success)
def on_oidc_user_login_success(sender, request, user, **kwargs):
request.session[BACKEND_SESSION_KEY] = 'OIDCAuthCodeBackend'
post_auth_success.send(sender, user=user, request=request)
@receiver(openid_user_login_failed)
def on_oidc_user_login_failed(sender, username, request, reason, **kwargs):
request.session[BACKEND_SESSION_KEY] = 'OIDCAuthCodeBackend'
post_auth_failed.send(sender, username=username, request=request, reason=reason)
@receiver(cas_user_authenticated)
def on_cas_user_login_success(sender, request, user, **kwargs):
request.session[BACKEND_SESSION_KEY] = 'CASBackend'
post_auth_success.send(sender, user=user, request=request)

View File

@@ -0,0 +1,29 @@
{% load i18n %}
{% spaceless %}
<img src="{{ image }}" alt="captcha" class="captcha" />
<div class="row" style="padding-bottom: 10px">
<div class="col-sm-6">
<div class="input-group-prepend">
{% if audio %}
<a title="{% trans "Play CAPTCHA as audio file" %}" href="{{ audio }}">
{% endif %}
</div>
{% include "django/forms/widgets/multiwidget.html" %}
</div>
</div>
<script>
var placeholder = '{% trans "Captcha" %}'
function refresh_captcha() {
$.getJSON("{% url "captcha-refresh" %}",
function (result) {
$('.captcha').attr('src', result['image_url']);
$('#id_captcha_0').val(result['key'])
})
}
$(document).ready(function () {
$('.captcha').click(refresh_captcha)
$('#id_captcha_1').addClass('form-control').attr('placeholder', placeholder)
})
</script>
{% endspaceless %}

View File

@@ -26,7 +26,8 @@
{% endif %}
</div>
<div class="form-group">
<input type="password" class="form-control" id="password" name="{{ form.password.html_name }}" placeholder="{% trans 'Password' %}" required="">
<input type="password" class="form-control" id="password" placeholder="{% trans 'Password' %}" required="">
<input id="password-hidden" type="text" style="display:none" name="{{ form.password.html_name }}">
{% if form.errors.password %}
<div class="help-block field-error">
<p class="red-fonts">{{ form.errors.password.as_text }}</p>
@@ -56,7 +57,7 @@
<div class="text-muted text-center">
<div>
<a href="{% url 'authentication:forgot-password' %}">
<a id="forgot_password" href="#">
<small>{% trans 'Forgot password' %}?</small>
</a>
</div>
@@ -86,8 +87,20 @@
var rsaPublicKey = "{{ rsa_public_key }}"
var password =$('#password').val(); //明文密码
var passwordEncrypted = encryptLoginPassword(password, rsaPublicKey)
$('#password').val(passwordEncrypted); //返回给密码输入input
$('#password-hidden').val(passwordEncrypted); //返回给密码输入input
$('#form').submit();//post提交
}
var authDB = '{{ AUTH_DB }}';
var forgotPasswordUrl = "{% url 'authentication:forgot-password' %}";
$(document).ready(function () {
}).on('click', '#forgot_password', function () {
if (authDB === 'True'){
window.open(forgotPasswordUrl, "_blank")
}
else{
alert("{% trans 'You are using another authentication server, please contact your administrator' %}")
}
})
</script>
{% endblock %}

View File

@@ -20,7 +20,7 @@
<div class="form-group">
<input type="text" class="form-control" name="otp_code" placeholder="" required="" autofocus="autofocus">
<span class="help-block">
{% trans 'Open Google Authenticator and enter the 6-bit dynamic code' %}
{% trans 'Open MFA Authenticator and enter the 6-bit dynamic code' %}
</span>
</div>
<button type="submit" class="btn btn-primary block full-width m-b">{% trans 'Next' %}</button>

View File

@@ -106,7 +106,8 @@
{% endif %}
</div>
<div class="form-group">
<input type="password" class="form-control" id="password" name="{{ form.password.html_name }}" placeholder="{% trans 'Password' %}" required="">
<input type="password" class="form-control" id="password" placeholder="{% trans 'Password' %}" required="">
<input id="password-hidden" type="text" style="display:none" name="{{ form.password.html_name }}">
{% if form.errors.password %}
<div class="help-block field-error">
<p class="red-fonts">{{ form.errors.password.as_text }}</p>
@@ -130,7 +131,7 @@
<button type="submit" class="btn btn-transparent" onclick="doLogin();return false;">{% trans 'Login' %}</button>
</div>
<div style="text-align: center">
<a href="{% url 'authentication:forgot-password' %}">
<a id="forgot_password" href="#">
<small>{% trans 'Forgot password' %}?</small>
</a>
</div>
@@ -143,6 +144,7 @@
</div>
</div>
</div>
</div>
</body>
<script type="text/javascript" src="/static/js/plugins/jsencrypt/jsencrypt.min.js"></script>
@@ -157,9 +159,21 @@
var rsaPublicKey = "{{ rsa_public_key }}"
var password =$('#password').val(); //明文密码
var passwordEncrypted = encryptLoginPassword(password, rsaPublicKey)
$('#password').val(passwordEncrypted); //返回给密码输入input
$('#password-hidden').val(passwordEncrypted); //返回给密码输入input
$('#contact-form').submit();//post提交
}
var authDB = '{{ AUTH_DB }}';
var forgotPasswordUrl = "{% url 'authentication:forgot-password' %}";
$(document).ready(function () {
}).on('click', '#forgot_password', function () {
if (authDB === 'True'){
window.open(forgotPasswordUrl, "_blank")
}
else{
alert("{% trans 'You are using another authentication server, please contact your administrator' %}")
}
})
</script>
</html>

View File

@@ -22,6 +22,7 @@ urlpatterns = [
name='forgot-password-sendmail-success'),
path('password/reset/', users_view.UserResetPasswordView.as_view(), name='reset-password'),
path('password/too-simple-flash-msg/', views.FlashPasswdTooSimpleMsgView.as_view(), name='passwd-too-simple-flash-msg'),
path('password/has-expired-msg/', views.FlashPasswdHasExpiredMsgView.as_view(), name='passwd-has-expired-flash-msg'),
path('password/reset/success/', users_view.UserResetPasswordSuccessView.as_view(), name='reset-password-success'),
path('password/verify/', users_view.UserVerifyPasswordView.as_view(), name='user-verify-password'),

View File

@@ -17,6 +17,7 @@ from django.views.generic.base import TemplateView, RedirectView
from django.views.generic.edit import FormView
from django.conf import settings
from django.urls import reverse_lazy
from django.contrib.auth import BACKEND_SESSION_KEY
from common.const.front_urls import TICKET_DETAIL
from common.utils import get_request_ip, get_object_or_none
@@ -31,7 +32,7 @@ from ..forms import get_user_login_form_cls
__all__ = [
'UserLoginView', 'UserLogoutView',
'UserLoginGuardView', 'UserLoginWaitConfirmView',
'FlashPasswdTooSimpleMsgView',
'FlashPasswdTooSimpleMsgView', 'FlashPasswdHasExpiredMsgView'
]
@@ -86,6 +87,7 @@ class UserLoginView(mixins.AuthMixin, FormView):
try:
self.check_user_auth(decrypt_passwd=True)
except errors.AuthFailedError as e:
e = self.check_is_block(raise_exception=False) or e
form.add_error(None, e.msg)
ip = self.get_request_ip()
cache.set(self.key_prefix_captcha.format(ip), 1, 3600)
@@ -94,7 +96,7 @@ class UserLoginView(mixins.AuthMixin, FormView):
new_form._errors = form.errors
context = self.get_context_data(form=new_form)
return self.render_to_response(context)
except errors.PasswdTooSimple as e:
except (errors.PasswdTooSimple, errors.PasswordRequireResetError) as e:
return redirect(e.url)
self.clear_rsa_key()
return self.redirect_to_guard_view()
@@ -130,7 +132,8 @@ class UserLoginView(mixins.AuthMixin, FormView):
context = {
'demo_mode': os.environ.get("DEMO_MODE"),
'AUTH_OPENID': settings.AUTH_OPENID,
'rsa_public_key': rsa_public_key
'rsa_public_key': rsa_public_key,
'AUTH_DB': settings.AUTH_DB
}
kwargs.update(context)
return super().get_context_data(**kwargs)
@@ -205,12 +208,12 @@ class UserLoginWaitConfirmView(TemplateView):
class UserLogoutView(TemplateView):
template_name = 'flash_message_standalone.html'
@staticmethod
def get_backend_logout_url():
if settings.AUTH_OPENID:
def get_backend_logout_url(self):
backend = self.request.session.get(BACKEND_SESSION_KEY, '')
if 'OIDC' in backend:
return settings.AUTH_OPENID_AUTH_LOGOUT_URL_NAME
# if settings.AUTH_CAS:
# return settings.CAS_LOGOUT_URL_NAME
elif 'CAS' in backend:
return settings.CAS_LOGOUT_URL_NAME
return None
def get(self, request, *args, **kwargs):
@@ -247,3 +250,18 @@ class FlashPasswdTooSimpleMsgView(TemplateView):
'auto_redirect': True,
}
return self.render_to_response(context)
@method_decorator(never_cache, name='dispatch')
class FlashPasswdHasExpiredMsgView(TemplateView):
template_name = 'flash_message_standalone.html'
def get(self, request, *args, **kwargs):
context = {
'title': _('Please change your password'),
'messages': _('Your password has expired, please reset before logging in'),
'interval': 5,
'redirect_url': request.GET.get('redirect_url'),
'auto_redirect': True,
}
return self.render_to_response(context)

View File

@@ -0,0 +1,2 @@
UPDATE_NODE_TREE_LOCK_KEY = 'org_level_transaction_lock_{org_id}_assets_update_node_tree'
UPDATE_MAPPING_NODE_TASK_LOCK_KEY = 'org_level_transaction_lock_{user_id}_update_mapping_node_task'

View File

@@ -0,0 +1,14 @@
"""
`m2m_changed`
```
def m2m_signals_handler(action, instance, reverse, model, pk_set, using):
pass
```
"""
PRE_ADD = 'pre_add'
POST_ADD = 'post_add'
PRE_REMOVE = 'pre_remove'
POST_REMOVE = 'post_remove'
PRE_CLEAR = 'pre_clear'
POST_CLEAR = 'post_clear'

40
apps/common/db/utils.py Normal file
View File

@@ -0,0 +1,40 @@
from common.utils import get_logger
logger = get_logger(__file__)
def get_object_if_need(model, pk):
if not isinstance(pk, model):
try:
return model.objects.get(id=pk)
except model.DoesNotExist as e:
logger.error(f'DoesNotExist: <{model.__name__}:{pk}> not exist')
raise e
return pk
def get_objects_if_need(model, pks):
if not pks:
return pks
if not isinstance(pks[0], model):
objs = list(model.objects.filter(id__in=pks))
if len(objs) != len(pks):
pks = set(pks)
exists_pks = {o.id for o in objs}
not_found_pks = ','.join(pks - exists_pks)
logger.error(f'DoesNotExist: <{model.__name__}: {not_found_pks}>')
return objs
return pks
def get_objects(model, pks):
if not pks:
return pks
objs = list(model.objects.filter(id__in=pks))
if len(objs) != len(pks):
pks = set(pks)
exists_pks = {o.id for o in objs}
not_found_pks = ','.join(pks - exists_pks)
logger.error(f'DoesNotExist: <{model.__name__}: {not_found_pks}>')
return objs

View File

@@ -1,12 +1,16 @@
from django.core.exceptions import PermissionDenied, ObjectDoesNotExist as DJObjectDoesNotExist
from django.http import Http404
from django.utils.translation import gettext
from django.db.models.deletion import ProtectedError
from rest_framework import exceptions
from rest_framework.views import set_rollback
from rest_framework.response import Response
from common.exceptions import JMSObjectDoesNotExist
from common.exceptions import JMSObjectDoesNotExist, ReferencedByOthers
from logging import getLogger
logger = getLogger('drf_exception')
unexpected_exception_logger = getLogger('unexpected_exception')
def extract_object_name(exc, index=0):
@@ -20,12 +24,16 @@ def extract_object_name(exc, index=0):
def common_exception_handler(exc, context):
logger.exception('')
if isinstance(exc, Http404):
exc = JMSObjectDoesNotExist(object_name=extract_object_name(exc, 1))
elif isinstance(exc, PermissionDenied):
exc = exceptions.PermissionDenied()
elif isinstance(exc, DJObjectDoesNotExist):
exc = JMSObjectDoesNotExist(object_name=extract_object_name(exc, 0))
elif isinstance(exc, ProtectedError):
exc = ReferencedByOthers()
if isinstance(exc, exceptions.APIException):
headers = {}
@@ -34,12 +42,14 @@ def common_exception_handler(exc, context):
if getattr(exc, 'wait', None):
headers['Retry-After'] = '%d' % exc.wait
if isinstance(exc.detail, (list, dict)):
data = exc.detail
if isinstance(exc.detail, str) and isinstance(exc.get_codes(), str):
data = {'detail': exc.detail, 'code': exc.get_codes()}
else:
data = {'detail': exc.detail}
data = exc.detail
set_rollback()
return Response(data, status=exc.status_code, headers=headers)
else:
unexpected_exception_logger.exception('')
return None

View File

@@ -7,6 +7,7 @@ from collections import OrderedDict
from django.core.exceptions import PermissionDenied
from django.http import Http404
from django.utils.encoding import force_text
from rest_framework.fields import empty
from rest_framework.metadata import SimpleMetadata
from rest_framework import exceptions, serializers
@@ -58,6 +59,10 @@ class SimpleMetadataWithFilters(SimpleMetadata):
field_info['type'] = self.label_lookup[field]
field_info['required'] = getattr(field, 'required', False)
default = getattr(field, 'default', False)
if default and isinstance(default, (str, int)):
field_info['default'] = default
for attr in self.attrs:
value = getattr(field, attr, None)
if value is not None and value != '':

View File

@@ -1 +1,2 @@
from .csv import *
from .csv import *
from .excel import *

View File

@@ -0,0 +1,136 @@
import abc
import json
import codecs
from rest_framework import serializers
from django.utils.translation import ugettext_lazy as _
from rest_framework.parsers import BaseParser
from rest_framework import status
from rest_framework.exceptions import ParseError, APIException
from common.utils import get_logger
logger = get_logger(__file__)
class FileContentOverflowedError(APIException):
status_code = status.HTTP_400_BAD_REQUEST
default_code = 'file_content_overflowed'
default_detail = _('The file content overflowed (The maximum length `{}` bytes)')
class BaseFileParser(BaseParser):
FILE_CONTENT_MAX_LENGTH = 1024 * 1024 * 10
serializer_cls = None
def check_content_length(self, meta):
content_length = int(meta.get('CONTENT_LENGTH', meta.get('HTTP_CONTENT_LENGTH', 0)))
if content_length > self.FILE_CONTENT_MAX_LENGTH:
msg = FileContentOverflowedError.default_detail.format(self.FILE_CONTENT_MAX_LENGTH)
logger.error(msg)
raise FileContentOverflowedError(msg)
@staticmethod
def get_stream_data(stream):
stream_data = stream.read()
stream_data = stream_data.strip(codecs.BOM_UTF8)
return stream_data
@abc.abstractmethod
def generate_rows(self, stream_data):
raise NotImplemented
def get_column_titles(self, rows):
return next(rows)
def convert_to_field_names(self, column_titles):
fields_map = {}
fields = self.serializer_cls().fields
fields_map.update({v.label: k for k, v in fields.items()})
fields_map.update({k: k for k, _ in fields.items()})
field_names = [
fields_map.get(column_title.strip('*'), '')
for column_title in column_titles
]
return field_names
@staticmethod
def _replace_chinese_quote(s):
trans_table = str.maketrans({
'': '"',
'': '"',
'': '"',
'': '"',
'\'': '"'
})
return s.translate(trans_table)
@classmethod
def process_row(cls, row):
"""
构建json数据前的行处理
"""
new_row = []
for col in row:
# 转换中文引号
col = cls._replace_chinese_quote(col)
# 列表/字典转换
if isinstance(col, str) and (
(col.startswith('[') and col.endswith(']'))
or
(col.startswith("{") and col.endswith("}"))
):
col = json.loads(col)
new_row.append(col)
return new_row
def process_row_data(self, row_data):
"""
构建json数据后的行数据处理
"""
new_row_data = {}
serializer_fields = self.serializer_cls().fields
for k, v in row_data.items():
if isinstance(v, list) or isinstance(v, dict) or isinstance(v, str) and k.strip() and v.strip():
# 解决类似disk_info为字符串的'{}'的问题
if not isinstance(v, str) and isinstance(serializer_fields[k], serializers.CharField):
v = str(v)
new_row_data[k] = v
return new_row_data
def generate_data(self, fields_name, rows):
data = []
for row in rows:
# 空行不处理
if not any(row):
continue
row = self.process_row(row)
row_data = dict(zip(fields_name, row))
row_data = self.process_row_data(row_data)
data.append(row_data)
return data
def parse(self, stream, media_type=None, parser_context=None):
parser_context = parser_context or {}
try:
view = parser_context['view']
meta = view.request.META
self.serializer_cls = view.get_serializer_class()
except Exception as e:
logger.debug(e, exc_info=True)
raise ParseError('The resource does not support imports!')
self.check_content_length(meta)
try:
stream_data = self.get_stream_data(stream)
rows = self.generate_rows(stream_data)
column_titles = self.get_column_titles(rows)
field_names = self.convert_to_field_names(column_titles)
data = self.generate_data(field_names, rows)
return data
except Exception as e:
logger.error(e, exc_info=True)
raise ParseError('Parse error! ({})'.format(self.media_type))

View File

@@ -1,32 +1,13 @@
# ~*~ coding: utf-8 ~*~
#
import json
import chardet
import codecs
import unicodecsv
from django.utils.translation import ugettext as _
from rest_framework.parsers import BaseParser
from rest_framework.exceptions import ParseError, APIException
from rest_framework import status
from common.utils import get_logger
logger = get_logger(__file__)
from .base import BaseFileParser
class CsvDataTooBig(APIException):
status_code = status.HTTP_400_BAD_REQUEST
default_code = 'csv_data_too_big'
default_detail = _('The max size of CSV is %d bytes')
class JMSCSVParser(BaseParser):
"""
Parses CSV file to serializer data
"""
CSV_UPLOAD_MAX_SIZE = 1024 * 1024 * 10
class CSVFileParser(BaseFileParser):
media_type = 'text/csv'
@@ -38,99 +19,10 @@ class JMSCSVParser(BaseParser):
for line in stream.splitlines():
yield line
@staticmethod
def _gen_rows(csv_data, charset='utf-8', **kwargs):
csv_reader = unicodecsv.reader(csv_data, encoding=charset, **kwargs)
def generate_rows(self, stream_data):
detect_result = chardet.detect(stream_data)
encoding = detect_result.get("encoding", "utf-8")
lines = self._universal_newlines(stream_data)
csv_reader = unicodecsv.reader(lines, encoding=encoding)
for row in csv_reader:
if not any(row): # 空行
continue
yield row
@staticmethod
def _get_fields_map(serializer_cls):
fields_map = {}
fields = serializer_cls().fields
fields_map.update({v.label: k for k, v in fields.items()})
fields_map.update({k: k for k, _ in fields.items()})
return fields_map
@staticmethod
def _replace_chinese_quot(str_):
trans_table = str.maketrans({
'': '"',
'': '"',
'': '"',
'': '"',
'\'': '"'
})
return str_.translate(trans_table)
@classmethod
def _process_row(cls, row):
"""
构建json数据前的行处理
"""
_row = []
for col in row:
# 列表转换
if isinstance(col, str) and col.startswith('[') and col.endswith(']'):
col = cls._replace_chinese_quot(col)
col = json.loads(col)
# 字典转换
if isinstance(col, str) and col.startswith("{") and col.endswith("}"):
col = cls._replace_chinese_quot(col)
col = json.loads(col)
_row.append(col)
return _row
@staticmethod
def _process_row_data(row_data):
"""
构建json数据后的行数据处理
"""
_row_data = {}
for k, v in row_data.items():
if isinstance(v, list) or isinstance(v, dict)\
or isinstance(v, str) and k.strip() and v.strip():
_row_data[k] = v
return _row_data
def parse(self, stream, media_type=None, parser_context=None):
parser_context = parser_context or {}
try:
view = parser_context['view']
meta = view.request.META
serializer_cls = view.get_serializer_class()
except Exception as e:
logger.debug(e, exc_info=True)
raise ParseError('The resource does not support imports!')
content_length = int(meta.get('CONTENT_LENGTH', meta.get('HTTP_CONTENT_LENGTH', 0)))
if content_length > self.CSV_UPLOAD_MAX_SIZE:
msg = CsvDataTooBig.default_detail % self.CSV_UPLOAD_MAX_SIZE
logger.error(msg)
raise CsvDataTooBig(msg)
try:
stream_data = stream.read()
stream_data = stream_data.strip(codecs.BOM_UTF8)
detect_result = chardet.detect(stream_data)
encoding = detect_result.get("encoding", "utf-8")
binary = self._universal_newlines(stream_data)
rows = self._gen_rows(binary, charset=encoding)
header = next(rows)
fields_map = self._get_fields_map(serializer_cls)
header = [fields_map.get(name.strip('*'), '') for name in header]
data = []
for row in rows:
row = self._process_row(row)
row_data = dict(zip(header, row))
row_data = self._process_row_data(row_data)
data.append(row_data)
return data
except Exception as e:
logger.error(e, exc_info=True)
raise ParseError('CSV parse error!')

View File

@@ -0,0 +1,14 @@
import pyexcel
from .base import BaseFileParser
class ExcelFileParser(BaseFileParser):
media_type = 'text/xlsx'
def generate_rows(self, stream_data):
workbook = pyexcel.get_book(file_type='xlsx', file_content=stream_data)
# 默认获取第一个工作表sheet
sheet = workbook.sheet_by_index(0)
rows = sheet.rows()
return rows

View File

@@ -1,6 +1,7 @@
from rest_framework import renderers
from .csv import *
from .excel import *
class PassthroughRenderer(renderers.BaseRenderer):

View File

@@ -0,0 +1,134 @@
import abc
from datetime import datetime
from rest_framework.renderers import BaseRenderer
from rest_framework.utils import encoders, json
from common.utils import get_logger
logger = get_logger(__file__)
class BaseFileRenderer(BaseRenderer):
# 渲染模版标识, 导入、导出、更新模版: ['import', 'update', 'export']
template = 'export'
serializer = None
@staticmethod
def _check_validation_data(data):
detail_key = "detail"
if detail_key in data:
return False
return True
@staticmethod
def _json_format_response(response_data):
return json.dumps(response_data)
def set_response_disposition(self, response):
serializer = self.serializer
if response and hasattr(serializer, 'Meta') and hasattr(serializer.Meta, "model"):
filename_prefix = serializer.Meta.model.__name__.lower()
else:
filename_prefix = 'download'
now = datetime.now().strftime("%Y-%m-%d_%H-%M-%S")
filename = "{}_{}.{}".format(filename_prefix, now, self.format)
disposition = 'attachment; filename="{}"'.format(filename)
response['Content-Disposition'] = disposition
def get_rendered_fields(self):
fields = self.serializer.fields
if self.template == 'import':
return [v for k, v in fields.items() if not v.read_only and k != "org_id" and k != 'id']
elif self.template == 'update':
return [v for k, v in fields.items() if not v.read_only and k != "org_id"]
else:
return [v for k, v in fields.items() if not v.write_only and k != "org_id"]
@staticmethod
def get_column_titles(render_fields):
return [
'*{}'.format(field.label) if field.required else str(field.label)
for field in render_fields
]
def process_data(self, data):
results = data['results'] if 'results' in data else data
if isinstance(results, dict):
results = [results]
if self.template == 'import':
results = [results[0]] if results else results
else:
# 限制数据数量
results = results[:10000]
# 会将一些 UUID 字段转化为 string
results = json.loads(json.dumps(results, cls=encoders.JSONEncoder))
return results
@staticmethod
def generate_rows(data, render_fields):
for item in data:
row = []
for field in render_fields:
value = item.get(field.field_name)
value = str(value) if value else ''
row.append(value)
yield row
@abc.abstractmethod
def initial_writer(self):
raise NotImplementedError
def write_column_titles(self, column_titles):
self.write_row(column_titles)
def write_rows(self, rows):
for row in rows:
self.write_row(row)
@abc.abstractmethod
def write_row(self, row):
raise NotImplementedError
@abc.abstractmethod
def get_rendered_value(self):
raise NotImplementedError
def render(self, data, accepted_media_type=None, renderer_context=None):
if data is None:
return bytes()
if not self._check_validation_data(data):
return self._json_format_response(data)
try:
renderer_context = renderer_context or {}
request = renderer_context['request']
response = renderer_context['response']
view = renderer_context['view']
self.template = request.query_params.get('template', 'export')
self.serializer = view.get_serializer()
self.set_response_disposition(response)
except Exception as e:
logger.debug(e, exc_info=True)
value = 'The resource not support export!'.encode('utf-8')
return value
try:
rendered_fields = self.get_rendered_fields()
column_titles = self.get_column_titles(rendered_fields)
data = self.process_data(data)
rows = self.generate_rows(data, rendered_fields)
self.initial_writer()
self.write_column_titles(column_titles)
self.write_rows(rows)
value = self.get_rendered_value()
except Exception as e:
logger.debug(e, exc_info=True)
value = 'Render error! ({})'.format(self.media_type).encode('utf-8')
return value
return value

View File

@@ -1,83 +1,30 @@
# ~*~ coding: utf-8 ~*~
#
import unicodecsv
import codecs
from datetime import datetime
import unicodecsv
from six import BytesIO
from rest_framework.renderers import BaseRenderer
from rest_framework.utils import encoders, json
from common.utils import get_logger
logger = get_logger(__file__)
from .base import BaseFileRenderer
class JMSCSVRender(BaseRenderer):
class CSVFileRenderer(BaseFileRenderer):
media_type = 'text/csv'
format = 'csv'
@staticmethod
def _get_show_fields(fields, template):
if template == 'import':
return [v for k, v in fields.items() if not v.read_only and k != "org_id" and k != 'id']
elif template == 'update':
return [v for k, v in fields.items() if not v.read_only and k != "org_id"]
else:
return [v for k, v in fields.items() if not v.write_only and k != "org_id"]
writer = None
buffer = None
@staticmethod
def _gen_table(data, fields):
data = data[:10000]
yield ['*{}'.format(f.label) if f.required else f.label for f in fields]
def initial_writer(self):
csv_buffer = BytesIO()
csv_buffer.write(codecs.BOM_UTF8)
csv_writer = unicodecsv.writer(csv_buffer, encoding='utf-8')
self.buffer = csv_buffer
self.writer = csv_writer
for item in data:
row = [item.get(f.field_name) for f in fields]
yield row
def set_response_disposition(self, serializer, context):
response = context.get('response')
if response and hasattr(serializer, 'Meta') and \
hasattr(serializer.Meta, "model"):
model_name = serializer.Meta.model.__name__.lower()
now = datetime.now().strftime("%Y-%m-%d_%H-%M-%S")
filename = "{}_{}.csv".format(model_name, now)
disposition = 'attachment; filename="{}"'.format(filename)
response['Content-Disposition'] = disposition
def render(self, data, media_type=None, renderer_context=None):
renderer_context = renderer_context or {}
request = renderer_context['request']
template = request.query_params.get('template', 'export')
view = renderer_context['view']
if isinstance(data, dict):
data = data.get("results", [])
if template == 'import':
data = [data[0]] if data else data
data = json.loads(json.dumps(data, cls=encoders.JSONEncoder))
try:
serializer = view.get_serializer()
self.set_response_disposition(serializer, renderer_context)
except Exception as e:
logger.debug(e, exc_info=True)
value = 'The resource not support export!'.encode('utf-8')
else:
fields = serializer.fields
show_fields = self._get_show_fields(fields, template)
table = self._gen_table(data, show_fields)
csv_buffer = BytesIO()
csv_buffer.write(codecs.BOM_UTF8)
csv_writer = unicodecsv.writer(csv_buffer, encoding='utf-8')
for row in table:
csv_writer.writerow(row)
value = csv_buffer.getvalue()
def write_row(self, row):
self.writer.writerow(row)
def get_rendered_value(self):
value = self.buffer.getvalue()
return value

Some files were not shown because too many files have changed in this diff Show More