Compare commits

...

385 Commits

Author SHA1 Message Date
halo
fef3211252 fix: 修复节点创建不能指定id问题,兼容api 2022-03-02 16:33:54 +08:00
halo
1627fae941 perf: 优化代码 2022-02-21 16:56:40 +08:00
halo
f95efc2274 fix: 修复keycloak配置转换为openid不生效问题,持久化到数据库 2022-02-21 16:56:40 +08:00
Jiangjie.Bai
bca4b5191e fix: 修复获取命令过滤规则时按照有组织的资源进行组织过滤 2022-02-19 11:39:34 +08:00
Jiangjie.Bai
997850fa61 fix: 修复登录系统用户temp_auth保存时构建的key和获取时不一致的问题 2022-02-18 20:07:02 +08:00
feng626
8d9221ea3d fix: 删除组织关联删除工单 工单流 2022-02-18 17:24:02 +08:00
Jiangjie.Bai
18c388f3a5 Merge pull request #7629 from jumpserver/dev
v2.19.0-rc3
2022-02-17 11:04:33 +08:00
feng626
b088362ae3 perf: 账号备份性能优化 2022-02-17 11:02:49 +08:00
xinwen
c7f8ebb613 fix: 站内信 ServerPerformanceCheck redis 订阅报错 2022-02-17 10:53:36 +08:00
feng626
de2f9ae687 Merge pull request #7626 from jumpserver/pr@dev@fix_i18n
fix: 翻译
2022-02-16 19:19:55 +08:00
xinwen
a806a2d3e6 fix: 翻译 2022-02-16 19:16:45 +08:00
Jiangjie.Bai
7be76feeb0 Merge pull request #7622 from jumpserver/dev
v2.19.0-rc3
2022-02-16 16:42:19 +08:00
ibuler
f548abcb87 perf: 用户临时密码支持加密传输 2022-02-16 16:31:57 +08:00
xinwen
35c6b581e2 feat: 远程应用支持磁盘挂载 2022-02-16 16:29:06 +08:00
feng626
40b119786b Merge pull request #7621 from jumpserver/pr@dev@ticket_flow
fix: 工单流需要超级管理员配置
2022-02-16 14:47:56 +08:00
feng626
ff9347e344 fix: 工单流需要超级管理员配置 2022-02-16 14:37:26 +08:00
feng626
da0bbcee57 fix: 修复资产账号过滤重复问题 2022-02-16 11:35:53 +08:00
ibuler
824d10ce93 pref: connection token 添加 secret 2022-02-16 11:33:25 +08:00
feng626
1d6dcf9fb5 fix: 修复命令报警翻译 2022-02-15 18:33:52 +08:00
Jiangjie.Bai
3325867f8f fix: 修改LDAP SYNC相关参数设置 2022-02-15 17:57:56 +08:00
xinwen
2f43aeee5d fix: Authbook 修改秘钥不生效 2022-02-15 11:14:51 +08:00
Jiangjie.Bai
764d70ea4f fix: 修改用户组名称翻译 2022-02-14 19:27:41 +08:00
Jiangjie.Bai
ff6dbe67a6 Merge pull request #7610 from jumpserver/dev
v2.19.0-rc2
2022-02-14 18:31:52 +08:00
feng626
7f36958683 perf: 工单流只需组织管理员权限 2022-02-14 18:15:11 +08:00
xinwen
fa5921cd86 fix: 去掉 can_replay 兼容代码 2022-02-14 18:14:37 +08:00
feng626
540679df11 fix: 修复operatelog add remove 翻译问题 2022-02-14 14:04:59 +08:00
fit2bot
597b022905 fix: 修复历史会话文件不能被清理的bug (#7604)
* fix: 修复历史会话文件不能被清理的bug

* perf: 删除调试日志

Co-authored-by: halo <wuyihuangw@gmail.com>
2022-02-14 10:58:30 +08:00
jiangweidong
ab34b9906e perf: 切割一次就可以了 2022-02-10 18:34:09 +08:00
jiangweidong
755fa8efa8 perf: 兼容不同版本间JumpServer获取的SAML2协议用户属性 2022-02-10 18:34:09 +08:00
feng626
889713f00e fix: 修复工单流程设置显示其它组织问题 2022-02-10 14:07:32 +08:00
Jiangjie.Bai
c10436de47 Merge pull request #7589 from jumpserver/dev
v2.19.0-rc1
2022-02-10 11:24:28 +08:00
xinwen
333bb64b8b fix: Authbook 取认证信息bug 2022-02-10 11:20:40 +08:00
Michael Bai
67a49dc5e9 feat: syslog 记录的数据来源于序列类 2022-02-08 19:22:08 +08:00
Michael Bai
8085db7acc feat: 增加系统设置(安全)控制第三方认证用户是否进行MFA认证 2022-02-08 17:48:40 +08:00
Michael Bai
6adeafd1d2 fix: 修复创建命令过滤规则失败的问题 2022-02-08 14:21:43 +08:00
xinwen
1faba95a48 fix: 修复 xrdp 连接资产时会生成用户登录日志 2022-02-08 12:38:27 +08:00
吴小白
ab07091eb8 perf: 优化 quick_start.sh 2022-02-08 12:36:43 +08:00
xinwen
f994f5d776 fix: redis 订阅 bug 2022-02-08 12:35:16 +08:00
ibuler
55fae1667d fix: 修复 redis 连接过多的问题 2022-02-08 12:35:16 +08:00
Michael Bai
b3397c6aeb feat: 命令过滤规则增加忽略大小写选项 2022-02-08 12:31:15 +08:00
老广
8ab4f6f004 docs: 添加 pr 提示说明 2022-02-07 15:39:50 +08:00
老广
5a37540eda docs: 修改贡献提示 2022-02-07 15:39:50 +08:00
老广
1c1d507e34 Create CONTRIBUTING.md 2022-02-07 15:39:50 +08:00
老广
9c40314edc Create CODE_OF_CONDUCT.md 2022-02-07 15:39:22 +08:00
feng626
acdde5a236 fix: 密钥密码翻译 2022-01-27 18:19:44 +08:00
Jiangjie.Bai
37a3566b0e Merge pull request #7540 from jumpserver/dev
v2.18
2022-01-20 13:47:13 +08:00
feng626
a81f6bf97b fix: 应用账号attrs字段问题 2022-01-20 13:20:45 +08:00
Michael Bai
b07f532a71 fix: 修复导出应用账号会包含attrs字段的问题 2022-01-20 12:00:18 +08:00
feng626
46270fd91c fix: authbook load asset queryset bug 2022-01-20 11:03:18 +08:00
Jiangjie.Bai
2b364c1476 Merge pull request #7534 from jumpserver/dev
v2.18.0-rc4
2022-01-19 19:36:59 +08:00
feng626
9e47bf6ec5 fix: 批量命令bug 2022-01-19 19:34:32 +08:00
ibuler
89c7e514eb perf: 修复登录和推送系统用户是密码选择 2022-01-19 19:08:38 +08:00
feng626
83c5344307 fix: 获取命令规则过滤不准 2022-01-19 19:07:43 +08:00
feng626
2beec1a03b Merge pull request #7530 from jumpserver/pr@dev@ssh_passphrase_bug
fix: 修改 修改ssh密钥时报500问题
2022-01-19 11:11:49 +08:00
feng626
d53925d1fb Merge pull request #7529 from jumpserver/pr@dev@ticket_bug
fix: 修复工单审批后审批人与记录人不一致问题
2022-01-19 11:11:34 +08:00
feng626
69d4a0a250 fix: 修改 修改ssh密钥时报500问题 2022-01-19 10:39:17 +08:00
feng626
e33a67daf6 fix: 修复工单审批后审批人与记录人不一致问题 2022-01-19 10:33:41 +08:00
Jiangjie.Bai
2036037675 Merge pull request #7527 from jumpserver/dev
v2.18.0-rc3
2022-01-18 19:35:37 +08:00
fit2bot
0b16d70c0c fix: 修改翻译 (#7526)
* fix: 修改翻译

* fix: 修改保存数据库翻译

* fix: 修复翻译为none问题

Co-authored-by: feng626 <1304903146@qq.com>
Co-authored-by: Jiangjie.Bai <32935519+BaiJiangJie@users.noreply.github.com>
2022-01-18 19:31:01 +08:00
Michael Bai
8d6498c8f0 fix: 升级依赖 jms-storage==0.0.41 2022-01-18 17:01:02 +08:00
Michael Bai
59d0f53725 fix: 修改审计日志操作日志动作翻译问题 2022-01-18 15:21:35 +08:00
Michael Bai
518caa4f7a fix: 修改应用账号APP名称的VerboseName 2022-01-18 15:03:23 +08:00
Michael Bai
b8aaa7d249 fix: 修改命令过滤器翻译 2022-01-18 14:27:22 +08:00
Michael Bai
c9f63a3f4a fix: 修改审计日志中的i18n翻译问题 2022-01-18 13:21:31 +08:00
xinwen
ec68fc9562 fix: 绑定三方登录消息翻译 2022-01-18 10:39:35 +08:00
xinwen
561ce53413 fix: 空的 es 存储,指定 timestamp 排序报错 2022-01-18 10:14:51 +08:00
Jiangjie.Bai
6bd597eadd Merge pull request #7511 from jumpserver/dev
v2.18.0-rc2
2022-01-17 19:21:39 +08:00
feng626
005032ea00 fix: 修复账号备份存在外键search 500 bug 2022-01-17 19:21:13 +08:00
Michael Bai
de74477fd6 fix: 修改翻译文件 2022-01-17 19:11:01 +08:00
fit2bot
c43ad981bd perf: 优化写法 (#7498)
* fix: 修复登录页输入 mfa 时不支持 某 mfa 的错误提示

fix tapd 1145454465001008371

* perf: 优化 send code api,避免暴力常识

* perf: 优化写法

* Update mfa.py

Co-authored-by: ibuler <ibuler@qq.com>
Co-authored-by: Jiangjie.Bai <32935519+BaiJiangJie@users.noreply.github.com>
2022-01-17 19:05:01 +08:00
xinwen
088676d998 fix: 计算工单序号的时候切换到root组织 2022-01-17 16:22:04 +08:00
jiangweidong
106bb9b63d perf: SAML2属性映射去掉域名前缀 2022-01-17 16:06:37 +08:00
ibuler
451e61bb61 perf: 优化 celery 任务中的数据库连接问题
perf: add comment
2022-01-17 16:04:50 +08:00
halo
e1de210585 feat: 创建用户邮件内容支持html标签 2022-01-17 16:03:04 +08:00
xinwen
cce34f4939 fix: 用户登录失败时,登陆日志中的认证方式不准确 2022-01-17 16:02:28 +08:00
fit2bot
def9bedd30 perf: 账号备份优化 (#7503)
* perf: 账号备份优化

* feat: 优化账号备份获取有序备份字段列表

Co-authored-by: feng626 <1304903146@qq.com>
Co-authored-by: Michael Bai <baijiangjie@gmail.com>
2022-01-17 15:58:39 +08:00
feng626
d2b3edffca Merge pull request #7505 from jumpserver/pr@dev@ticket_bug
fix: 修复工单审批后state不准问题
2022-01-17 15:51:49 +08:00
Michael Bai
aea2db84e8 fix: 添加deb依赖: libaio-dev freetds-dev 2022-01-17 13:57:08 +08:00
feng626
6a4ffcc9ad fix: 修复工单审批后state不准问题 2022-01-16 17:49:17 +08:00
fit2bot
4343b6487d fix: 修复登录页输入 mfa 时不支持 某 mfa 的错误提示 (#7495)
* fix: 修复登录页输入 mfa 时不支持 某 mfa 的错误提示

fix tapd 1145454465001008371

* perf: 优化 send code api,避免暴力常识

Co-authored-by: ibuler <ibuler@qq.com>
2022-01-13 19:01:12 +08:00
Michael Bai
ba695c4600 fix: 抽象permissions-serializer的actions字段值设置 2022-01-13 18:58:10 +08:00
Michael Bai
21a34ddc03 feat: 返回License信息 2022-01-13 18:55:44 +08:00
ibuler
145c7952c9 perf: 优化 ops 任务,支持 i18n
- 内置任务名称支持了 i18n,数据库存英文,返回是翻译一下
- 任务执行中,添加 language 上下文
2022-01-13 15:03:48 +08:00
xinwen
43c4c78378 fix: 工单序号添加排序 2022-01-13 14:37:02 +08:00
Michael Bai
620834c817 fix: 修改工单迁移文件warining问题
Any fields that are stored with VARCHAR column types may have their max_length restricted to 255 characters if you are using unique=True for the field.
See: https://docs.djangoproject.com/en/3.1/ref/databases/#mysql-character-fields
2022-01-13 14:36:30 +08:00
Jiangjie.Bai
fbd0b44d4f Merge pull request #7490 from jumpserver/dev
v2.18.0-rc1
2022-01-12 20:58:04 +08:00
xinwen
4b44a54d97 fix: 腾讯验证码测试不通过 2022-01-12 20:57:22 +08:00
Eric
6716acb4ff perf: 更新安全英文版本 2022-01-12 20:56:58 +08:00
Jiangjie.Bai
35722a8466 Merge pull request #7487 from jumpserver/dev
v2.18.0-rc1
2022-01-12 20:56:33 +08:00
feng626
e5659a1d07 fix: 修复翻译及迁移文件 2022-01-12 20:55:58 +08:00
xinwen
414137cd73 fix: 三方认证绑定发通知文案 2022-01-12 20:34:19 +08:00
xinwen
1e3a15a3d0 feat: 三方认证绑定发通知 2022-01-12 20:23:25 +08:00
ibuler
dde7559f47 feat: node api 支持使用 full_value 递归创建 2022-01-12 20:23:16 +08:00
feng626
b0932e5137 feat: 逃生通道 2022-01-12 20:22:36 +08:00
xinwen
5cdc4c3c28 feat: 工单添加序号 2022-01-12 10:38:07 +08:00
ibuler
25a2290804 perf: 优化迁移数据库错误提示 2022-01-10 17:32:01 +08:00
Michael Bai
4405abbedf feat: 2022 2022-01-04 12:54:12 +08:00
halo
3f6c9d519e fix: 修复通知用户过期时间bug 2021-12-31 17:26:23 +08:00
xinwen
bd84edea62 feat: 资产登录工单页面增加监控与中断 2021-12-31 17:23:39 +08:00
feng626
100bfe0304 feat: 新增获取k8s pod namespace container接口 2021-12-31 15:54:59 +08:00
feng626
11fd9a6567 feat: 配置私钥密码 2021-12-31 14:47:01 +08:00
Michael Bai
90299a508c fix: 授权actions serializer 设置default值 2021-12-30 17:25:00 +08:00
Michael Bai
de9516dee5 feat: 应用授权增加Action动作控制 2021-12-30 14:47:58 +08:00
jiangweidong
db01a6d48d fix: 因为admin_user表中有历史数据,导致组织无法删除 2021-12-30 10:48:07 +08:00
halo
c2d271f00b feat: rdp协议新增username表达式,匹配更多特殊字符 2021-12-30 10:01:26 +08:00
halo
90de229631 feat: rdp协议username支持输入中文和$ 2021-12-30 10:01:26 +08:00
xinwen
35a10fdd62 feat: 命令记录接口增加 remote_addr 字段 2021-12-28 18:26:21 +08:00
xinwen
89cad224c5 fix: SSO 限制用户删除 2021-12-28 18:24:32 +08:00
jiangweidong
9b2e1e08d8 feat: 将redis_acl协议合并到redis协议中 2021-12-28 18:15:00 +08:00
jiangweidong
aaaa87dd60 feat: 增加数据库redis的纳管功能 2021-12-28 18:15:00 +08:00
吴小白
24cad76232 perf: 更新基础镜像 2021-12-28 18:12:10 +08:00
吴小白
040f1fcbb8 perf: 添加 quick_start.sh 2021-12-27 12:32:58 +08:00
Eric
d76ef788d2 perf: 修改 Jmservisor 的下载地址名称 2021-12-24 11:35:47 +08:00
feng626
38c870a0b5 perf: 工单流对应受理人懒加载 2021-12-21 18:20:02 +08:00
Michael Bai
ac9e7e9b40 fix: 下载页面添加Jmservisor拉起脚本 2021-12-17 15:54:32 +08:00
Michael Bai
5aed04d58b fix: 修改SAML2.0 Logo 2021-12-17 15:25:15 +08:00
jiangweidong
c05951b6c4 feat: 添加数据库SQLServer改密功能的python依赖库 2021-12-17 11:20:09 +08:00
ibuler
107215aadc workflow: 添加同步 gitee 的 workflow 2021-12-17 11:19:47 +08:00
Jiangjie.Bai
d27947919b Merge pull request #7404 from jumpserver/dev
v2.17.0 rc4
2021-12-15 22:03:19 +08:00
ibuler
b92530f0b9 perf: 优化通知,对支持 markdown 的发 markdown 2021-12-15 20:58:33 +08:00
ibuler
b113eeb1d6 perf: 修复基本设置中的 base_site 可能带 / 的问题 2021-12-15 20:55:06 +08:00
ibuler
eb3165f8e7 perf: 优化 saml 2021-12-15 20:54:34 +08:00
xinwen
93d77b2312 fix: 执行 ansible 任务日志记录错误 2021-12-15 18:02:14 +08:00
Michael Bai
ae1d419f70 fix: 修复用户组命令过滤规则获取不到的问题 2021-12-15 16:34:33 +08:00
feng626
36054cb64c Merge pull request #7397 from jumpserver/pr@dev@translate
feat: translate
2021-12-15 16:21:52 +08:00
feng626
022e93d7a2 feat: translate 2021-12-15 16:15:21 +08:00
Michael Bai
ac14790960 fix: 修改命令过滤规则动作允许值1->9 2021-12-15 14:57:44 +08:00
feng626
facc231e3f Merge pull request #7394 from jumpserver/pr@dev@ticket_filter_bug
perf: ticket filter
2021-12-15 14:40:28 +08:00
feng626
e4178849ba perf: ticket filter 2021-12-15 14:23:07 +08:00
Jiangjie.Bai
151d897746 Merge pull request #7391 from jumpserver/dev
v2.17.0 rc3
2021-12-14 21:58:27 +08:00
xinwen
27588f48d8 fix: authbook 创建的时候没有自动推送 2021-12-14 20:07:16 +08:00
Michael Bai
303102d40b fix: 删除一些启动时大量输出的debug日志 2021-12-14 20:06:49 +08:00
xinwen
ff9632d6da fix: LDAP 测试连接报错 2021-12-14 20:04:41 +08:00
Michael Bai
4cbf6dd5e6 fix: 修复获取系统用户认证信息API 2021-12-14 19:12:22 +08:00
xinwen
4f889cfe36 fix: 带 ad_domain 的系统用户不应该支持推送 2021-12-14 18:05:34 +08:00
ibuler
0c2e5f3f2a perf: 优化 saml2 i18n 2021-12-14 18:01:08 +08:00
ibuler
19ecc7fef6 fix: 修复 otp 返回时报错 2021-12-14 17:50:33 +08:00
ibuler
0020343ae0 perf: 优化运行的命名 2021-12-14 17:47:10 +08:00
ibuler
fa5c433c7c perf: 优化启动脚本,避免启动超时 2021-12-14 17:47:10 +08:00
jiangweidong
5139f9c4b9 feat: saml2认证支持https协议 2021-12-14 17:12:43 +08:00
Jiangjie.Bai
d6aad41d05 Merge pull request #7373 from jumpserver/dev
v2.17.0 rc2
2021-12-13 19:47:33 +08:00
xinwen
b3acc0d451 fix: 去掉 get_docker_mem_usage_if_limit 日志 2021-12-13 18:09:55 +08:00
ibuler
330917df4c fix: 修复 saml2 登陆的问题 2021-12-13 17:52:18 +08:00
ibuler
21d4ffa33b perf: 优化 cas 回调地址翻译 2021-12-13 16:33:00 +08:00
xinwen
1bc14c53d4 fix: 企业微信发送报错 2021-12-13 16:32:09 +08:00
ibuler
5024d0d739 perf: 优化 saml2 log 2021-12-13 16:30:38 +08:00
jiangweidong
c5013dcbd6 feat: saml2协议单点登录支持在页面上配置saml2协议的高级配置 (#7362) 2021-12-13 10:54:14 +08:00
halo
84924bc3f6 fix: 修复会话详情页命令列表排序失效bug 2021-12-13 10:52:37 +08:00
Michael Bai
2de769ecad chore: 修改依赖版本: jms-storage==0.0.40 2021-12-13 10:47:09 +08:00
feng626
ab68aed98f Merge pull request #7363 from jumpserver/pr@dev@ticket_bug
perf: 工单根据选择组织自动选择流
2021-12-13 10:25:48 +08:00
feng626
9b9e32d2f4 perf: 工单根据选择组织自动选择流 2021-12-11 12:50:30 +08:00
ibuler
b6249e9a63 fix: 修复登录时跳转问题 2021-12-10 17:49:29 +08:00
Jiangjie.Bai
5f7fa7e02f Merge pull request #7355 from jumpserver/dev
v2.17.0 rc1
2021-12-09 20:57:02 +08:00
Michael Bai
74fe9dddb7 fix: 修改i18n 2021-12-09 20:35:52 +08:00
ibuler
ade3f3ae6c fix: 修复需要设置 mfa 的 bug 2021-12-09 20:24:02 +08:00
Michael Bai
1c3b574c46 fix: 修复系统用户命令过滤规则必填的问题 2021-12-09 20:23:26 +08:00
Michael Bai
92aeae4a2a fix: 修改i18n
fix: 修改i18n
2021-12-09 20:23:07 +08:00
xinwen
ea6c28fa0d fix: 修改 cgroup memory log 2021-12-09 20:22:34 +08:00
fit2bot
f44acc923b perf: 修改 otp 的显示,避免误会 (#7345)
perf: 还是用 OTP 吧

Co-authored-by: ibuler <ibuler@qq.com>
2021-12-09 19:33:11 +08:00
ibuler
f27df2f523 fix: 修复 openstack 同步依赖 2021-12-09 19:28:22 +08:00
Michael Bai
bbf2aff96c feat: 命令过滤器支持多维度绑定
feat: 修改命令过滤规则校验

feat: 修改系统用户命令过滤器迁移文件
2021-12-09 19:06:28 +08:00
ibuler
3ea4a9fbe6 chore: 修改 html 中 license 文案 2021-12-09 17:33:56 +08:00
fit2bot
3962af7c4f feat: 支持saml2协议的单点登录,合并代码 (#7347)
* fix: 支持saml2协议的单点登录

* feat: 支持saml2协议的单点登录,合并代码

Co-authored-by: jiangweidong <weidong.jiang@fit2cloud.com>
2021-12-09 15:47:21 +08:00
ibuler
38c6d11af1 perf: 修改关闭 issue 的 comment 2021-12-09 15:31:27 +08:00
fit2bot
5f45acf68f workflow: 修改 workflow (#7341)
* workflow: 修改 workflow

* fix: 修复判断的错误

* perf: 添加移除的内容

Co-authored-by: ibuler <ibuler@qq.com>
2021-12-09 12:07:55 +08:00
fit2bot
c7f2afc3c1 fix: redis lock bug (#7338)
Co-authored-by: feng626 <1304903146@qq.com>
2021-12-08 17:55:32 +08:00
xinwen
16fae00e0e feat: xrdp 远程应用 2021-12-08 15:55:49 +08:00
Eric_Lee
0afeed0ff1 feat: 录像增加 cast 格式 (#7259)
* feat: 录像增加 cast 格式

* perf: 优化一下写法

* perf: 修改一点点

Co-authored-by: ibuler <ibuler@qq.com>
2021-12-08 15:35:22 +08:00
Michael Bai
d72aa34513 fix: 修改命令过滤规则生成正则表达式的逻辑,提高命令过滤准确性 2021-12-08 10:24:33 +08:00
feng626
d56de16563 fix: 依赖bug 2021-12-07 19:01:14 +08:00
ibuler
8536e6fca7 perf: 添加 issues workflow 2021-12-07 18:29:53 +08:00
feng626
adb9f01231 feat: 密码计划邮件提醒 2021-12-07 17:00:33 +08:00
ibuler
e0d4ad8570 workflow: 优化workflow 2021-12-07 10:40:13 +08:00
ibuler
3c8726f1b0 fix: 去掉多余的 collect static
perf: 使用bash 才能保证 subprocess 运行正常
2021-12-06 19:03:23 +08:00
ibuler
6b4bc1bdb5 chore: 升级 License 到 GPLv3
chore: 升级 License 到 GPLv3
2021-12-06 18:15:17 +08:00
ibuler
c92d36391f chore: change license 2021-12-06 17:45:54 +08:00
xinwen
24bbe7c31f fix: 计算docker limit 内存使用率时区分 cgroup v1 v2 2021-12-06 11:22:45 +08:00
fit2bot
77f2bd9262 perf: 优化workflow名称 (#7323)
* workflow: 添加定时检查issue的workflow

* perf: 优化workflow名称

Co-authored-by: ibuler <ibuler@qq.com>
2021-12-06 10:09:58 +08:00
ibuler
495a6fbc95 workflow: 添加定时检查issue的workflow 2021-12-03 19:25:56 +08:00
xinwen
36c6299a03 fix: redis 锁在redis集群环境报错 2021-12-03 19:23:44 +08:00
ibuler
8a31fbe35e chore: 添加判断 pull request 2021-12-03 15:22:13 +08:00
ibuler
dbe628a8db chore: 添加 issue 备注添加labels 2021-12-03 15:22:13 +08:00
ibuler
e45b893662 chore: 添加 workflow 检查issue 2021-12-02 17:50:31 +08:00
feng626
ac4b6e5b96 feat: 工单创建资产添加节点 2021-11-30 16:38:30 +08:00
ibuler
c85249be36 perf: 优化订阅处理,形成框架 2021-11-29 15:01:37 +08:00
ibuler
de006bc664 perf: 资产添加标签显示,更加易读
pref: 添加翻译
2021-11-29 14:57:50 +08:00
xinwen
a313e011ef perf: 内存使用率兼顾 docker cgroup limit 2021-11-29 14:53:04 +08:00
Michael Bai
3076b7a5e1 fix: 修复migrations序号重复问题 2021-11-29 11:24:12 +08:00
老广
c47a588e7c Merge pull request #7289 from jumpserver/pr@dev@merge_with_master
Merge branch 'dev' of github.com:jumpserver/jumpserver into dev
2021-11-26 11:20:39 +08:00
ibuler
7f61c49db2 Merge branch 'dev' of github.com:jumpserver/jumpserver into dev 2021-11-25 20:00:34 +08:00
ibuler
c8a76c9afb fix: 修复 close connection 的问题 2021-11-25 18:29:01 +08:00
ibuler
3e85a9a079 Merge branch 'dev' of github.com:jumpserver/jumpserver into dev 2021-11-25 18:05:18 +08:00
ibuler
62cbca7eb2 fix: 修复重置 mfa 的提示问题 2021-11-25 17:50:47 +08:00
ibuler
434793bc41 Merge branch 'dev' of github.com:jumpserver/jumpserver into dev 2021-11-25 17:44:51 +08:00
xinwen
85a017b2c4 fix: 按资产ip搜索数据不全 2021-11-25 17:29:47 +08:00
ibuler
84bce19a92 perf: 去掉登录页面更好 2021-11-25 16:02:40 +08:00
ibuler
3874b8ed67 Merge branch 'dev' of github.com:jumpserver/jumpserver into dev
repr@dev_v2.16_v2.15@2ee26971c9b152@fix_xxxx
merges an updated upstream into a topic branch.
2021-11-25 15:58:33 +08:00
ibuler
7f0ad7e27f fix: 修复 oidc cas 登录时跳转问题
perf: 优化一波,容易debug

perf: 还原回来的世界
2021-11-25 15:01:42 +08:00
ibuler
2ee26971c9 Merge branch 'dev' of github.com:jumpserver/jumpserver into dev 2021-11-24 20:41:41 +08:00
ibuler
5ace5a752e fix: 修复 cas/oidc 登录 MFA 产生的bug
perf: 优化更严谨
2021-11-24 19:05:15 +08:00
Michael Bai
e838d974c3 fix: 修复根据节点/资产查询授权时报错的问题 2021-11-24 19:04:42 +08:00
ibuler
9dd93ea3b8 Merge branch 'dev' of github.com:jumpserver/jumpserver into dev 2021-11-24 16:15:57 +08:00
jiangweidong
88ea7ae149 feat: 云管同步中心支持OpenStack资产同步,所需python库依赖 2021-11-24 14:28:11 +08:00
ibuler
1132c6a4e4 fix: 修复 mfa radius 登录的bug 2021-11-24 12:51:12 +08:00
ibuler
0762f87ba5 Merge branch 'master' into dev 2021-11-24 11:25:51 +08:00
chenqiao
cb98b4f80a feat:新增对sqlserver数据库托管 2021-11-24 11:21:04 +08:00
feng626
599d58e9f2 fix: 修复重置mfa 500 bug 2021-11-23 16:11:54 +08:00
ibuler
5c6f1730d5 perf: 系统用户密码不再trim空格 2021-11-23 10:41:58 +08:00
feng626
67d084fc43 perf: ip list 优化 2021-11-19 16:36:58 +08:00
feng626
24d0d52a0a perf: 优化ip黑白名单 2021-11-19 11:48:45 +08:00
xinwen
d94c515cfc fix: 验证码逻辑错误 2021-11-19 10:46:59 +08:00
Michael Bai
cae956f9a5 fix(i18n): 修改翻译文件 2021-11-18 19:07:14 +08:00
Jiangjie.Bai
2c70e117c6 Merge pull request #7229 from jumpserver/dev
v2.16
2021-11-18 18:51:57 +08:00
ibuler
70e8a03888 fix: 修复 mfa 登录时的 focus 错误 2021-11-18 18:49:22 +08:00
fit2bot
d2df8acd84 fix: 修复数据库连接没有关闭问题 (#7227)
* fix: 修复数据库连接没有关闭的bug

perf: websocket 断开也添加关闭数据库连接

* fix: 修复数据库连接没有关闭问题

Co-authored-by: ibuler <ibuler@qq.com>
2021-11-18 18:47:01 +08:00
Michael Bai
6e5dcc738e fix: 修改ldap分页日志为debug模式 2021-11-18 18:07:51 +08:00
Jiangjie.Bai
d1ccb4af15 Merge pull request #7224 from jumpserver/dev
v2.16.0 rc4
2021-11-18 17:05:35 +08:00
fit2bot
086ecfc046 perf: 优化全局ip限制逻辑 (#7220)
* perf: 优化全局ip限制逻辑

* perf: 优化全局ip限制逻辑 2

* perf: 优化全局ip限制逻辑 3

Co-authored-by: feng626 <1304903146@qq.com>
Co-authored-by: Michael Bai <baijiangjie@gmail.com>
2021-11-18 16:55:23 +08:00
Jiangjie.Bai
8f3765a98d Merge pull request #7217 from jumpserver/dev
v2.16.0 rc3
2021-11-18 13:02:51 +08:00
Michael Bai
2d6610b133 fix(settings): 修复LDAP系统设置中使用search_ou查询前执行strip操作 2021-11-18 13:01:20 +08:00
Jiangjie.Bai
5a82174c54 Merge pull request #7214 from jumpserver/dev
v2.16.0 rc2
2021-11-17 19:42:25 +08:00
Michael Bai
dac45f234a fix: 终端列表指标数值保留一位小数 2021-11-17 19:36:52 +08:00
Michael Bai
096b2cf8b8 fix: 修改翻译 2021-11-17 19:36:10 +08:00
fit2bot
0ed0f69be0 fix: 登录日志字段翻译 (#7211)
Co-authored-by: xinwen <coderWen@126.com>
Co-authored-by: Jiangjie.Bai <32935519+BaiJiangJie@users.noreply.github.com>
2021-11-17 19:28:58 +08:00
fit2bot
8af88cd2c6 fix: 修复otp verify msg引起的500 (#7210)
Co-authored-by: ibuler <ibuler@qq.com>
2021-11-17 16:43:04 +08:00
feng626
2a9f0f8dcf perf: help信息迁移到public中 2021-11-17 15:44:50 +08:00
ibuler
cc2d47e6dc perf: 修复首页登录mfa错误提示 2021-11-16 11:33:40 +08:00
xinwen
22c9dfc0f2 fix: 系统用户 actions 2021-11-16 11:33:00 +08:00
fit2bot
ed01f2f1fb fix: 修复xrdp连接时报错 (#7202)
* fix: 修复xrdp连接时报错

perf: 添加注释

* perf: 去掉import

Co-authored-by: ibuler <ibuler@qq.com>
2021-11-16 11:32:25 +08:00
ibuler
c5ff0d972b perf: 去掉mfa input 的required,以为用户可能没有设置 2021-11-15 17:01:12 +08:00
ibuler
7d9da9ff66 perf: 修改名称 2021-11-15 17:01:12 +08:00
fit2bot
d6395b64fa merge: with dev (#7195)
* fix: 修复登录mfa时,选择了禁用的

* fix: mfa focus

Co-authored-by: ibuler <ibuler@qq.com>
2021-11-15 16:51:05 +08:00
xinwen
9bfbdea508 fix: 终端删除后,录像存储不能删除问题 2021-11-15 16:35:02 +08:00
ibuler
cb1c906db4 fix: 修复登录时没有绑定mfa,没有跳转的问题
fix: 首页登录如果没有则后续登录
2021-11-15 16:20:58 +08:00
ibuler
12a0096963 fix: 修复重置mfa的bug 2021-11-15 16:19:30 +08:00
ibuler
a315e8888b perf: 优化一下 win2016 不再内置 2021-11-15 16:18:31 +08:00
ibuler
454d3cba96 fix: 修复登录mfa时,选择了禁用的 2021-11-15 16:17:02 +08:00
feng626
4a786baf4e perf: 工单提示信息优化 2021-11-12 18:00:32 +08:00
xinwen
187977c04a fix: 修改 CPU info 字段 2021-11-12 17:09:42 +08:00
xinwen
4260fe1424 fix: 资产 cpu_info 翻译不对 2021-11-12 16:56:19 +08:00
Michael Bai
7286b1b09e fix: 移动【会话设置保留时间】至定期清理设置页面 2021-11-12 16:54:17 +08:00
xinwen
9a7919f3ac fix: 更新公钥消息不对 2021-11-12 16:44:23 +08:00
xinwen
43cbf4f6a9 fix: 创建存储时名称重复报错 2021-11-12 16:31:14 +08:00
Eric
e9dc1ad86a fix: 修复 ssh 工具连接无法二级登录的问题 2021-11-12 16:14:14 +08:00
feng626
90477146ed feat: 添加全局ip黑名单 2021-11-12 14:56:25 +08:00
ibuler
353b66bf8f fix: 修复创建token的问题 2021-11-11 19:10:28 +08:00
Jiangjie.Bai
3051996e35 Merge pull request #7169 from jumpserver/dev
v2.16.0 rc1
2021-11-11 14:00:33 +08:00
ibuler
8761ed741c pref: 资产的硬件信息可以更改 2021-11-11 13:31:18 +08:00
ibuler
0facd8a25e perf: 去掉官网的链接 2021-11-11 13:03:35 +08:00
xinwen
b28ce9de7a feat: xrdp session bpp 读取环境变量 2021-11-11 10:42:17 +08:00
jiangweidong
f2b72aae37 feat: 支持管理员配置导航栏上帮助中的url 2021-11-10 17:06:16 +08:00
ibuler
b001443f34 pref: 修改为内置 2021-11-10 16:59:08 +08:00
fit2bot
17303c0550 pref: 优化MFA (#7153)
* perf: 优化mfa 和登录

* perf: stash

* stash

* pref: 基本完成

* perf: remove init function

* perf: 优化命名

* perf: 优化backends

* perf: 基本完成优化

* perf: 修复首页登录时没有 toastr 的问题

Co-authored-by: ibuler <ibuler@qq.com>
Co-authored-by: Jiangjie.Bai <32935519+BaiJiangJie@users.noreply.github.com>
2021-11-10 11:30:48 +08:00
xinwen
bac974b4f2 feat: 异地登录提醒可配置是否启用 2021-11-09 14:35:54 +08:00
Michael Bai
f9e970f4ed fix: 修改文案 切换用户 -> 用户切换 2021-11-05 16:41:45 +08:00
Jiangjie.Bai
07c60ca75d feat: 支持二级登录资产 (#7143)
* feat: 支持su切换系统用户

* feat: 支持su切换系统用户

* feat: 支持su切换系统用户
2021-11-05 16:11:29 +08:00
ibuler
bbe2678df3 perf: 优化用户创建邮件信息,支持部分标签 2021-11-04 13:45:15 +08:00
ibuler
d57f52ee24 feat: 添加 window RDP TLS 平台 2021-11-04 11:05:50 +08:00
feng626
55775f0deb perf: 异地登陆去掉本地判断 2021-11-02 21:11:17 +08:00
xinwen
fde118021e fix: Radius 配置 2021-11-02 20:13:14 +08:00
ibuler
d484885762 perf: 修复ansible stdout 中非utf8的问题 2021-11-02 20:12:48 +08:00
feng626
8071f45f92 fix: 修复xpack引入包问题 2021-10-29 16:23:49 +08:00
Jiangjie.Bai
15cdd44c6c Merge pull request #7107 from jumpserver/dev
v2.15.0
2021-10-28 17:45:42 +08:00
feng626
d44f90ea3d fix: acl xpack 2021-10-28 17:38:55 +08:00
Jiangjie.Bai
42e9fbf37a Merge pull request #7103 from jumpserver/dev
v2.15.0
2021-10-28 15:31:44 +08:00
ibuler
d44656aa10 fix: 修复修改用户信息后,绑定丢失
perf: 还原提示文案
2021-10-28 15:29:55 +08:00
feng626
09d4228182 Merge pull request #7102 from jumpserver/pr@dev@ticket_tips
perf: 工单错误提示优化
2021-10-28 15:25:42 +08:00
feng626
3ae8da231a perf: 工单错误提示优化 2021-10-28 15:16:52 +08:00
feng626
2e4b6d150a fix: 登录复合浏览器不兼容 2021-10-28 15:02:54 +08:00
Michael Bai
8542d53aff fix: 修复ldap用户导入时字段进行strip 2021-10-28 15:02:33 +08:00
Michael Bai
141dafc8bf fix: 创建授权/添加资产到节点;不再自动更新资产的管理用户 2021-10-28 13:39:27 +08:00
Michael Bai
28a6024b49 fix: 修改工单列表默认按照创建日志倒叙排序,修改翻译 2021-10-28 13:38:10 +08:00
ibuler
6cf2bc4baf pref: 优化工单验证 2021-10-28 13:37:24 +08:00
xinwen
631f802961 fix: 钉钉报错的时取得字段不对 2021-10-28 13:31:27 +08:00
feng626
0eedda748a perf: acl username unified 2021-10-28 13:31:05 +08:00
Jiangjie.Bai
7183f0d274 Merge pull request #7089 from jumpserver/dev
v2.15.0-rc3
2021-10-27 19:24:42 +08:00
feng626
c93ab15351 fix: acl migrate bug 2021-10-27 11:31:16 +08:00
Michael Bai
3fffd667dc fix: 修改系统用户、工单列表排序字段 2021-10-27 11:16:01 +08:00
fit2bot
11fd2afa3a perf: 优化下载页面 (#7082)
Co-authored-by: ibuler <ibuler@qq.com>
Co-authored-by: Jiangjie.Bai <32935519+BaiJiangJie@users.noreply.github.com>
2021-10-26 17:27:01 +08:00
Michael Bai
1eb59b11da fix: 添加Model的verbose_name属性 2021-10-26 17:22:05 +08:00
feng626
a1e2d4ca57 Merge pull request #7081 from jumpserver/pr@dev@command_record_filter
fix: 命令记录-导出选择项命令:却导出了所有命令
2021-10-26 16:54:27 +08:00
feng626
8a413563be fix: 命令记录-导出选择项命令:却导出了所有命令 2021-10-26 16:50:14 +08:00
ibuler
203a01240b perf: 优化dockerfile 2021-10-26 15:48:17 +08:00
Michael Bai
37fef6153a fix: 修改依赖包版本: jumpserver-django-oidc-rp==0.3.7.8 2021-10-26 15:16:24 +08:00
feng626
ba6c49e62b Merge pull request #7077 from jumpserver/pr@dev@app_account_translate
fix: 【【账号管理】应用账号导出字段存在英文】
2021-10-26 13:42:31 +08:00
feng626
da79f8beab fix: 【【账号管理】应用账号导出字段存在英文】 2021-10-26 13:34:59 +08:00
ibuler
3f72c02049 perf: 添加命令org 2021-10-26 11:49:56 +08:00
feng626
380226a7d2 fix: 用户登陆mfa code为空限制 2021-10-26 11:01:12 +08:00
fit2bot
f88e5de3c1 perf: 优化用户创建邮件 (#7072)
* perf: 优化通知中的连接点击

* perf: 优化用户创建邮件

* perf: 优化时间日期

Co-authored-by: ibuler <ibuler@qq.com>
2021-10-26 10:52:23 +08:00
Michael Bai
7c149fe91b fix: 修复系统用户applications_amount字段只读 2021-10-25 16:25:12 +08:00
Jiangjie.Bai
f673fed706 Merge pull request #7068 from jumpserver/dev
v2.15.0-rc2
2021-10-25 15:07:25 +08:00
Michael Bai
84326cc999 fix: 系统用户列表添加 应用数量 字段 2021-10-25 15:01:29 +08:00
xinwen
7d56678a8e fix: 修复登录次数出现 0 2021-10-25 14:30:48 +08:00
feng626
25d1b71448 fix: 修复分时登陆bug 2021-10-25 14:26:47 +08:00
feng626
d3920a0cc9 Merge pull request #7065 from jumpserver/pr@dev@ticket_bug
fix: 修复工单发邮件及授权资产备注信息修改
2021-10-25 11:32:01 +08:00
feng626
52889cb67a fix: 修复工单发邮件及授权资产备注信息修改 2021-10-25 11:30:37 +08:00
feng626
0b67c7a953 fix: 修复登陆复合acl时 未添加规则bug 2021-10-25 10:49:00 +08:00
ibuler
1f50a2fe33 perf: 优化签名 2021-10-25 10:48:27 +08:00
ibuler
cd5094f10d perf: 优化通知中的连接点击 2021-10-25 10:48:03 +08:00
fit2bot
c244cf5f43 pref: 修改使用的消息内容 (#7061)
* perf:  再次优化通知

* pref: 修改使用的消息内容

* perf: 修复url地址

Co-authored-by: ibuler <ibuler@qq.com>
2021-10-22 20:06:16 +08:00
feng626
a0db2f6ef8 Merge pull request #7059 from jumpserver/pr@dev@ticket_send_message_multiple_times
fix: 修复工单结束重复发送受理人消息bug
2021-10-22 16:40:51 +08:00
feng626
ea485c3070 fix: 修复工单结束重复发送受理人消息bug 2021-10-22 16:38:14 +08:00
feng626
705f352cb9 Merge pull request #7057 from jumpserver/pr@dev@acl_bug
fix: 修复acl 选择时间段为空bug
2021-10-22 16:12:40 +08:00
feng626
dc13134b7b fix: 修复acl 选择时间段为空bug 2021-10-22 16:09:22 +08:00
Michael Bai
06de6c3575 fix: 修复资产系统用户auth-info获取流程及创建时手动登录方式的校验 2021-10-22 15:19:58 +08:00
ibuler
c341d01e5a perf: 修改工单信息 2021-10-21 20:04:27 +08:00
feng626
d1a3d31d3f feat: 客户端协议内容添加rdp文件名 2021-10-21 19:16:04 +08:00
ibuler
10f4ff4eec fix: 修复工单通知内容bug 2021-10-21 18:36:29 +08:00
ibuler
25ea3ba01d perf: 修改sdk位置 2021-10-21 17:39:28 +08:00
ibuler
072865f3e5 perf: 优化修改 sdk 位置 2021-10-21 17:39:28 +08:00
ibuler
d5c9ec1c3d perf: 优化位置 2021-10-21 17:39:28 +08:00
ibuler
487c945d1d perf: 修改代码位置,用户sugestion增加到6个 2021-10-21 17:39:28 +08:00
feng626
a78b2f4b62 fix: 修复工单发消息报错 2021-10-21 17:31:59 +08:00
ibuler
a1f1dce56b perf: 修改格式错误 2021-10-21 16:22:08 +08:00
xinwen
a1221a39fd fix: 会话管理翻译 2021-10-21 16:21:49 +08:00
xinwen
8c118b6f47 fix: 组织删除报错未翻译 2021-10-21 16:21:11 +08:00
feng626
68fd8012d8 Merge pull request #7042 from jumpserver/pr@dev@acl_perf
perf: acl filter
2021-10-21 15:11:50 +08:00
feng626
526928518d perf: acl filter 2021-10-21 15:00:29 +08:00
feng626
c5f6c564a7 fix: 系统启动bug 2021-10-21 13:05:07 +08:00
Jiangjie.Bai
076adec218 Merge pull request #7033 from jumpserver/dev
v2.15.0 rc1
2021-10-20 20:24:29 +08:00
fit2bot
00d434ceea perf: 优化消息通知 (#7024)
* perf: 优化系统用户列表

* stash

* perf: 优化消息通知

* perf: 修改钉钉

* perf: 修改优化消息通知

* perf: 修改requirements

* perf: 优化datetime

Co-authored-by: ibuler <ibuler@qq.com>
2021-10-20 19:45:37 +08:00
fit2bot
9acfd461b4 feat: user login acl (#6963)
* feat: user login acl

* 添加分时登陆

* acl 部分还原

* 简化acl判断逻辑

Co-authored-by: feng626 <1304903146@qq.com>
Co-authored-by: feng626 <57284900+feng626@users.noreply.github.com>
2021-10-20 17:56:59 +08:00
feng626
9424929dde fix: sms bug 2021-10-19 20:23:06 +08:00
Michael Bai
b95d3ac9be fix: 修改终端status翻译 2021-10-19 16:16:48 +08:00
Michael Bai
af2a9bb1e6 fix: 修复用户登录失败未记录日志的问题 2021-10-19 15:28:39 +08:00
fit2bot
63638ed1ce feat: 首页的 chanlege 和 MFA 统一 (#6989)
* feat: 首页的 chanlege 和 MFA 统一

* 登陆样式调整

* mfa bug

* q

* m

* mfa封装组件 前端可修改配置

* perf: 添加翻译

* login css bug

* perf: 修改一些风格

* perf: 修改命名

* perf: 修改 mfa code 不是必填

* mfa 前端统一组件

* stash

* perf: 统一验证码

Co-authored-by: feng626 <1304903146@qq.com>
Co-authored-by: ibuler <ibuler@qq.com>
2021-10-18 18:41:41 +08:00
fit2bot
fa68389028 perf: 去掉单独的flash msg (#7013)
* perf: 去掉单独的flash msg

perf: 修改使用库

* fix: guangbug

* pref: 修改 context

Co-authored-by: ibuler <ibuler@qq.com>
Co-authored-by: xinwen <coderWen@126.com>
2021-10-18 11:25:39 +08:00
halo
63b338085a feat: 修改username校验,telnet协议支持输入中文 2021-10-15 15:36:29 +08:00
fit2bot
5d8818e69e feat: 添加资产授权用户API、资产授权用户的授权规则API (#7008)
* feat: 添加资产授权用户API、资产授权用户的授权规则API

* feat: 添加资产授权用户组API、资产授权用户组的授权规则API

* feat: 抽象API类

Co-authored-by: Michael Bai <baijiangjie@gmail.com>
2021-10-15 14:02:32 +08:00
xinwen
77521119b9 fix: 修改 oauth 认证提示信息 2021-10-15 11:32:20 +08:00
ibuler
baee71e4b8 perf: 添加requirements 2021-10-15 11:31:06 +08:00
ibuler
6459c20516 perf: Django语言文件推送到lfs
perf: 修改mac依赖,添加安装 git lfs

perf: 修改添加 mac 开发环境依赖
2021-10-14 16:30:18 +08:00
ibuler
0207fe60c5 perf: 修改ip城市获取算法
perf: 优化使用lfs
2021-10-14 10:35:18 +08:00
Eric
b8c43f5944 fix: 修复获取系统用户密码问题 2021-10-13 14:34:57 +08:00
xinwen
63ca2ab182 fix: 修复 traceback 问题 2021-10-13 11:42:54 +08:00
ibuler
f9ea119928 perf: 不再需要 tls 类型,any 方式都可以 2021-10-12 17:45:38 +08:00
xinwen.xu
d4bdc74bd8 feat: 授权过期通知 2021-10-11 15:41:13 +08:00
ibuler
42e7ca6a18 perf: 资产编号允许api更改 2021-10-11 15:26:31 +08:00
xinwen
6e0341b7b1 feat: xrdp挂载受授权的上传下载控制 2021-10-11 15:24:34 +08:00
jiangweidong
2f25e2b24c fix: 系统用户认证方式从托管密码更新到手动输入模式后,不填写用户名报错 2021-10-11 10:02:48 +08:00
halo
87dcd2dcb7 feat: 增加关闭第三方登录跳转配置参数 2021-09-29 16:04:59 +08:00
ibuler
e96aac058f fix: 修复跳转msg页面,取消button太小的问题 2021-09-29 15:30:17 +08:00
Jiangjie.Bai
d76bad125b Merge pull request #6973 from jumpserver/pr@dev@feat_announcement
feat: 支持公告
2021-09-29 14:58:15 +08:00
ibuler
913b1a1426 perf: 修改翻译 2021-09-29 14:55:14 +08:00
ibuler
0213154a19 merge: with dev 2021-09-29 14:53:42 +08:00
ibuler
8f63b488b2 perf: 修改翻译 2021-09-29 14:43:00 +08:00
ibuler
f8921004c2 perf: 修改依赖版本 2021-09-29 14:42:11 +08:00
feng626
5588eab57e feat: 用户异地登陆 2021-09-29 14:33:52 +08:00
ibuler
e0a8f91741 perf: 修改翻译 2021-09-28 18:57:21 +08:00
ibuler
467ebfa650 perf: 添加公告id 2021-09-28 18:17:10 +08:00
ibuler
0533b77b1b perf: 支持公告 2021-09-28 17:01:47 +08:00
ibuler
a558ee2ac0 perf: 修改dockerfile 2021-09-28 10:10:20 +08:00
ibuler
46a39701d4 fix: 修复docker无法构建 2021-09-27 19:27:33 +08:00
ibuler
8c3ab31e4e fix: 修复由于修改filter引起的问题 2021-09-27 18:50:29 +08:00
Jiangjie.Bai
476e6cdc2f Merge pull request #6932 from jumpserver/pr@dev@perf_remove_djangopo
perf: 去掉django.po
2021-09-27 14:12:26 +08:00
ibuler
0b593f4555 perf: 优化drf filter, 修改terminal filter
perf: 去掉debug

perf: 去掉debug
2021-09-27 14:09:22 +08:00
ibuler
456116938d perf: 优化数据迁移,性能提升50倍 2021-09-27 14:05:21 +08:00
ibuler
76b24f62d4 perf: merge with dev 2021-09-27 14:03:19 +08:00
Michael Bai
e1eef0a3f3 fix: 修复authbook systemuser字段问题 2021-09-24 20:24:25 +08:00
Michael Bai
2c74727b65 fix: 修复邮件测试序列类及API 2021-09-24 14:13:55 +08:00
ibuler
08cd91c426 perf: 修改应用授权树,没有的时候不显示 2021-09-24 14:12:31 +08:00
wojiushixiaobai
90d269d2a2 fix: 添加包 2021-09-24 12:56:35 +08:00
Michael Bai
a9ddbcc0cd fix: 修复系统用户和资产/节点关联时资产特权用户更新的问题 2021-09-24 12:48:35 +08:00
Michael Bai
aec31128cf perf: 修改翻译 2021-09-23 11:14:53 +08:00
ibuler
b415ee051d perf: 去掉django.po 2021-09-23 10:26:39 +08:00
ibuler
082a5ae84c perf: 修改不再添加编译后的翻译文件 2021-09-22 18:53:04 +08:00
老广
6b7554d69a Merge pull request #6929 from jumpserver/pr@dev@ticket_comment
perf: 工单优化
2021-09-22 15:51:10 +08:00
feng626
5923562440 perf: 工单优化 2021-09-22 15:17:57 +08:00
老广
d144e7e572 Merge pull request #6922 from jumpserver/pr@dev@fix_tokensmsoptions
fix: 修复KoKo登录时,未开启SMS服务出现了SMS选项的问题
2021-09-22 11:24:03 +08:00
ibuler
cafbd08986 perf: 修改啊 2021-09-22 11:18:38 +08:00
ibuler
b436fc9b44 perf: 优化登录 2021-09-22 11:18:38 +08:00
ibuler
26fc56b4be perf: 优化登录html 2021-09-22 11:18:38 +08:00
Michael Bai
3b1d199669 fix: 修复KoKo登录时,未开启SMS服务出现了SMS选项的问题 2021-09-22 03:17:14 +00:00
ibuler
e7edbc9d84 perf: 修改啊 2021-09-18 15:46:41 +08:00
ibuler
41f81bc0bf merge: with dev 2021-09-18 15:29:29 +08:00
ibuler
75d2c81d33 perf: 优化登录 2021-09-18 15:24:04 +08:00
ibuler
da2dea5003 perf: 登录错误提示颜色 2021-09-18 15:07:05 +08:00
老广
fc60156c23 Merge pull request #6913 from jumpserver/pr@dev@perf_dockerfile
perf: 优化依赖包
2021-09-17 18:03:31 +08:00
wojiushixiaobai
76fb547551 perf: 优化依赖包 2021-09-17 16:14:15 +08:00
ibuler
e686f51703 perf: 优化登录html 2021-09-17 16:00:23 +08:00
feng626
7578bda588 Merge pull request #6911 from jumpserver/pr@dev@ticket_perf
perf: 工单优化
2021-09-17 15:50:48 +08:00
feng626
f2d743ec2b perf: 工单优化 2021-09-17 15:46:33 +08:00
ibuler
7099aef360 perf: 登录错误提示颜色 2021-09-17 14:35:05 +08:00
wojiushixiaobai
246f5d8a11 fix: 添加依赖 2021-09-16 23:24:02 +08:00
wojiushixiaobai
6c98bd3b48 fix: 添加 pg 依赖包 2021-09-16 23:24:02 +08:00
415 changed files with 11573 additions and 5830 deletions

View File

@@ -6,4 +6,5 @@ tmp/*
django.db
celerybeat.pid
### Vagrant ###
.vagrant/
.vagrant/
apps/xpack/.git

2
.gitattributes vendored Normal file
View File

@@ -0,0 +1,2 @@
*.mmdb filter=lfs diff=lfs merge=lfs -text
*.mo filter=lfs diff=lfs merge=lfs -text

9
.github/PULL_REQUEST_TEMPLATE.md vendored Normal file
View File

@@ -0,0 +1,9 @@
#### What this PR does / why we need it?
#### Summary of your change
#### Please indicate you've done the following:
- [ ] Made sure tests are passing and test coverage is added if needed.
- [ ] Made sure commit message follow the rule of [Conventional Commits specification](https://www.conventionalcommits.org/).
- [ ] Considered the docs impact and opened a new docs issue or PR with docs changes if needed.

View File

@@ -0,0 +1,18 @@
name: Issue Close Require
on:
schedule:
- cron: "0 0 * * *"
jobs:
issue-close-require:
runs-on: ubuntu-latest
steps:
- name: need reproduce
uses: actions-cool/issues-helper@v2
with:
actions: 'close-issues'
labels: '状态:待反馈'
inactive-day: 30
body: |
您超过 30 天未反馈信息,我们将关闭该 issue如有需求您可以重新打开或者提交新的 issue。

16
.github/workflows/issue-close.yml vendored Normal file
View File

@@ -0,0 +1,16 @@
name: Issue Close Check
on:
issues:
types: [closed]
jobs:
issue-close-remove-labels:
runs-on: ubuntu-latest
steps:
- name: Remove labels
uses: actions-cool/issues-helper@v2
if: ${{ !github.event.issue.pull_request }}
with:
actions: 'remove-labels'
labels: '状态:待处理,状态:待反馈'

38
.github/workflows/issue-comment.yml vendored Normal file
View File

@@ -0,0 +1,38 @@
on:
issue_comment:
types: [created]
name: Add issues workflow labels
jobs:
add-label-if-is-author:
runs-on: ubuntu-latest
if: (github.event.issue.user.id == github.event.comment.user.id) && !github.event.issue.pull_request && (github.event.issue.state == 'open')
steps:
- name: Add require handle label
uses: actions-cool/issues-helper@v2
with:
actions: 'add-labels'
labels: '状态:待处理'
- name: Remove require reply label
uses: actions-cool/issues-helper@v2
with:
actions: 'remove-labels'
labels: '状态:待反馈'
add-label-if-not-author:
runs-on: ubuntu-latest
if: (github.event.issue.user.id != github.event.comment.user.id) && !github.event.issue.pull_request && (github.event.issue.state == 'open')
steps:
- name: Add require replay label
uses: actions-cool/issues-helper@v2
with:
actions: 'add-labels'
labels: '状态:待反馈'
- name: Remove require handle label
uses: actions-cool/issues-helper@v2
with:
actions: 'remove-labels'
labels: '状态:待处理'

16
.github/workflows/issue-open.yml vendored Normal file
View File

@@ -0,0 +1,16 @@
name: Issue Open Check
on:
issues:
types: [opened]
jobs:
issue-open-add-labels:
runs-on: ubuntu-latest
steps:
- name: Add labels
uses: actions-cool/issues-helper@v2
if: ${{ !github.event.issue.pull_request }}
with:
actions: 'add-labels'
labels: '状态:待处理'

View File

@@ -0,0 +1,17 @@
on:
schedule:
- cron: "0 1 * * *"
name: Check recent handle issues
jobs:
check-recent-issues-not-handle:
runs-on: ubuntu-latest
steps:
- name: Check recent issues and send msg
uses: jumpserver/action-issues-alert@master
with:
hook: ${{ secrets.WECHAT_GROUP_WEB_HOOK }}
type: recent
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

View File

@@ -0,0 +1,17 @@
on:
schedule:
- cron: "0 9 * * 1-5"
name: Check untimely handle issues
jobs:
check-untimely-handle-issues:
runs-on: ubuntu-latest
steps:
- name: Check untimely issues and send msg
uses: jumpserver/action-issues-alert@master
with:
hook: ${{ secrets.WECHAT_GROUP_WEB_HOOK }}
type: untimely
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

View File

@@ -20,6 +20,8 @@ jobs:
run: |
TAG=$(basename ${GITHUB_REF})
VERSION=${TAG/v/}
wget https://raw.githubusercontent.com/jumpserver/installer/master/quick_start.sh
sed -i "s@Version=.*@Version=v${VERSION}@g" quick_start.sh
echo "::set-output name=TAG::$TAG"
echo "::set-output name=VERSION::$VERSION"
- name: Create Release
@@ -31,6 +33,16 @@ jobs:
config-name: release-config.yml
version: ${{ steps.get_version.outputs.TAG }}
tag: ${{ steps.get_version.outputs.TAG }}
- name: Upload Quick Start Script
id: upload-release-quick-start-shell
uses: actions/upload-release-asset@v1
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ steps.create_release.outputs.upload_url }} # This pulls from the CREATE RELEASE step above, referencing it's ID to get its outputs object, which include a `upload_url`. See this blog post for more info: https://jasonet.co/posts/new-features-of-github-actions/#passing-data-to-future-steps
asset_path: ./quick_start.sh
asset_name: quick_start.sh
asset_content_type: application/text
build-and-release:
needs: create-realese
@@ -43,4 +55,4 @@ jobs:
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
with:
upload_url: ${{ needs.create-realese.outputs.upload_url }}
upload_url: ${{ needs.create-realese.outputs.upload_url }}

23
.github/workflows/sync-gitee.yml vendored Normal file
View File

@@ -0,0 +1,23 @@
name: 🔀 Sync mirror to Gitee
on:
push:
branches:
- master
- dev
create:
jobs:
mirror:
runs-on: ubuntu-latest
if: github.repository == 'jumpserver/jumpserver'
steps:
- name: mirror
continue-on-error: true
if: github.event_name == 'push' || (github.event_name == 'create' && github.event.ref_type == 'tag')
uses: wearerequired/git-mirror-action@v1
env:
SSH_PRIVATE_KEY: ${{ secrets.GITEE_SSH_PRIVATE_KEY }}
with:
source-repo: 'git@github.com:jumpserver/jumpserver.git'
destination-repo: 'git@gitee.com:jumpserver/jumpserver.git'

128
CODE_OF_CONDUCT.md Normal file
View File

@@ -0,0 +1,128 @@
# Contributor Covenant Code of Conduct
## Our Pledge
We as members, contributors, and leaders pledge to make participation in our
community a harassment-free experience for everyone, regardless of age, body
size, visible or invisible disability, ethnicity, sex characteristics, gender
identity and expression, level of experience, education, socio-economic status,
nationality, personal appearance, race, religion, or sexual identity
and orientation.
We pledge to act and interact in ways that contribute to an open, welcoming,
diverse, inclusive, and healthy community.
## Our Standards
Examples of behavior that contributes to a positive environment for our
community include:
* Demonstrating empathy and kindness toward other people
* Being respectful of differing opinions, viewpoints, and experiences
* Giving and gracefully accepting constructive feedback
* Accepting responsibility and apologizing to those affected by our mistakes,
and learning from the experience
* Focusing on what is best not just for us as individuals, but for the
overall community
Examples of unacceptable behavior include:
* The use of sexualized language or imagery, and sexual attention or
advances of any kind
* Trolling, insulting or derogatory comments, and personal or political attacks
* Public or private harassment
* Publishing others' private information, such as a physical or email
address, without their explicit permission
* Other conduct which could reasonably be considered inappropriate in a
professional setting
## Enforcement Responsibilities
Community leaders are responsible for clarifying and enforcing our standards of
acceptable behavior and will take appropriate and fair corrective action in
response to any behavior that they deem inappropriate, threatening, offensive,
or harmful.
Community leaders have the right and responsibility to remove, edit, or reject
comments, commits, code, wiki edits, issues, and other contributions that are
not aligned to this Code of Conduct, and will communicate reasons for moderation
decisions when appropriate.
## Scope
This Code of Conduct applies within all community spaces, and also applies when
an individual is officially representing the community in public spaces.
Examples of representing our community include using an official e-mail address,
posting via an official social media account, or acting as an appointed
representative at an online or offline event.
## Enforcement
Instances of abusive, harassing, or otherwise unacceptable behavior may be
reported to the community leaders responsible for enforcement at
.
All complaints will be reviewed and investigated promptly and fairly.
All community leaders are obligated to respect the privacy and security of the
reporter of any incident.
## Enforcement Guidelines
Community leaders will follow these Community Impact Guidelines in determining
the consequences for any action they deem in violation of this Code of Conduct:
### 1. Correction
**Community Impact**: Use of inappropriate language or other behavior deemed
unprofessional or unwelcome in the community.
**Consequence**: A private, written warning from community leaders, providing
clarity around the nature of the violation and an explanation of why the
behavior was inappropriate. A public apology may be requested.
### 2. Warning
**Community Impact**: A violation through a single incident or series
of actions.
**Consequence**: A warning with consequences for continued behavior. No
interaction with the people involved, including unsolicited interaction with
those enforcing the Code of Conduct, for a specified period of time. This
includes avoiding interactions in community spaces as well as external channels
like social media. Violating these terms may lead to a temporary or
permanent ban.
### 3. Temporary Ban
**Community Impact**: A serious violation of community standards, including
sustained inappropriate behavior.
**Consequence**: A temporary ban from any sort of interaction or public
communication with the community for a specified period of time. No public or
private interaction with the people involved, including unsolicited interaction
with those enforcing the Code of Conduct, is allowed during this period.
Violating these terms may lead to a permanent ban.
### 4. Permanent Ban
**Community Impact**: Demonstrating a pattern of violation of community
standards, including sustained inappropriate behavior, harassment of an
individual, or aggression toward or disparagement of classes of individuals.
**Consequence**: A permanent ban from any sort of public interaction within
the community.
## Attribution
This Code of Conduct is adapted from the [Contributor Covenant][homepage],
version 2.0, available at
https://www.contributor-covenant.org/version/2/0/code_of_conduct.html.
Community Impact Guidelines were inspired by [Mozilla's code of conduct
enforcement ladder](https://github.com/mozilla/diversity).
[homepage]: https://www.contributor-covenant.org
For answers to common questions about this code of conduct, see the FAQ at
https://www.contributor-covenant.org/faq. Translations are available at
https://www.contributor-covenant.org/translations.

25
CONTRIBUTING.md Normal file
View File

@@ -0,0 +1,25 @@
# Contributing
## Create pull request
PR are always welcome, even if they only contain small fixes like typos or a few lines of code. If there will be a significant effort, please document it as an issue and get a discussion going before starting to work on it.
Please submit a PR broken down into small changes bit by bit. A PR consisting of a lot features and code changes may be hard to review. It is recommended to submit PRs in an incremental fashion.
This [development guideline](https://docs.jumpserver.org/zh/master/dev/rest_api/) contains information about repository structure, how to setup development environment, how to run it, and more.
Note: If you split your pull request to small changes, please make sure any of the changes goes to master will not break anything. Otherwise, it can not be merged until this feature complete.
## Report issues
It is a great way to contribute by reporting an issue. Well-written and complete bug reports are always welcome! Please open an issue and follow the template to fill in required information.
Before opening any issue, please look up the existing issues to avoid submitting a duplication.
If you find a match, you can "subscribe" to it to get notified on updates. If you have additional helpful information about the issue, please leave a comment.
When reporting issues, always include:
* Which version you are using.
* Steps to reproduce the issue.
* Snapshots or log files if needed
Because the issues are open to the public, when submitting files, be sure to remove any sensitive information, e.g. user name, password, IP address, and company name. You can
replace those parts with "REDACTED" or other strings like "****".

View File

@@ -1,5 +1,5 @@
# 编译代码
FROM python:3.8.6-slim as stage-build
FROM python:3.8-slim as stage-build
MAINTAINER JumpServer Team <ibuler@qq.com>
ARG VERSION
ENV VERSION=$VERSION
@@ -8,9 +8,8 @@ WORKDIR /opt/jumpserver
ADD . .
RUN cd utils && bash -ixeu build.sh
# 构建运行时环境
FROM python:3.8.6-slim
FROM python:3.8-slim
ARG PIP_MIRROR=https://pypi.douban.com/simple
ENV PIP_MIRROR=$PIP_MIRROR
ARG PIP_JMS_MIRROR=https://pypi.douban.com/simple
@@ -18,12 +17,12 @@ ENV PIP_JMS_MIRROR=$PIP_JMS_MIRROR
WORKDIR /opt/jumpserver
COPY ./requirements/deb_buster_requirements.txt ./requirements/deb_buster_requirements.txt
COPY ./requirements/deb_requirements.txt ./requirements/deb_requirements.txt
RUN sed -i 's/deb.debian.org/mirrors.aliyun.com/g' /etc/apt/sources.list \
&& sed -i 's/security.debian.org/mirrors.aliyun.com/g' /etc/apt/sources.list \
&& apt update \
&& apt -y install telnet iproute2 redis-tools \
&& grep -v '^#' ./requirements/deb_buster_requirements.txt | xargs apt -y install \
&& apt -y install telnet iproute2 redis-tools default-mysql-client vim wget curl locales procps \
&& apt -y install $(cat requirements/deb_requirements.txt) \
&& rm -rf /var/lib/apt/lists/* \
&& localedef -c -f UTF-8 -i zh_CN zh_CN.UTF-8 \
&& cp /usr/share/zoneinfo/Asia/Shanghai /etc/localtime \
@@ -32,21 +31,25 @@ RUN sed -i 's/deb.debian.org/mirrors.aliyun.com/g' /etc/apt/sources.list \
COPY ./requirements/requirements.txt ./requirements/requirements.txt
RUN pip install --upgrade pip==20.2.4 setuptools==49.6.0 wheel==0.34.2 -i ${PIP_MIRROR} \
&& pip config set global.index-url ${PIP_MIRROR} \
&& pip install --no-cache-dir $(grep -E 'jms|jumpserver' requirements/requirements.txt) -i ${PIP_JMS_MIRROR} \
&& pip install --no-cache-dir -r requirements/requirements.txt
&& pip install --no-cache-dir -r requirements/requirements.txt -i ${PIP_MIRROR} \
&& rm -rf ~/.cache/pip
COPY --from=stage-build /opt/jumpserver/release/jumpserver /opt/jumpserver
RUN mkdir -p /root/.ssh/ \
&& echo "Host *\n\tStrictHostKeyChecking no\n\tUserKnownHostsFile /dev/null" > /root/.ssh/config
&& echo "Host *\n\tStrictHostKeyChecking no\n\tUserKnownHostsFile /dev/null" > /root/.ssh/config \
&& mv /bin/sh /bin/sh.bak \
&& ln -s /bin/bash /bin/sh
RUN mkdir -p /opt/jumpserver/oracle/
ADD https://download.jumpserver.org/public/instantclient-basiclite-linux.x64-21.1.0.0.0.tar /opt/jumpserver/oracle/
RUN tar xvf /opt/jumpserver/oracle/instantclient-basiclite-linux.x64-21.1.0.0.0.tar -C /opt/jumpserver/oracle/
RUN sh -c "echo /opt/jumpserver/oracle/instantclient_21_1 > /etc/ld.so.conf.d/oracle-instantclient.conf"
RUN ldconfig
RUN mkdir -p /opt/jumpserver/oracle/ \
&& wget https://download.jumpserver.org/public/instantclient-basiclite-linux.x64-21.1.0.0.0.tar > /dev/null \
&& tar xf instantclient-basiclite-linux.x64-21.1.0.0.0.tar -C /opt/jumpserver/oracle/ \
&& echo "/opt/jumpserver/oracle/instantclient_21_1" > /etc/ld.so.conf.d/oracle-instantclient.conf \
&& ldconfig \
&& rm -f instantclient-basiclite-linux.x64-21.1.0.0.0.tar
RUN echo > config.yml
VOLUME /opt/jumpserver/data
VOLUME /opt/jumpserver/logs

837
LICENSE
View File

@@ -1,281 +1,622 @@
GNU GENERAL PUBLIC LICENSE
Version 2, June 1991
Version 3, 29 June 2007
Copyright (C) 1989, 1991 Free Software Foundation, Inc., <http://fsf.org/>
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA
Copyright (C) 2007 Free Software Foundation, Inc. <https://fsf.org/>
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.
Preamble
The licenses for most software are designed to take away your
freedom to share and change it. By contrast, the GNU General Public
License is intended to guarantee your freedom to share and change free
software--to make sure the software is free for all its users. This
General Public License applies to most of the Free Software
Foundation's software and to any other program whose authors commit to
using it. (Some other Free Software Foundation software is covered by
the GNU Lesser General Public License instead.) You can apply it to
The GNU General Public License is a free, copyleft license for
software and other kinds of works.
The licenses for most software and other practical works are designed
to take away your freedom to share and change the works. By contrast,
the GNU General Public License is intended to guarantee your freedom to
share and change all versions of a program--to make sure it remains free
software for all its users. We, the Free Software Foundation, use the
GNU General Public License for most of our software; it applies also to
any other work released this way by its authors. You can apply it to
your programs, too.
When we speak of free software, we are referring to freedom, not
price. Our General Public Licenses are designed to make sure that you
have the freedom to distribute copies of free software (and charge for
this service if you wish), that you receive source code or can get it
if you want it, that you can change the software or use pieces of it
in new free programs; and that you know you can do these things.
them if you wish), that you receive source code or can get it if you
want it, that you can change the software or use pieces of it in new
free programs, and that you know you can do these things.
To protect your rights, we need to make restrictions that forbid
anyone to deny you these rights or to ask you to surrender the rights.
These restrictions translate to certain responsibilities for you if you
distribute copies of the software, or if you modify it.
To protect your rights, we need to prevent others from denying you
these rights or asking you to surrender the rights. Therefore, you have
certain responsibilities if you distribute copies of the software, or if
you modify it: responsibilities to respect the freedom of others.
For example, if you distribute copies of such a program, whether
gratis or for a fee, you must give the recipients all the rights that
you have. You must make sure that they, too, receive or can get the
source code. And you must show them these terms so they know their
rights.
gratis or for a fee, you must pass on to the recipients the same
freedoms that you received. You must make sure that they, too, receive
or can get the source code. And you must show them these terms so they
know their rights.
We protect your rights with two steps: (1) copyright the software, and
(2) offer you this license which gives you legal permission to copy,
distribute and/or modify the software.
Developers that use the GNU GPL protect your rights with two steps:
(1) assert copyright on the software, and (2) offer you this License
giving you legal permission to copy, distribute and/or modify it.
Also, for each author's protection and ours, we want to make certain
that everyone understands that there is no warranty for this free
software. If the software is modified by someone else and passed on, we
want its recipients to know that what they have is not the original, so
that any problems introduced by others will not reflect on the original
authors' reputations.
For the developers' and authors' protection, the GPL clearly explains
that there is no warranty for this free software. For both users' and
authors' sake, the GPL requires that modified versions be marked as
changed, so that their problems will not be attributed erroneously to
authors of previous versions.
Finally, any free program is threatened constantly by software
patents. We wish to avoid the danger that redistributors of a free
program will individually obtain patent licenses, in effect making the
program proprietary. To prevent this, we have made it clear that any
patent must be licensed for everyone's free use or not licensed at all.
Some devices are designed to deny users access to install or run
modified versions of the software inside them, although the manufacturer
can do so. This is fundamentally incompatible with the aim of
protecting users' freedom to change the software. The systematic
pattern of such abuse occurs in the area of products for individuals to
use, which is precisely where it is most unacceptable. Therefore, we
have designed this version of the GPL to prohibit the practice for those
products. If such problems arise substantially in other domains, we
stand ready to extend this provision to those domains in future versions
of the GPL, as needed to protect the freedom of users.
Finally, every program is threatened constantly by software patents.
States should not allow patents to restrict development and use of
software on general-purpose computers, but in those that do, we wish to
avoid the special danger that patents applied to a free program could
make it effectively proprietary. To prevent this, the GPL assures that
patents cannot be used to render the program non-free.
The precise terms and conditions for copying, distribution and
modification follow.
GNU GENERAL PUBLIC LICENSE
TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION
TERMS AND CONDITIONS
0. This License applies to any program or other work which contains
a notice placed by the copyright holder saying it may be distributed
under the terms of this General Public License. The "Program", below,
refers to any such program or work, and a "work based on the Program"
means either the Program or any derivative work under copyright law:
that is to say, a work containing the Program or a portion of it,
either verbatim or with modifications and/or translated into another
language. (Hereinafter, translation is included without limitation in
the term "modification".) Each licensee is addressed as "you".
0. Definitions.
Activities other than copying, distribution and modification are not
covered by this License; they are outside its scope. The act of
running the Program is not restricted, and the output from the Program
is covered only if its contents constitute a work based on the
Program (independent of having been made by running the Program).
Whether that is true depends on what the Program does.
"This License" refers to version 3 of the GNU General Public License.
1. You may copy and distribute verbatim copies of the Program's
source code as you receive it, in any medium, provided that you
conspicuously and appropriately publish on each copy an appropriate
copyright notice and disclaimer of warranty; keep intact all the
notices that refer to this License and to the absence of any warranty;
and give any other recipients of the Program a copy of this License
along with the Program.
"Copyright" also means copyright-like laws that apply to other kinds of
works, such as semiconductor masks.
You may charge a fee for the physical act of transferring a copy, and
you may at your option offer warranty protection in exchange for a fee.
"The Program" refers to any copyrightable work licensed under this
License. Each licensee is addressed as "you". "Licensees" and
"recipients" may be individuals or organizations.
2. You may modify your copy or copies of the Program or any portion
of it, thus forming a work based on the Program, and copy and
distribute such modifications or work under the terms of Section 1
above, provided that you also meet all of these conditions:
To "modify" a work means to copy from or adapt all or part of the work
in a fashion requiring copyright permission, other than the making of an
exact copy. The resulting work is called a "modified version" of the
earlier work or a work "based on" the earlier work.
a) You must cause the modified files to carry prominent notices
stating that you changed the files and the date of any change.
A "covered work" means either the unmodified Program or a work based
on the Program.
b) You must cause any work that you distribute or publish, that in
whole or in part contains or is derived from the Program or any
part thereof, to be licensed as a whole at no charge to all third
parties under the terms of this License.
To "propagate" a work means to do anything with it that, without
permission, would make you directly or secondarily liable for
infringement under applicable copyright law, except executing it on a
computer or modifying a private copy. Propagation includes copying,
distribution (with or without modification), making available to the
public, and in some countries other activities as well.
c) If the modified program normally reads commands interactively
when run, you must cause it, when started running for such
interactive use in the most ordinary way, to print or display an
announcement including an appropriate copyright notice and a
notice that there is no warranty (or else, saying that you provide
a warranty) and that users may redistribute the program under
these conditions, and telling the user how to view a copy of this
License. (Exception: if the Program itself is interactive but
does not normally print such an announcement, your work based on
the Program is not required to print an announcement.)
To "convey" a work means any kind of propagation that enables other
parties to make or receive copies. Mere interaction with a user through
a computer network, with no transfer of a copy, is not conveying.
These requirements apply to the modified work as a whole. If
identifiable sections of that work are not derived from the Program,
and can be reasonably considered independent and separate works in
themselves, then this License, and its terms, do not apply to those
sections when you distribute them as separate works. But when you
distribute the same sections as part of a whole which is a work based
on the Program, the distribution of the whole must be on the terms of
this License, whose permissions for other licensees extend to the
entire whole, and thus to each and every part regardless of who wrote it.
An interactive user interface displays "Appropriate Legal Notices"
to the extent that it includes a convenient and prominently visible
feature that (1) displays an appropriate copyright notice, and (2)
tells the user that there is no warranty for the work (except to the
extent that warranties are provided), that licensees may convey the
work under this License, and how to view a copy of this License. If
the interface presents a list of user commands or options, such as a
menu, a prominent item in the list meets this criterion.
Thus, it is not the intent of this section to claim rights or contest
your rights to work written entirely by you; rather, the intent is to
exercise the right to control the distribution of derivative or
collective works based on the Program.
1. Source Code.
In addition, mere aggregation of another work not based on the Program
with the Program (or with a work based on the Program) on a volume of
a storage or distribution medium does not bring the other work under
the scope of this License.
The "source code" for a work means the preferred form of the work
for making modifications to it. "Object code" means any non-source
form of a work.
3. You may copy and distribute the Program (or a work based on it,
under Section 2) in object code or executable form under the terms of
Sections 1 and 2 above provided that you also do one of the following:
A "Standard Interface" means an interface that either is an official
standard defined by a recognized standards body, or, in the case of
interfaces specified for a particular programming language, one that
is widely used among developers working in that language.
a) Accompany it with the complete corresponding machine-readable
source code, which must be distributed under the terms of Sections
1 and 2 above on a medium customarily used for software interchange; or,
The "System Libraries" of an executable work include anything, other
than the work as a whole, that (a) is included in the normal form of
packaging a Major Component, but which is not part of that Major
Component, and (b) serves only to enable use of the work with that
Major Component, or to implement a Standard Interface for which an
implementation is available to the public in source code form. A
"Major Component", in this context, means a major essential component
(kernel, window system, and so on) of the specific operating system
(if any) on which the executable work runs, or a compiler used to
produce the work, or an object code interpreter used to run it.
b) Accompany it with a written offer, valid for at least three
years, to give any third party, for a charge no more than your
cost of physically performing source distribution, a complete
machine-readable copy of the corresponding source code, to be
distributed under the terms of Sections 1 and 2 above on a medium
customarily used for software interchange; or,
The "Corresponding Source" for a work in object code form means all
the source code needed to generate, install, and (for an executable
work) run the object code and to modify the work, including scripts to
control those activities. However, it does not include the work's
System Libraries, or general-purpose tools or generally available free
programs which are used unmodified in performing those activities but
which are not part of the work. For example, Corresponding Source
includes interface definition files associated with source files for
the work, and the source code for shared libraries and dynamically
linked subprograms that the work is specifically designed to require,
such as by intimate data communication or control flow between those
subprograms and other parts of the work.
c) Accompany it with the information you received as to the offer
to distribute corresponding source code. (This alternative is
allowed only for noncommercial distribution and only if you
received the program in object code or executable form with such
an offer, in accord with Subsection b above.)
The Corresponding Source need not include anything that users
can regenerate automatically from other parts of the Corresponding
Source.
The source code for a work means the preferred form of the work for
making modifications to it. For an executable work, complete source
code means all the source code for all modules it contains, plus any
associated interface definition files, plus the scripts used to
control compilation and installation of the executable. However, as a
special exception, the source code distributed need not include
anything that is normally distributed (in either source or binary
form) with the major components (compiler, kernel, and so on) of the
operating system on which the executable runs, unless that component
itself accompanies the executable.
The Corresponding Source for a work in source code form is that
same work.
If distribution of executable or object code is made by offering
access to copy from a designated place, then offering equivalent
access to copy the source code from the same place counts as
distribution of the source code, even though third parties are not
compelled to copy the source along with the object code.
2. Basic Permissions.
4. You may not copy, modify, sublicense, or distribute the Program
except as expressly provided under this License. Any attempt
otherwise to copy, modify, sublicense or distribute the Program is
void, and will automatically terminate your rights under this License.
However, parties who have received copies, or rights, from you under
this License will not have their licenses terminated so long as such
parties remain in full compliance.
All rights granted under this License are granted for the term of
copyright on the Program, and are irrevocable provided the stated
conditions are met. This License explicitly affirms your unlimited
permission to run the unmodified Program. The output from running a
covered work is covered by this License only if the output, given its
content, constitutes a covered work. This License acknowledges your
rights of fair use or other equivalent, as provided by copyright law.
5. You are not required to accept this License, since you have not
signed it. However, nothing else grants you permission to modify or
distribute the Program or its derivative works. These actions are
prohibited by law if you do not accept this License. Therefore, by
modifying or distributing the Program (or any work based on the
Program), you indicate your acceptance of this License to do so, and
all its terms and conditions for copying, distributing or modifying
the Program or works based on it.
You may make, run and propagate covered works that you do not
convey, without conditions so long as your license otherwise remains
in force. You may convey covered works to others for the sole purpose
of having them make modifications exclusively for you, or provide you
with facilities for running those works, provided that you comply with
the terms of this License in conveying all material for which you do
not control copyright. Those thus making or running the covered works
for you must do so exclusively on your behalf, under your direction
and control, on terms that prohibit them from making any copies of
your copyrighted material outside their relationship with you.
6. Each time you redistribute the Program (or any work based on the
Program), the recipient automatically receives a license from the
original licensor to copy, distribute or modify the Program subject to
these terms and conditions. You may not impose any further
restrictions on the recipients' exercise of the rights granted herein.
You are not responsible for enforcing compliance by third parties to
Conveying under any other circumstances is permitted solely under
the conditions stated below. Sublicensing is not allowed; section 10
makes it unnecessary.
3. Protecting Users' Legal Rights From Anti-Circumvention Law.
No covered work shall be deemed part of an effective technological
measure under any applicable law fulfilling obligations under article
11 of the WIPO copyright treaty adopted on 20 December 1996, or
similar laws prohibiting or restricting circumvention of such
measures.
When you convey a covered work, you waive any legal power to forbid
circumvention of technological measures to the extent such circumvention
is effected by exercising rights under this License with respect to
the covered work, and you disclaim any intention to limit operation or
modification of the work as a means of enforcing, against the work's
users, your or third parties' legal rights to forbid circumvention of
technological measures.
4. Conveying Verbatim Copies.
You may convey verbatim copies of the Program's source code as you
receive it, in any medium, provided that you conspicuously and
appropriately publish on each copy an appropriate copyright notice;
keep intact all notices stating that this License and any
non-permissive terms added in accord with section 7 apply to the code;
keep intact all notices of the absence of any warranty; and give all
recipients a copy of this License along with the Program.
You may charge any price or no price for each copy that you convey,
and you may offer support or warranty protection for a fee.
5. Conveying Modified Source Versions.
You may convey a work based on the Program, or the modifications to
produce it from the Program, in the form of source code under the
terms of section 4, provided that you also meet all of these conditions:
a) The work must carry prominent notices stating that you modified
it, and giving a relevant date.
b) The work must carry prominent notices stating that it is
released under this License and any conditions added under section
7. This requirement modifies the requirement in section 4 to
"keep intact all notices".
c) You must license the entire work, as a whole, under this
License to anyone who comes into possession of a copy. This
License will therefore apply, along with any applicable section 7
additional terms, to the whole of the work, and all its parts,
regardless of how they are packaged. This License gives no
permission to license the work in any other way, but it does not
invalidate such permission if you have separately received it.
d) If the work has interactive user interfaces, each must display
Appropriate Legal Notices; however, if the Program has interactive
interfaces that do not display Appropriate Legal Notices, your
work need not make them do so.
A compilation of a covered work with other separate and independent
works, which are not by their nature extensions of the covered work,
and which are not combined with it such as to form a larger program,
in or on a volume of a storage or distribution medium, is called an
"aggregate" if the compilation and its resulting copyright are not
used to limit the access or legal rights of the compilation's users
beyond what the individual works permit. Inclusion of a covered work
in an aggregate does not cause this License to apply to the other
parts of the aggregate.
6. Conveying Non-Source Forms.
You may convey a covered work in object code form under the terms
of sections 4 and 5, provided that you also convey the
machine-readable Corresponding Source under the terms of this License,
in one of these ways:
a) Convey the object code in, or embodied in, a physical product
(including a physical distribution medium), accompanied by the
Corresponding Source fixed on a durable physical medium
customarily used for software interchange.
b) Convey the object code in, or embodied in, a physical product
(including a physical distribution medium), accompanied by a
written offer, valid for at least three years and valid for as
long as you offer spare parts or customer support for that product
model, to give anyone who possesses the object code either (1) a
copy of the Corresponding Source for all the software in the
product that is covered by this License, on a durable physical
medium customarily used for software interchange, for a price no
more than your reasonable cost of physically performing this
conveying of source, or (2) access to copy the
Corresponding Source from a network server at no charge.
c) Convey individual copies of the object code with a copy of the
written offer to provide the Corresponding Source. This
alternative is allowed only occasionally and noncommercially, and
only if you received the object code with such an offer, in accord
with subsection 6b.
d) Convey the object code by offering access from a designated
place (gratis or for a charge), and offer equivalent access to the
Corresponding Source in the same way through the same place at no
further charge. You need not require recipients to copy the
Corresponding Source along with the object code. If the place to
copy the object code is a network server, the Corresponding Source
may be on a different server (operated by you or a third party)
that supports equivalent copying facilities, provided you maintain
clear directions next to the object code saying where to find the
Corresponding Source. Regardless of what server hosts the
Corresponding Source, you remain obligated to ensure that it is
available for as long as needed to satisfy these requirements.
e) Convey the object code using peer-to-peer transmission, provided
you inform other peers where the object code and Corresponding
Source of the work are being offered to the general public at no
charge under subsection 6d.
A separable portion of the object code, whose source code is excluded
from the Corresponding Source as a System Library, need not be
included in conveying the object code work.
A "User Product" is either (1) a "consumer product", which means any
tangible personal property which is normally used for personal, family,
or household purposes, or (2) anything designed or sold for incorporation
into a dwelling. In determining whether a product is a consumer product,
doubtful cases shall be resolved in favor of coverage. For a particular
product received by a particular user, "normally used" refers to a
typical or common use of that class of product, regardless of the status
of the particular user or of the way in which the particular user
actually uses, or expects or is expected to use, the product. A product
is a consumer product regardless of whether the product has substantial
commercial, industrial or non-consumer uses, unless such uses represent
the only significant mode of use of the product.
"Installation Information" for a User Product means any methods,
procedures, authorization keys, or other information required to install
and execute modified versions of a covered work in that User Product from
a modified version of its Corresponding Source. The information must
suffice to ensure that the continued functioning of the modified object
code is in no case prevented or interfered with solely because
modification has been made.
If you convey an object code work under this section in, or with, or
specifically for use in, a User Product, and the conveying occurs as
part of a transaction in which the right of possession and use of the
User Product is transferred to the recipient in perpetuity or for a
fixed term (regardless of how the transaction is characterized), the
Corresponding Source conveyed under this section must be accompanied
by the Installation Information. But this requirement does not apply
if neither you nor any third party retains the ability to install
modified object code on the User Product (for example, the work has
been installed in ROM).
The requirement to provide Installation Information does not include a
requirement to continue to provide support service, warranty, or updates
for a work that has been modified or installed by the recipient, or for
the User Product in which it has been modified or installed. Access to a
network may be denied when the modification itself materially and
adversely affects the operation of the network or violates the rules and
protocols for communication across the network.
Corresponding Source conveyed, and Installation Information provided,
in accord with this section must be in a format that is publicly
documented (and with an implementation available to the public in
source code form), and must require no special password or key for
unpacking, reading or copying.
7. Additional Terms.
"Additional permissions" are terms that supplement the terms of this
License by making exceptions from one or more of its conditions.
Additional permissions that are applicable to the entire Program shall
be treated as though they were included in this License, to the extent
that they are valid under applicable law. If additional permissions
apply only to part of the Program, that part may be used separately
under those permissions, but the entire Program remains governed by
this License without regard to the additional permissions.
When you convey a copy of a covered work, you may at your option
remove any additional permissions from that copy, or from any part of
it. (Additional permissions may be written to require their own
removal in certain cases when you modify the work.) You may place
additional permissions on material, added by you to a covered work,
for which you have or can give appropriate copyright permission.
Notwithstanding any other provision of this License, for material you
add to a covered work, you may (if authorized by the copyright holders of
that material) supplement the terms of this License with terms:
a) Disclaiming warranty or limiting liability differently from the
terms of sections 15 and 16 of this License; or
b) Requiring preservation of specified reasonable legal notices or
author attributions in that material or in the Appropriate Legal
Notices displayed by works containing it; or
c) Prohibiting misrepresentation of the origin of that material, or
requiring that modified versions of such material be marked in
reasonable ways as different from the original version; or
d) Limiting the use for publicity purposes of names of licensors or
authors of the material; or
e) Declining to grant rights under trademark law for use of some
trade names, trademarks, or service marks; or
f) Requiring indemnification of licensors and authors of that
material by anyone who conveys the material (or modified versions of
it) with contractual assumptions of liability to the recipient, for
any liability that these contractual assumptions directly impose on
those licensors and authors.
All other non-permissive additional terms are considered "further
restrictions" within the meaning of section 10. If the Program as you
received it, or any part of it, contains a notice stating that it is
governed by this License along with a term that is a further
restriction, you may remove that term. If a license document contains
a further restriction but permits relicensing or conveying under this
License, you may add to a covered work material governed by the terms
of that license document, provided that the further restriction does
not survive such relicensing or conveying.
If you add terms to a covered work in accord with this section, you
must place, in the relevant source files, a statement of the
additional terms that apply to those files, or a notice indicating
where to find the applicable terms.
Additional terms, permissive or non-permissive, may be stated in the
form of a separately written license, or stated as exceptions;
the above requirements apply either way.
8. Termination.
You may not propagate or modify a covered work except as expressly
provided under this License. Any attempt otherwise to propagate or
modify it is void, and will automatically terminate your rights under
this License (including any patent licenses granted under the third
paragraph of section 11).
However, if you cease all violation of this License, then your
license from a particular copyright holder is reinstated (a)
provisionally, unless and until the copyright holder explicitly and
finally terminates your license, and (b) permanently, if the copyright
holder fails to notify you of the violation by some reasonable means
prior to 60 days after the cessation.
Moreover, your license from a particular copyright holder is
reinstated permanently if the copyright holder notifies you of the
violation by some reasonable means, this is the first time you have
received notice of violation of this License (for any work) from that
copyright holder, and you cure the violation prior to 30 days after
your receipt of the notice.
Termination of your rights under this section does not terminate the
licenses of parties who have received copies or rights from you under
this License. If your rights have been terminated and not permanently
reinstated, you do not qualify to receive new licenses for the same
material under section 10.
9. Acceptance Not Required for Having Copies.
You are not required to accept this License in order to receive or
run a copy of the Program. Ancillary propagation of a covered work
occurring solely as a consequence of using peer-to-peer transmission
to receive a copy likewise does not require acceptance. However,
nothing other than this License grants you permission to propagate or
modify any covered work. These actions infringe copyright if you do
not accept this License. Therefore, by modifying or propagating a
covered work, you indicate your acceptance of this License to do so.
10. Automatic Licensing of Downstream Recipients.
Each time you convey a covered work, the recipient automatically
receives a license from the original licensors, to run, modify and
propagate that work, subject to this License. You are not responsible
for enforcing compliance by third parties with this License.
An "entity transaction" is a transaction transferring control of an
organization, or substantially all assets of one, or subdividing an
organization, or merging organizations. If propagation of a covered
work results from an entity transaction, each party to that
transaction who receives a copy of the work also receives whatever
licenses to the work the party's predecessor in interest had or could
give under the previous paragraph, plus a right to possession of the
Corresponding Source of the work from the predecessor in interest, if
the predecessor has it or can get it with reasonable efforts.
You may not impose any further restrictions on the exercise of the
rights granted or affirmed under this License. For example, you may
not impose a license fee, royalty, or other charge for exercise of
rights granted under this License, and you may not initiate litigation
(including a cross-claim or counterclaim in a lawsuit) alleging that
any patent claim is infringed by making, using, selling, offering for
sale, or importing the Program or any portion of it.
11. Patents.
A "contributor" is a copyright holder who authorizes use under this
License of the Program or a work on which the Program is based. The
work thus licensed is called the contributor's "contributor version".
A contributor's "essential patent claims" are all patent claims
owned or controlled by the contributor, whether already acquired or
hereafter acquired, that would be infringed by some manner, permitted
by this License, of making, using, or selling its contributor version,
but do not include claims that would be infringed only as a
consequence of further modification of the contributor version. For
purposes of this definition, "control" includes the right to grant
patent sublicenses in a manner consistent with the requirements of
this License.
7. If, as a consequence of a court judgment or allegation of patent
infringement or for any other reason (not limited to patent issues),
conditions are imposed on you (whether by court order, agreement or
Each contributor grants you a non-exclusive, worldwide, royalty-free
patent license under the contributor's essential patent claims, to
make, use, sell, offer for sale, import and otherwise run, modify and
propagate the contents of its contributor version.
In the following three paragraphs, a "patent license" is any express
agreement or commitment, however denominated, not to enforce a patent
(such as an express permission to practice a patent or covenant not to
sue for patent infringement). To "grant" such a patent license to a
party means to make such an agreement or commitment not to enforce a
patent against the party.
If you convey a covered work, knowingly relying on a patent license,
and the Corresponding Source of the work is not available for anyone
to copy, free of charge and under the terms of this License, through a
publicly available network server or other readily accessible means,
then you must either (1) cause the Corresponding Source to be so
available, or (2) arrange to deprive yourself of the benefit of the
patent license for this particular work, or (3) arrange, in a manner
consistent with the requirements of this License, to extend the patent
license to downstream recipients. "Knowingly relying" means you have
actual knowledge that, but for the patent license, your conveying the
covered work in a country, or your recipient's use of the covered work
in a country, would infringe one or more identifiable patents in that
country that you have reason to believe are valid.
If, pursuant to or in connection with a single transaction or
arrangement, you convey, or propagate by procuring conveyance of, a
covered work, and grant a patent license to some of the parties
receiving the covered work authorizing them to use, propagate, modify
or convey a specific copy of the covered work, then the patent license
you grant is automatically extended to all recipients of the covered
work and works based on it.
A patent license is "discriminatory" if it does not include within
the scope of its coverage, prohibits the exercise of, or is
conditioned on the non-exercise of one or more of the rights that are
specifically granted under this License. You may not convey a covered
work if you are a party to an arrangement with a third party that is
in the business of distributing software, under which you make payment
to the third party based on the extent of your activity of conveying
the work, and under which the third party grants, to any of the
parties who would receive the covered work from you, a discriminatory
patent license (a) in connection with copies of the covered work
conveyed by you (or copies made from those copies), or (b) primarily
for and in connection with specific products or compilations that
contain the covered work, unless you entered into that arrangement,
or that patent license was granted, prior to 28 March 2007.
Nothing in this License shall be construed as excluding or limiting
any implied license or other defenses to infringement that may
otherwise be available to you under applicable patent law.
12. No Surrender of Others' Freedom.
If conditions are imposed on you (whether by court order, agreement or
otherwise) that contradict the conditions of this License, they do not
excuse you from the conditions of this License. If you cannot
distribute so as to satisfy simultaneously your obligations under this
License and any other pertinent obligations, then as a consequence you
may not distribute the Program at all. For example, if a patent
license would not permit royalty-free redistribution of the Program by
all those who receive copies directly or indirectly through you, then
the only way you could satisfy both it and this License would be to
refrain entirely from distribution of the Program.
excuse you from the conditions of this License. If you cannot convey a
covered work so as to satisfy simultaneously your obligations under this
License and any other pertinent obligations, then as a consequence you may
not convey it at all. For example, if you agree to terms that obligate you
to collect a royalty for further conveying from those to whom you convey
the Program, the only way you could satisfy both those terms and this
License would be to refrain entirely from conveying the Program.
If any portion of this section is held invalid or unenforceable under
any particular circumstance, the balance of the section is intended to
apply and the section as a whole is intended to apply in other
circumstances.
13. Use with the GNU Affero General Public License.
It is not the purpose of this section to induce you to infringe any
patents or other property right claims or to contest validity of any
such claims; this section has the sole purpose of protecting the
integrity of the free software distribution system, which is
implemented by public license practices. Many people have made
generous contributions to the wide range of software distributed
through that system in reliance on consistent application of that
system; it is up to the author/donor to decide if he or she is willing
to distribute software through any other system and a licensee cannot
impose that choice.
Notwithstanding any other provision of this License, you have
permission to link or combine any covered work with a work licensed
under version 3 of the GNU Affero General Public License into a single
combined work, and to convey the resulting work. The terms of this
License will continue to apply to the part which is the covered work,
but the special requirements of the GNU Affero General Public License,
section 13, concerning interaction through a network will apply to the
combination as such.
This section is intended to make thoroughly clear what is believed to
be a consequence of the rest of this License.
14. Revised Versions of this License.
8. If the distribution and/or use of the Program is restricted in
certain countries either by patents or by copyrighted interfaces, the
original copyright holder who places the Program under this License
may add an explicit geographical distribution limitation excluding
those countries, so that distribution is permitted only in or among
countries not thus excluded. In such case, this License incorporates
the limitation as if written in the body of this License.
9. The Free Software Foundation may publish revised and/or new versions
of the General Public License from time to time. Such new versions will
The Free Software Foundation may publish revised and/or new versions of
the GNU General Public License from time to time. Such new versions will
be similar in spirit to the present version, but may differ in detail to
address new problems or concerns.
Each version is given a distinguishing version number. If the Program
specifies a version number of this License which applies to it and "any
later version", you have the option of following the terms and conditions
either of that version or of any later version published by the Free
Software Foundation. If the Program does not specify a version number of
this License, you may choose any version ever published by the Free Software
Foundation.
Each version is given a distinguishing version number. If the
Program specifies that a certain numbered version of the GNU General
Public License "or any later version" applies to it, you have the
option of following the terms and conditions either of that numbered
version or of any later version published by the Free Software
Foundation. If the Program does not specify a version number of the
GNU General Public License, you may choose any version ever published
by the Free Software Foundation.
10. If you wish to incorporate parts of the Program into other free
programs whose distribution conditions are different, write to the author
to ask for permission. For software which is copyrighted by the Free
Software Foundation, write to the Free Software Foundation; we sometimes
make exceptions for this. Our decision will be guided by the two goals
of preserving the free status of all derivatives of our free software and
of promoting the sharing and reuse of software generally.
If the Program specifies that a proxy can decide which future
versions of the GNU General Public License can be used, that proxy's
public statement of acceptance of a version permanently authorizes you
to choose that version for the Program.
NO WARRANTY
Later license versions may give you additional or different
permissions. However, no additional obligations are imposed on any
author or copyright holder as a result of your choosing to follow a
later version.
11. BECAUSE THE PROGRAM IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY
FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN
OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES
PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED
OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF
MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS
TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE
PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING,
REPAIR OR CORRECTION.
15. Disclaimer of Warranty.
12. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR
REDISTRIBUTE THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES,
INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING
OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED
TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY
YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER
PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE
POSSIBILITY OF SUCH DAMAGES.
THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
16. Limitation of Liability.
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
SUCH DAMAGES.
17. Interpretation of Sections 15 and 16.
If the disclaimer of warranty and limitation of liability provided
above cannot be given local legal effect according to their terms,
reviewing courts shall apply local law that most closely approximates
an absolute waiver of all civil liability in connection with the
Program, unless a warranty or assumption of liability accompanies a
copy of the Program in return for a fee.
END OF TERMS AND CONDITIONS
@@ -287,15 +628,15 @@ free software which everyone can redistribute and change under these terms.
To do so, attach the following notices to the program. It is safest
to attach them to the start of each source file to most effectively
convey the exclusion of warranty; and each file should have at least
state the exclusion of warranty; and each file should have at least
the "copyright" line and a pointer to where the full notice is found.
{description}
Copyright (C) {year} {fullname}
<one line to give the program's name and a brief idea of what it does.>
Copyright (C) <year> <name of author>
This program is free software; you can redistribute it and/or modify
This program is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation; either version 2 of the License, or
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
@@ -303,37 +644,31 @@ the "copyright" line and a pointer to where the full notice is found.
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License along
with this program; if not, write to the Free Software Foundation, Inc.,
51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA.
You should have received a copy of the GNU General Public License
along with this program. If not, see <https://www.gnu.org/licenses/>.
Also add information on how to contact you by electronic and paper mail.
If the program is interactive, make it output a short notice like this
when it starts in an interactive mode:
If the program does terminal interaction, make it output a short
notice like this when it starts in an interactive mode:
Gnomovision version 69, Copyright (C) year name of author
Gnomovision comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
<program> Copyright (C) <year> <name of author>
This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
This is free software, and you are welcome to redistribute it
under certain conditions; type `show c' for details.
The hypothetical commands `show w' and `show c' should show the appropriate
parts of the General Public License. Of course, the commands you use may
be called something other than `show w' and `show c'; they could even be
mouse-clicks or menu items--whatever suits your program.
parts of the General Public License. Of course, your program's commands
might be different; for a GUI interface, you would use an "about box".
You should also get your employer (if you work as a programmer) or your
school, if any, to sign a "copyright disclaimer" for the program, if
necessary. Here is a sample; alter the names:
You should also get your employer (if you work as a programmer) or school,
if any, to sign a "copyright disclaimer" for the program, if necessary.
For more information on this, and how to apply and follow the GNU GPL, see
<https://www.gnu.org/licenses/>.
Yoyodyne, Inc., hereby disclaims all copyright interest in the program
`Gnomovision' (which makes passes at compilers) written by James Hacker.
{signature of Ty Coon}, 1 April 1989
Ty Coon, President of Vice
This General Public License does not permit incorporating your program into
proprietary programs. If your program is a subroutine library, you may
consider it more useful to permit linking proprietary applications with the
library. If this is what you want to do, use the GNU Lesser General
Public License instead of this License.
The GNU General Public License does not permit incorporating your program
into proprietary programs. If your program is a subroutine library, you
may consider it more useful to permit linking proprietary applications with
the library. If this is what you want to do, use the GNU Lesser General
Public License instead of this License. But first, please read
<https://www.gnu.org/licenses/why-not-lgpl.html>.

View File

@@ -2,7 +2,7 @@
<h3 align="center">多云环境下更好用的堡垒机</h3>
<p align="center">
<a href="https://www.gnu.org/licenses/old-licenses/gpl-2.0"><img src="https://shields.io/github/license/jumpserver/jumpserver" alt="License: GPL v2"></a>
<a href="https://www.gnu.org/licenses/gpl-3.0.html"><img src="https://img.shields.io/github/license/jumpserver/jumpserver" alt="License: GPLv3"></a>
<a href="https://shields.io/github/downloads/jumpserver/jumpserver/total"><img src="https://shields.io/github/downloads/jumpserver/jumpserver/total" alt=" release"></a>
<a href="https://hub.docker.com/u/jumpserver"><img src="https://img.shields.io/docker/pulls/jumpserver/jms_all.svg" alt="Codacy"></a>
<a href="https://github.com/jumpserver/jumpserver"><img src="https://img.shields.io/github/stars/jumpserver/jumpserver?color=%231890FF&style=flat-square" alt="Stars"></a>
@@ -13,7 +13,7 @@
JumpServer 是全球首款开源的堡垒机,使用 GNU GPL v2.0 开源协议,是符合 4A 规范的运维安全审计系统。
JumpServer 是全球首款开源的堡垒机,使用 GPLv3 开源协议,是符合 4A 规范的运维安全审计系统。
JumpServer 使用 Python 开发,遵循 Web 2.0 规范,配备了业界领先的 Web Terminal 方案,交互界面美观、用户体验好。
@@ -124,11 +124,11 @@ JumpServer是一款安全产品请参考 [基本安全建议](https://docs.ju
### License & Copyright
Copyright (c) 2014-2021 飞致云 FIT2CLOUD, All rights reserved.
Copyright (c) 2014-2022 飞致云 FIT2CLOUD, All rights reserved.
Licensed under The GNU General Public License version 2 (GPLv2) (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at
Licensed under The GNU General Public License version 3 (GPLv3) (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at
https://www.gnu.org/licenses/gpl-2.0.html
https://www.gnu.org/licenses/gpl-3.0.html
Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.

View File

@@ -2,13 +2,13 @@
<h3 align="center">Open Source Bastion Host</h3>
<p align="center">
<a href="https://www.gnu.org/licenses/old-licenses/gpl-2.0"><img src="https://shields.io/github/license/jumpserver/jumpserver" alt="License: GPL v2"></a>
<a href="https://www.gnu.org/licenses/gpl-3.0.html"><img src="https://img.shields.io/github/license/jumpserver/jumpserver" alt="License: GPLv3"></a>
<a href="https://shields.io/github/downloads/jumpserver/jumpserver/total"><img src="https://shields.io/github/downloads/jumpserver/jumpserver/total" alt=" release"></a>
<a href="https://hub.docker.com/u/jumpserver"><img src="https://img.shields.io/docker/pulls/jumpserver/jms_all.svg" alt="Codacy"></a>
<a href="https://github.com/jumpserver/jumpserver"><img src="https://img.shields.io/github/stars/jumpserver/jumpserver?color=%231890FF&style=flat-square" alt="Stars"></a>
</p>
JumpServer is the world's first open-source Bastion Host and is licensed under the GNU GPL v2.0. It is a 4A-compliant professional operation and maintenance security audit system.
JumpServer is the world's first open-source Bastion Host and is licensed under the GPLv3. It is a 4A-compliant professional operation and maintenance security audit system.
JumpServer uses Python / Django for development, follows Web 2.0 specifications, and is equipped with an industry-leading Web Terminal solution that provides a beautiful user interface and great user experience
@@ -85,10 +85,11 @@ If you find a security problem, please contact us directly
- 400-052-0755
### License & Copyright
Copyright (c) 2014-2021 Beijing Duizhan Tech, Inc., All rights reserved.
Copyright (c) 2014-2022 FIT2CLOUD Tech, Inc., All rights reserved.
Licensed under The GNU General Public License version 2 (GPLv2) (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at
Licensed under The GNU General Public License version 3 (GPLv3) (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at
https://www.gnu.org/licenses/gpl-2.0.html
https://www.gnu.org/licenses/gpl-3.0.htmll
Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.

View File

@@ -7,3 +7,14 @@ JumpServer 是一款正在成长的安全产品, 请参考 [基本安全建议
- ibuler@fit2cloud.com
- support@fit2cloud.com
- 400-052-0755
# Security Policy
JumpServer is a security product, The installation and development should follow our security tips.
## Reporting a Vulnerability
All security bugs should be reported to the contact as below:
- ibuler@fit2cloud.com
- support@fit2cloud.com
- 400-052-0755

View File

@@ -2,15 +2,16 @@ from common.permissions import IsOrgAdmin, HasQueryParamsUserAndIsCurrentOrgMemb
from common.drf.api import JMSBulkModelViewSet
from ..models import LoginACL
from .. import serializers
from ..filters import LoginAclFilter
__all__ = ['LoginACLViewSet', ]
class LoginACLViewSet(JMSBulkModelViewSet):
queryset = LoginACL.objects.all()
filterset_fields = ('name', 'user', )
search_fields = filterset_fields
permission_classes = (IsOrgAdmin, )
filterset_class = LoginAclFilter
search_fields = ('name',)
permission_classes = (IsOrgAdmin,)
serializer_class = serializers.LoginACLSerializer
def get_permissions(self):

View File

@@ -1,4 +1,3 @@
from orgs.mixins.api import OrgBulkModelViewSet
from common.permissions import IsOrgAdmin
from .. import models, serializers

View File

@@ -1,10 +1,9 @@
from django.shortcuts import get_object_or_404
from rest_framework.response import Response
from rest_framework.generics import CreateAPIView, RetrieveDestroyAPIView
from rest_framework.generics import CreateAPIView
from common.permissions import IsAppUser
from common.utils import reverse, lazyproperty
from orgs.utils import tmp_to_org, tmp_to_root_org
from orgs.utils import tmp_to_org
from tickets.api import GenericTicketStatusRetrieveCloseAPI
from ..models import LoginAssetACL
from .. import serializers
@@ -61,7 +60,8 @@ class LoginAssetCheckAPI(CreateAPIView):
'check_confirm_status': {'method': 'GET', 'url': confirm_status_url},
'close_confirm': {'method': 'DELETE', 'url': confirm_status_url},
'ticket_detail_url': ticket_detail_url,
'reviewers': [str(ticket_assignee.assignee) for ticket_assignee in ticket_assignees]
'reviewers': [str(ticket_assignee.assignee) for ticket_assignee in ticket_assignees],
'ticket_id': str(ticket.id)
}
return data

15
apps/acls/filters.py Normal file
View File

@@ -0,0 +1,15 @@
from django_filters import rest_framework as filters
from common.drf.filters import BaseFilterSet
from acls.models import LoginACL
class LoginAclFilter(BaseFilterSet):
user = filters.UUIDFilter(field_name='user_id')
user_display = filters.CharFilter(field_name='user__name')
class Meta:
model = LoginACL
fields = (
'name', 'user', 'user_display', 'action'
)

View File

@@ -0,0 +1,98 @@
# Generated by Django 3.1.12 on 2021-09-26 02:47
import django
from django.conf import settings
from django.db import migrations, models, transaction
from acls.models import LoginACL
LOGIN_CONFIRM_ZH = '登录复核'
LOGIN_CONFIRM_EN = 'Login confirm'
DEFAULT_TIME_PERIODS = [{'id': i, 'value': '00:00~00:00'} for i in range(7)]
def has_zh(name: str) -> bool:
for i in name:
if u'\u4e00' <= i <= u'\u9fff':
return True
return False
def migrate_login_confirm(apps, schema_editor):
login_acl_model = apps.get_model("acls", "LoginACL")
login_confirm_model = apps.get_model("authentication", "LoginConfirmSetting")
with transaction.atomic():
for instance in login_confirm_model.objects.filter(is_active=True):
user = instance.user
reviewers = instance.reviewers.all()
login_confirm = LOGIN_CONFIRM_ZH if has_zh(user.name) else LOGIN_CONFIRM_EN
date_created = instance.date_created.strftime('%Y-%m-%d %H:%M:%S')
if reviewers.count() == 0:
continue
data = {
'user': user,
'name': f'{user.name}-{login_confirm} ({date_created})',
'created_by': instance.created_by,
'action': LoginACL.ActionChoices.confirm,
'rules': {'ip_group': ['*'], 'time_period': DEFAULT_TIME_PERIODS}
}
instance = login_acl_model.objects.create(**data)
instance.reviewers.set(reviewers)
def migrate_ip_group(apps, schema_editor):
login_acl_model = apps.get_model("acls", "LoginACL")
updates = list()
with transaction.atomic():
for instance in login_acl_model.objects.exclude(action=LoginACL.ActionChoices.confirm):
instance.rules = {'ip_group': instance.ip_group, 'time_period': DEFAULT_TIME_PERIODS}
updates.append(instance)
login_acl_model.objects.bulk_update(updates, ['rules', ])
class Migration(migrations.Migration):
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
('acls', '0001_initial'),
('authentication', '0004_ssotoken'),
]
operations = [
migrations.AlterField(
model_name='loginacl',
name='action',
field=models.CharField(choices=[('reject', 'Reject'), ('allow', 'Allow'), ('confirm', 'Login confirm')],
default='reject', max_length=64, verbose_name='Action'),
),
migrations.AddField(
model_name='loginacl',
name='reviewers',
field=models.ManyToManyField(blank=True, related_name='login_confirm_acls',
to=settings.AUTH_USER_MODEL, verbose_name='Reviewers'),
),
migrations.AlterField(
model_name='loginacl',
name='user',
field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE,
related_name='login_acls', to=settings.AUTH_USER_MODEL, verbose_name='User'),
),
migrations.AddField(
model_name='loginacl',
name='rules',
field=models.JSONField(default=dict, verbose_name='Rule'),
),
migrations.RunPython(migrate_login_confirm),
migrations.RunPython(migrate_ip_group),
migrations.RemoveField(
model_name='loginacl',
name='ip_group',
),
migrations.AlterModelOptions(
name='loginacl',
options={'ordering': ('priority', '-date_updated', 'name'), 'verbose_name': 'Login acl'},
),
migrations.AlterModelOptions(
name='loginassetacl',
options={'ordering': ('priority', '-date_updated', 'name'), 'verbose_name': 'Login asset acl'},
),
]

View File

@@ -1,8 +1,10 @@
from django.db import models
from django.utils.translation import ugettext_lazy as _
from .base import BaseACL, BaseACLQuerySet
from common.utils import get_request_ip, get_ip_city
from common.utils.ip import contains_ip
from common.utils.time_period import contains_time_period
from common.utils.timezone import local_now_display
class ACLManager(models.Manager):
@@ -15,23 +17,29 @@ class LoginACL(BaseACL):
class ActionChoices(models.TextChoices):
reject = 'reject', _('Reject')
allow = 'allow', _('Allow')
confirm = 'confirm', _('Login confirm')
# 条件
ip_group = models.JSONField(default=list, verbose_name=_('Login IP'))
# 用户
user = models.ForeignKey(
'users.User', on_delete=models.CASCADE, verbose_name=_('User'),
related_name='login_acls'
)
# 规则
rules = models.JSONField(default=dict, verbose_name=_('Rule'))
# 动作
action = models.CharField(
max_length=64, choices=ActionChoices.choices, default=ActionChoices.reject,
verbose_name=_('Action')
max_length=64, verbose_name=_('Action'),
choices=ActionChoices.choices, default=ActionChoices.reject
)
# 关联
user = models.ForeignKey(
'users.User', on_delete=models.CASCADE, related_name='login_acls', verbose_name=_('User')
reviewers = models.ManyToManyField(
'users.User', verbose_name=_("Reviewers"),
related_name="login_confirm_acls", blank=True
)
objects = ACLManager.from_queryset(BaseACLQuerySet)()
class Meta:
ordering = ('priority', '-date_updated', 'name')
verbose_name = _('Login acl')
def __str__(self):
return self.name
@@ -44,14 +52,77 @@ class LoginACL(BaseACL):
def action_allow(self):
return self.action == self.ActionChoices.allow
@classmethod
def filter_acl(cls, user):
return user.login_acls.all().valid().distinct()
@staticmethod
def allow_user_confirm_if_need(user, ip):
acl = LoginACL.filter_acl(user).filter(
action=LoginACL.ActionChoices.confirm
).first()
acl = acl if acl and acl.reviewers.exists() else None
if not acl:
return False, acl
ip_group = acl.rules.get('ip_group')
time_periods = acl.rules.get('time_period')
is_contain_ip = contains_ip(ip, ip_group)
is_contain_time_period = contains_time_period(time_periods)
return is_contain_ip and is_contain_time_period, acl
@staticmethod
def allow_user_to_login(user, ip):
acl = user.login_acls.valid().first()
acl = LoginACL.filter_acl(user).exclude(
action=LoginACL.ActionChoices.confirm
).first()
if not acl:
return True
is_contained = contains_ip(ip, acl.ip_group)
if acl.action_allow and is_contained:
return True
if acl.action_reject and not is_contained:
return True
return False
return True, ''
ip_group = acl.rules.get('ip_group')
time_periods = acl.rules.get('time_period')
is_contain_ip = contains_ip(ip, ip_group)
is_contain_time_period = contains_time_period(time_periods)
reject_type = ''
if is_contain_ip and is_contain_time_period:
# 满足条件
allow = acl.action_allow
if not allow:
reject_type = 'ip' if is_contain_ip else 'time'
else:
# 不满足条件
# 如果acl本身允许那就拒绝如果本身拒绝那就允许
allow = not acl.action_allow
if not allow:
reject_type = 'ip' if not is_contain_ip else 'time'
return allow, reject_type
@staticmethod
def construct_confirm_ticket_meta(request=None):
login_ip = get_request_ip(request) if request else ''
login_ip = login_ip or '0.0.0.0'
login_city = get_ip_city(login_ip)
login_datetime = local_now_display()
ticket_meta = {
'apply_login_ip': login_ip,
'apply_login_city': login_city,
'apply_login_datetime': login_datetime,
}
return ticket_meta
def create_confirm_ticket(self, request=None):
from tickets import const
from tickets.models import Ticket
from orgs.models import Organization
ticket_title = _('Login confirm') + ' {}'.format(self.user)
ticket_meta = self.construct_confirm_ticket_meta(request)
data = {
'title': ticket_title,
'type': const.TicketType.login_confirm.value,
'meta': ticket_meta,
'org_id': Organization.ROOT_ID,
}
ticket = Ticket.objects.create(**data)
ticket.create_process_map_and_node(self.reviewers.all())
ticket.open(self.user)
return ticket

View File

@@ -37,6 +37,7 @@ class LoginAssetACL(BaseACL, OrgModelMixin):
class Meta:
unique_together = ('name', 'org_id')
ordering = ('priority', '-date_updated', 'name')
verbose_name = _('Login asset acl')
def __str__(self):
return self.name

View File

@@ -1,59 +1,56 @@
from django.utils.translation import ugettext as _
from rest_framework import serializers
from common.drf.serializers import BulkModelSerializer
from orgs.utils import current_org
from common.drf.serializers import MethodSerializer
from jumpserver.utils import has_valid_xpack_license
from ..models import LoginACL
from common.utils.ip import is_ip_address, is_ip_network, is_ip_segment
from .rules import RuleSerializer
__all__ = ['LoginACLSerializer', ]
def ip_group_child_validator(ip_group_child):
is_valid = ip_group_child == '*' \
or is_ip_address(ip_group_child) \
or is_ip_network(ip_group_child) \
or is_ip_segment(ip_group_child)
if not is_valid:
error = _('IP address invalid: `{}`').format(ip_group_child)
raise serializers.ValidationError(error)
common_help_text = _('Format for comma-delimited string, with * indicating a match all. ')
class LoginACLSerializer(BulkModelSerializer):
ip_group_help_text = _(
'Format for comma-delimited string, with * indicating a match all. '
'Such as: '
'192.168.10.1, 192.168.1.0/24, 10.1.1.1-10.1.1.20, 2001:db8:2de::e13, 2001:db8:1a:1110::/64 '
)
ip_group = serializers.ListField(
default=['*'], label=_('IP'), help_text=ip_group_help_text,
child=serializers.CharField(max_length=1024, validators=[ip_group_child_validator])
)
user_display = serializers.ReadOnlyField(source='user.name', label=_('User'))
user_display = serializers.ReadOnlyField(source='user.username', label=_('Username'))
reviewers_display = serializers.SerializerMethodField(label=_('Reviewers'))
action_display = serializers.ReadOnlyField(source='get_action_display', label=_('Action'))
reviewers_amount = serializers.IntegerField(read_only=True, source='reviewers.count')
rules = MethodSerializer()
class Meta:
model = LoginACL
fields_mini = ['id', 'name']
fields_small = fields_mini + [
'priority', 'ip_group', 'action', 'action_display',
'is_active',
'date_created', 'date_updated',
'comment', 'created_by',
'priority', 'rules', 'action', 'action_display',
'is_active', 'user', 'user_display',
'date_created', 'date_updated', 'reviewers_amount',
'comment', 'created_by'
]
fields_fk = ['user', 'user_display',]
fields = fields_small + fields_fk
fields_fk = ['user', 'user_display']
fields_m2m = ['reviewers', 'reviewers_display']
fields = fields_small + fields_fk + fields_m2m
extra_kwargs = {
'priority': {'default': 50},
'is_active': {'default': True},
"reviewers": {'allow_null': False, 'required': True},
}
@staticmethod
def validate_user(user):
if user not in current_org.get_members():
error = _('The user `{}` is not in the current organization: `{}`').format(
user, current_org
)
raise serializers.ValidationError(error)
return user
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.set_action_choices()
def set_action_choices(self):
action = self.fields.get('action')
if not action:
return
choices = action._choices
if not has_valid_xpack_license():
choices.pop(LoginACL.ActionChoices.confirm, None)
action._choices = choices
def get_rules_serializer(self):
return RuleSerializer()
def get_reviewers_display(self, obj):
return ','.join([str(user) for user in obj.reviewers.all()])

View File

@@ -0,0 +1 @@
from .rules import *

View File

@@ -0,0 +1,35 @@
# coding: utf-8
#
from rest_framework import serializers
from django.utils.translation import ugettext_lazy as _
from common.utils import get_logger
from common.utils.ip import is_ip_address, is_ip_network, is_ip_segment
logger = get_logger(__file__)
__all__ = ['RuleSerializer', 'ip_group_child_validator', 'ip_group_help_text']
def ip_group_child_validator(ip_group_child):
is_valid = ip_group_child == '*' \
or is_ip_address(ip_group_child) \
or is_ip_network(ip_group_child) \
or is_ip_segment(ip_group_child)
if not is_valid:
error = _('IP address invalid: `{}`').format(ip_group_child)
raise serializers.ValidationError(error)
ip_group_help_text = _(
'Format for comma-delimited string, with * indicating a match all. '
'Such as: '
'192.168.10.1, 192.168.1.0/24, 10.1.1.1-10.1.1.20, 2001:db8:2de::e13, 2001:db8:1a:1110::/64 '
)
class RuleSerializer(serializers.Serializer):
ip_group = serializers.ListField(
default=['*'], label=_('IP'), help_text=ip_group_help_text,
child=serializers.CharField(max_length=1024, validators=[ip_group_child_validator]))
time_period = serializers.ListField(default=[], label=_('Time Period'))

View File

@@ -44,11 +44,7 @@ class ApplicationAccountViewSet(JMSBulkModelViewSet):
permission_classes = (IsOrgAdmin,)
def get_queryset(self):
queryset = Account.objects.all() \
.annotate(type=F('app__type')) \
.annotate(app_display=F('app__name')) \
.annotate(systemuser_display=F('systemuser__name')) \
.annotate(category=F('app__category'))
queryset = Account.get_queryset()
return queryset

View File

@@ -1,12 +1,12 @@
# coding: utf-8
#
from django.shortcuts import get_object_or_404
from orgs.mixins.api import OrgBulkModelViewSet
from rest_framework.decorators import action
from rest_framework.response import Response
from common.tree import TreeNodeSerializer
from common.mixins.views import SuggestionMixin
from common.mixins.api import SuggestionMixin
from ..hands import IsOrgAdminOrAppUser
from .. import serializers
from ..models import Application

View File

@@ -1,13 +1,21 @@
from urllib.parse import urlencode, parse_qsl
from django.utils.translation import ugettext as _
from rest_framework.generics import get_object_or_404
from common.tree import TreeNode
from orgs.models import Organization
from assets.models import SystemUser
from applications.utils import KubernetesClient, KubernetesTree
from perms.utils.application.permission import get_application_system_user_ids
from ..models import Application
__all__ = ['SerializeApplicationToTreeNodeMixin']
class SerializeApplicationToTreeNodeMixin:
@staticmethod
def filter_organizations(applications):
organization_ids = set(applications.values_list('org_id', flat=True))
@@ -31,23 +39,47 @@ class SerializeApplicationToTreeNodeMixin:
})
return node
def serialize_applications_with_org(self, applications):
root_node = self.create_root_node()
tree_nodes = [root_node]
organizations = self.filter_organizations(applications)
def serialize_applications_with_org(self, applications, tree_id, parent_info, user):
tree_nodes = []
if not applications:
return tree_nodes
for i, org in enumerate(organizations):
# 组织节点
org_node = org.as_tree_node(pid=root_node.id)
tree_nodes.append(org_node)
org_applications = applications.filter(org_id=org.id)
count = org_applications.count()
org_node.name += '({})'.format(count)
if not tree_id:
root_node = self.create_root_node()
tree_nodes.append(root_node)
organizations = self.filter_organizations(applications)
for i, org in enumerate(organizations):
tree_id = urlencode({'org_id': str(org.id)})
apps = applications.filter(org_id=org.id)
# 组织节点
org_node = org.as_tree_node(oid=tree_id, pid=root_node.id)
org_node.name += '({})'.format(apps.count())
tree_nodes.append(org_node)
category_type_nodes = Application.create_category_type_tree_nodes(
apps, tree_id, show_empty=False
)
tree_nodes += category_type_nodes
# 各应用节点
apps_nodes = Application.create_tree_nodes(
queryset=org_applications, root_node=org_node,
show_empty=False
)
tree_nodes += apps_nodes
for app in apps:
app_node = app.as_tree_node(tree_id, is_luna=True)
tree_nodes.append(app_node)
return tree_nodes
parent_info = dict(parse_qsl(parent_info))
pod_name = parent_info.get('pod')
app_id = parent_info.get('app_id')
namespace = parent_info.get('namespace')
system_user_id = parent_info.get('system_user_id')
if app_id and not any([pod_name, namespace, system_user_id]):
app = get_object_or_404(Application, id=app_id)
system_user_ids = get_application_system_user_ids(user, app)
system_users = SystemUser.objects.filter(id__in=system_user_ids).order_by('priority')
for system_user in system_users:
system_user_node = KubernetesTree(tree_id).as_system_user_tree_node(
system_user, parent_info
)
tree_nodes.append(system_user_node)
return tree_nodes
tree_nodes = KubernetesTree(tree_id).async_tree_node(parent_info)
return tree_nodes

View File

@@ -17,9 +17,11 @@ class AppCategory(TextChoices):
class AppType(TextChoices):
# db category
mysql = 'mysql', 'MySQL'
redis = 'redis', 'Redis'
oracle = 'oracle', 'Oracle'
pgsql = 'postgresql', 'PostgreSQL'
mariadb = 'mariadb', 'MariaDB'
sqlserver = 'sqlserver', 'SQLServer'
# remote-app category
chrome = 'chrome', 'Chrome'
@@ -33,7 +35,9 @@ class AppType(TextChoices):
@classmethod
def category_types_mapper(cls):
return {
AppCategory.db: [cls.mysql, cls.oracle, cls.pgsql, cls.mariadb],
AppCategory.db: [
cls.mysql, cls.oracle, cls.redis, cls.pgsql, cls.mariadb, cls.sqlserver
],
AppCategory.remote_app: [cls.chrome, cls.mysql_workbench, cls.vmware_client, cls.custom],
AppCategory.cloud: [cls.k8s]
}
@@ -61,7 +65,3 @@ class AppType(TextChoices):
@classmethod
def cloud_types(cls):
return [tp.value for tp in cls.category_types_mapper()[AppCategory.cloud]]

View File

@@ -0,0 +1,23 @@
# Generated by Django 3.1.13 on 2021-10-14 14:09
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('applications', '0011_auto_20210826_1759'),
]
operations = [
migrations.AlterField(
model_name='account',
name='username',
field=models.CharField(blank=True, db_index=True, max_length=128, verbose_name='Username'),
),
migrations.AlterField(
model_name='historicalaccount',
name='username',
field=models.CharField(blank=True, db_index=True, max_length=128, verbose_name='Username'),
),
]

View File

@@ -0,0 +1,17 @@
# Generated by Django 3.1.13 on 2021-10-26 09:11
from django.db import migrations
class Migration(migrations.Migration):
dependencies = [
('applications', '0012_auto_20211014_2209'),
]
operations = [
migrations.AlterModelOptions(
name='application',
options={'ordering': ('name',), 'verbose_name': 'Application'},
),
]

View File

@@ -0,0 +1,18 @@
# Generated by Django 3.1.12 on 2021-11-05 08:05
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('applications', '0013_auto_20211026_1711'),
]
operations = [
migrations.AlterField(
model_name='application',
name='type',
field=models.CharField(choices=[('mysql', 'MySQL'), ('oracle', 'Oracle'), ('postgresql', 'PostgreSQL'), ('mariadb', 'MariaDB'), ('sqlserver', 'SQLServer'), ('chrome', 'Chrome'), ('mysql_workbench', 'MySQL Workbench'), ('vmware_client', 'vSphere Client'), ('custom', 'Custom'), ('k8s', 'Kubernetes')], max_length=16, verbose_name='Type'),
),
]

View File

@@ -0,0 +1,18 @@
# Generated by Django 3.1.13 on 2022-01-12 12:35
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('applications', '0014_auto_20211105_1605'),
]
operations = [
migrations.AlterField(
model_name='application',
name='type',
field=models.CharField(choices=[('mysql', 'MySQL'), ('redis', 'Redis'), ('oracle', 'Oracle'), ('postgresql', 'PostgreSQL'), ('mariadb', 'MariaDB'), ('sqlserver', 'SQLServer'), ('chrome', 'Chrome'), ('mysql_workbench', 'MySQL Workbench'), ('vmware_client', 'vSphere Client'), ('custom', 'Custom'), ('k8s', 'Kubernetes')], max_length=16, verbose_name='Type'),
),
]

View File

@@ -0,0 +1,24 @@
# Generated by Django 3.1.13 on 2022-01-18 06:55
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('applications', '0015_auto_20220112_2035'),
]
operations = [
migrations.AlterField(
model_name='account',
name='app',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.CASCADE, to='applications.application', verbose_name='Application'),
),
migrations.AlterField(
model_name='historicalaccount',
name='app',
field=models.ForeignKey(blank=True, db_constraint=False, null=True, on_delete=django.db.models.deletion.DO_NOTHING, related_name='+', to='applications.application', verbose_name='Application'),
),
]

View File

@@ -1,5 +1,6 @@
from django.db import models
from simple_history.models import HistoricalRecords
from django.db.models import F
from django.utils.translation import ugettext_lazy as _
from common.utils import lazyproperty
@@ -7,8 +8,12 @@ from assets.models.base import BaseUser
class Account(BaseUser):
app = models.ForeignKey('applications.Application', on_delete=models.CASCADE, null=True, verbose_name=_('Database'))
systemuser = models.ForeignKey('assets.SystemUser', on_delete=models.CASCADE, null=True, verbose_name=_("System user"))
app = models.ForeignKey(
'applications.Application', on_delete=models.CASCADE, null=True, verbose_name=_('Application')
)
systemuser = models.ForeignKey(
'assets.SystemUser', on_delete=models.CASCADE, null=True, verbose_name=_("System user")
)
version = models.IntegerField(default=1, verbose_name=_('Version'))
history = HistoricalRecords()
@@ -60,6 +65,10 @@ class Account(BaseUser):
def type(self):
return self.app.type
@lazyproperty
def attrs(self):
return self.app.attrs
@lazyproperty
def app_display(self):
return self.systemuser.name
@@ -84,5 +93,14 @@ class Account(BaseUser):
app = '*'
return '{}@{}'.format(username, app)
@classmethod
def get_queryset(cls):
queryset = cls.objects.all() \
.annotate(type=F('app__type')) \
.annotate(app_display=F('app__name')) \
.annotate(systemuser_display=F('systemuser__name')) \
.annotate(category=F('app__category'))
return queryset
def __str__(self):
return self.smart_name

View File

@@ -1,4 +1,5 @@
from collections import defaultdict
from urllib.parse import urlencode, parse_qsl
from django.db import models
from django.utils.translation import ugettext_lazy as _
@@ -7,6 +8,8 @@ from orgs.mixins.models import OrgModelMixin
from common.mixins import CommonModelMixin
from common.tree import TreeNode
from assets.models import Asset, SystemUser
from ..utils import KubernetesTree
from .. import const
@@ -16,6 +19,13 @@ class ApplicationTreeNodeMixin:
type: str
category: str
@staticmethod
def create_tree_id(pid, type, v):
i = dict(parse_qsl(pid))
i[type] = v
tree_id = urlencode(i)
return tree_id
@classmethod
def create_choice_node(cls, c, id_, pid, tp, opened=False, counts=None,
show_empty=True, show_count=True):
@@ -65,13 +75,13 @@ class ApplicationTreeNodeMixin:
return node
@classmethod
def create_category_tree_nodes(cls, root_node, counts=None, show_empty=True, show_count=True):
def create_category_tree_nodes(cls, pid, counts=None, show_empty=True, show_count=True):
nodes = []
categories = const.AppType.category_types_mapper().keys()
for category in categories:
i = root_node.id + '_' + category.value
i = cls.create_tree_id(pid, 'category', category.value)
node = cls.create_choice_node(
category, i, pid=root_node.id, tp='category',
category, i, pid=pid, tp='category',
counts=counts, opened=False, show_empty=show_empty,
show_count=show_count
)
@@ -81,17 +91,20 @@ class ApplicationTreeNodeMixin:
return nodes
@classmethod
def create_types_tree_nodes(cls, root_node, counts, show_empty=True, show_count=True):
def create_types_tree_nodes(cls, pid, counts, show_empty=True, show_count=True):
nodes = []
temp_pid = pid
type_category_mapper = const.AppType.type_category_mapper()
for tp in const.AppType.type_category_mapper().keys():
types = const.AppType.type_category_mapper().keys()
for tp in types:
category = type_category_mapper.get(tp)
pid = root_node.id + '_' + category.value
i = root_node.id + '_' + tp.value
pid = cls.create_tree_id(pid, 'category', category.value)
i = cls.create_tree_id(pid, 'type', tp.value)
node = cls.create_choice_node(
tp, i, pid, tp='type', counts=counts, opened=False,
show_empty=show_empty, show_count=show_count
)
pid = temp_pid
if not node:
continue
nodes.append(node)
@@ -109,40 +122,63 @@ class ApplicationTreeNodeMixin:
return counts
@classmethod
def create_tree_nodes(cls, queryset, root_node=None, show_empty=True, show_count=True):
def create_category_type_tree_nodes(cls, queryset, pid, show_empty=True, show_count=True):
counts = cls.get_tree_node_counts(queryset)
tree_nodes = []
# 类别的节点
tree_nodes += cls.create_category_tree_nodes(
pid, counts, show_empty=show_empty,
show_count=show_count
)
# 类型的节点
tree_nodes += cls.create_types_tree_nodes(
pid, counts, show_empty=show_empty,
show_count=show_count
)
return tree_nodes
@classmethod
def create_tree_nodes(cls, queryset, root_node=None, show_empty=True, show_count=True):
tree_nodes = []
# 根节点有可能是组织名称
if root_node is None:
root_node = cls.create_root_tree_node(queryset, show_count=show_count)
tree_nodes.append(root_node)
# 类别的节点
tree_nodes += cls.create_category_tree_nodes(
root_node, counts, show_empty=show_empty,
show_count=show_count
)
# 类型的节点
tree_nodes += cls.create_types_tree_nodes(
root_node, counts, show_empty=show_empty,
show_count=show_count
tree_nodes += cls.create_category_type_tree_nodes(
queryset, root_node.id, show_empty=show_empty, show_count=show_count
)
# 应用的节点
for app in queryset:
pid = root_node.id + '_' + app.type
tree_nodes.append(app.as_tree_node(pid))
node = app.as_tree_node(root_node.id)
tree_nodes.append(node)
return tree_nodes
def as_tree_node(self, pid):
def create_app_tree_pid(self, root_id):
pid = self.create_tree_id(root_id, 'category', self.category)
pid = self.create_tree_id(pid, 'type', self.type)
return pid
def as_tree_node(self, pid, is_luna=False):
if is_luna and self.type == const.AppType.k8s:
node = KubernetesTree(pid).as_tree_node(self)
else:
node = self._as_tree_node(pid)
return node
def _as_tree_node(self, pid):
icon_skin_category_mapper = {
'remote_app': 'chrome',
'db': 'database',
'cloud': 'cloud'
}
icon_skin = icon_skin_category_mapper.get(self.category, 'file')
pid = self.create_app_tree_pid(pid)
node = TreeNode(**{
'id': str(self.id),
'name': self.name,
@@ -180,6 +216,7 @@ class Application(CommonModelMixin, OrgModelMixin, ApplicationTreeNodeMixin):
)
class Meta:
verbose_name = _('Application')
unique_together = [('org_id', 'name')]
ordering = ('name',)

View File

@@ -1,6 +1,5 @@
# coding: utf-8
#
from rest_framework import serializers
from django.utils.translation import ugettext_lazy as _
@@ -9,7 +8,8 @@ from assets.serializers.base import AuthSerializerMixin
from common.drf.serializers import MethodSerializer
from .attrs import (
category_serializer_classes_mapping,
type_serializer_classes_mapping
type_serializer_classes_mapping,
type_secret_serializer_classes_mapping
)
from .. import models
from .. import const
@@ -23,17 +23,28 @@ __all__ = [
class AppSerializerMixin(serializers.Serializer):
attrs = MethodSerializer()
@property
def app(self):
if isinstance(self.instance, models.Application):
instance = self.instance
else:
instance = None
return instance
def get_attrs_serializer(self):
default_serializer = serializers.Serializer(read_only=True)
if isinstance(self.instance, models.Application):
_type = self.instance.type
_category = self.instance.category
instance = self.app
if instance:
_type = instance.type
_category = instance.category
else:
_type = self.context['request'].query_params.get('type')
_category = self.context['request'].query_params.get('category')
if _type:
serializer_class = type_serializer_classes_mapping.get(_type)
if isinstance(self, AppAccountSecretSerializer):
serializer_class = type_secret_serializer_classes_mapping.get(_type)
else:
serializer_class = type_serializer_classes_mapping.get(_type)
elif _category:
serializer_class = category_serializer_classes_mapping.get(_category)
else:
@@ -84,11 +95,13 @@ class MiniAppSerializer(serializers.ModelSerializer):
fields = AppSerializer.Meta.fields_mini
class AppAccountSerializer(AuthSerializerMixin, BulkOrgResourceModelSerializer):
class AppAccountSerializer(AppSerializerMixin, AuthSerializerMixin, BulkOrgResourceModelSerializer):
category = serializers.ChoiceField(label=_('Category'), choices=const.AppCategory.choices, read_only=True)
category_display = serializers.SerializerMethodField(label=_('Category display'))
type = serializers.ChoiceField(label=_('Type'), choices=const.AppType.choices, read_only=True)
type_display = serializers.SerializerMethodField(label=_('Type display'))
date_created = serializers.DateTimeField(label=_('Date created'), format="%Y/%m/%d %H:%M:%S", read_only=True)
date_updated = serializers.DateTimeField(label=_('Date updated'), format="%Y/%m/%d %H:%M:%S", read_only=True)
category_mapper = dict(const.AppCategory.choices)
type_mapper = dict(const.AppType.choices)
@@ -96,21 +109,31 @@ class AppAccountSerializer(AuthSerializerMixin, BulkOrgResourceModelSerializer):
class Meta:
model = models.Account
fields_mini = ['id', 'username', 'version']
fields_write_only = ['password', 'private_key']
fields_write_only = ['password', 'private_key', 'public_key', 'passphrase']
fields_other = ['date_created', 'date_updated']
fields_fk = ['systemuser', 'systemuser_display', 'app', 'app_display']
fields = fields_mini + fields_fk + fields_write_only + [
'type', 'type_display', 'category', 'category_display',
fields = fields_mini + fields_fk + fields_write_only + fields_other + [
'type', 'type_display', 'category', 'category_display', 'attrs'
]
extra_kwargs = {
'username': {'default': '', 'required': False},
'password': {'write_only': True},
'app_display': {'label': _('Application display')}
'app_display': {'label': _('Application display')},
'systemuser_display': {'label': _('System User')}
}
use_model_bulk_create = True
model_bulk_create_kwargs = {
'ignore_conflicts': True
}
@property
def app(self):
if isinstance(self.instance, models.Account):
instance = self.instance.app
else:
instance = None
return instance
def get_category_display(self, obj):
return self.category_mapper.get(obj.category)
@@ -130,8 +153,15 @@ class AppAccountSerializer(AuthSerializerMixin, BulkOrgResourceModelSerializer):
class AppAccountSecretSerializer(AppAccountSerializer):
class Meta(AppAccountSerializer.Meta):
fields_backup = [
'id', 'app_display', 'attrs', 'username', 'password', 'private_key',
'public_key', 'date_created', 'date_updated', 'version'
]
extra_kwargs = {
'password': {'write_only': False},
'private_key': {'write_only': False},
'public_key': {'write_only': False},
'app_display': {'label': _('Application display')},
'systemuser_display': {'label': _('System User')}
}

View File

@@ -1,7 +1,6 @@
from rest_framework import serializers
from django.utils.translation import ugettext_lazy as _
__all__ = ['CloudSerializer']

View File

@@ -10,7 +10,6 @@ from assets.models import Asset
logger = get_logger(__file__)
__all__ = ['RemoteAppSerializer']

View File

@@ -1,8 +1,10 @@
from .mysql import *
from .redis import *
from .mariadb import *
from .oracle import *
from .pgsql import *
from .sqlserver import *
from .chrome import *
from .mysql_workbench import *

View File

@@ -3,8 +3,7 @@ from rest_framework import serializers
from ..application_category import RemoteAppSerializer
__all__ = ['ChromeSerializer']
__all__ = ['ChromeSerializer', 'ChromeSecretSerializer']
class ChromeSerializer(RemoteAppSerializer):
@@ -17,10 +16,16 @@ class ChromeSerializer(RemoteAppSerializer):
max_length=128, allow_blank=True, required=False, label=_('Target URL'), allow_null=True,
)
chrome_username = serializers.CharField(
max_length=128, allow_blank=True, required=False, label=_('Username'), allow_null=True,
max_length=128, allow_blank=True, required=False, label=_('Chrome username'), allow_null=True,
)
chrome_password = serializers.CharField(
max_length=128, allow_blank=True, required=False, write_only=True, label=_('Password'),
max_length=128, allow_blank=True, required=False, write_only=True, label=_('Chrome password'),
allow_null=True
)
class ChromeSecretSerializer(ChromeSerializer):
chrome_password = serializers.CharField(
max_length=128, allow_blank=True, required=False, read_only=True, label=_('Chrome password'),
allow_null=True
)

View File

@@ -1,11 +1,9 @@
from django.utils.translation import ugettext_lazy as _
from rest_framework import serializers
from ..application_category import RemoteAppSerializer
__all__ = ['CustomSerializer']
__all__ = ['CustomSerializer', 'CustomSecretSerializer']
class CustomSerializer(RemoteAppSerializer):
@@ -18,10 +16,17 @@ class CustomSerializer(RemoteAppSerializer):
allow_null=True,
)
custom_username = serializers.CharField(
max_length=128, allow_blank=True, required=False, label=_('Username'),
max_length=128, allow_blank=True, required=False, label=_('Custom Username'),
allow_null=True,
)
custom_password = serializers.CharField(
max_length=128, allow_blank=True, required=False, write_only=True, label=_('Password'),
max_length=128, allow_blank=True, required=False, write_only=True, label=_('Custom password'),
allow_null=True,
)
class CustomSecretSerializer(RemoteAppSerializer):
custom_password = serializers.CharField(
max_length=128, allow_blank=True, required=False, read_only=True, label=_('Custom password'),
allow_null=True,
)

View File

@@ -1,6 +1,5 @@
from ..application_category import CloudSerializer
__all__ = ['K8SSerializer']

View File

@@ -1,6 +1,5 @@
from .mysql import MySQLSerializer
__all__ = ['MariaDBSerializer']

View File

@@ -3,13 +3,9 @@ from django.utils.translation import ugettext_lazy as _
from ..application_category import DBSerializer
__all__ = ['MySQLSerializer']
class MySQLSerializer(DBSerializer):
port = serializers.IntegerField(default=3306, label=_('Port'), allow_null=True)

View File

@@ -3,8 +3,7 @@ from rest_framework import serializers
from ..application_category import RemoteAppSerializer
__all__ = ['MySQLWorkbenchSerializer']
__all__ = ['MySQLWorkbenchSerializer', 'MySQLWorkbenchSecretSerializer']
class MySQLWorkbenchSerializer(RemoteAppSerializer):
@@ -27,10 +26,17 @@ class MySQLWorkbenchSerializer(RemoteAppSerializer):
allow_null=True,
)
mysql_workbench_username = serializers.CharField(
max_length=128, allow_blank=True, required=False, label=_('Username'),
max_length=128, allow_blank=True, required=False, label=_('Mysql workbench username'),
allow_null=True,
)
mysql_workbench_password = serializers.CharField(
max_length=128, allow_blank=True, required=False, write_only=True, label=_('Password'),
max_length=128, allow_blank=True, required=False, write_only=True, label=_('Mysql workbench password'),
allow_null=True,
)
class MySQLWorkbenchSecretSerializer(RemoteAppSerializer):
mysql_workbench_password = serializers.CharField(
max_length=128, allow_blank=True, required=False, read_only=True, label=_('Mysql workbench password'),
allow_null=True,
)

View File

@@ -3,10 +3,8 @@ from django.utils.translation import ugettext_lazy as _
from ..application_category import DBSerializer
__all__ = ['OracleSerializer']
class OracleSerializer(DBSerializer):
port = serializers.IntegerField(default=1521, label=_('Port'), allow_null=True)

View File

@@ -3,10 +3,8 @@ from django.utils.translation import ugettext_lazy as _
from ..application_category import DBSerializer
__all__ = ['PostgreSerializer']
class PostgreSerializer(DBSerializer):
port = serializers.IntegerField(default=5432, label=_('Port'), allow_null=True)

View File

@@ -0,0 +1,11 @@
from rest_framework import serializers
from django.utils.translation import ugettext_lazy as _
from ..application_category import DBSerializer
__all__ = ['RedisSerializer']
class RedisSerializer(DBSerializer):
port = serializers.IntegerField(default=6379, label=_('Port'), allow_null=True)

View File

@@ -0,0 +1,11 @@
from rest_framework import serializers
from django.utils.translation import ugettext_lazy as _
from ..application_category import DBSerializer
__all__ = ['SQLServerSerializer']
class SQLServerSerializer(DBSerializer):
port = serializers.IntegerField(default=1433, label=_('Port'), allow_null=True)

View File

@@ -3,8 +3,7 @@ from rest_framework import serializers
from ..application_category import RemoteAppSerializer
__all__ = ['VMwareClientSerializer']
__all__ = ['VMwareClientSerializer', 'VMwareClientSecretSerializer']
class VMwareClientSerializer(RemoteAppSerializer):
@@ -23,10 +22,17 @@ class VMwareClientSerializer(RemoteAppSerializer):
allow_null=True
)
vmware_username = serializers.CharField(
max_length=128, allow_blank=True, required=False, label=_('Username'),
max_length=128, allow_blank=True, required=False, label=_('Vmware username'),
allow_null=True
)
vmware_password = serializers.CharField(
max_length=128, allow_blank=True, required=False, write_only=True, label=_('Password'),
max_length=128, allow_blank=True, required=False, write_only=True, label=_('Vmware password'),
allow_null=True
)
class VMwareClientSecretSerializer(RemoteAppSerializer):
vmware_password = serializers.CharField(
max_length=128, allow_blank=True, required=False, read_only=True, label=_('Vmware password'),
allow_null=True
)

View File

@@ -1,15 +1,15 @@
from rest_framework import serializers
import copy
from applications import const
from . import application_category, application_type
__all__ = [
'category_serializer_classes_mapping',
'type_serializer_classes_mapping',
'get_serializer_class_by_application_type',
'type_secret_serializer_classes_mapping'
]
# define `attrs` field `category serializers mapping`
# ---------------------------------------------------
@@ -25,18 +25,37 @@ category_serializer_classes_mapping = {
type_serializer_classes_mapping = {
# db
const.AppType.mysql.value: application_type.MySQLSerializer,
const.AppType.redis.value: application_type.RedisSerializer,
const.AppType.mariadb.value: application_type.MariaDBSerializer,
const.AppType.oracle.value: application_type.OracleSerializer,
const.AppType.pgsql.value: application_type.PostgreSerializer,
# remote-app
const.AppType.chrome.value: application_type.ChromeSerializer,
const.AppType.mysql_workbench.value: application_type.MySQLWorkbenchSerializer,
const.AppType.vmware_client.value: application_type.VMwareClientSerializer,
const.AppType.custom.value: application_type.CustomSerializer,
const.AppType.sqlserver.value: application_type.SQLServerSerializer,
# cloud
const.AppType.k8s.value: application_type.K8SSerializer
}
remote_app_serializer_classes_mapping = {
# remote-app
const.AppType.chrome.value: application_type.ChromeSerializer,
const.AppType.mysql_workbench.value: application_type.MySQLWorkbenchSerializer,
const.AppType.vmware_client.value: application_type.VMwareClientSerializer,
const.AppType.custom.value: application_type.CustomSerializer
}
type_serializer_classes_mapping.update(remote_app_serializer_classes_mapping)
remote_app_secret_serializer_classes_mapping = {
# remote-app
const.AppType.chrome.value: application_type.ChromeSecretSerializer,
const.AppType.mysql_workbench.value: application_type.MySQLWorkbenchSecretSerializer,
const.AppType.vmware_client.value: application_type.VMwareClientSecretSerializer,
const.AppType.custom.value: application_type.CustomSecretSerializer
}
type_secret_serializer_classes_mapping = copy.deepcopy(type_serializer_classes_mapping)
type_secret_serializer_classes_mapping.update(remote_app_secret_serializer_classes_mapping)
def get_serializer_class_by_application_type(_application_type):
return type_serializer_classes_mapping.get(_application_type)

View File

@@ -0,0 +1,4 @@
# -*- coding: utf-8 -*-
#
from .kubernetes_util import *

View File

@@ -0,0 +1,186 @@
# -*- coding: utf-8 -*-
from urllib3.exceptions import MaxRetryError
from urllib.parse import urlencode
from kubernetes.client import api_client
from kubernetes.client.api import core_v1_api
from kubernetes import client
from kubernetes.client.exceptions import ApiException
from rest_framework.generics import get_object_or_404
from common.utils import get_logger
from common.tree import TreeNode
from assets.models import SystemUser
from .. import const
logger = get_logger(__file__)
class KubernetesClient:
def __init__(self, url, token):
self.url = url
self.token = token
def get_api(self):
configuration = client.Configuration()
configuration.host = self.url
configuration.verify_ssl = False
configuration.api_key = {"authorization": "Bearer " + self.token}
c = api_client.ApiClient(configuration=configuration)
api = core_v1_api.CoreV1Api(c)
return api
def get_namespace_list(self):
api = self.get_api()
namespace_list = []
for ns in api.list_namespace().items:
namespace_list.append(ns.metadata.name)
return namespace_list
def get_services(self):
api = self.get_api()
ret = api.list_service_for_all_namespaces(watch=False)
for i in ret.items:
print("%s \t%s \t%s \t%s \t%s \n" % (
i.kind, i.metadata.namespace, i.metadata.name, i.spec.cluster_ip, i.spec.ports))
def get_pod_info(self, namespace, pod):
api = self.get_api()
resp = api.read_namespaced_pod(namespace=namespace, name=pod)
return resp
def get_pod_logs(self, namespace, pod):
api = self.get_api()
log_content = api.read_namespaced_pod_log(pod, namespace, pretty=True, tail_lines=200)
return log_content
def get_pods(self):
api = self.get_api()
try:
ret = api.list_pod_for_all_namespaces(watch=False, _request_timeout=(3, 3))
except MaxRetryError:
logger.warning('Kubernetes connection timed out')
return
except ApiException as e:
if e.status == 401:
logger.warning('Kubernetes User not authenticated')
else:
logger.warning(e)
return
data = {}
for i in ret.items:
namespace = i.metadata.namespace
pod_info = {
'pod_name': i.metadata.name,
'containers': [j.name for j in i.spec.containers]
}
if namespace in data:
data[namespace].append(pod_info)
else:
data[namespace] = [pod_info, ]
return data
@staticmethod
def get_kubernetes_data(app_id, system_user_id):
from ..models import Application
app = get_object_or_404(Application, id=app_id)
system_user = get_object_or_404(SystemUser, id=system_user_id)
k8s = KubernetesClient(app.attrs['cluster'], system_user.token)
return k8s.get_pods()
class KubernetesTree:
def __init__(self, tree_id):
self.tree_id = tree_id
def as_tree_node(self, app):
pid = app.create_app_tree_pid(self.tree_id)
app_id = str(app.id)
parent_info = {'app_id': app_id}
node = self.create_tree_node(
app_id, pid, app.name, 'k8s', parent_info
)
return node
def as_system_user_tree_node(self, system_user, parent_info):
from ..models import ApplicationTreeNodeMixin
system_user_id = str(system_user.id)
username = system_user.username
username = username if username else '*'
name = f'{system_user.name}({username})'
pid = urlencode({'app_id': self.tree_id})
i = ApplicationTreeNodeMixin.create_tree_id(pid, 'system_user_id', system_user_id)
parent_info.update({'system_user_id': system_user_id})
node = self.create_tree_node(
i, pid, name, 'system_user', parent_info, icon='user-tie'
)
return node
def as_namespace_pod_tree_node(self, name, meta, type, counts=0, is_container=False):
from ..models import ApplicationTreeNodeMixin
i = ApplicationTreeNodeMixin.create_tree_id(self.tree_id, type, name)
meta.update({type: name})
name = name if is_container else f'{name}({counts})'
node = self.create_tree_node(
i, self.tree_id, name, type, meta, icon='cloud', is_container=is_container
)
return node
@staticmethod
def create_tree_node(id_, pid, name, identity, parent_info, icon='', is_container=False):
node = TreeNode(**{
'id': id_,
'name': name,
'title': name,
'pId': pid,
'isParent': not is_container,
'open': False,
'iconSkin': icon,
'parentInfo': urlencode(parent_info),
'meta': {
'type': 'application',
'data': {
'category': const.AppCategory.cloud,
'type': const.AppType.k8s,
'identity': identity
}
}
})
return node
def async_tree_node(self, parent_info):
pod_name = parent_info.get('pod')
app_id = parent_info.get('app_id')
namespace = parent_info.get('namespace')
system_user_id = parent_info.get('system_user_id')
tree_nodes = []
data = KubernetesClient.get_kubernetes_data(app_id, system_user_id)
if not data:
return tree_nodes
if pod_name:
for container in next(
filter(
lambda x: x['pod_name'] == pod_name, data[namespace]
)
)['containers']:
container_node = self.as_namespace_pod_tree_node(
container, parent_info, 'container', is_container=True
)
tree_nodes.append(container_node)
elif namespace:
for pod in data[namespace]:
pod_nodes = self.as_namespace_pod_tree_node(
pod['pod_name'], parent_info, 'pod', len(pod['containers'])
)
tree_nodes.append(pod_nodes)
elif system_user_id:
for namespace, pods in data.items():
namespace_node = self.as_namespace_pod_tree_node(
namespace, parent_info, 'namespace', len(pods)
)
tree_nodes.append(namespace_node)
return tree_nodes

View File

@@ -10,3 +10,4 @@ from .domain import *
from .cmd_filter import *
from .gathered_user import *
from .favorite_asset import *
from .backup import *

View File

@@ -26,6 +26,7 @@ class AccountFilterSet(BaseFilterSet):
qs = super().qs
qs = self.filter_username(qs)
qs = self.filter_node(qs)
qs = qs.distinct()
return qs
def filter_username(self, qs):
@@ -64,9 +65,7 @@ class AccountViewSet(OrgBulkModelViewSet):
permission_classes = (IsOrgAdmin,)
def get_queryset(self):
queryset = super().get_queryset() \
.annotate(ip=F('asset__ip')) \
.annotate(hostname=F('asset__hostname'))
queryset = AuthBook.get_queryset()
return queryset
@action(methods=['post'], detail=True, url_path='verify')

View File

@@ -21,6 +21,8 @@ class AdminUserViewSet(OrgBulkModelViewSet):
search_fields = filterset_fields
serializer_class = serializers.AdminUserSerializer
permission_classes = (IsOrgAdmin,)
ordering_fields = ('name',)
ordering = ('name', )
def get_queryset(self):
queryset = super().get_queryset().filter(type=SystemUser.Type.admin)

View File

@@ -2,12 +2,19 @@
#
from assets.api import FilterAssetByNodeMixin
from rest_framework.viewsets import ModelViewSet
from rest_framework.generics import RetrieveAPIView
from rest_framework.generics import RetrieveAPIView, ListAPIView
from django.shortcuts import get_object_or_404
from django.db.models import Q
from common.utils import get_logger, get_object_or_none
from common.permissions import IsOrgAdmin, IsOrgAdminOrAppUser, IsSuperUser
from common.mixins.views import SuggestionMixin
from common.mixins.api import SuggestionMixin
from users.models import User, UserGroup
from users.serializers import UserSerializer, UserGroupSerializer
from users.filters import UserFilter
from perms.models import AssetPermission
from perms.serializers import AssetPermissionSerializer
from perms.filters import AssetPermissionFilter
from orgs.mixins.api import OrgBulkModelViewSet
from orgs.mixins import generics
from ..models import Asset, Node, Platform
@@ -23,6 +30,8 @@ __all__ = [
'AssetViewSet', 'AssetPlatformRetrieveApi',
'AssetGatewayListApi', 'AssetPlatformViewSet',
'AssetTaskCreateApi', 'AssetsTaskCreateApi',
'AssetPermUserListApi', 'AssetPermUserPermissionsListApi',
'AssetPermUserGroupListApi', 'AssetPermUserGroupPermissionsListApi',
]
@@ -41,6 +50,7 @@ class AssetViewSet(SuggestionMixin, FilterAssetByNodeMixin, OrgBulkModelViewSet)
}
search_fields = ("hostname", "ip")
ordering_fields = ("hostname", "ip", "port", "cpu_cores")
ordering = ('hostname', )
serializer_classes = {
'default': serializers.AssetSerializer,
'suggestion': serializers.MiniAssetSerializer
@@ -137,7 +147,7 @@ class AssetTaskCreateApi(AssetsTaskMixin, generics.CreateAPIView):
asset = data['asset']
system_users = data.get('system_users')
if not system_users:
system_users = asset.get_all_systemusers()
system_users = asset.get_all_system_users()
if action == 'push_system_user':
task = push_system_users_a_asset.delay(system_users, asset=asset)
elif action == 'test_system_user':
@@ -170,3 +180,102 @@ class AssetGatewayListApi(generics.ListAPIView):
return []
queryset = asset.domain.gateways.filter(protocol='ssh')
return queryset
class BaseAssetPermUserOrUserGroupListApi(ListAPIView):
permission_classes = (IsOrgAdmin,)
def get_object(self):
asset_id = self.kwargs.get('pk')
asset = get_object_or_404(Asset, pk=asset_id)
return asset
def get_asset_related_perms(self):
asset = self.get_object()
nodes = asset.get_all_nodes(flat=True)
perms = AssetPermission.objects.filter(Q(assets=asset) | Q(nodes__in=nodes))
return perms
class AssetPermUserListApi(BaseAssetPermUserOrUserGroupListApi):
filterset_class = UserFilter
search_fields = ('username', 'email', 'name', 'id', 'source', 'role')
serializer_class = UserSerializer
def get_queryset(self):
perms = self.get_asset_related_perms()
users = User.objects.filter(
Q(assetpermissions__in=perms) | Q(groups__assetpermissions__in=perms)
).distinct()
return users
class AssetPermUserGroupListApi(BaseAssetPermUserOrUserGroupListApi):
serializer_class = UserGroupSerializer
def get_queryset(self):
perms = self.get_asset_related_perms()
user_groups = UserGroup.objects.filter(assetpermissions__in=perms).distinct()
return user_groups
class BaseAssetPermUserOrUserGroupPermissionsListApiMixin(generics.ListAPIView):
permission_classes = (IsOrgAdmin,)
model = AssetPermission
serializer_class = AssetPermissionSerializer
filterset_class = AssetPermissionFilter
search_fields = ('name',)
def get_object(self):
asset_id = self.kwargs.get('pk')
asset = get_object_or_404(Asset, pk=asset_id)
return asset
def filter_asset_related(self, queryset):
asset = self.get_object()
nodes = asset.get_all_nodes(flat=True)
perms = queryset.filter(Q(assets=asset) | Q(nodes__in=nodes))
return perms
def filter_queryset(self, queryset):
queryset = super().filter_queryset(queryset)
queryset = self.filter_asset_related(queryset)
return queryset
class AssetPermUserPermissionsListApi(BaseAssetPermUserOrUserGroupPermissionsListApiMixin):
def filter_queryset(self, queryset):
queryset = super().filter_queryset(queryset)
queryset = self.filter_user_related(queryset)
queryset = queryset.distinct()
return queryset
def filter_user_related(self, queryset):
user = self.get_perm_user()
user_groups = user.groups.all()
perms = queryset.filter(Q(users=user) | Q(user_groups__in=user_groups))
return perms
def get_perm_user(self):
user_id = self.kwargs.get('perm_user_id')
user = get_object_or_404(User, pk=user_id)
return user
class AssetPermUserGroupPermissionsListApi(BaseAssetPermUserOrUserGroupPermissionsListApiMixin):
def filter_queryset(self, queryset):
queryset = super().filter_queryset(queryset)
queryset = self.filter_user_group_related(queryset)
queryset = queryset.distinct()
return queryset
def filter_user_group_related(self, queryset):
user_group = self.get_perm_user_group()
perms = queryset.filter(user_groups=user_group)
return perms
def get_perm_user_group(self):
user_group_id = self.kwargs.get('perm_user_group_id')
user_group = get_object_or_404(UserGroup, pk=user_group_id)
return user_group

55
apps/assets/api/backup.py Normal file
View File

@@ -0,0 +1,55 @@
# -*- coding: utf-8 -*-
#
from rest_framework import status, mixins, viewsets
from rest_framework.response import Response
from common.permissions import IsOrgAdmin
from orgs.mixins.api import OrgBulkModelViewSet
from .. import serializers
from ..tasks import execute_account_backup_plan
from ..models import (
AccountBackupPlan, AccountBackupPlanExecution
)
__all__ = [
'AccountBackupPlanViewSet', 'AccountBackupPlanExecutionViewSet'
]
class AccountBackupPlanViewSet(OrgBulkModelViewSet):
model = AccountBackupPlan
filter_fields = ('name',)
search_fields = filter_fields
ordering_fields = ('name',)
ordering = ('name',)
serializer_class = serializers.AccountBackupPlanSerializer
permission_classes = (IsOrgAdmin,)
class AccountBackupPlanExecutionViewSet(
mixins.CreateModelMixin, mixins.ListModelMixin,
mixins.RetrieveModelMixin, viewsets.GenericViewSet
):
serializer_class = serializers.AccountBackupPlanExecutionSerializer
search_fields = ('trigger',)
filterset_fields = ('trigger', 'plan_id')
permission_classes = (IsOrgAdmin,)
def get_queryset(self):
queryset = AccountBackupPlanExecution.objects.all()
return queryset
def create(self, request, *args, **kwargs):
serializer = self.get_serializer(data=request.data)
serializer.is_valid(raise_exception=True)
pid = serializer.data.get('plan')
task = execute_account_backup_plan.delay(
pid=pid, trigger=AccountBackupPlanExecution.Trigger.manual
)
return Response({'task': task.id}, status=status.HTTP_201_CREATED)
def filter_queryset(self, queryset):
queryset = super().filter_queryset(queryset)
queryset = queryset.order_by('-date_start')
return queryset

View File

@@ -29,7 +29,7 @@ class CommandFilterViewSet(OrgBulkModelViewSet):
class CommandFilterRuleViewSet(OrgBulkModelViewSet):
model = CommandFilterRule
filterset_fields = ("content",)
filterset_fields = ('content',)
search_fields = filterset_fields
permission_classes = (IsOrgAdmin,)
serializer_class = serializers.CommandFilterRuleSerializer

View File

@@ -22,6 +22,8 @@ class DomainViewSet(OrgBulkModelViewSet):
search_fields = filterset_fields
permission_classes = (IsOrgAdminOrAppUser,)
serializer_class = serializers.DomainSerializer
ordering_fields = ('name',)
ordering = ('name', )
def get_serializer_class(self):
if self.request.query_params.get('gateway'):

View File

@@ -13,10 +13,11 @@ from django.db.models.signals import m2m_changed
from common.const.http import POST
from common.exceptions import SomeoneIsDoingThis
from common.const.signals import PRE_REMOVE, POST_REMOVE
from common.mixins.api import SuggestionMixin
from assets.models import Asset
from common.utils import get_logger, get_object_or_none
from common.tree import TreeNodeSerializer
from orgs.mixins.api import OrgModelViewSet
from orgs.mixins.api import OrgBulkModelViewSet
from orgs.mixins import generics
from orgs.utils import current_org
from ..hands import IsOrgAdmin
@@ -41,19 +42,13 @@ __all__ = [
]
class NodeViewSet(OrgModelViewSet):
class NodeViewSet(SuggestionMixin, OrgBulkModelViewSet):
model = Node
filterset_fields = ('value', 'key', 'id')
search_fields = ('value', )
permission_classes = (IsOrgAdmin,)
serializer_class = serializers.NodeSerializer
# 仅支持根节点指直接创建子节点下的节点需要通过children接口创建
def perform_create(self, serializer):
child_key = Node.org_root().get_next_child_key()
serializer.validated_data["key"] = child_key
serializer.save()
@action(methods=[POST], detail=False, url_path='check_assets_amount_task')
def check_assets_amount_task(self, request):
task = check_node_assets_amount_task.delay(current_org.id)

View File

@@ -1,14 +1,17 @@
# ~*~ coding: utf-8 ~*~
from django.shortcuts import get_object_or_404
from django.middleware import csrf
from rest_framework.response import Response
from common.utils import get_logger
from common.utils import get_logger, get_object_or_none
from common.utils.crypto import get_aes_crypto
from common.permissions import IsOrgAdmin, IsOrgAdminOrAppUser, IsValidUser
from orgs.mixins.api import OrgBulkModelViewSet
from orgs.mixins import generics
from common.mixins.views import SuggestionMixin
from common.mixins.api import SuggestionMixin
from orgs.utils import tmp_to_root_org
from ..models import SystemUser, Asset
from rest_framework.decorators import action
from ..models import SystemUser, CommandFilterRule
from .. import serializers
from ..serializers import SystemUserWithAuthInfoSerializer, SystemUserTempAuthSerializer
from ..tasks import (
@@ -16,7 +19,6 @@ from ..tasks import (
push_system_user_to_assets
)
logger = get_logger(__file__)
__all__ = [
'SystemUserViewSet', 'SystemUserAuthInfoApi', 'SystemUserAssetAuthInfoApi',
@@ -42,8 +44,36 @@ class SystemUserViewSet(SuggestionMixin, OrgBulkModelViewSet):
'default': serializers.SystemUserSerializer,
'suggestion': serializers.MiniSystemUserSerializer
}
ordering_fields = ('name', 'protocol', 'login_mode')
ordering = ('name', )
permission_classes = (IsOrgAdminOrAppUser,)
@action(methods=['get'], detail=False, url_path='su-from')
def su_from(self, request, *args, **kwargs):
""" API 获取可选的 su_from 系统用户"""
queryset = self.filter_queryset(self.get_queryset())
queryset = queryset.filter(
protocol=SystemUser.Protocol.ssh, login_mode=SystemUser.LOGIN_AUTO
)
return self.get_paginate_response_if_need(queryset)
@action(methods=['get'], detail=True, url_path='su-to')
def su_to(self, request, *args, **kwargs):
""" 获取系统用户的所有 su_to 系统用户 """
pk = kwargs.get('pk')
system_user = get_object_or_404(SystemUser, pk=pk)
queryset = system_user.su_to.all()
queryset = self.filter_queryset(queryset)
return self.get_paginate_response_if_need(queryset)
def get_paginate_response_if_need(self, queryset):
page = self.paginate_queryset(queryset)
if page is not None:
serializer = self.get_serializer(page, many=True)
return self.get_paginated_response(serializer.data)
serializer = self.get_serializer(queryset, many=True)
return Response(serializer.data)
class SystemUserAuthInfoApi(generics.RetrieveUpdateDestroyAPIView):
"""
@@ -64,17 +94,27 @@ class SystemUserTempAuthInfoApi(generics.CreateAPIView):
permission_classes = (IsValidUser,)
serializer_class = SystemUserTempAuthSerializer
def decrypt_data_if_need(self, data):
csrf_token = self.request.META.get('CSRF_COOKIE')
aes = get_aes_crypto(csrf_token, 'ECB')
password = data.get('password', '')
try:
data['password'] = aes.decrypt(password)
except:
pass
return data
def create(self, request, *args, **kwargs):
serializer = super().get_serializer(data=request.data)
serializer.is_valid(raise_exception=True)
pk = kwargs.get('pk')
user = self.request.user
data = serializer.validated_data
data = self.decrypt_data_if_need(serializer.validated_data)
instance_id = data.get('instance_id')
with tmp_to_root_org():
instance = get_object_or_404(SystemUser, pk=pk)
instance.set_temp_auth(instance_id, user, data)
instance.set_temp_auth(instance_id, self.request.user.id, data)
return Response(serializer.data, status=201)
@@ -91,7 +131,7 @@ class SystemUserAssetAuthInfoApi(generics.RetrieveAPIView):
asset_id = self.kwargs.get('asset_id')
user_id = self.request.query_params.get("user_id")
username = self.request.query_params.get("username")
instance.load_asset_more_auth(asset_id=asset_id, user_id=user_id, username=username)
instance.load_asset_more_auth(asset_id, username, user_id)
return instance
@@ -107,8 +147,8 @@ class SystemUserAppAuthInfoApi(generics.RetrieveAPIView):
instance = super().get_object()
app_id = self.kwargs.get('app_id')
user_id = self.request.query_params.get("user_id")
if user_id:
instance.load_app_more_auth(app_id, user_id)
username = self.request.query_params.get("username")
instance.load_app_more_auth(app_id, username, user_id)
return instance
@@ -165,9 +205,22 @@ class SystemUserCommandFilterRuleListApi(generics.ListAPIView):
return CommandFilterRuleSerializer
def get_queryset(self):
pk = self.kwargs.get('pk', None)
system_user = get_object_or_404(SystemUser, pk=pk)
return system_user.cmd_filter_rules
user_id = self.request.query_params.get('user_id')
user_group_id = self.request.query_params.get('user_group_id')
system_user_id = self.kwargs.get('pk', None)
system_user = get_object_or_none(SystemUser, pk=system_user_id)
if not system_user:
system_user_id = self.request.query_params.get('system_user_id')
asset_id = self.request.query_params.get('asset_id')
application_id = self.request.query_params.get('application_id')
rules = CommandFilterRule.get_queryset(
user_id=user_id,
user_group_id=user_group_id,
system_user_id=system_user_id,
asset_id=asset_id,
application_id=application_id
)
return rules
class SystemUserAssetsListView(generics.ListAPIView):

View File

@@ -15,7 +15,7 @@ def migrate_system_assets_to_authbook(apps, schema_editor):
system_users = system_user_model.objects.all()
for s in system_users:
while True:
systemuser_asset_relations = system_user_asset_model.objects.filter(systemuser=s)[:20]
systemuser_asset_relations = system_user_asset_model.objects.filter(systemuser=s)[:1000]
if not systemuser_asset_relations:
break
authbooks = []

View File

@@ -0,0 +1,24 @@
# Generated by Django 3.1.12 on 2021-10-12 08:42
from django.db import migrations
def migrate_platform_win2016(apps, schema_editor):
platform_model = apps.get_model("assets", "Platform")
win2016 = platform_model.objects.filter(name='Windows2016').first()
if not win2016:
print("Error: Not found Windows2016 platform")
return
win2016.meta = {"security": "any"}
win2016.save()
class Migration(migrations.Migration):
dependencies = [
('assets', '0076_delete_assetuser'),
]
operations = [
migrations.RunPython(migrate_platform_win2016)
]

View File

@@ -0,0 +1,38 @@
# Generated by Django 3.1.13 on 2021-10-14 14:09
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('assets', '0077_auto_20211012_1642'),
]
operations = [
migrations.AlterField(
model_name='adminuser',
name='username',
field=models.CharField(blank=True, db_index=True, max_length=128, verbose_name='Username'),
),
migrations.AlterField(
model_name='authbook',
name='username',
field=models.CharField(blank=True, db_index=True, max_length=128, verbose_name='Username'),
),
migrations.AlterField(
model_name='gateway',
name='username',
field=models.CharField(blank=True, db_index=True, max_length=128, verbose_name='Username'),
),
migrations.AlterField(
model_name='historicalauthbook',
name='username',
field=models.CharField(blank=True, db_index=True, max_length=128, verbose_name='Username'),
),
migrations.AlterField(
model_name='systemuser',
name='username',
field=models.CharField(blank=True, db_index=True, max_length=128, verbose_name='Username'),
),
]

View File

@@ -0,0 +1,33 @@
# Generated by Django 3.1.12 on 2021-11-02 11:22
from django.db import migrations
def create_internal_platform(apps, schema_editor):
model = apps.get_model("assets", "Platform")
db_alias = schema_editor.connection.alias
type_platforms = (
('Windows-RDP', 'Windows', {'security': 'rdp'}),
('Windows-TLS', 'Windows', {'security': 'tls'}),
)
for name, base, meta in type_platforms:
defaults = {'name': name, 'base': base, 'meta': meta, 'internal': True}
model.objects.using(db_alias).update_or_create(
name=name, defaults=defaults
)
win2016 = model.objects.filter(name='Windows2016').first()
if win2016:
win2016.internal = False
win2016.save(update_fields=['internal'])
class Migration(migrations.Migration):
dependencies = [
('assets', '0078_auto_20211014_2209'),
]
operations = [
migrations.RunPython(create_internal_platform)
]

View File

@@ -0,0 +1,24 @@
# Generated by Django 3.1.13 on 2021-11-04 05:47
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('assets', '0079_auto_20211102_1922'),
]
operations = [
migrations.AddField(
model_name='systemuser',
name='su_enabled',
field=models.BooleanField(default=False, verbose_name='User switch'),
),
migrations.AddField(
model_name='systemuser',
name='su_from',
field=models.ForeignKey(null=True, on_delete=django.db.models.deletion.SET_NULL, related_name='su_to', to='assets.systemuser', verbose_name='Switch from'),
),
]

View File

@@ -0,0 +1,18 @@
# Generated by Django 3.1.12 on 2021-11-05 08:05
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('assets', '0080_auto_20211104_1347'),
]
operations = [
migrations.AlterField(
model_name='systemuser',
name='protocol',
field=models.CharField(choices=[('ssh', 'SSH'), ('rdp', 'RDP'), ('telnet', 'Telnet'), ('vnc', 'VNC'), ('mysql', 'MySQL'), ('oracle', 'Oracle'), ('mariadb', 'MariaDB'), ('postgresql', 'PostgreSQL'), ('sqlserver', 'SQLServer'), ('k8s', 'K8S')], default='ssh', max_length=16, verbose_name='Protocol'),
),
]

View File

@@ -0,0 +1,74 @@
# Generated by Django 3.1.13 on 2021-12-09 06:40
from django.conf import settings
from django.db import migrations, models
def migrate_system_users_cmd_filters(apps, schema_editor):
system_user_model = apps.get_model("assets", "SystemUser")
cmd_filter_model = apps.get_model("assets", "CommandFilter")
su_through = system_user_model.cmd_filters.through
cf_through = cmd_filter_model.system_users.through
su_relation_objects = su_through.objects.all()
cf_relation_objects = [
cf_through(**{
'id': su_relation.id,
'systemuser_id': su_relation.systemuser_id,
'commandfilter_id': su_relation.commandfilter_id
})
for su_relation in su_relation_objects
]
cf_through.objects.bulk_create(cf_relation_objects)
class Migration(migrations.Migration):
dependencies = [
('applications', '0014_auto_20211105_1605'),
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
('assets', '0081_auto_20211105_1605'),
]
operations = [
migrations.AddField(
model_name='commandfilter',
name='applications',
field=models.ManyToManyField(blank=True, related_name='cmd_filters', to='applications.Application', verbose_name='Application'),
),
migrations.AddField(
model_name='commandfilter',
name='assets',
field=models.ManyToManyField(blank=True, related_name='cmd_filters', to='assets.Asset', verbose_name='Asset'),
),
migrations.AddField(
model_name='commandfilter',
name='system_users',
field=models.ManyToManyField(blank=True, related_name='cmd_filters_pre', to='assets.SystemUser', verbose_name='System user'),
),
migrations.AddField(
model_name='commandfilter',
name='users',
field=models.ManyToManyField(blank=True, related_name='cmd_filters', to=settings.AUTH_USER_MODEL, verbose_name='User'),
),
migrations.AddField(
model_name='commandfilter',
name='user_groups',
field=models.ManyToManyField(blank=True, related_name='cmd_filters', to='users.UserGroup', verbose_name='User group'),
),
migrations.AlterField(
model_name='systemuser',
name='cmd_filters',
field=models.ManyToManyField(blank=True, related_name='system_users_bak', to='assets.CommandFilter', verbose_name='Command filter'),
),
migrations.RunPython(migrate_system_users_cmd_filters),
migrations.RemoveField(
model_name='systemuser',
name='cmd_filters',
),
migrations.AlterField(
model_name='commandfilter',
name='system_users',
field=models.ManyToManyField(blank=True, related_name='cmd_filters', to='assets.SystemUser', verbose_name='System user'),
),
]

View File

@@ -0,0 +1,27 @@
# Generated by Django 3.1.13 on 2021-12-15 06:36
from django.db import migrations, models
OLD_ACTION_ALLOW = 1
NEW_ACTION_ALLOW = 9
def migrate_action(apps, schema_editor):
model = apps.get_model("assets", "CommandFilterRule")
model.objects.filter(action=OLD_ACTION_ALLOW).update(action=NEW_ACTION_ALLOW)
class Migration(migrations.Migration):
dependencies = [
('assets', '0082_auto_20211209_1440'),
]
operations = [
migrations.RunPython(migrate_action),
migrations.AlterField(
model_name='commandfilterrule',
name='action',
field=models.IntegerField(choices=[(0, 'Deny'), (9, 'Allow'), (2, 'Reconfirm')], default=0, verbose_name='Action'),
),
]

View File

@@ -0,0 +1,62 @@
# Generated by Django 3.1.13 on 2022-01-12 11:59
import common.db.encoder
from django.conf import settings
from django.db import migrations, models
import django.db.models.deletion
import uuid
class Migration(migrations.Migration):
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
('assets', '0083_auto_20211215_1436'),
]
operations = [
migrations.CreateModel(
name='AccountBackupPlan',
fields=[
('org_id', models.CharField(blank=True, db_index=True, default='', max_length=36, verbose_name='Organization')),
('name', models.CharField(max_length=128, verbose_name='Name')),
('is_periodic', models.BooleanField(default=False)),
('interval', models.IntegerField(blank=True, default=24, null=True, verbose_name='Cycle perform')),
('crontab', models.CharField(blank=True, max_length=128, null=True, verbose_name='Regularly perform')),
('created_by', models.CharField(blank=True, max_length=32, null=True, verbose_name='Created by')),
('date_created', models.DateTimeField(auto_now_add=True, null=True, verbose_name='Date created')),
('date_updated', models.DateTimeField(auto_now=True, verbose_name='Date updated')),
('id', models.UUIDField(default=uuid.uuid4, primary_key=True, serialize=False)),
('types', models.IntegerField(choices=[(255, 'All'), (1, 'Asset'), (2, 'Application')], default=255, verbose_name='Type')),
('comment', models.TextField(blank=True, verbose_name='Comment')),
('recipients', models.ManyToManyField(blank=True, related_name='recipient_escape_route_plans', to=settings.AUTH_USER_MODEL, verbose_name='Recipient')),
],
options={
'verbose_name': 'Account backup plan',
'ordering': ['name'],
'unique_together': {('name', 'org_id')},
},
),
migrations.AlterField(
model_name='systemuser',
name='protocol',
field=models.CharField(choices=[('ssh', 'SSH'), ('rdp', 'RDP'), ('telnet', 'Telnet'), ('vnc', 'VNC'), ('mysql', 'MySQL'), ('redis', 'Redis'), ('oracle', 'Oracle'), ('mariadb', 'MariaDB'), ('postgresql', 'PostgreSQL'), ('sqlserver', 'SQLServer'), ('k8s', 'K8S')], default='ssh', max_length=16, verbose_name='Protocol'),
),
migrations.CreateModel(
name='AccountBackupPlanExecution',
fields=[
('org_id', models.CharField(blank=True, db_index=True, default='', max_length=36, verbose_name='Organization')),
('id', models.UUIDField(default=uuid.uuid4, primary_key=True, serialize=False)),
('date_start', models.DateTimeField(auto_now_add=True, verbose_name='Date start')),
('timedelta', models.FloatField(default=0.0, null=True, verbose_name='Time')),
('plan_snapshot', models.JSONField(blank=True, default=dict, encoder=common.db.encoder.ModelJSONFieldEncoder, null=True, verbose_name='Account backup snapshot')),
('trigger', models.CharField(choices=[('manual', 'Manual trigger'), ('timing', 'Timing trigger')], default='manual', max_length=128, verbose_name='Trigger mode')),
('reason', models.CharField(blank=True, max_length=1024, null=True, verbose_name='Reason')),
('is_success', models.BooleanField(default=False, verbose_name='Is success')),
('plan', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, related_name='execution', to='assets.accountbackupplan', verbose_name='Account backup plan')),
],
options={
'verbose_name': 'Account backup execution',
},
),
]

View File

@@ -0,0 +1,18 @@
# Generated by Django 3.1.13 on 2022-02-08 02:57
from django.db import migrations, models
class Migration(migrations.Migration):
dependencies = [
('assets', '0084_auto_20220112_1959'),
]
operations = [
migrations.AddField(
model_name='commandfilterrule',
name='ignore_case',
field=models.BooleanField(default=True, verbose_name='Ignore case'),
),
]

View File

@@ -12,3 +12,4 @@ from .utils import *
from .authbook import *
from .gathered_user import *
from .favorite_asset import *
from .backup import *

View File

@@ -164,38 +164,7 @@ class Platform(models.Model):
# ordering = ('name',)
class Asset(AbsConnectivity, ProtocolsMixin, NodesRelationMixin, OrgModelMixin):
# Important
PLATFORM_CHOICES = (
('Linux', 'Linux'),
('Unix', 'Unix'),
('MacOS', 'MacOS'),
('BSD', 'BSD'),
('Windows', 'Windows'),
('Windows2016', 'Windows(2016)'),
('Other', 'Other'),
)
id = models.UUIDField(default=uuid.uuid4, primary_key=True)
ip = models.CharField(max_length=128, verbose_name=_('IP'), db_index=True)
hostname = models.CharField(max_length=128, verbose_name=_('Hostname'))
protocol = models.CharField(max_length=128, default=ProtocolsMixin.Protocol.ssh,
choices=ProtocolsMixin.Protocol.choices,
verbose_name=_('Protocol'))
port = models.IntegerField(default=22, verbose_name=_('Port'))
protocols = models.CharField(max_length=128, default='ssh/22', blank=True, verbose_name=_("Protocols"))
platform = models.ForeignKey(Platform, default=Platform.default, on_delete=models.PROTECT, verbose_name=_("Platform"), related_name='assets')
domain = models.ForeignKey("assets.Domain", null=True, blank=True, related_name='assets', verbose_name=_("Domain"), on_delete=models.SET_NULL)
nodes = models.ManyToManyField('assets.Node', default=default_node, related_name='assets', verbose_name=_("Nodes"))
is_active = models.BooleanField(default=True, verbose_name=_('Is active'))
# Auth
admin_user = models.ForeignKey('assets.SystemUser', on_delete=models.SET_NULL, null=True, verbose_name=_("Admin user"), related_name='admin_assets')
# Some information
public_ip = models.CharField(max_length=128, blank=True, null=True, verbose_name=_('Public IP'))
number = models.CharField(max_length=32, null=True, blank=True, verbose_name=_('Asset number'))
class AbsHardwareInfo(models.Model):
# Collect
vendor = models.CharField(max_length=64, null=True, blank=True, verbose_name=_('Vendor'))
model = models.CharField(max_length=54, null=True, blank=True, verbose_name=_('Model'))
@@ -214,6 +183,49 @@ class Asset(AbsConnectivity, ProtocolsMixin, NodesRelationMixin, OrgModelMixin):
os_arch = models.CharField(max_length=16, blank=True, null=True, verbose_name=_('OS arch'))
hostname_raw = models.CharField(max_length=128, blank=True, null=True, verbose_name=_('Hostname raw'))
class Meta:
abstract = True
@property
def cpu_info(self):
info = ""
if self.cpu_model:
info += self.cpu_model
if self.cpu_count and self.cpu_cores:
info += "{}*{}".format(self.cpu_count, self.cpu_cores)
return info
@property
def hardware_info(self):
if self.cpu_count:
return '{} Core {} {}'.format(
self.cpu_vcpus or self.cpu_count * self.cpu_cores,
self.memory, self.disk_total
)
else:
return ''
class Asset(AbsConnectivity, AbsHardwareInfo, ProtocolsMixin, NodesRelationMixin, OrgModelMixin):
id = models.UUIDField(default=uuid.uuid4, primary_key=True)
ip = models.CharField(max_length=128, verbose_name=_('IP'), db_index=True)
hostname = models.CharField(max_length=128, verbose_name=_('Hostname'))
protocol = models.CharField(max_length=128, default=ProtocolsMixin.Protocol.ssh,
choices=ProtocolsMixin.Protocol.choices, verbose_name=_('Protocol'))
port = models.IntegerField(default=22, verbose_name=_('Port'))
protocols = models.CharField(max_length=128, default='ssh/22', blank=True, verbose_name=_("Protocols"))
platform = models.ForeignKey(Platform, default=Platform.default, on_delete=models.PROTECT, verbose_name=_("Platform"), related_name='assets')
domain = models.ForeignKey("assets.Domain", null=True, blank=True, related_name='assets', verbose_name=_("Domain"), on_delete=models.SET_NULL)
nodes = models.ManyToManyField('assets.Node', default=default_node, related_name='assets', verbose_name=_("Nodes"))
is_active = models.BooleanField(default=True, verbose_name=_('Is active'))
# Auth
admin_user = models.ForeignKey('assets.SystemUser', on_delete=models.SET_NULL, null=True, verbose_name=_("Admin user"), related_name='admin_assets')
# Some information
public_ip = models.CharField(max_length=128, blank=True, null=True, verbose_name=_('Public IP'))
number = models.CharField(max_length=32, null=True, blank=True, verbose_name=_('Asset number'))
labels = models.ManyToManyField('assets.Label', blank=True, related_name='assets', verbose_name=_("Labels"))
created_by = models.CharField(max_length=128, null=True, blank=True, verbose_name=_('Created by'))
date_created = models.DateTimeField(auto_now_add=True, null=True, blank=True, verbose_name=_('Date created'))
@@ -269,25 +281,6 @@ class Asset(AbsConnectivity, ProtocolsMixin, NodesRelationMixin, OrgModelMixin):
def is_support_ansible(self):
return self.has_protocol('ssh') and self.platform_base not in ("Other",)
@property
def cpu_info(self):
info = ""
if self.cpu_model:
info += self.cpu_model
if self.cpu_count and self.cpu_cores:
info += "{}*{}".format(self.cpu_count, self.cpu_cores)
return info
@property
def hardware_info(self):
if self.cpu_count:
return '{} Core {} {}'.format(
self.cpu_vcpus or self.cpu_count * self.cpu_cores,
self.memory, self.disk_total
)
else:
return ''
def get_auth_info(self):
if not self.admin_user:
return {}
@@ -306,6 +299,12 @@ class Asset(AbsConnectivity, ProtocolsMixin, NodesRelationMixin, OrgModelMixin):
names.append(n.full_value)
return names
def labels_display(self):
names = []
for n in self.labels.all():
names.append(n.name + ':' + n.value)
return names
def as_node(self):
from .node import Node
fake_node = Node()
@@ -345,7 +344,7 @@ class Asset(AbsConnectivity, ProtocolsMixin, NodesRelationMixin, OrgModelMixin):
tree_node = TreeNode(**data)
return tree_node
def get_all_systemusers(self):
def get_all_system_users(self):
from .user import SystemUser
system_user_ids = SystemUser.assets.through.objects.filter(asset=self)\
.values_list('systemuser_id', flat=True)

View File

@@ -2,6 +2,7 @@
#
from django.db import models
from django.db.models import F
from django.utils.translation import ugettext_lazy as _
from simple_history.models import HistoricalRecords
@@ -94,25 +95,36 @@ class AuthBook(BaseUser, AbsConnectivity):
i.private_key = self.private_key
i.public_key = self.public_key
i.comment = 'Update triggered by account {}'.format(self.id)
i.save(update_fields=['password', 'private_key', 'public_key'])
# 不触发post_save信号
self.__class__.objects.bulk_update(matched, fields=['password', 'private_key', 'public_key'])
def remove_asset_admin_user_if_need(self):
if not self.asset or not self.asset.admin_user:
if not self.asset or not self.systemuser:
return
if not self.systemuser.is_admin_user:
if not self.systemuser.is_admin_user or self.asset.admin_user != self.systemuser:
return
logger.debug('Remove asset admin user: {} {}'.format(self.asset, self.systemuser))
self.asset.admin_user = None
self.asset.save()
logger.debug('Remove asset admin user: {} {}'.format(self.asset, self.systemuser))
def update_asset_admin_user_if_need(self):
if not self.systemuser or not self.systemuser.is_admin_user:
if not self.asset or not self.systemuser:
return
if not self.asset or self.asset.admin_user == self.systemuser:
if not self.systemuser.is_admin_user or self.asset.admin_user == self.systemuser:
return
logger.debug('Update asset admin user: {} {}'.format(self.asset, self.systemuser))
self.asset.admin_user = self.systemuser
self.asset.save()
logger.debug('Update asset admin user: {} {}'.format(self.asset, self.systemuser))
@classmethod
def get_queryset(cls):
queryset = cls.objects.all() \
.annotate(ip=F('asset__ip')) \
.annotate(hostname=F('asset__hostname')) \
.annotate(platform=F('asset__platform__name')) \
.annotate(protocols=F('asset__protocols'))
return queryset
def __str__(self):
return self.smart_name

View File

@@ -0,0 +1,145 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
#
import uuid
from celery import current_task
from django.db import models
from django.utils.translation import ugettext_lazy as _
from orgs.mixins.models import OrgModelMixin
from ops.mixin import PeriodTaskModelMixin
from common.utils import get_logger
from common.db.encoder import ModelJSONFieldEncoder
from common.db.models import BitOperationChoice
from common.mixins.models import CommonModelMixin
__all__ = ['AccountBackupPlan', 'AccountBackupPlanExecution', 'Type']
logger = get_logger(__file__)
class Type(BitOperationChoice):
NONE = 0
ALL = 0xff
Asset = 0b1
App = 0b1 << 1
DB_CHOICES = (
(ALL, _('All')),
(Asset, _('Asset')),
(App, _('Application'))
)
NAME_MAP = {
ALL: "all",
Asset: "asset",
App: "application"
}
NAME_MAP_REVERSE = {v: k for k, v in NAME_MAP.items()}
CHOICES = []
for i, j in DB_CHOICES:
CHOICES.append((NAME_MAP[i], j))
class AccountBackupPlan(CommonModelMixin, PeriodTaskModelMixin, OrgModelMixin):
id = models.UUIDField(default=uuid.uuid4, primary_key=True)
types = models.IntegerField(choices=Type.DB_CHOICES, default=Type.ALL, verbose_name=_('Type'))
recipients = models.ManyToManyField(
'users.User', related_name='recipient_escape_route_plans', blank=True,
verbose_name=_("Recipient")
)
comment = models.TextField(blank=True, verbose_name=_('Comment'))
def __str__(self):
return f'{self.name}({self.org_id})'
class Meta:
ordering = ['name']
unique_together = [('name', 'org_id')]
verbose_name = _('Account backup plan')
def get_register_task(self):
from ..tasks import execute_account_backup_plan
name = "account_backup_plan_period_{}".format(str(self.id)[:8])
task = execute_account_backup_plan.name
args = (str(self.id), AccountBackupPlanExecution.Trigger.timing)
kwargs = {}
return name, task, args, kwargs
def to_attr_json(self):
return {
'name': self.name,
'is_periodic': self.is_periodic,
'interval': self.interval,
'crontab': self.crontab,
'org_id': self.org_id,
'created_by': self.created_by,
'types': Type.value_to_choices(self.types),
'recipients': {
str(recipient.id): (str(recipient), bool(recipient.secret_key))
for recipient in self.recipients.all()
}
}
def execute(self, trigger):
try:
hid = current_task.request.id
except AttributeError:
hid = str(uuid.uuid4())
execution = AccountBackupPlanExecution.objects.create(
id=hid, plan=self, plan_snapshot=self.to_attr_json(), trigger=trigger
)
return execution.start()
class AccountBackupPlanExecution(OrgModelMixin):
class Trigger(models.TextChoices):
manual = 'manual', _('Manual trigger')
timing = 'timing', _('Timing trigger')
id = models.UUIDField(default=uuid.uuid4, primary_key=True)
date_start = models.DateTimeField(
auto_now_add=True, verbose_name=_('Date start')
)
timedelta = models.FloatField(
default=0.0, verbose_name=_('Time'), null=True
)
plan_snapshot = models.JSONField(
encoder=ModelJSONFieldEncoder, default=dict,
blank=True, null=True, verbose_name=_('Account backup snapshot')
)
trigger = models.CharField(
max_length=128, default=Trigger.manual, choices=Trigger.choices,
verbose_name=_('Trigger mode')
)
reason = models.CharField(
max_length=1024, blank=True, null=True, verbose_name=_('Reason')
)
is_success = models.BooleanField(default=False, verbose_name=_('Is success'))
plan = models.ForeignKey(
'AccountBackupPlan', related_name='execution', on_delete=models.CASCADE,
verbose_name=_('Account backup plan')
)
class Meta:
verbose_name = _('Account backup execution')
@property
def types(self):
types = self.plan_snapshot.get('types')
return types
@property
def recipients(self):
recipients = self.plan_snapshot.get('recipients')
if not recipients:
return []
return recipients.values()
def start(self):
from ..task_handlers import ExecutionManager
manager = ExecutionManager(execution=self)
return manager.run()

View File

@@ -173,7 +173,7 @@ class AuthMixin:
class BaseUser(OrgModelMixin, AuthMixin):
id = models.UUIDField(default=uuid.uuid4, primary_key=True)
name = models.CharField(max_length=128, verbose_name=_('Name'))
username = models.CharField(max_length=128, blank=True, verbose_name=_('Username'), validators=[alphanumeric], db_index=True)
username = models.CharField(max_length=128, blank=True, verbose_name=_('Username'), db_index=True)
password = fields.EncryptCharField(max_length=256, blank=True, null=True, verbose_name=_('Password'))
private_key = fields.EncryptTextField(blank=True, null=True, verbose_name=_('SSH private key'))
public_key = fields.EncryptTextField(blank=True, null=True, verbose_name=_('SSH public key'))
@@ -185,10 +185,18 @@ class BaseUser(OrgModelMixin, AuthMixin):
ASSETS_AMOUNT_CACHE_KEY = "ASSET_USER_{}_ASSETS_AMOUNT"
ASSET_USER_CACHE_TIME = 600
APPS_AMOUNT_CACHE_KEY = "APP_USER_{}_APPS_AMOUNT"
APP_USER_CACHE_TIME = 600
def get_related_assets(self):
assets = self.assets.filter(org_id=self.org_id)
return assets
def get_related_apps(self):
from applications.models import Account
apps = Account.objects.filter(systemuser=self)
return apps
def get_username(self):
return self.username
@@ -201,6 +209,15 @@ class BaseUser(OrgModelMixin, AuthMixin):
cache.set(cache_key, cached, self.ASSET_USER_CACHE_TIME)
return cached
@property
def apps_amount(self):
cache_key = self.APPS_AMOUNT_CACHE_KEY.format(self.id)
cached = cache.get(cache_key)
if not cached:
cached = self.get_related_apps().count()
cache.set(cache_key, cached, self.APP_USER_CACHE_TIME)
return cached
def expire_assets_amount(self):
cache_key = self.ASSETS_AMOUNT_CACHE_KEY.format(self.id)
cache.delete(cache_key)

View File

@@ -4,12 +4,18 @@ import uuid
import re
from django.db import models
from django.db.models import Q
from django.core.validators import MinValueValidator, MaxValueValidator
from django.utils.translation import ugettext_lazy as _
from common.utils import lazyproperty
from users.models import User, UserGroup
from applications.models import Application
from ..models import SystemUser, Asset
from common.utils import lazyproperty, get_logger, get_object_or_none
from orgs.mixins.models import OrgModelMixin
logger = get_logger(__file__)
__all__ = [
'CommandFilter', 'CommandFilterRule'
@@ -19,11 +25,32 @@ __all__ = [
class CommandFilter(OrgModelMixin):
id = models.UUIDField(default=uuid.uuid4, primary_key=True)
name = models.CharField(max_length=64, verbose_name=_("Name"))
users = models.ManyToManyField(
'users.User', related_name='cmd_filters', blank=True,
verbose_name=_("User")
)
user_groups = models.ManyToManyField(
'users.UserGroup', related_name='cmd_filters', blank=True,
verbose_name=_("User group"),
)
assets = models.ManyToManyField(
'assets.Asset', related_name='cmd_filters', blank=True,
verbose_name=_("Asset")
)
system_users = models.ManyToManyField(
'assets.SystemUser', related_name='cmd_filters', blank=True,
verbose_name=_("System user"))
applications = models.ManyToManyField(
'applications.Application', related_name='cmd_filters', blank=True,
verbose_name=_("Application")
)
is_active = models.BooleanField(default=True, verbose_name=_('Is active'))
comment = models.TextField(blank=True, default='', verbose_name=_("Comment"))
date_created = models.DateTimeField(auto_now_add=True)
date_updated = models.DateTimeField(auto_now=True)
created_by = models.CharField(max_length=128, blank=True, default='', verbose_name=_('Created by'))
created_by = models.CharField(
max_length=128, blank=True, default='', verbose_name=_('Created by')
)
def __str__(self):
return self.name
@@ -45,15 +72,20 @@ class CommandFilterRule(OrgModelMixin):
class ActionChoices(models.IntegerChoices):
deny = 0, _('Deny')
allow = 1, _('Allow')
allow = 9, _('Allow')
confirm = 2, _('Reconfirm')
id = models.UUIDField(default=uuid.uuid4, primary_key=True)
filter = models.ForeignKey('CommandFilter', on_delete=models.CASCADE, verbose_name=_("Filter"), related_name='rules')
filter = models.ForeignKey(
'CommandFilter', on_delete=models.CASCADE, verbose_name=_("Filter"), related_name='rules'
)
type = models.CharField(max_length=16, default=TYPE_COMMAND, choices=TYPE_CHOICES, verbose_name=_("Type"))
priority = models.IntegerField(default=50, verbose_name=_("Priority"), help_text=_("1-100, the lower the value will be match first"),
validators=[MinValueValidator(1), MaxValueValidator(100)])
priority = models.IntegerField(
default=50, verbose_name=_("Priority"), help_text=_("1-100, the lower the value will be match first"),
validators=[MinValueValidator(1), MaxValueValidator(100)]
)
content = models.TextField(verbose_name=_("Content"), help_text=_("One line one command"))
ignore_case = models.BooleanField(default=True, verbose_name=_('Ignore case'))
action = models.IntegerField(default=ActionChoices.deny, choices=ActionChoices.choices, verbose_name=_("Action"))
# 动作: 附加字段
# - confirm: 命令复核人
@@ -71,28 +103,55 @@ class CommandFilterRule(OrgModelMixin):
verbose_name = _("Command filter rule")
@lazyproperty
def _pattern(self):
def pattern(self):
if self.type == 'command':
regex = []
content = self.content.replace('\r\n', '\n')
for cmd in content.split('\n'):
cmd = re.escape(cmd)
cmd = cmd.replace('\\ ', '\s+')
if cmd[-1].isalpha():
regex.append(r'\b{0}\b'.format(cmd))
else:
regex.append(r'\b{0}'.format(cmd))
s = r'{}'.format('|'.join(regex))
s = self.construct_command_regex(content=self.content)
else:
s = r'{0}'.format(self.content)
return s
@classmethod
def construct_command_regex(cls, content):
regex = []
content = content.replace('\r\n', '\n')
for _cmd in content.split('\n'):
cmd = re.sub(r'\s+', ' ', _cmd)
cmd = re.escape(cmd)
cmd = cmd.replace('\\ ', '\s+')
# 有空格就不能 铆钉单词了
if ' ' in _cmd:
regex.append(cmd)
continue
# 如果是单个字符
if cmd[-1].isalpha():
regex.append(r'\b{0}\b'.format(cmd))
else:
regex.append(r'\b{0}'.format(cmd))
s = r'{}'.format('|'.join(regex))
return s
@staticmethod
def compile_regex(regex, ignore_case):
try:
_pattern = re.compile(s)
except:
_pattern = ''
return _pattern
if ignore_case:
pattern = re.compile(regex, re.IGNORECASE)
else:
pattern = re.compile(regex)
except Exception as e:
error = _('The generated regular expression is incorrect: {}').format(str(e))
logger.error(error)
return False, error, None
return True, '', pattern
def match(self, data):
found = self._pattern.search(data)
succeed, error, pattern = self.compile_regex(self.pattern, self.ignore_case)
if not succeed:
return self.ACTION_UNKNOWN, ''
found = pattern.search(data)
if not found:
return self.ACTION_UNKNOWN, ''
@@ -125,3 +184,41 @@ class CommandFilterRule(OrgModelMixin):
ticket.create_process_map_and_node(self.reviewers.all())
ticket.open(applicant=session.user_obj)
return ticket
@classmethod
def get_queryset(cls, user_id=None, user_group_id=None, system_user_id=None,
asset_id=None, application_id=None, org_id=None):
user_groups = []
user = get_object_or_none(User, pk=user_id)
if user:
user_groups.extend(list(user.groups.all()))
user_group = get_object_or_none(UserGroup, pk=user_group_id)
if user_group:
org_id = user_group.org_id
user_groups.append(user_group)
system_user = get_object_or_none(SystemUser, pk=system_user_id)
asset = get_object_or_none(Asset, pk=asset_id)
application = get_object_or_none(Application, pk=application_id)
q = Q()
if user:
q |= Q(users=user)
if user_groups:
q |= Q(user_groups__in=set(user_groups))
if system_user:
org_id = system_user.org_id
q |= Q(system_users=system_user)
if asset:
org_id = asset.org_id
q |= Q(assets=asset)
if application:
org_id = application.org_id
q |= Q(applications=application)
if q:
cmd_filters = CommandFilter.objects.filter(q).filter(is_active=True)
if org_id:
cmd_filters = cmd_filters.filter(org_id=org_id)
rule_ids = cmd_filters.values_list('rules', flat=True)
rules = cls.objects.filter(id__in=rule_ids)
else:
rules = cls.objects.none()
return rules

View File

@@ -5,6 +5,7 @@
import logging
from django.db import models
from django.db.models import Q
from django.utils.translation import ugettext_lazy as _
from django.core.validators import MinValueValidator, MaxValueValidator
from django.core.cache import cache
@@ -29,9 +30,11 @@ class ProtocolMixin:
telnet = 'telnet', 'Telnet'
vnc = 'vnc', 'VNC'
mysql = 'mysql', 'MySQL'
redis = 'redis', 'Redis'
oracle = 'oracle', 'Oracle'
mariadb = 'mariadb', 'MariaDB'
postgresql = 'postgresql', 'PostgreSQL'
sqlserver = 'sqlserver', 'SQLServer'
k8s = 'k8s', 'K8S'
SUPPORT_PUSH_PROTOCOLS = [Protocol.ssh, Protocol.rdp]
@@ -43,7 +46,8 @@ class ProtocolMixin:
Protocol.rdp
]
APPLICATION_CATEGORY_DB_PROTOCOLS = [
Protocol.mysql, Protocol.oracle, Protocol.mariadb, Protocol.postgresql
Protocol.mysql, Protocol.redis, Protocol.oracle,
Protocol.mariadb, Protocol.postgresql, Protocol.sqlserver
]
APPLICATION_CATEGORY_CLOUD_PROTOCOLS = [
Protocol.k8s
@@ -103,16 +107,23 @@ class AuthMixin:
password = cache.get(key)
return password
def load_tmp_auth_if_has(self, asset_or_app_id, user):
if not asset_or_app_id or not user:
return
def _clean_auth_info_if_manual_login_mode(self):
if self.login_mode == self.LOGIN_MANUAL:
self.password = ''
self.private_key = ''
self.public_key = ''
def _load_tmp_auth_if_has(self, asset_or_app_id, user_id):
if self.login_mode != self.LOGIN_MANUAL:
return
auth = self.get_temp_auth(asset_or_app_id, user)
if not asset_or_app_id or not user_id:
return
auth = self.get_temp_auth(asset_or_app_id, user_id)
if not auth:
return
username = auth.get('username')
password = auth.get('password')
@@ -121,55 +132,15 @@ class AuthMixin:
if password:
self.password = password
def load_app_more_auth(self, app_id=None, user_id=None):
from users.models import User
def load_app_more_auth(self, app_id=None, username=None, user_id=None):
self._clean_auth_info_if_manual_login_mode()
# 加载临时认证信息
if self.login_mode == self.LOGIN_MANUAL:
self.password = ''
self.private_key = ''
if not user_id:
self._load_tmp_auth_if_has(app_id, user_id)
return
user = get_object_or_none(User, pk=user_id)
if not user:
return
self.load_tmp_auth_if_has(app_id, user)
def load_asset_special_auth(self, asset, username=''):
"""
"""
authbooks = list(AuthBook.objects.filter(asset=asset, systemuser=self))
if len(authbooks) == 0:
return None
elif len(authbooks) == 1:
authbook = authbooks[0]
else:
authbooks.sort(key=lambda x: 1 if x.username == username else 0, reverse=True)
authbook = authbooks[0]
authbook.load_auth()
self.password = authbook.password
self.private_key = authbook.private_key
self.public_key = authbook.public_key
def load_asset_more_auth(self, asset_id=None, username=None, user_id=None):
# 更新用户名
from users.models import User
if self.login_mode == self.LOGIN_MANUAL:
self.password = ''
self.private_key = ''
asset = None
if asset_id:
asset = get_object_or_none(Asset, pk=asset_id)
# 没有资产就没有必要继续了
if not asset:
logger.debug('Asset not found, pass')
return
user = None
if user_id:
user = get_object_or_none(User, pk=user_id)
_username = self.username
user = get_object_or_none(User, pk=user_id) if user_id else None
if self.username_same_with_user:
if user and not username:
_username = user.username
@@ -177,9 +148,65 @@ class AuthMixin:
_username = username
self.username = _username
def load_asset_special_auth(self, asset, username=''):
"""
AuthBook 的数据状态
| asset | systemuser | username |
1 | * | * | x |
2 | * | x | * |
当前 AuthBook 只有以上两种状态systemuser 与 username 不会并存。
正常的资产与系统用户关联产生的是第1种状态改密则产生第2种状态。改密之后
只有 username 而没有 systemuser 。
Freq: 关联同一资产的多个系统用户指定同一用户名时,修改用户密码会影响所有系统用户
这里有一个不对称的行为,同名系统用户密码覆盖
当有相同 username 的多个系统用户时,有改密动作之后,所有的同名系统用户都使用最后
一次改动,但如果没有发生过改密,同名系统用户使用的密码还是各自的。
"""
if username == '':
username = self.username
authbook = AuthBook.objects.filter(
asset=asset, username=username, systemuser__isnull=True
).order_by('-date_created').first()
if not authbook:
authbook = AuthBook.objects.filter(
asset=asset, systemuser=self
).order_by('-date_created').first()
if not authbook:
return None
authbook.load_auth()
self.password = authbook.password
self.private_key = authbook.private_key
self.public_key = authbook.public_key
def load_asset_more_auth(self, asset_id=None, username=None, user_id=None):
from users.models import User
self._clean_auth_info_if_manual_login_mode()
# 加载临时认证信息
if self.login_mode == self.LOGIN_MANUAL:
self._load_tmp_auth_if_has(asset_id, user_id)
return
# 更新用户名
user = get_object_or_none(User, pk=user_id) if user_id else None
if self.username_same_with_user:
if user and not username:
_username = user.username
else:
_username = username
self.username = _username
# 加载某个资产的特殊配置认证信息
self.load_asset_special_auth(asset, _username)
self.load_tmp_auth_if_has(asset_id, user)
asset = get_object_or_none(Asset, pk=asset_id) if asset_id else None
if not asset:
logger.debug('Asset not found, pass')
return
self.load_asset_special_auth(asset, self.username)
class SystemUser(ProtocolMixin, AuthMixin, BaseUser):
@@ -210,12 +237,14 @@ class SystemUser(ProtocolMixin, AuthMixin, BaseUser):
sudo = models.TextField(default='/bin/whoami', verbose_name=_('Sudo'))
shell = models.CharField(max_length=64, default='/bin/bash', verbose_name=_('Shell'))
login_mode = models.CharField(choices=LOGIN_MODE_CHOICES, default=LOGIN_AUTO, max_length=10, verbose_name=_('Login mode'))
cmd_filters = models.ManyToManyField('CommandFilter', related_name='system_users', verbose_name=_("Command filter"), blank=True)
sftp_root = models.CharField(default='tmp', max_length=128, verbose_name=_("SFTP Root"))
token = models.TextField(default='', verbose_name=_('Token'))
home = models.CharField(max_length=4096, default='', verbose_name=_('Home'), blank=True)
system_groups = models.CharField(default='', max_length=4096, verbose_name=_('System groups'), blank=True)
ad_domain = models.CharField(default='', max_length=256)
# linux su 命令 (switch user)
su_enabled = models.BooleanField(default=False, verbose_name=_('User switch'))
su_from = models.ForeignKey('self', on_delete=models.SET_NULL, related_name='su_to', null=True, verbose_name=_("Switch from"))
def __str__(self):
username = self.username
@@ -275,6 +304,21 @@ class SystemUser(ProtocolMixin, AuthMixin, BaseUser):
assets = Asset.objects.filter(id__in=asset_ids)
return assets
def add_related_assets(self, assets_or_ids):
self.assets.add(*tuple(assets_or_ids))
self.add_related_assets_to_su_from_if_need(assets_or_ids)
def add_related_assets_to_su_from_if_need(self, assets_or_ids):
if self.protocol not in [self.Protocol.ssh.value]:
return
if not self.su_enabled:
return
if not self.su_from:
return
if self.su_from.protocol != self.protocol:
return
self.su_from.assets.add(*tuple(assets_or_ids))
class Meta:
ordering = ['name']
unique_together = [('name', 'org_id')]

View File

@@ -0,0 +1,25 @@
from django.utils.translation import ugettext_lazy as _
from users.models import User
from common.tasks import send_mail_attachment_async
class AccountBackupExecutionTaskMsg(object):
subject = _('Notification of account backup route task results')
def __init__(self, name: str, user: User):
self.name = name
self.user = user
@property
def message(self):
name = self.name
if self.user.secret_key:
return _('{} - The account backup passage task has been completed. See the attachment for details').format(name)
return _("{} - The account backup passage task has been completed: the encryption password has not been set - "
"please go to personal information -> file encryption password to set the encryption password").format(name)
def publish(self, attachment_list=None):
send_mail_attachment_async.delay(
self.subject, self.message, [self.user.email], attachment_list
)

View File

@@ -11,3 +11,4 @@ from .cmd_filter import *
from .gathered_user import *
from .favorite_asset import *
from .account import *
from .backup import *

View File

@@ -6,16 +6,25 @@ from orgs.mixins.serializers import BulkOrgResourceModelSerializer
from .base import AuthSerializerMixin
from .utils import validate_password_contains_left_double_curly_bracket
from common.utils.encode import ssh_pubkey_gen
class AccountSerializer(AuthSerializerMixin, BulkOrgResourceModelSerializer):
ip = serializers.ReadOnlyField(label=_("IP"))
hostname = serializers.ReadOnlyField(label=_("Hostname"))
platform = serializers.ReadOnlyField(label=_("Platform"))
protocols = serializers.SerializerMethodField(label=_("Protocols"))
date_created = serializers.DateTimeField(
label=_('Date created'), format="%Y/%m/%d %H:%M:%S", read_only=True
)
date_updated = serializers.DateTimeField(
label=_('Date updated'), format="%Y/%m/%d %H:%M:%S", read_only=True
)
class Meta:
model = AuthBook
fields_mini = ['id', 'username', 'ip', 'hostname', 'version']
fields_write_only = ['password', 'private_key', "public_key"]
fields_mini = ['id', 'username', 'ip', 'hostname', 'platform', 'protocols', 'version']
fields_write_only = ['password', 'private_key', "public_key", 'passphrase']
fields_other = ['date_created', 'date_updated', 'connectivity', 'date_verified', 'comment']
fields_small = fields_mini + fields_write_only + fields_other
fields_fk = ['asset', 'systemuser', 'systemuser_display']
@@ -32,6 +41,24 @@ class AccountSerializer(AuthSerializerMixin, BulkOrgResourceModelSerializer):
}
ref_name = 'AssetAccountSerializer'
def _validate_gen_key(self, attrs):
private_key = attrs.get('private_key')
if not private_key:
return attrs
password = attrs.get('passphrase')
username = attrs.get('username')
public_key = ssh_pubkey_gen(private_key, password=password, username=username)
attrs['public_key'] = public_key
return attrs
def validate(self, attrs):
attrs = self._validate_gen_key(attrs)
return attrs
def get_protocols(self, v):
return v.protocols.replace(' ', ', ')
@classmethod
def setup_eager_loading(cls, queryset):
""" Perform necessary eager loading of data. """
@@ -45,6 +72,10 @@ class AccountSerializer(AuthSerializerMixin, BulkOrgResourceModelSerializer):
class AccountSecretSerializer(AccountSerializer):
class Meta(AccountSerializer.Meta):
fields_backup = [
'hostname', 'ip', 'platform', 'protocols', 'username', 'password',
'private_key', 'public_key', 'date_created', 'date_updated', 'version'
]
extra_kwargs = {
'password': {'write_only': False},
'private_key': {'write_only': False},

View File

@@ -10,7 +10,7 @@ from ..models import Asset, Node, Platform, SystemUser
__all__ = [
'AssetSerializer', 'AssetSimpleSerializer', 'MiniAssetSerializer',
'ProtocolsField', 'PlatformSerializer',
'AssetTaskSerializer', 'AssetsTaskSerializer', 'ProtocolsField'
'AssetTaskSerializer', 'AssetsTaskSerializer', 'ProtocolsField',
]
@@ -64,7 +64,12 @@ class AssetSerializer(BulkOrgResourceModelSerializer):
)
protocols = ProtocolsField(label=_('Protocols'), required=False, default=['ssh/22'])
domain_display = serializers.ReadOnlyField(source='domain.name', label=_('Domain name'))
nodes_display = serializers.ListField(child=serializers.CharField(), label=_('Nodes name'), required=False)
nodes_display = serializers.ListField(
child=serializers.CharField(), label=_('Nodes name'), required=False
)
labels_display = serializers.ListField(
child=serializers.CharField(), label=_('Labels name'), required=False, read_only=True
)
"""
资产的数据结构
@@ -74,34 +79,33 @@ class AssetSerializer(BulkOrgResourceModelSerializer):
model = Asset
fields_mini = ['id', 'hostname', 'ip', 'platform', 'protocols']
fields_small = fields_mini + [
'protocol', 'port', 'protocols', 'is_active', 'public_ip',
'comment',
'protocol', 'port', 'protocols', 'is_active',
'public_ip', 'number', 'comment',
]
hardware_fields = [
'number', 'vendor', 'model', 'sn', 'cpu_model', 'cpu_count',
fields_hardware = [
'vendor', 'model', 'sn', 'cpu_model', 'cpu_count',
'cpu_cores', 'cpu_vcpus', 'memory', 'disk_total', 'disk_info',
'os', 'os_version', 'os_arch', 'hostname_raw', 'hardware_info',
'connectivity', 'date_verified'
'os', 'os_version', 'os_arch', 'hostname_raw',
'cpu_info', 'hardware_info',
]
fields_fk = [
'domain', 'domain_display', 'platform', 'admin_user', 'admin_user_display'
]
fields_m2m = [
'nodes', 'nodes_display', 'labels',
'nodes', 'nodes_display', 'labels', 'labels_display',
]
read_only_fields = [
'connectivity', 'date_verified', 'cpu_info', 'hardware_info',
'created_by', 'date_created',
]
fields = fields_small + hardware_fields + fields_fk + fields_m2m + read_only_fields
extra_kwargs = {k: {'read_only': True} for k in hardware_fields}
extra_kwargs.update({
fields = fields_small + fields_hardware + fields_fk + fields_m2m + read_only_fields
extra_kwargs = {
'protocol': {'write_only': True},
'port': {'write_only': True},
'hardware_info': {'label': _('Hardware info'), 'read_only': True},
'org_name': {'label': _('Org name'), 'read_only': True},
'admin_user_display': {'label': _('Admin user display'), 'read_only': True},
})
'cpu_info': {'label': _('CPU info')},
}
def get_fields(self):
fields = super().get_fields()

View File

@@ -0,0 +1,52 @@
# -*- coding: utf-8 -*-
#
from django.utils.translation import ugettext as _
from rest_framework import serializers
from orgs.mixins.serializers import BulkOrgResourceModelSerializer
from ops.mixin import PeriodTaskSerializerMixin
from common.utils import get_logger
from .base import TypesField
from ..models import AccountBackupPlan, AccountBackupPlanExecution
logger = get_logger(__file__)
__all__ = ['AccountBackupPlanSerializer', 'AccountBackupPlanExecutionSerializer']
class AccountBackupPlanSerializer(PeriodTaskSerializerMixin, BulkOrgResourceModelSerializer):
types = TypesField(required=False, allow_null=True, label=_("Actions"))
class Meta:
model = AccountBackupPlan
fields = [
'id', 'name', 'is_periodic', 'interval', 'crontab', 'date_created',
'date_updated', 'created_by', 'periodic_display', 'comment',
'recipients', 'types'
]
extra_kwargs = {
'name': {'required': True},
'periodic_display': {'label': _('Periodic perform')},
'recipients': {'label': _('Recipient'), 'help_text': _(
'Currently only mail sending is supported'
)}
}
class AccountBackupPlanExecutionSerializer(serializers.ModelSerializer):
trigger_display = serializers.ReadOnlyField(
source='get_trigger_display', label=_('Trigger mode')
)
class Meta:
model = AccountBackupPlanExecution
fields = [
'id', 'date_start', 'timedelta', 'plan_snapshot', 'trigger', 'reason',
'is_success', 'plan', 'org_id', 'recipients', 'trigger_display'
]
read_only_fields = (
'id', 'date_start', 'timedelta', 'plan_snapshot', 'trigger', 'reason',
'is_success', 'org_id', 'recipients'
)

View File

@@ -1,10 +1,12 @@
# -*- coding: utf-8 -*-
#
from io import StringIO
from django.utils.translation import ugettext as _
from django.utils.translation import ugettext_lazy as _
from rest_framework import serializers
from common.utils import ssh_pubkey_gen, validate_ssh_private_key
from common.utils import ssh_pubkey_gen, ssh_private_key_gen, validate_ssh_private_key
from assets.models import Type
class AuthSerializer(serializers.ModelSerializer):
@@ -28,17 +30,28 @@ class AuthSerializer(serializers.ModelSerializer):
return self.instance
class AuthSerializerMixin:
class AuthSerializerMixin(serializers.ModelSerializer):
passphrase = serializers.CharField(
allow_blank=True, allow_null=True, required=False, max_length=512,
write_only=True, label=_('Key password')
)
def validate_password(self, password):
return password
def validate_private_key(self, private_key):
if not private_key:
return
password = self.initial_data.get("password")
valid = validate_ssh_private_key(private_key, password)
passphrase = self.initial_data.get('passphrase')
passphrase = passphrase if passphrase else None
valid = validate_ssh_private_key(private_key, password=passphrase)
if not valid:
raise serializers.ValidationError(_("private key invalid"))
raise serializers.ValidationError(_("private key invalid or passphrase error"))
private_key = ssh_private_key_gen(private_key, password=passphrase)
string_io = StringIO()
private_key.write_private_key(string_io)
private_key = string_io.getvalue()
return private_key
def validate_public_key(self, public_key):
@@ -50,6 +63,7 @@ class AuthSerializerMixin:
value = validated_data.get(field)
if not value:
validated_data.pop(field, None)
validated_data.pop('passphrase', None)
def create(self, validated_data):
self.clean_auth_fields(validated_data)
@@ -58,3 +72,24 @@ class AuthSerializerMixin:
def update(self, instance, validated_data):
self.clean_auth_fields(validated_data)
return super().update(instance, validated_data)
class TypesField(serializers.MultipleChoiceField):
def __init__(self, *args, **kwargs):
kwargs['choices'] = Type.CHOICES
super().__init__(*args, **kwargs)
def to_representation(self, value):
return Type.value_to_choices(value)
def to_internal_value(self, data):
if data is None:
return data
return Type.choices_to_value(data)
class ActionsDisplayField(TypesField):
def to_representation(self, value):
values = super().to_representation(value)
choices = dict(Type.CHOICES)
return [choices.get(i) for i in values]

View File

@@ -3,6 +3,7 @@
import re
from rest_framework import serializers
from django.utils.translation import ugettext_lazy as _
from ..models import CommandFilter, CommandFilterRule
from orgs.mixins.serializers import BulkOrgResourceModelSerializer
from orgs.utils import tmp_to_root_org
@@ -22,16 +23,14 @@ class CommandFilterSerializer(BulkOrgResourceModelSerializer):
'comment', 'created_by',
]
fields_fk = ['rules']
fields_m2m = ['system_users']
fields_m2m = ['users', 'user_groups', 'system_users', 'assets', 'applications']
fields = fields_small + fields_fk + fields_m2m
extra_kwargs = {
'rules': {'read_only': True},
'system_users': {'required': False},
'rules': {'read_only': True}
}
class CommandFilterRuleSerializer(BulkOrgResourceModelSerializer):
invalid_pattern = re.compile(r'[\.\*\+\[\\\?\{\}\^\$\|\(\)\#\<\>]')
type_display = serializers.ReadOnlyField(source='get_type_display')
action_display = serializers.ReadOnlyField(source='get_action_display')
@@ -39,13 +38,13 @@ class CommandFilterRuleSerializer(BulkOrgResourceModelSerializer):
model = CommandFilterRule
fields_mini = ['id']
fields_small = fields_mini + [
'type', 'type_display', 'content', 'priority',
'type', 'type_display', 'content', 'ignore_case', 'pattern', 'priority',
'action', 'action_display', 'reviewers',
'date_created', 'date_updated',
'comment', 'created_by',
]
fields_fk = ['filter']
fields = '__all__'
fields = fields_small + fields_fk
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
@@ -61,15 +60,17 @@ class CommandFilterRuleSerializer(BulkOrgResourceModelSerializer):
choices.pop(CommandFilterRule.ActionChoices.confirm, None)
action._choices = choices
# def validate_content(self, content):
# tp = self.initial_data.get("type")
# if tp == CommandFilterRule.TYPE_REGEX:
# return content
# if self.invalid_pattern.search(content):
# invalid_char = self.invalid_pattern.pattern.replace('\\', '')
# msg = _("Content should not be contain: {}").format(invalid_char)
# raise serializers.ValidationError(msg)
# return content
def validate_content(self, content):
tp = self.initial_data.get("type")
if tp == CommandFilterRule.TYPE_COMMAND:
regex = CommandFilterRule.construct_command_regex(content)
else:
regex = content
ignore_case = self.initial_data.get('ignore_case')
succeed, error, pattern = CommandFilterRule.compile_regex(regex, ignore_case)
if not succeed:
raise serializers.ValidationError(error)
return content
class CommandConfirmSerializer(serializers.Serializer):

View File

@@ -3,6 +3,7 @@
from rest_framework import serializers
from django.utils.translation import ugettext_lazy as _
from common.validators import alphanumeric
from orgs.mixins.serializers import BulkOrgResourceModelSerializer
from ..models import Domain, Gateway
from .base import AuthSerializerMixin
@@ -48,7 +49,7 @@ class GatewaySerializer(AuthSerializerMixin, BulkOrgResourceModelSerializer):
model = Gateway
fields_mini = ['id', 'name']
fields_write_only = [
'password', 'private_key', 'public_key',
'password', 'private_key', 'public_key', 'passphrase'
]
fields_small = fields_mini + fields_write_only + [
'username', 'ip', 'port', 'protocol',
@@ -59,6 +60,7 @@ class GatewaySerializer(AuthSerializerMixin, BulkOrgResourceModelSerializer):
fields_fk = ['domain']
fields = fields_small + fields_fk
extra_kwargs = {
'username': {"validators": [alphanumeric]},
'password': {'write_only': True},
'private_key': {"write_only": True},
'public_key': {"write_only": True},

View File

@@ -5,7 +5,6 @@ from django.utils.translation import ugettext as _
from orgs.mixins.serializers import BulkOrgResourceModelSerializer
from ..models import Asset, Node
__all__ = [
'NodeSerializer', "NodeAddChildrenSerializer",
"NodeAssetsSerializer", "NodeTaskSerializer",
@@ -17,6 +16,9 @@ class NodeSerializer(BulkOrgResourceModelSerializer):
value = serializers.CharField(
required=False, allow_blank=True, allow_null=True, label=_("value")
)
full_value = serializers.CharField(
required=False, allow_blank=True, allow_null=True, label=_("Full value")
)
class Meta:
model = Node
@@ -40,6 +42,19 @@ class NodeSerializer(BulkOrgResourceModelSerializer):
)
return data
def create(self, validated_data):
full_value = validated_data.get('full_value')
# 直接多层级创建
if full_value:
node = Node.create_node_by_full_value(full_value)
# 根据 value 在 root 下创建
else:
key = Node.org_root().get_next_child_key()
validated_data['key'] = key
node = Node.objects.create(**validated_data)
return node
class NodeAssetsSerializer(BulkOrgResourceModelSerializer):
assets = serializers.PrimaryKeyRelatedField(

View File

@@ -4,6 +4,7 @@ from django.db.models import Count
from common.mixins.serializers import BulkSerializerMixin
from common.utils import ssh_pubkey_gen
from common.validators import alphanumeric_re, alphanumeric_cn_re, alphanumeric_win_re
from orgs.mixins.serializers import BulkOrgResourceModelSerializer
from ..models import SystemUser, Asset
from .utils import validate_password_contains_left_double_curly_bracket
@@ -25,26 +26,32 @@ class SystemUserSerializer(AuthSerializerMixin, BulkOrgResourceModelSerializer):
auto_generate_key = serializers.BooleanField(initial=True, required=False, write_only=True)
type_display = serializers.ReadOnlyField(source='get_type_display', label=_('Type display'))
ssh_key_fingerprint = serializers.ReadOnlyField(label=_('SSH key fingerprint'))
applications_amount = serializers.IntegerField(
source='apps_amount', read_only=True, label=_('Apps amount')
)
class Meta:
model = SystemUser
fields_mini = ['id', 'name', 'username']
fields_write_only = ['password', 'public_key', 'private_key']
fields_write_only = ['password', 'public_key', 'private_key', 'passphrase']
fields_small = fields_mini + fields_write_only + [
'token', 'ssh_key_fingerprint',
'type', 'type_display', 'protocol', 'is_asset_protocol',
'login_mode', 'login_mode_display', 'priority',
'sudo', 'shell', 'sftp_root', 'home', 'system_groups', 'ad_domain',
'username_same_with_user', 'auto_push', 'auto_generate_key',
'su_enabled', 'su_from',
'date_created', 'date_updated', 'comment', 'created_by',
]
fields_m2m = ['cmd_filters', 'assets_amount', 'nodes']
fields_m2m = ['cmd_filters', 'assets_amount', 'applications_amount', 'nodes']
fields = fields_small + fields_m2m
extra_kwargs = {
'password': {
"write_only": True,
'trim_whitespace': False,
"validators": [validate_password_contains_left_double_curly_bracket]
},
'cmd_filters': {"required": False, 'label': _('Command filter')},
'public_key': {"write_only": True},
'private_key': {"write_only": True},
'token': {"write_only": True},
@@ -53,7 +60,8 @@ class SystemUserSerializer(AuthSerializerMixin, BulkOrgResourceModelSerializer):
'login_mode_display': {'label': _('Login mode display')},
'created_by': {'read_only': True},
'ad_domain': {'required': False, 'allow_blank': True, 'label': _('Ad domain')},
'is_asset_protocol': {'label': _('Is asset protocol')}
'is_asset_protocol': {'label': _('Is asset protocol')},
'su_from': {'help_text': _('Only ssh and automatic login system users are supported')}
}
def validate_auto_push(self, value):
@@ -97,16 +105,25 @@ class SystemUserSerializer(AuthSerializerMixin, BulkOrgResourceModelSerializer):
raise serializers.ValidationError(error)
def validate_username(self, username):
if username:
return username
login_mode = self.get_initial_value("login_mode")
protocol = self.get_initial_value("protocol")
username_same_with_user = self.get_initial_value("username_same_with_user")
if username:
if protocol == SystemUser.Protocol.telnet:
regx = alphanumeric_cn_re
elif protocol == SystemUser.Protocol.rdp:
regx = alphanumeric_win_re
else:
regx = alphanumeric_re
if not regx.match(username):
raise serializers.ValidationError(_('Special char not allowed'))
return username
username_same_with_user = self.get_initial_value("username_same_with_user")
if username_same_with_user:
return ''
if login_mode == SystemUser.LOGIN_AUTO and protocol != SystemUser.Protocol.vnc:
login_mode = self.get_initial_value("login_mode")
if login_mode == SystemUser.LOGIN_AUTO and protocol != SystemUser.Protocol.vnc \
and protocol != SystemUser.Protocol.redis:
msg = _('* Automatic login mode must fill in the username.')
raise serializers.ValidationError(msg)
return username
@@ -117,7 +134,8 @@ class SystemUserSerializer(AuthSerializerMixin, BulkOrgResourceModelSerializer):
return ''
return home
def validate_sftp_root(self, value):
@staticmethod
def validate_sftp_root(value):
if value in ['home', 'tmp']:
return value
if not value.startswith('/'):
@@ -125,7 +143,41 @@ class SystemUserSerializer(AuthSerializerMixin, BulkOrgResourceModelSerializer):
raise serializers.ValidationError(error)
return value
def validate_admin_user(self, attrs):
def validate_password(self, password):
super().validate_password(password)
auto_gen_key = self.get_initial_value('auto_generate_key', False)
private_key = self.get_initial_value('private_key')
login_mode = self.get_initial_value('login_mode')
if not self.instance and not auto_gen_key and not password and \
not private_key and login_mode == SystemUser.LOGIN_AUTO:
raise serializers.ValidationError(_("Password or private key required"))
return password
def validate_su_from(self, su_from: SystemUser):
# self: su enabled
su_enabled = self.get_initial_value('su_enabled', default=False)
if not su_enabled:
return
if not su_from:
error = _('This field is required.')
raise serializers.ValidationError(error)
# self: protocol ssh
protocol = self.get_initial_value('protocol', default=SystemUser.Protocol.ssh.value)
if protocol not in [SystemUser.Protocol.ssh.value]:
error = _('Only ssh protocol system users are allowed')
raise serializers.ValidationError(error)
# su_from: protocol same
if su_from.protocol != protocol:
error = _('The protocol must be consistent with the current user: {}').format(protocol)
raise serializers.ValidationError(error)
# su_from: login model auto
if su_from.login_mode != su_from.LOGIN_AUTO:
error = _('Only system users with automatic login are allowed')
raise serializers.ValidationError(error)
return su_from
def _validate_admin_user(self, attrs):
if self.instance:
tp = self.instance.type
else:
@@ -138,21 +190,10 @@ class SystemUserSerializer(AuthSerializerMixin, BulkOrgResourceModelSerializer):
attrs['auto_push'] = False
return attrs
def validate_password(self, password):
super().validate_password(password)
auto_gen_key = self.get_initial_value("auto_generate_key", False)
private_key = self.get_initial_value("private_key")
login_mode = self.get_initial_value("login_mode")
if not self.instance and not auto_gen_key and not password and \
not private_key and login_mode == SystemUser.LOGIN_AUTO:
raise serializers.ValidationError(_("Password or private key required"))
return password
def validate_gen_key(self, attrs):
username = attrs.get("username", "manual")
auto_gen_key = attrs.pop("auto_generate_key", False)
protocol = attrs.get("protocol")
def _validate_gen_key(self, attrs):
username = attrs.get('username', 'manual')
auto_gen_key = attrs.pop('auto_generate_key', False)
protocol = attrs.get('protocol')
if protocol not in SystemUser.SUPPORT_PUSH_PROTOCOLS:
return attrs
@@ -160,28 +201,44 @@ class SystemUserSerializer(AuthSerializerMixin, BulkOrgResourceModelSerializer):
# 自动生成
if auto_gen_key and not self.instance:
password = SystemUser.gen_password()
attrs["password"] = password
attrs['password'] = password
if protocol == SystemUser.Protocol.ssh:
private_key, public_key = SystemUser.gen_key(username)
attrs["private_key"] = private_key
attrs["public_key"] = public_key
attrs['private_key'] = private_key
attrs['public_key'] = public_key
# 如果设置了private key没有设置public key则生成
elif attrs.get("private_key", None):
private_key = attrs["private_key"]
password = attrs.get("password")
elif attrs.get('private_key'):
private_key = attrs['private_key']
password = attrs.get('password')
public_key = ssh_pubkey_gen(private_key, password=password, username=username)
attrs["public_key"] = public_key
attrs['public_key'] = public_key
return attrs
def _validate_login_mode(self, attrs):
if 'login_mode' in attrs:
login_mode = attrs['login_mode']
else:
login_mode = self.instance.login_mode if self.instance else SystemUser.LOGIN_AUTO
if login_mode == SystemUser.LOGIN_MANUAL:
attrs['password'] = ''
attrs['private_key'] = ''
attrs['public_key'] = ''
return attrs
def validate(self, attrs):
attrs = self.validate_admin_user(attrs)
attrs = self.validate_gen_key(attrs)
attrs = self._validate_admin_user(attrs)
attrs = self._validate_gen_key(attrs)
attrs = self._validate_login_mode(attrs)
return attrs
@classmethod
def setup_eager_loading(cls, queryset):
""" Perform necessary eager loading of data. """
queryset = queryset.annotate(assets_amount=Count("assets"))
queryset = queryset \
.annotate(assets_amount=Count("assets")) \
.prefetch_related('nodes', 'cmd_filters')
return queryset

View File

@@ -34,9 +34,13 @@ def on_authbook_post_delete(sender, instance, **kwargs):
@receiver(post_save, sender=AuthBook)
def on_authbook_post_create(sender, instance, **kwargs):
def on_authbook_post_create(sender, instance, created, **kwargs):
instance.sync_to_system_user_account()
instance.update_asset_admin_user_if_need()
if created:
pass
# # 不再自动更新资产管理用户,只允许用户手动指定。
# 只在创建时进行更新资产的管理用户
# instance.update_asset_admin_user_if_need()
@receiver(pre_save, sender=AuthBook)

View File

@@ -40,10 +40,6 @@ def expire_node_assets_mapping_for_memory(org_id):
root_org_id = Organization.ROOT_ID
# 当前进程清除(cache 数据)
logger.debug(
"Expire node assets id mapping from cache of org={}, pid={}"
"".format(org_id, os.getpid())
)
Node.expire_node_all_asset_ids_mapping_from_cache(org_id)
Node.expire_node_all_asset_ids_mapping_from_cache(root_org_id)
@@ -77,25 +73,14 @@ def on_node_asset_change(sender, instance, **kwargs):
def subscribe_node_assets_mapping_expire(sender, **kwargs):
logger.debug("Start subscribe for expire node assets id mapping from memory")
def keep_subscribe():
while True:
try:
subscribe = node_assets_mapping_for_memory_pub_sub.subscribe()
for message in subscribe.listen():
if message["type"] != "message":
continue
org_id = message['data'].decode()
root_org_id = Organization.ROOT_ID
Node.expire_node_all_asset_ids_mapping_from_memory(org_id)
Node.expire_node_all_asset_ids_mapping_from_memory(root_org_id)
logger.debug(
"Expire node assets id mapping from memory of org={}, pid={}"
"".format(str(org_id), os.getpid())
)
except Exception as e:
logger.exception(f'subscribe_node_assets_mapping_expire: {e}')
Node.expire_all_orgs_node_all_asset_ids_mapping_from_memory()
def handle_node_relation_change(org_id):
root_org_id = Organization.ROOT_ID
Node.expire_node_all_asset_ids_mapping_from_memory(org_id)
Node.expire_node_all_asset_ids_mapping_from_memory(root_org_id)
t = threading.Thread(target=keep_subscribe)
def keep_subscribe_node_assets_relation():
node_assets_mapping_for_memory_pub_sub.subscribe(handle_node_relation_change)
t = threading.Thread(target=keep_subscribe_node_assets_relation)
t.daemon = True
t.start()

View File

@@ -140,3 +140,5 @@ def on_system_user_update(instance: SystemUser, created, **kwargs):
logger.info("System user update signal recv: {}".format(instance))
assets = instance.assets.all().valid()
push_system_user_to_assets.delay(instance.id, [_asset.id for _asset in assets])
# add assets to su_from
instance.add_related_assets_to_su_from_if_need(assets)

View File

@@ -0,0 +1 @@
from .endpoint import *

View File

@@ -0,0 +1,228 @@
import os
import time
import pandas as pd
from collections import defaultdict, OrderedDict
from django.conf import settings
from django.utils.translation import ugettext_lazy as _
from rest_framework import serializers
from assets.models import AuthBook
from assets.serializers import AccountSecretSerializer
from assets.notifications import AccountBackupExecutionTaskMsg
from applications.models import Account
from applications.const import AppType
from applications.serializers import AppAccountSecretSerializer
from users.models import User
from common.utils import get_logger
from common.utils.timezone import local_now_display
from common.utils.file import encrypt_and_compress_zip_file
logger = get_logger(__file__)
PATH = os.path.join(os.path.dirname(settings.BASE_DIR), 'tmp')
class BaseAccountHandler:
@classmethod
def unpack_data(cls, serializer_data, data=None):
if data is None:
data = {}
for k, v in serializer_data.items():
if isinstance(v, OrderedDict):
cls.unpack_data(v, data)
else:
data[k] = v
return data
@classmethod
def get_header_fields(cls, serializer: serializers.Serializer):
try:
backup_fields = getattr(serializer, 'Meta').fields_backup
except AttributeError:
backup_fields = serializer.fields.keys()
header_fields = {}
for field in backup_fields:
v = serializer.fields[field]
if isinstance(v, serializers.Serializer):
_fields = cls.get_header_fields(v)
header_fields.update(_fields)
else:
header_fields[field] = v.label
return header_fields
@classmethod
def create_row(cls, account, serializer_cls, header_fields=None):
serializer = serializer_cls(account)
if not header_fields:
header_fields = cls.get_header_fields(serializer)
data = cls.unpack_data(serializer.data)
row_dict = {}
for field, header_name in header_fields.items():
row_dict[header_name] = data[field]
return row_dict
class AssetAccountHandler(BaseAccountHandler):
@staticmethod
def get_filename(plan_name):
filename = os.path.join(
PATH, f'{plan_name}-{_("Asset")}-{local_now_display()}-{time.time()}.xlsx'
)
return filename
@classmethod
def create_df(cls):
df_dict = defaultdict(list)
sheet_name = AuthBook._meta.verbose_name
accounts = AuthBook.get_queryset().select_related('systemuser')
if not accounts.first():
return df_dict
header_fields = cls.get_header_fields(AccountSecretSerializer(accounts.first()))
for account in accounts:
account.load_auth()
row = cls.create_row(account, AccountSecretSerializer, header_fields)
df_dict[sheet_name].append(row)
for k, v in df_dict.items():
df_dict[k] = pd.DataFrame(v)
logger.info('\n\033[33m- 共收集{}条资产账号\033[0m'.format(accounts.count()))
return df_dict
class AppAccountHandler(BaseAccountHandler):
@staticmethod
def get_filename(plan_name):
filename = os.path.join(
PATH, f'{plan_name}-{_("Application")}-{local_now_display()}-{time.time()}.xlsx'
)
return filename
@classmethod
def create_df(cls):
df_dict = defaultdict(list)
accounts = Account.get_queryset().select_related('systemuser')
for account in accounts:
account.load_auth()
app_type = account.type
sheet_name = AppType.get_label(app_type)
row = cls.create_row(account, AppAccountSecretSerializer)
df_dict[sheet_name].append(row)
for k, v in df_dict.items():
df_dict[k] = pd.DataFrame(v)
logger.info('\n\033[33m- 共收集{}条应用账号\033[0m'.format(accounts.count()))
return df_dict
handler_map = {
'asset': AssetAccountHandler,
'application': AppAccountHandler
}
class AccountBackupHandler:
def __init__(self, execution):
self.execution = execution
self.plan_name = self.execution.plan.name
self.is_frozen = False # 任务状态冻结标志
def create_excel(self):
logger.info(
'\n'
'\033[32m>>> 正在生成资产或应用相关备份信息文件\033[0m'
''
)
# Print task start date
time_start = time.time()
files = []
for account_type in self.execution.types:
handler = handler_map.get(account_type)
if not handler:
continue
df_dict = handler.create_df()
if not df_dict:
continue
filename = handler.get_filename(self.plan_name)
with pd.ExcelWriter(filename) as w:
for sheet, df in df_dict.items():
sheet = sheet.replace(' ', '-')
getattr(df, 'to_excel')(w, sheet_name=sheet, index=False)
files.append(filename)
timedelta = round((time.time() - time_start), 2)
logger.info('步骤完成: 用时 {}s'.format(timedelta))
return files
def send_backup_mail(self, files):
recipients = self.execution.plan_snapshot.get('recipients')
if not recipients:
return
if not files:
return
recipients = User.objects.filter(id__in=list(recipients))
logger.info(
'\n'
'\033[32m>>> 发送备份邮件\033[0m'
''
)
plan_name = self.plan_name
for user in recipients:
if not user.secret_key:
attachment_list = []
else:
password = user.secret_key.encode('utf8')
attachment = os.path.join(PATH, f'{plan_name}-{local_now_display()}-{time.time()}.zip')
encrypt_and_compress_zip_file(attachment, password, files)
attachment_list = [attachment, ]
AccountBackupExecutionTaskMsg(plan_name, user).publish(attachment_list)
logger.info('邮件已发送至{}({})'.format(user, user.email))
for file in files:
os.remove(file)
def step_perform_task_update(self, is_success, reason):
self.execution.reason = reason[:1024]
self.execution.is_success = is_success
self.execution.save()
logger.info('已完成对任务状态的更新')
def step_finished(self, is_success):
if is_success:
logger.info('任务执行成功')
else:
logger.error('任务执行失败')
def _run(self):
is_success = False
error = '-'
try:
files = self.create_excel()
self.send_backup_mail(files)
except Exception as e:
self.is_frozen = True
logger.error('任务执行被异常中断')
logger.info('下面打印发生异常的 Traceback 信息 : ')
logger.error(e, exc_info=True)
error = str(e)
else:
is_success = True
finally:
reason = error
self.step_perform_task_update(is_success, reason)
self.step_finished(is_success)
def run(self):
logger.info('任务开始: {}'.format(local_now_display()))
time_start = time.time()
try:
self._run()
except Exception as e:
logger.error('任务运行出现异常')
logger.error('下面显示异常 Traceback 信息: ')
logger.error(e, exc_info=True)
finally:
logger.info('\n任务结束: {}'.format(local_now_display()))
timedelta = round((time.time() - time_start), 2)
logger.info('用时: {}'.format(timedelta))

View File

@@ -0,0 +1,48 @@
# -*- coding: utf-8 -*-
#
import time
from django.utils import timezone
from common.utils import get_logger
from common.utils.timezone import local_now_display
from .handlers import AccountBackupHandler
logger = get_logger(__name__)
class AccountBackupExecutionManager:
def __init__(self, execution):
self.execution = execution
self.date_start = timezone.now()
self.time_start = time.time()
self.date_end = None
self.time_end = None
self.timedelta = 0
def do_run(self):
execution = self.execution
logger.info('\n\033[33m# 账号备份计划正在执行\033[0m')
handler = AccountBackupHandler(execution)
handler.run()
def pre_run(self):
self.execution.date_start = self.date_start
self.execution.save()
def post_run(self):
self.time_end = time.time()
self.date_end = timezone.now()
logger.info('\n\n' + '-' * 80)
logger.info('计划执行结束 {}\n'.format(local_now_display()))
self.timedelta = self.time_end - self.time_start
logger.info('用时: {}s'.format(self.timedelta))
self.execution.timedelta = self.timedelta
self.execution.save()
def run(self):
self.pre_run()
self.do_run()
self.post_run()

Some files were not shown because too many files have changed in this diff Show More