Compare commits

..

197 Commits

Author SHA1 Message Date
Jiangjie Bai
1d0db2ba8b Merge pull request #16316 from jumpserver/dev
v4.10.13-lts
2025-11-20 20:20:02 +08:00
Bai
c97124c279 perf: client pkg rename 2025-11-20 17:59:22 +08:00
Bai
32a766ed34 perf: client pkg rename 2025-11-20 17:59:22 +08:00
Bai
58fd15d743 perf: client pkg rename 2025-11-20 17:59:22 +08:00
feng
f50250dedb perf: Client version 2025-11-20 16:37:23 +08:00
wangruidong
9e150b7fbe fix: One login lock, resulting in two logs 2025-11-20 15:01:06 +08:00
wangruidong
16c79f59a7 fix: Handle case where all time_periods have empty values as a selection of all 2025-11-20 11:31:09 +08:00
wangruidong
be0f04862a fix: Correctly pass runas value in ACL check for job execution 2025-11-19 19:08:29 +08:00
feng
1a3fb2f0db perf: Account bulk error prompt 2025-11-19 17:42:39 +08:00
Eric
4cd70efe66 perf: fix mp4 type replay 2025-11-19 17:10:26 +08:00
wangruidong
28700c01c8 perf: The login log records the locked login log 2025-11-19 17:08:55 +08:00
wangruidong
4524822245 fix: Solve this version of Mysql doesn't yet support 'LIMIT & IN/ALL/ANY/S0ME subquery' error 2025-11-19 09:52:05 +08:00
Eric
9d04fda018 perf: add match perm to user for suggestions api 2025-11-19 09:48:31 +08:00
老广
01c277cd1e Add Client to JumpServer components list 2025-11-19 09:19:52 +08:00
wangruidong
c4b3531d72 fix: correct handling of changed field values in operate log 2025-11-18 10:32:49 +08:00
feng
8870d1ef9e perf: Translate 2025-11-17 18:25:40 +08:00
wangruidong
6c5086a083 perf: implement login asset ACL checks in Job and JobExecution viewsets 2025-11-17 10:53:22 +08:00
wrd
e9f762a982 Revert "perf: Reduce the number of pub sub processing threads (#16072)"
This reverts commit 70068c9253.
2025-11-17 10:52:16 +08:00
wangruidong
d4d4cadbcd fix: OAuth2 Only allow existing users to log in operate log error 2025-11-13 18:42:28 +08:00
fit2bot
5e56590405 perf: change base img (#16279)
* perf: change base img

* perf: add gcc

* perf: change base image

* perf: Update Dockerfile with new base image tag

---------

Co-authored-by: ibuler <ibuler@qq.com>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2025-11-13 17:32:51 +08:00
wangruidong
ad8c0f6664 fix: SAML2 Only allow existing users to log in operate log error 2025-11-13 16:36:58 +08:00
wangruidong
47dd6babfc perf: add id verbose_name 2025-11-13 15:17:14 +08:00
ibuler
691d1c4dba perf: remove client key 2025-11-13 14:36:40 +08:00
ibuler
ac485804d5 perf: postgresql support ssl 2025-11-13 14:36:40 +08:00
ibuler
51e5fdb301 perf: change i18n 2025-11-13 10:05:37 +08:00
feng
69c4d613f7 perf: Add client support version 2025-11-11 16:37:12 +08:00
github-actions[bot]
1ad825bf0d perf: Update Dockerfile with new base image tag 2025-11-11 15:11:51 +08:00
ibuler
a286cb9343 deps: upgrade playwright 2025-11-11 15:11:51 +08:00
ibuler
1eb489bb2d perf: upgrade pg client 2025-11-11 14:24:53 +08:00
fit2bot
4334ae9e5e perf: update apt source config (#16265)
* perf: upgrade os to trixie

* perf: update apt source config

* perf: Update Dockerfile with new base image tag

---------

Co-authored-by: ibuler <ibuler@qq.com>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2025-11-11 14:17:35 +08:00
fit2bot
f2e346a0c3 perf: upgrade os to trixie (#16263)
* perf: upgrade os to trixie

* perf: Update Dockerfile with new base image tag

---------

Co-authored-by: ibuler <ibuler@qq.com>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
2025-11-11 11:52:17 +08:00
wangruidong
dc20b06431 fix: i18n error 2025-11-10 18:14:18 +08:00
fit2bot
387a9248fc perf: Add a key to cover all protocols and ports (#16227)
Co-authored-by: wangruidong <940853815@qq.com>
2025-11-10 18:04:00 +08:00
wangruidong
705fd6385f fix: i18n error 2025-11-10 18:03:51 +08:00
fit2bot
0ccf36621f perf: Translate select files (#16212)
Co-authored-by: wangruidong <940853815@qq.com>
2025-11-06 18:26:54 +08:00
fit2bot
a9ae12fc2c perf: Implement data masking rules ACL check before job execution (#16216)
* perf: Implement data masking rules ACL check before job execution

* perf: Add login asset ACL check during job creation

* perf: Remove unused code.

---------

Co-authored-by: wangruidong <940853815@qq.com>
2025-11-06 18:25:34 +08:00
老广
7b1a25adde Add issue spam configuration file 2025-11-06 18:13:42 +08:00
feng
a1b5eb1cd8 perf: Translate 2025-11-06 15:50:15 +08:00
wangruidong
24ac642c5e fix: Escape percentage signs in gateway password for sshpass command 2025-11-06 14:10:24 +08:00
wangruidong
e4f5e21219 perf: Support batch import of leak passwords 2025-11-06 14:09:09 +08:00
feng
a2aae9db47 perf: Translate 2025-11-05 19:07:48 +08:00
feng
206c43cf75 fix: Fixed the issue of inaccurate calculation of the number of dashboard commands. 2025-11-04 18:14:02 +08:00
feng
019a657ec3 perf: Ssotoken login create operator choose org_id 2025-11-03 17:36:04 +08:00
feng
fad60ee40f perf: Translate 2025-11-03 10:51:22 +08:00
feng
1728412793 perf: Bulk account support node 2025-10-31 17:19:48 +08:00
feng
3e93034fbc perf: Update remote_client 2025-10-30 10:12:40 +08:00
feng
f4b3a7d73a perf: Sync feishu info 2025-10-29 14:53:45 +08:00
wrd
3781c40179 Revert "perf: update fields serialization and bump django and djangorestframe…"
This reverts commit dd0cacb4bc.
2025-10-29 11:19:50 +08:00
ibuler
fab6219cea perf: branches auto cleanup 2025-10-29 10:10:21 +08:00
fit2bot
dd0cacb4bc perf: update fields serialization and bump django and djangorestframework versions (#16209)
Co-authored-by: wangruidong <940853815@qq.com>
2025-10-28 16:42:06 +08:00
ibuler
b8639601a1 perf: branches auto cleanup 2025-10-27 15:33:06 +08:00
老广
ab9882c9c1 perf: check api summary 2025-10-27 15:28:21 +08:00
ibuler
77a7b74b15 perf: print summary in the end 2025-10-27 15:26:04 +08:00
dependabot[bot]
4bc05865f1 chore(deps): bump python-ldap from 3.4.3 to 3.4.5
Bumps [python-ldap](https://github.com/python-ldap/python-ldap) from 3.4.3 to 3.4.5.
- [Release notes](https://github.com/python-ldap/python-ldap/releases)
- [Changelog](https://github.com/python-ldap/python-ldap/blob/python-ldap-3.4.5/CHANGES)
- [Commits](https://github.com/python-ldap/python-ldap/compare/python-ldap-3.4.3...python-ldap-3.4.5)

---
updated-dependencies:
- dependency-name: python-ldap
  dependency-version: 3.4.5
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-10-27 15:24:06 +08:00
fit2bot
bec9e4f3a7 perf: update deps kombu (#16133)
* perf: update deps kombu

* perf: Update Dockerfile with new base image tag

---------

Co-authored-by: Ewall555 <a03216@foxmail.com>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
Co-authored-by: wrd <940853815@qq.com>
2025-10-27 15:18:16 +08:00
fit2bot
359adf3dbb perf: add check api for common user 2025-10-27 14:54:02 +08:00
feng
ac54bb672c fix: Bulk account invalid secret_reset 2025-10-24 18:18:16 +08:00
ibuler
9e3ba00bc4 perf: search support keyword q=str to search 2025-10-24 10:22:49 +08:00
wangruidong
2ec9a43317 fix: Any change to the LDAP server URI should require re-authentication and explicit re-entry of
the bind password, not reuse stored credentials
2025-10-23 15:29:47 +08:00
wangruidong
06be56ef06 fix: Enhance state check to include query parameter for session validation 2025-10-23 14:41:50 +08:00
ibuler
b2a618b206 perf: user sugguestion limit and serializer 2025-10-23 14:40:37 +08:00
wangruidong
1039c2e320 perf: ws/ldap perms check 2025-10-23 14:26:24 +08:00
fit2bot
8d7267400d fix: OpenID Only allow existing users to log in operate log error (#16013)
Co-authored-by: wangruidong <940853815@qq.com>
2025-10-22 14:53:12 +08:00
ibuler
d67e473884 perf: add auto cleanup branches 2025-10-22 11:46:09 +08:00
fit2bot
70068c9253 perf: Reduce the number of pub sub processing threads (#16072)
* perf: Reduce the number of pub sub processing threads

* perf: Using thread pool to process messages

---------

Co-authored-by: wangruidong <940853815@qq.com>
2025-10-21 17:41:14 +08:00
wangruidong
d68babb2e1 fix: Using winrm protocol to transfer files did not create a directory problem 2025-10-21 17:31:41 +08:00
wangruidong
afb6f466d5 perf: AppletHost translate 2025-10-21 17:31:03 +08:00
ibuler
453ad331ee perf: token retrieve 2025-10-21 10:48:08 +08:00
老广
e617245b26 merge: to master 2025-10-16 17:30:34 +08:00
feng
d309d11a8f perf: Command count 2025-10-16 17:11:42 +08:00
feng
4771693a56 fix: dashboard command count 2025-10-16 16:25:01 +08:00
Chenyang Shen
cefc820ac1 Merge pull request #16163 from jumpserver/pr@dev@asset_acl_filter
perf: Asset acl filter action
2025-10-16 15:25:38 +08:00
feng
d007afdb43 perf: Asset acl filter action 2025-10-16 15:21:32 +08:00
feng
e8921a43be perf: Translate 2025-10-16 14:32:59 +08:00
wangruidong
a9b44103d4 fix: Handle email sending failure with appropriate error response 2025-10-16 11:28:41 +08:00
jiangweidong
4abf2bded6 perf: oracle cdb mode, common users need to start username with C## 2025-10-16 09:57:54 +08:00
feng
54693089a0 perf: replace command objects 2025-10-15 19:32:14 +08:00
Aaron3S
0b859dd502 feat: update i18n 2025-10-15 19:17:44 +08:00
feng
3fb27f969a perf: datamaskingrule perm 2025-10-15 17:33:27 +08:00
Aaron3S
45627a1d92 feat: update data masking rule filter 2025-10-15 16:51:58 +08:00
feng
245e2dab66 perf: Filter effective 2025-10-15 16:51:32 +08:00
Aaron3S
8f0a41b1a8 fix: fix data masking org problem 2025-10-15 15:51:14 +08:00
feng
1a9e56c520 perf: Translate 2025-10-15 15:24:19 +08:00
feng
67c2f471b4 perf: oracle sqlserver db2 dameng clickhouse redis db_name allow_blank 2025-10-15 11:30:00 +08:00
github-actions[bot]
b04f96f5f2 perf: Update Dockerfile with new base image tag 2025-10-14 18:09:25 +08:00
Eric
30f03b7d89 perf: change python base
perf: update deps
2025-10-14 18:09:25 +08:00
wangruidong
28a97d0b5a fix: Incorrect language display in some email content 2025-10-14 18:08:21 +08:00
Eric
3410686690 perf: fix python base ci 2025-10-14 17:47:31 +08:00
Eric
6860e2327f perf: add python base ci build 2025-10-14 17:41:05 +08:00
feng
20253e760c perf: translate 2025-10-14 17:13:42 +08:00
Aaron3S
a63cfde8d2 feat: add translate 2025-10-14 16:03:38 +08:00
feng
92e250e03b perf: user_can_authenticate add logger 2025-10-14 15:48:47 +08:00
wangruidong
098f0950cb fix: Incorrect language display in email content 2025-10-14 15:33:04 +08:00
feng
39b0830a6b perf: web script default [] 2025-10-14 13:59:11 +08:00
wangruidong
2e847bc2bc fix: Error in updating message subscription 500 2025-10-14 10:14:50 +08:00
wangruidong
f82f31876a fix: Mysql has set a gateway, and the command execution failed. 2025-10-14 10:14:23 +08:00
github-actions[bot]
cde182c015 perf: Update Dockerfile with new base image tag 2025-10-10 17:06:14 +08:00
Eric
b990cdf561 perf: update deps 2025-10-10 17:06:14 +08:00
feng
c9a062823d perf: Translate 2025-10-10 17:02:30 +08:00
feng
643ba4fc15 fix: Asset web script dont create 2025-10-10 11:43:11 +08:00
feng
d16a55bbe2 perf: Ticket details cannot view assets from other organizations. 2025-10-09 18:41:25 +08:00
fit2bot
ae31554729 perf: AppletHostOnly label match (#16109)
Co-authored-by: wangruidong <940853815@qq.com>
2025-10-09 18:13:37 +08:00
github-actions[bot]
53b47980a2 perf: Update Dockerfile with new base image tag 2025-10-09 16:55:50 +08:00
Eric
d31b5ee570 perf: update Dockerfile-base 2025-10-09 16:55:50 +08:00
feng
65aea1ea36 perf: Push account and change secret support gid 2025-10-09 16:39:32 +08:00
feng
5abb5c5d5a perf: Themes deep blue 2025-10-09 15:36:14 +08:00
feng
93e41a5004 perf: Luna themes default 2025-10-09 15:02:37 +08:00
feng
95f51bbe48 perf: Perference add themes 2025-10-09 14:47:11 +08:00
feng
0184d292ec perf: MFA code 2025-10-09 14:29:08 +08:00
fit2bot
23a6d320c7 feat: update i18n (#16101)
* feat: data masking

* feat: update i18n

---------

Co-authored-by: Aaron3S <chenyang@fit2cloud.com>
Co-authored-by: 老广 <ibuler@qq.com>
2025-10-09 10:03:11 +08:00
Aaron3S
b16304c48a feat: data masking 2025-10-09 09:59:23 +08:00
Gerry.tan
7cd1e4d3a0 perf: Dynamically configure the validity period of the email verification code 2025-09-28 11:26:32 +08:00
Eric
64a9987c3f perf: update rdp params 2025-09-28 11:20:52 +08:00
feng
18bfe312fa perf: open web ui 2025-09-25 15:49:10 +08:00
Bryan
9280884c1c Merge pull request #16056 from jumpserver/dev
v4.10.8-lts
2025-09-18 16:52:13 +08:00
wangruidong
c593f91d77 fix: Account backup: when sending to the mailbox fails, the task status also shows the success problem. 2025-09-18 15:44:35 +08:00
feng
46da05652a fix: Fixed the issue where the final connection verification failed when the domain name contains . 2025-09-18 14:08:00 +08:00
feng
9249aba1a9 perf: Video player version 2025-09-18 11:03:58 +08:00
fit2bot
eca637c120 perf: Translate msg template (#16050)
* fix: Correct translation for device and user limits in django.po

* perf: Translate msg template

---------

Co-authored-by: wangruidong <940853815@qq.com>
2025-09-17 19:04:06 +08:00
feng
ddacd5fce1 fix: Ticket direct approval 2025-09-17 18:58:16 +08:00
wangruidong
3ca5c04099 fix: Add ignore_https_errors option to browser context 2025-09-17 16:30:54 +08:00
wangruidong
6603a073ec fix: Case 2025-09-17 15:32:23 +08:00
wangruidong
d745f7495a fix: Conflict 2025-09-17 15:32:23 +08:00
wangruidong
76f1667c89 perf: Restore msg template default value config 2025-09-17 15:32:23 +08:00
wangruidong
1ab1954299 fix: reset password msg error 2025-09-17 15:32:23 +08:00
wangruidong
c8335999a4 perf: Translate msg template 2025-09-17 15:32:23 +08:00
feng
5b4a67362d perf: Translate 2025-09-17 15:10:54 +08:00
fit2bot
e025073da2 fix: The number of exported data is incorrect (#16043)
Co-authored-by: wangruidong <940853815@qq.com>
2025-09-16 18:52:24 +08:00
feng
2155bc6862 perf: Migrate 2025-09-16 16:46:30 +08:00
wangruidong
953b515817 perf: Add is_alive filter to TerminalFilterSet 2025-09-16 16:30:57 +08:00
ibuler
7f7a354b2d fix: get obj error on queryset limit 2025-09-16 16:28:54 +08:00
Eric
2b2f7ea3f0 perf: add rdp true color 24 bit 2025-09-16 16:28:14 +08:00
feng
529123e1b5 perf: Translate 2025-09-16 16:15:09 +08:00
ibuler
e156ab6ad8 fix: force page limit 2025-09-16 13:48:06 +08:00
wangruidong
3c1fd134ae fix: There is something wrong with the format of the site message 2025-09-16 13:33:43 +08:00
Bai
b15f663c87 fix: AK/SK remained valid after the user expired. 2025-09-16 13:32:25 +08:00
wangruidong
93906dff0a fix: Export report pdf failed 2025-09-16 11:36:42 +08:00
Bai
307befdacd fix: login acl action reject > reviewers 500 2025-09-16 11:17:42 +08:00
feng626
dbfc4d3981 Revert "perf: User acl 500"
This reverts commit 849edd33c1.
2025-09-16 11:15:51 +08:00
feng
849edd33c1 perf: User acl 500 2025-09-16 10:50:41 +08:00
feng
37cceec8fe perf: get protocols error 500 2025-09-16 10:40:42 +08:00
feng
d2494c25cc perf: Translate 2025-09-15 19:19:01 +08:00
feng
023952582e fix: Push account failed 2025-09-15 15:32:27 +08:00
halo
863fe95100 perf: client version 2025-09-12 18:53:16 +08:00
wangruidong
4b0bdb18c9 perf: Template msg example error 2025-09-12 18:47:47 +08:00
Eric
10da053a95 perf: change applet-hosts view default limit 2025-09-12 18:43:38 +08:00
mikebofs
c40bc46520 fix: asset permission exclude accounts with -action 2025-09-12 11:16:27 +08:00
feng
a732cc614e perf: Asset user login notify 2025-09-11 14:16:00 +08:00
ibuler
bb29d519c6 perf: exclude accounts date expired 2025-09-11 11:42:44 +08:00
ibuler
b56c3a76a7 fix: user option error 2025-09-11 11:21:59 +08:00
fit2bot
ab908d24a7 perf: add i18n (#16001)
* perf: change some api view default limit

* perf: add i18n

---------

Co-authored-by: mikebofs <mikebofs@gmail.com>
2025-09-10 18:18:18 +08:00
fit2bot
79cabe1b3c feat: setting email template content (#15974)
* feat: setting email template content

* perf: tempale list

* perf: custom template render to string

* perf: content serialize valid

* perf: Custom msg template base class

* perf: Template content reset

* perf: Update templates config

* perf: Remove useless code

---------

Co-authored-by: wangruidong <940853815@qq.com>
2025-09-10 16:49:52 +08:00
feng
231b7287c1 perf: Notify info css optimization 2025-09-10 14:04:19 +08:00
feng
be7a4c0d6e perf: Create account unique message 2025-09-09 17:39:18 +08:00
feng
009da19050 perf: Change secret windows password cannot contain > ^ 2025-09-09 16:41:45 +08:00
feng
dfda6b1e08 perf: Change secret del over report 2025-09-09 15:48:03 +08:00
fit2bot
59b40578d8 fix: adhoc SQL Server 2008 (#15984)
* fix: Resolve the issue of errors occurring during automated execution with SQL Server 2008

* fix: adhoc SQL Server 2008

* perf: add todo information

---------

Co-authored-by: halo <wuyihuangw@gmail.com>
2025-09-09 14:26:42 +08:00
Eric
e5db28c014 perf: user add has_public_keys 2025-09-09 14:23:39 +08:00
Eric
6d1f26b0f8 perf: add redis cluster mode setting 2025-09-09 13:51:53 +08:00
Ewall555
2333dbbe33 fix: avoid AttributeError when default_limit is missing 2025-09-09 13:32:52 +08:00
Bryan
f31994fdcd Merge pull request #15899 from jumpserver/dev 2025-08-21 19:03:18 +08:00
Bryan
71766418bb Merge pull request #15742 from jumpserver/dev
merge: v4.10.4-lts
2025-07-17 15:12:58 +08:00
Bryan
a9399dd709 Merge pull request #15608 from jumpserver/dev
v4.10.2
2025-06-19 20:14:21 +08:00
Bryan
d0cb9e5432 Merge pull request #15412 from jumpserver/dev
v4.10.0
2025-05-15 17:11:43 +08:00
老广
558188da90 merge: dev to master
Ready to relase
2025-04-17 20:24:45 +08:00
Bryan
ad5460dab8 Merge pull request #15086 from jumpserver/dev
v4.8.0
2025-03-20 18:44:44 +08:00
Bryan
4d37dca0de Merge pull request #14901 from jumpserver/dev
v4.7.0
2025-02-20 10:21:16 +08:00
Bryan
2ca4002624 Merge pull request #14813 from jumpserver/dev
v4.6.0
2025-01-15 14:38:17 +08:00
Bryan
053d640e4c Merge pull request #14699 from jumpserver/dev
v4.5.0
2024-12-19 16:04:45 +08:00
Bryan
f3acc28ded Merge pull request #14697 from jumpserver/dev
v4.5.0
2024-12-19 15:57:11 +08:00
Bryan
25987545db Merge pull request #14511 from jumpserver/dev
v4.4.0
2024-11-21 19:00:35 +08:00
Bryan
6720ecc6e0 Merge pull request #14319 from jumpserver/dev
v4.3.0
2024-10-17 14:55:38 +08:00
老广
0b3a7bb020 Merge pull request #14203 from jumpserver/dev
merge: from dev to master
2024-09-19 19:37:19 +08:00
Bryan
56373e362b Merge pull request #13988 from jumpserver/dev
v4.1.0
2024-08-16 18:40:35 +08:00
Bryan
02fc045370 Merge pull request #13600 from jumpserver/dev
v4.0.0
2024-07-03 19:04:35 +08:00
Bryan
e4ac73896f Merge pull request #13452 from jumpserver/dev
v3.10.11-lts
2024-06-19 16:01:26 +08:00
Bryan
1518f792d6 Merge pull request #13236 from jumpserver/dev
v3.10.10-lts
2024-05-16 16:04:07 +08:00
Bai
67277dd622 fix: 修复仪表盘会话排序数量都是 1 的问题 2024-04-22 19:42:33 +08:00
Bryan
82e7f020ea Merge pull request #13094 from jumpserver/dev
v3.10.9 (dev to master)
2024-04-22 19:39:53 +08:00
Bryan
f20b9e01ab Merge pull request #13062 from jumpserver/dev
v3.10.8 dev to master
2024-04-18 18:01:20 +08:00
Bryan
8cf8a3701b Merge pull request #13059 from jumpserver/dev
v3.10.8
2024-04-18 17:16:37 +08:00
Bryan
7ba24293d1 Merge pull request #12736 from jumpserver/pr@dev@master_fix
fix: 解决冲突
2024-02-29 16:38:43 +08:00
Bai
f10114c9ed fix: 解决冲突 2024-02-29 16:37:10 +08:00
Bryan
cf31cbfb07 Merge pull request #12729 from jumpserver/dev
v3.10.4
2024-02-29 16:19:59 +08:00
wangruidong
0edad24d5d fix: 资产过期消息提示发送失败 2024-02-04 11:41:48 +08:00
ibuler
1f1c1a9157 fix: 修复定时检测用户是否活跃任务无法执行的问题 2024-01-23 09:28:38 +00:00
feng
6c9d271ae1 fix: redis 密码有特殊字符celery beat启动失败 2024-01-22 06:18:34 +00:00
Bai
6ff852e225 perf: 修复 Count 时没有去重的问题 2024-01-22 06:16:25 +00:00
Bryan
baa75dc735 Merge pull request #12566 from jumpserver/master
v3.10.2
2024-01-17 07:34:28 -04:00
Bryan
8a9f0436b8 Merge pull request #12565 from jumpserver/dev
v3.10.2
2024-01-17 07:23:30 -04:00
Bryan
a9620a3cbe Merge pull request #12461 from jumpserver/master
v3.10.1
2023-12-29 11:33:05 +05:00
Bryan
769e7dc8a0 Merge pull request #12460 from jumpserver/dev
v3.10.1
2023-12-29 11:20:36 +05:00
Bryan
2a70449411 Merge pull request #12458 from jumpserver/dev
v3.10.1
2023-12-29 11:01:13 +05:00
Bryan
8df720f19e Merge pull request #12401 from jumpserver/dev
v3.10
2023-12-21 15:14:19 +05:00
老广
dabbb45f6e Merge pull request #12144 from jumpserver/dev
v3.9.0
2023-11-16 18:23:05 +08:00
Bryan
ce24c1c3fd Merge pull request #11914 from jumpserver/dev
v3.8.0
2023-10-19 03:37:39 -05:00
Bryan
3c54c82ce9 Merge pull request #11636 from jumpserver/dev
v3.7.0
2023-09-21 17:02:48 +08:00
206 changed files with 25483 additions and 6742 deletions

26
.github/.github/issue-spam-config.json vendored Normal file
View File

@@ -0,0 +1,26 @@
{
"dry_run": false,
"min_account_age_days": 3,
"max_urls_for_spam": 1,
"min_body_len_for_links": 40,
"spam_words": [
"call now",
"zadzwoń",
"zadzwoń teraz",
"kontakt",
"telefon",
"telefone",
"contato",
"suporte",
"infolinii",
"click here",
"buy now",
"subscribe",
"visit"
],
"bracket_max": 6,
"special_char_density_threshold": 0.12,
"phone_regex": "\\+?\\d[\\d\\-\\s\\(\\)\\.]{6,}\\d",
"labels_for_spam": ["spam"],
"labels_for_review": ["needs-triage"]
}

View File

@@ -1,74 +1,72 @@
name: Build and Push Base Image
on:
pull_request:
branches:
- 'dev'
- 'v*'
paths:
- poetry.lock
- pyproject.toml
- Dockerfile-base
- package.json
- go.mod
- yarn.lock
- pom.xml
- install_deps.sh
- utils/clean_site_packages.sh
types:
- opened
- synchronize
- reopened
pull_request:
branches:
- 'dev'
- 'v*'
paths:
- poetry.lock
- pyproject.toml
- Dockerfile-base
- package.json
- go.mod
- yarn.lock
- pom.xml
- install_deps.sh
- utils/clean_site_packages.sh
types:
- opened
- synchronize
- reopened
jobs:
build-and-push:
runs-on: ubuntu-22.04
steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.ref }}
build-and-push:
runs-on: ubuntu-22.04
steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.ref }}
- name: Set up QEMU
uses: docker/setup-qemu-action@v3
with:
image: tonistiigi/binfmt:qemu-v7.0.0-28
- name: Set up QEMU
uses: docker/setup-qemu-action@v3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Login to DockerHub
uses: docker/login-action@v2
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Login to DockerHub
uses: docker/login-action@v2
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Extract date
id: vars
run: echo "IMAGE_TAG=$(date +'%Y%m%d_%H%M%S')" >> $GITHUB_ENV
- name: Extract date
id: vars
run: echo "IMAGE_TAG=$(date +'%Y%m%d_%H%M%S')" >> $GITHUB_ENV
- name: Extract repository name
id: repo
run: echo "REPO=$(basename ${{ github.repository }})" >> $GITHUB_ENV
- name: Extract repository name
id: repo
run: echo "REPO=$(basename ${{ github.repository }})" >> $GITHUB_ENV
- name: Build and push multi-arch image
uses: docker/build-push-action@v6
with:
platforms: linux/amd64,linux/arm64
push: true
file: Dockerfile-base
tags: jumpserver/core-base:${{ env.IMAGE_TAG }}
- name: Build and push multi-arch image
uses: docker/build-push-action@v6
with:
platforms: linux/amd64,linux/arm64
push: true
file: Dockerfile-base
tags: jumpserver/core-base:${{ env.IMAGE_TAG }}
- name: Update Dockerfile
run: |
sed -i 's|-base:.* AS stage-build|-base:${{ env.IMAGE_TAG }} AS stage-build|' Dockerfile
- name: Update Dockerfile
run: |
sed -i 's|-base:.* AS stage-build|-base:${{ env.IMAGE_TAG }} AS stage-build|' Dockerfile
- name: Commit changes
run: |
git config --global user.name 'github-actions[bot]'
git config --global user.email 'github-actions[bot]@users.noreply.github.com'
git add Dockerfile
git commit -m "perf: Update Dockerfile with new base image tag"
git push origin ${{ github.event.pull_request.head.ref }}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: Commit changes
run: |
git config --global user.name 'github-actions[bot]'
git config --global user.email 'github-actions[bot]@users.noreply.github.com'
git add Dockerfile
git commit -m "perf: Update Dockerfile with new base image tag"
git push origin ${{ github.event.pull_request.head.ref }}
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

View File

@@ -0,0 +1,46 @@
name: Build and Push Python Base Image
on:
workflow_dispatch:
inputs:
tag:
description: 'Tag to build'
required: true
default: '3.11-slim-bullseye-v1'
type: string
jobs:
build-and-push:
runs-on: ubuntu-22.04
steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.ref }}
- name: Set up QEMU
uses: docker/setup-qemu-action@v3
with:
image: tonistiigi/binfmt:qemu-v7.0.0-28
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Login to DockerHub
uses: docker/login-action@v2
with:
username: ${{ secrets.DOCKERHUB_USERNAME }}
password: ${{ secrets.DOCKERHUB_TOKEN }}
- name: Extract repository name
id: repo
run: echo "REPO=$(basename ${{ github.repository }})" >> $GITHUB_ENV
- name: Build and push multi-arch image
uses: docker/build-push-action@v6
with:
platforms: linux/amd64,linux/arm64
push: true
file: Dockerfile-python
tags: jumpserver/core-base:python-${{ inputs.tag }}

123
.github/workflows/cleanup-branches.yml vendored Normal file
View File

@@ -0,0 +1,123 @@
name: Cleanup PR Branches
on:
schedule:
# 每天凌晨2点运行
- cron: '0 2 * * *'
workflow_dispatch:
# 允许手动触发
inputs:
dry_run:
description: 'Dry run mode (default: true)'
required: false
default: 'true'
type: boolean
jobs:
cleanup-branches:
runs-on: ubuntu-latest
steps:
- name: Checkout repository
uses: actions/checkout@v4
with:
fetch-depth: 0 # 获取所有分支和提交历史
- name: Setup Git
run: |
git config --global user.name "GitHub Actions"
git config --global user.email "actions@github.com"
- name: Get dry run setting
id: dry-run
run: |
if [ "${{ github.event_name }}" = "workflow_dispatch" ]; then
echo "dry_run=${{ github.event.inputs.dry_run }}" >> $GITHUB_OUTPUT
else
echo "dry_run=false" >> $GITHUB_OUTPUT
fi
- name: Cleanup branches
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
DRY_RUN: ${{ steps.dry-run.outputs.dry_run }}
run: |
echo "Starting branch cleanup..."
echo "Dry run mode: $DRY_RUN"
# 获取所有本地分支
git fetch --all --prune
# 获取以 pr 或 repr 开头的分支
branches=$(git branch -r | grep -E 'origin/(pr|repr)' | sed 's/origin\///' | grep -v 'HEAD')
echo "Found branches matching pattern:"
echo "$branches"
deleted_count=0
skipped_count=0
for branch in $branches; do
echo ""
echo "Processing branch: $branch"
# 检查分支是否有未合并的PR
pr_info=$(gh pr list --head "$branch" --state open --json number,title,state 2>/dev/null)
if [ $? -eq 0 ] && [ "$pr_info" != "[]" ]; then
echo " ⚠️ Branch has open PR(s), skipping deletion"
echo " PR info: $pr_info"
skipped_count=$((skipped_count + 1))
continue
fi
# 检查分支是否有已合并的PR可选如果PR已合并也可以删除
merged_pr_info=$(gh pr list --head "$branch" --state merged --json number,title,state 2>/dev/null)
if [ $? -eq 0 ] && [ "$merged_pr_info" != "[]" ]; then
echo " ✅ Branch has merged PR(s), safe to delete"
echo " Merged PR info: $merged_pr_info"
else
echo " No PRs found for this branch"
fi
# 执行删除操作
if [ "$DRY_RUN" = "true" ]; then
echo " 🔍 [DRY RUN] Would delete branch: $branch"
deleted_count=$((deleted_count + 1))
else
echo " 🗑️ Deleting branch: $branch"
# 删除远程分支
if git push origin --delete "$branch" 2>/dev/null; then
echo " ✅ Successfully deleted remote branch: $branch"
deleted_count=$((deleted_count + 1))
else
echo " ❌ Failed to delete remote branch: $branch"
fi
fi
done
echo ""
echo "=== Cleanup Summary ==="
echo "Branches processed: $(echo "$branches" | wc -l)"
echo "Branches deleted: $deleted_count"
echo "Branches skipped: $skipped_count"
if [ "$DRY_RUN" = "true" ]; then
echo ""
echo "🔍 This was a DRY RUN - no branches were actually deleted"
echo "To perform actual deletion, run this workflow manually with dry_run=false"
fi
- name: Create summary
if: always()
run: |
echo "## Branch Cleanup Summary" >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY
echo "**Workflow:** ${{ github.workflow }}" >> $GITHUB_STEP_SUMMARY
echo "**Run ID:** ${{ github.run_id }}" >> $GITHUB_STEP_SUMMARY
echo "**Dry Run:** ${{ steps.dry-run.outputs.dry_run }}" >> $GITHUB_STEP_SUMMARY
echo "**Triggered by:** ${{ github.event_name }}" >> $GITHUB_STEP_SUMMARY
echo "" >> $GITHUB_STEP_SUMMARY
echo "Check the logs above for detailed information about processed branches." >> $GITHUB_STEP_SUMMARY

View File

@@ -1,11 +1,9 @@
name: 🔀 Sync mirror to Gitee
on:
push:
branches:
- master
- dev
create:
schedule:
# 每天凌晨3点运行
- cron: '0 3 * * *'
jobs:
mirror:
@@ -14,7 +12,6 @@ jobs:
steps:
- name: mirror
continue-on-error: true
if: github.event_name == 'push' || (github.event_name == 'create' && github.event.ref_type == 'tag')
uses: wearerequired/git-mirror-action@v1
env:
SSH_PRIVATE_KEY: ${{ secrets.GITEE_SSH_PRIVATE_KEY }}

View File

@@ -1,4 +1,4 @@
FROM jumpserver/core-base:20250827_025554 AS stage-build
FROM jumpserver/core-base:20251113_092612 AS stage-build
ARG VERSION
@@ -19,7 +19,7 @@ RUN set -ex \
&& python manage.py compilemessages
FROM python:3.11-slim-bullseye
FROM python:3.11-slim-trixie
ENV LANG=en_US.UTF-8 \
PATH=/opt/py3/bin:$PATH
@@ -39,7 +39,7 @@ ARG TOOLS=" \
ARG APT_MIRROR=http://deb.debian.org
RUN set -ex \
&& sed -i "s@http://.*.debian.org@${APT_MIRROR}@g" /etc/apt/sources.list \
&& sed -i "s@http://.*.debian.org@${APT_MIRROR}@g" /etc/apt/sources.list.d/debian.sources \
&& ln -sf /usr/share/zoneinfo/Asia/Shanghai /etc/localtime \
&& apt-get update > /dev/null \
&& apt-get -y install --no-install-recommends ${DEPENDENCIES} \

View File

@@ -1,6 +1,5 @@
FROM python:3.11-slim-bullseye
FROM python:3.11.14-slim-trixie
ARG TARGETARCH
COPY --from=ghcr.io/astral-sh/uv:0.6.14 /uv /uvx /usr/local/bin/
# Install APT dependencies
ARG DEPENDENCIES=" \
ca-certificates \
@@ -22,13 +21,13 @@ RUN --mount=type=cache,target=/var/cache/apt,sharing=locked,id=core \
set -ex \
&& rm -f /etc/apt/apt.conf.d/docker-clean \
&& echo 'Binary::apt::APT::Keep-Downloaded-Packages "true";' > /etc/apt/apt.conf.d/keep-cache \
&& sed -i "s@http://.*.debian.org@${APT_MIRROR}@g" /etc/apt/sources.list \
&& sed -i "s@http://.*.debian.org@${APT_MIRROR}@g" /etc/apt/sources.list.d/debian.sources \
&& apt-get update > /dev/null \
&& apt-get -y install --no-install-recommends ${DEPENDENCIES} \
&& echo "no" | dpkg-reconfigure dash
# Install bin tools
ARG CHECK_VERSION=v1.0.4
ARG CHECK_VERSION=v1.0.5
RUN set -ex \
&& wget https://github.com/jumpserver-dev/healthcheck/releases/download/${CHECK_VERSION}/check-${CHECK_VERSION}-linux-${TARGETARCH}.tar.gz \
&& tar -xf check-${CHECK_VERSION}-linux-${TARGETARCH}.tar.gz \
@@ -41,12 +40,10 @@ RUN set -ex \
WORKDIR /opt/jumpserver
ARG PIP_MIRROR=https://pypi.org/simple
ENV POETRY_PYPI_MIRROR_URL=${PIP_MIRROR}
ENV ANSIBLE_COLLECTIONS_PATHS=/opt/py3/lib/python3.11/site-packages/ansible_collections
ENV LANG=en_US.UTF-8 \
PATH=/opt/py3/bin:$PATH
ENV UV_LINK_MODE=copy
ENV SETUPTOOLS_SCM_PRETEND_VERSION=3.4.5
RUN --mount=type=cache,target=/root/.cache \
--mount=type=bind,source=pyproject.toml,target=pyproject.toml \
@@ -54,6 +51,7 @@ RUN --mount=type=cache,target=/root/.cache \
--mount=type=bind,source=requirements/collections.yml,target=collections.yml \
--mount=type=bind,source=requirements/static_files.sh,target=utils/static_files.sh \
set -ex \
&& pip install uv -i${PIP_MIRROR} \
&& uv venv \
&& uv pip install -i${PIP_MIRROR} -r pyproject.toml \
&& ln -sf $(pwd)/.venv /opt/py3 \

View File

@@ -13,7 +13,7 @@ ARG TOOLS=" \
nmap \
telnet \
vim \
postgresql-client-13 \
postgresql-client \
wget \
poppler-utils"

View File

@@ -77,7 +77,8 @@ JumpServer consists of multiple key components, which collectively form the func
| [Luna](https://github.com/jumpserver/luna) | <a href="https://github.com/jumpserver/luna/releases"><img alt="Luna release" src="https://img.shields.io/github/release/jumpserver/luna.svg" /></a> | JumpServer Web Terminal |
| [KoKo](https://github.com/jumpserver/koko) | <a href="https://github.com/jumpserver/koko/releases"><img alt="Koko release" src="https://img.shields.io/github/release/jumpserver/koko.svg" /></a> | JumpServer Character Protocol Connector |
| [Lion](https://github.com/jumpserver/lion) | <a href="https://github.com/jumpserver/lion/releases"><img alt="Lion release" src="https://img.shields.io/github/release/jumpserver/lion.svg" /></a> | JumpServer Graphical Protocol Connector |
| [Chen](https://github.com/jumpserver/chen) | <a href="https://github.com/jumpserver/chen/releases"><img alt="Chen release" src="https://img.shields.io/github/release/jumpserver/chen.svg" /> | JumpServer Web DB |
| [Chen](https://github.com/jumpserver/chen) | <a href="https://github.com/jumpserver/chen/releases"><img alt="Chen release" src="https://img.shields.io/github/release/jumpserver/chen.svg" /> | JumpServer Web DB
| [Client](https://github.com/jumpserver/clients) | <a href="https://github.com/jumpserver/clients/releases"><img alt="Clients release" src="https://img.shields.io/github/release/jumpserver/clients.svg" /> | JumpServer Client |
| [Tinker](https://github.com/jumpserver/tinker) | <img alt="Tinker" src="https://img.shields.io/badge/release-private-red" /> | JumpServer Remote Application Connector (Windows) |
| [Panda](https://github.com/jumpserver/Panda) | <img alt="Panda" src="https://img.shields.io/badge/release-private-red" /> | JumpServer EE Remote Application Connector (Linux) |
| [Razor](https://github.com/jumpserver/razor) | <img alt="Chen" src="https://img.shields.io/badge/release-private-red" /> | JumpServer EE RDP Proxy Connector |

View File

@@ -1,16 +1,18 @@
from django.db import transaction
from django.shortcuts import get_object_or_404
from django.utils.translation import gettext_lazy as _
from rest_framework import serializers as drf_serializers
from rest_framework.decorators import action
from rest_framework.generics import ListAPIView, CreateAPIView
from rest_framework.response import Response
from rest_framework.status import HTTP_200_OK
from rest_framework.status import HTTP_200_OK, HTTP_400_BAD_REQUEST
from accounts import serializers
from accounts.const import ChangeSecretRecordStatusChoice
from accounts.filters import AccountFilterSet, NodeFilterBackend
from accounts.mixins import AccountRecordViewLogMixin
from accounts.models import Account, ChangeSecretRecord
from assets.const.gpt import create_or_update_chatx_resources
from assets.models import Asset, Node
from authentication.permissions import UserConfirmation, ConfirmType
from common.api.mixin import ExtraFilterFieldsMixin
@@ -18,6 +20,7 @@ from common.drf.filters import AttrRulesFilterBackend
from common.permissions import IsValidUser
from common.utils import lazyproperty, get_logger
from orgs.mixins.api import OrgBulkModelViewSet
from orgs.utils import tmp_to_root_org
from rbac.permissions import RBACPermission
logger = get_logger(__file__)
@@ -43,6 +46,7 @@ class AccountViewSet(OrgBulkModelViewSet):
'clear_secret': 'accounts.change_account',
'move_to_assets': 'accounts.delete_account',
'copy_to_assets': 'accounts.add_account',
'chat': 'accounts.view_account',
}
export_as_zip = True
@@ -152,6 +156,13 @@ class AccountViewSet(OrgBulkModelViewSet):
def copy_to_assets(self, request, *args, **kwargs):
return self._copy_or_move_to_assets(request, move=False)
@action(methods=['get'], detail=False, url_path='chat')
def chat(self, request, *args, **kwargs):
with tmp_to_root_org():
__, account = create_or_update_chatx_resources()
serializer = self.get_serializer(account)
return Response(serializer.data)
class AccountSecretsViewSet(AccountRecordViewLogMixin, AccountViewSet):
"""
@@ -174,12 +185,66 @@ class AssetAccountBulkCreateApi(CreateAPIView):
'POST': 'accounts.add_account',
}
@staticmethod
def get_all_assets(base_payload: dict):
nodes = base_payload.pop('nodes', [])
asset_ids = base_payload.pop('assets', [])
nodes = Node.objects.filter(id__in=nodes).only('id', 'key')
node_asset_ids = Node.get_nodes_all_assets(*nodes).values_list('id', flat=True)
asset_ids = set(asset_ids + list(node_asset_ids))
return Asset.objects.filter(id__in=asset_ids)
def create(self, request, *args, **kwargs):
serializer = self.get_serializer(data=request.data)
serializer.is_valid(raise_exception=True)
data = serializer.create(serializer.validated_data)
serializer = serializers.AssetAccountBulkSerializerResultSerializer(data, many=True)
return Response(data=serializer.data, status=HTTP_200_OK)
if hasattr(request.data, "copy"):
base_payload = request.data.copy()
else:
base_payload = dict(request.data)
templates = base_payload.pop("template", None)
assets = self.get_all_assets(base_payload)
if not assets.exists():
error = _("No valid assets found for account creation.")
return Response(
data={
"detail": error,
"code": "no_valid_assets"
},
status=HTTP_400_BAD_REQUEST
)
result = []
errors = []
def handle_one(_payload):
try:
ser = self.get_serializer(data=_payload)
ser.is_valid(raise_exception=True)
data = ser.bulk_create(ser.validated_data, assets)
if isinstance(data, (list, tuple)):
result.extend(data)
else:
result.append(data)
except drf_serializers.ValidationError as e:
errors.extend(list(e.detail))
except Exception as e:
errors.extend([str(e)])
if not templates:
handle_one(base_payload)
else:
if not isinstance(templates, (list, tuple)):
templates = [templates]
for tpl in templates:
payload = dict(base_payload)
payload["template"] = tpl
handle_one(payload)
if errors:
raise drf_serializers.ValidationError(errors)
out_ser = serializers.AssetAccountBulkSerializerResultSerializer(result, many=True)
return Response(data=out_ser.data, status=HTTP_200_OK)
class AccountHistoriesSecretAPI(ExtraFilterFieldsMixin, AccountRecordViewLogMixin, ListAPIView):

View File

@@ -25,7 +25,8 @@ class IntegrationApplicationViewSet(OrgBulkModelViewSet):
}
rbac_perms = {
'get_once_secret': 'accounts.change_integrationapplication',
'get_account_secret': 'accounts.view_integrationapplication'
'get_account_secret': 'accounts.view_integrationapplication',
'get_sdks_info': 'accounts.view_integrationapplication'
}
def read_file(self, path):
@@ -36,7 +37,6 @@ class IntegrationApplicationViewSet(OrgBulkModelViewSet):
@action(
['GET'], detail=False, url_path='sdks',
permission_classes=[IsValidUser]
)
def get_sdks_info(self, request, *args, **kwargs):
code_suffix_mapper = {

View File

@@ -235,8 +235,8 @@ class AccountBackupHandler:
except Exception as e:
error = str(e)
print(f'\033[31m>>> {error}\033[0m')
self.execution.status = Status.error
self.execution.summary['error'] = error
self.manager.status = Status.error
self.manager.summary['error'] = error
def backup_by_obj_storage(self):
object_id = self.execution.snapshot.get('id')

View File

@@ -105,10 +105,6 @@ class BaseChangeSecretPushManager(AccountBasePlaybookManager):
h['account']['mode'] = 'sysdba' if account.privileged else None
return h
def add_extra_params(self, host, **kwargs):
host['ssh_params'] = {}
return host
def host_callback(self, host, asset=None, account=None, automation=None, path_dir=None, **kwargs):
host = super().host_callback(
host, asset=asset, account=account, automation=automation,
@@ -117,7 +113,18 @@ class BaseChangeSecretPushManager(AccountBasePlaybookManager):
if host.get('error'):
return host
host = self.add_extra_params(host, automation=automation)
inventory_hosts = []
if asset.type == HostTypes.WINDOWS:
if self.secret_type == SecretType.SSH_KEY:
host['error'] = _("Windows does not support SSH key authentication")
return host
new_secret = self.get_secret(account)
if '>' in new_secret or '^' in new_secret:
host['error'] = _("Windows password cannot contain special characters like > ^")
return host
host['ssh_params'] = {}
accounts = self.get_accounts(account)
existing_ids = set(map(str, accounts.values_list('id', flat=True)))
missing_ids = set(map(str, self.account_ids)) - existing_ids
@@ -133,11 +140,6 @@ class BaseChangeSecretPushManager(AccountBasePlaybookManager):
if asset.type == HostTypes.WINDOWS:
accounts = accounts.filter(secret_type=SecretType.PASSWORD)
inventory_hosts = []
if asset.type == HostTypes.WINDOWS and self.secret_type == SecretType.SSH_KEY:
print(f'Windows {asset} does not support ssh key push')
return inventory_hosts
for account in accounts:
h = deepcopy(host)
h['name'] += '(' + account.username + ')' # To distinguish different accounts

View File

@@ -5,12 +5,14 @@
tasks:
- name: Test SQLServer connection
community.general.mssql_script:
mssql_script:
login_user: "{{ jms_account.username }}"
login_password: "{{ jms_account.secret }}"
login_host: "{{ jms_asset.address }}"
login_port: "{{ jms_asset.port }}"
name: '{{ jms_asset.spec_info.db_name }}'
encryption: "{{ jms_asset.encryption | default(None) }}"
tds_version: "{{ jms_asset.tds_version | default(None) }}"
script: |
SELECT @@version
register: db_info
@@ -23,45 +25,53 @@
var: info
- name: Check whether SQLServer User exist
community.general.mssql_script:
mssql_script:
login_user: "{{ jms_account.username }}"
login_password: "{{ jms_account.secret }}"
login_host: "{{ jms_asset.address }}"
login_port: "{{ jms_asset.port }}"
name: '{{ jms_asset.spec_info.db_name }}'
encryption: "{{ jms_asset.encryption | default(None) }}"
tds_version: "{{ jms_asset.tds_version | default(None) }}"
script: "SELECT 1 from sys.sql_logins WHERE name='{{ account.username }}';"
when: db_info is succeeded
register: user_exist
- name: Change SQLServer password
community.general.mssql_script:
mssql_script:
login_user: "{{ jms_account.username }}"
login_password: "{{ jms_account.secret }}"
login_host: "{{ jms_asset.address }}"
login_port: "{{ jms_asset.port }}"
name: '{{ jms_asset.spec_info.db_name }}'
encryption: "{{ jms_asset.encryption | default(None) }}"
tds_version: "{{ jms_asset.tds_version | default(None) }}"
script: "ALTER LOGIN {{ account.username }} WITH PASSWORD = '{{ account.secret }}', DEFAULT_DATABASE = {{ jms_asset.spec_info.db_name }}; select @@version"
ignore_errors: true
when: user_exist.query_results[0] | length != 0
- name: Add SQLServer user
community.general.mssql_script:
mssql_script:
login_user: "{{ jms_account.username }}"
login_password: "{{ jms_account.secret }}"
login_host: "{{ jms_asset.address }}"
login_port: "{{ jms_asset.port }}"
name: '{{ jms_asset.spec_info.db_name }}'
encryption: "{{ jms_asset.encryption | default(None) }}"
tds_version: "{{ jms_asset.tds_version | default(None) }}"
script: "CREATE LOGIN {{ account.username }} WITH PASSWORD = '{{ account.secret }}', DEFAULT_DATABASE = {{ jms_asset.spec_info.db_name }}; CREATE USER {{ account.username }} FOR LOGIN {{ account.username }}; select @@version"
ignore_errors: true
when: user_exist.query_results[0] | length == 0
- name: Verify password
community.general.mssql_script:
mssql_script:
login_user: "{{ account.username }}"
login_password: "{{ account.secret }}"
login_host: "{{ jms_asset.address }}"
login_port: "{{ jms_asset.port }}"
name: '{{ jms_asset.spec_info.db_name }}'
encryption: "{{ jms_asset.encryption | default(None) }}"
tds_version: "{{ jms_asset.tds_version | default(None) }}"
script: |
SELECT @@version
when: check_conn_after_change

View File

@@ -18,6 +18,7 @@
uid: "{{ params.uid | int if params.uid | length > 0 else omit }}"
shell: "{{ params.shell if params.shell | length > 0 else omit }}"
home: "{{ params.home if params.home | length > 0 else '/home/' + account.username }}"
group: "{{ params.group if params.group | length > 0 else omit }}"
groups: "{{ params.groups if params.groups | length > 0 else omit }}"
append: "{{ true if params.groups | length > 0 else false }}"
expires: -1

View File

@@ -28,6 +28,12 @@ params:
default: ''
help_text: "{{ 'Params home help text' | trans }}"
- name: group
type: str
label: "{{ 'Params group label' | trans }}"
default: ''
help_text: "{{ 'Params group help text' | trans }}"
- name: groups
type: str
label: "{{ 'Params groups label' | trans }}"
@@ -61,6 +67,11 @@ i18n:
ja: 'デフォルトのホームディレクトリ /home/{アカウントユーザ名}'
en: 'Default home directory /home/{account username}'
Params group help text:
zh: '请输入用户组(名字或数字),只能输入一个(需填写已存在的用户组)'
ja: 'ユーザー グループ (名前または番号) を入力してください。入力できるのは 1 つだけです (既存のユーザー グループを入力する必要があります)'
en: 'Please enter a user group (name or number), only one can be entered (must fill in an existing user group)'
Params groups help text:
zh: '请输入用户组,多个用户组使用逗号分隔(需填写已存在的用户组)'
ja: 'グループを入力してください。複数のグループはコンマで区切ってください(既存のグループを入力してください)'
@@ -86,6 +97,11 @@ i18n:
ja: 'グループ'
en: 'Groups'
Params group label:
zh: '主组'
ja: '主组'
en: 'Main group'
Params uid label:
zh: '用户ID'
ja: 'ユーザーID'

View File

@@ -18,6 +18,7 @@
uid: "{{ params.uid | int if params.uid | length > 0 else omit }}"
shell: "{{ params.shell if params.shell | length > 0 else omit }}"
home: "{{ params.home if params.home | length > 0 else '/home/' + account.username }}"
group: "{{ params.group if params.group | length > 0 else omit }}"
groups: "{{ params.groups if params.groups | length > 0 else omit }}"
append: "{{ true if params.groups | length > 0 else false }}"
expires: -1

View File

@@ -30,6 +30,12 @@ params:
default: ''
help_text: "{{ 'Params home help text' | trans }}"
- name: group
type: str
label: "{{ 'Params group label' | trans }}"
default: ''
help_text: "{{ 'Params group help text' | trans }}"
- name: groups
type: str
label: "{{ 'Params groups label' | trans }}"
@@ -63,6 +69,11 @@ i18n:
ja: 'デフォルトのホームディレクトリ /home/{アカウントユーザ名}'
en: 'Default home directory /home/{account username}'
Params group help text:
zh: '请输入用户组(名字或数字),只能输入一个(需填写已存在的用户组)'
ja: 'ユーザー グループ (名前または番号) を入力してください。入力できるのは 1 つだけです (既存のユーザー グループを入力する必要があります)'
en: 'Please enter a user group (name or number), only one can be entered (must fill in an existing user group)'
Params groups help text:
zh: '请输入用户组,多个用户组使用逗号分隔(需填写已存在的用户组)'
ja: 'グループを入力してください。複数のグループはコンマで区切ってください(既存のグループを入力してください)'
@@ -88,6 +99,11 @@ i18n:
ja: 'グループ'
en: 'Groups'
Params group label:
zh: '主组'
ja: '主组'
en: 'Main group'
Params uid label:
zh: '用户ID'
ja: 'ユーザーID'

View File

@@ -5,14 +5,11 @@ from django.conf import settings
from django.utils.translation import gettext_lazy as _
from xlsxwriter import Workbook
from assets.automations.methods import platform_automation_methods as asset_methods
from assets.const import AutomationTypes as AssetAutomationTypes
from accounts.automations.methods import platform_automation_methods as account_methods
from accounts.const import (
AutomationTypes, SecretStrategy, ChangeSecretRecordStatusChoice
)
from accounts.models import ChangeSecretRecord
from accounts.notifications import ChangeSecretExecutionTaskMsg, ChangeSecretReportMsg
from accounts.notifications import ChangeSecretExecutionTaskMsg
from accounts.serializers import ChangeSecretRecordBackUpSerializer
from common.utils import get_logger
from common.utils.file import encrypt_and_compress_zip_file
@@ -25,22 +22,6 @@ logger = get_logger(__name__)
class ChangeSecretManager(BaseChangeSecretPushManager):
ansible_account_prefer = ''
def get_method_id_meta_mapper(self):
return {
method["id"]: method for method in self.platform_automation_methods
}
@property
def platform_automation_methods(self):
return asset_methods + account_methods
def add_extra_params(self, host, **kwargs):
host = super().add_extra_params(host, **kwargs)
automation = kwargs.get('automation')
for extra_type in [AssetAutomationTypes.ping, AutomationTypes.verify_account]:
host[f"{extra_type}_params"] = self.get_params(automation, extra_type)
return host
@classmethod
def method_type(cls):
return AutomationTypes.change_secret
@@ -113,10 +94,6 @@ class ChangeSecretManager(BaseChangeSecretPushManager):
if not recipients:
return
context = self.get_report_context()
for user in recipients:
ChangeSecretReportMsg(user, context).publish()
if not records:
return

View File

@@ -1,36 +0,0 @@
- hosts: website
gather_facts: no
vars:
ansible_python_interpreter: "{{ local_python_interpreter }}"
tasks:
- name: Test privileged account
website_ping:
login_host: "{{ jms_asset.address }}"
login_user: "{{ jms_account.username }}"
login_password: "{{ jms_account.secret }}"
steps: "{{ ping_params.steps }}"
load_state: "{{ ping_params.load_state }}"
- name: "Change {{ account.username }} password"
website_user:
login_host: "{{ jms_asset.address }}"
login_user: "{{ jms_account.username }}"
login_password: "{{ jms_account.secret }}"
steps: "{{ params.steps }}"
load_state: "{{ params.load_state }}"
name: "{{ account.username }}"
password: "{{ account.secret }}"
ignore_errors: true
register: change_secret_result
- name: "Verify {{ account.username }} password"
website_ping:
login_host: "{{ jms_asset.address }}"
login_user: "{{ account.username }}"
login_password: "{{ account.secret }}"
steps: "{{ verify_account_params.steps }}"
load_state: "{{ verify_account_params.load_state }}"
when:
- check_conn_after_change or change_secret_result.failed | default(false)
delegate_to: localhost

View File

@@ -1,51 +0,0 @@
id: change_account_website
name: "{{ 'Website account change secret' | trans }}"
category: web
type:
- website
method: change_secret
priority: 50
params:
- name: load_state
type: choice
label: "{{ 'Load state' | trans }}"
choices:
- [ networkidle, "{{ 'Network idle' | trans }}" ]
- [ domcontentloaded, "{{ 'Dom content loaded' | trans }}" ]
- [ load, "{{ 'Load completed' | trans }}" ]
default: 'load'
- name: steps
type: list
default: [ ]
label: "{{ 'Steps' | trans }}"
help_text: "{{ 'Params step help text' | trans }}"
i18n:
Website account change secret:
zh: 使用 Playwright 模拟浏览器变更账号密码
ja: Playwright を使用してブラウザをシミュレートし、アカウントのパスワードを変更します
en: Use Playwright to simulate a browser for account password change.
Load state:
zh: 加载状态检测
en: Load state detection
ja: ロード状態の検出
Steps:
zh: 步骤
en: Steps
ja: 手順
Network idle:
zh: 网络空闲
en: Network idle
ja: ネットワークが空いた状態
Dom content loaded:
zh: 文档内容加载完成
en: Dom content loaded
ja: ドキュメントの内容がロードされた状態
Load completed:
zh: 全部加载完成
en: All load completed
ja: すべてのロードが完了した状態
Params step help text:
zh: 根据配置决定任务执行步骤
ja: 設定に基づいてタスクの実行ステップを決定する
en: Determine task execution steps based on configuration

View File

@@ -5,12 +5,14 @@
tasks:
- name: Test SQLServer connection
community.general.mssql_script:
mssql_script:
login_user: "{{ jms_account.username }}"
login_password: "{{ jms_account.secret }}"
login_host: "{{ jms_asset.address }}"
login_port: "{{ jms_asset.port }}"
name: '{{ jms_asset.spec_info.db_name }}'
encryption: "{{ jms_asset.encryption | default(None) }}"
tds_version: "{{ jms_asset.tds_version | default(None) }}"
script: |
SELECT
l.name,

View File

@@ -5,12 +5,14 @@
tasks:
- name: Test SQLServer connection
community.general.mssql_script:
mssql_script:
login_user: "{{ jms_account.username }}"
login_password: "{{ jms_account.secret }}"
login_host: "{{ jms_asset.address }}"
login_port: "{{ jms_asset.port }}"
name: '{{ jms_asset.spec_info.db_name }}'
encryption: "{{ jms_asset.encryption | default(None) }}"
tds_version: "{{ jms_asset.tds_version | default(None) }}"
script: |
SELECT @@version
register: db_info
@@ -23,47 +25,55 @@
var: info
- name: Check whether SQLServer User exist
community.general.mssql_script:
mssql_script:
login_user: "{{ jms_account.username }}"
login_password: "{{ jms_account.secret }}"
login_host: "{{ jms_asset.address }}"
login_port: "{{ jms_asset.port }}"
name: '{{ jms_asset.spec_info.db_name }}'
encryption: "{{ jms_asset.encryption | default(None) }}"
tds_version: "{{ jms_asset.tds_version | default(None) }}"
script: "SELECT 1 from sys.sql_logins WHERE name='{{ account.username }}';"
when: db_info is succeeded
register: user_exist
- name: Change SQLServer password
community.general.mssql_script:
mssql_script:
login_user: "{{ jms_account.username }}"
login_password: "{{ jms_account.secret }}"
login_host: "{{ jms_asset.address }}"
login_port: "{{ jms_asset.port }}"
name: '{{ jms_asset.spec_info.db_name }}'
encryption: "{{ jms_asset.encryption | default(None) }}"
tds_version: "{{ jms_asset.tds_version | default(None) }}"
script: "ALTER LOGIN {{ account.username }} WITH PASSWORD = '{{ account.secret }}', DEFAULT_DATABASE = {{ jms_asset.spec_info.db_name }}; select @@version"
ignore_errors: true
when: user_exist.query_results[0] | length != 0
register: change_info
- name: Add SQLServer user
community.general.mssql_script:
mssql_script:
login_user: "{{ jms_account.username }}"
login_password: "{{ jms_account.secret }}"
login_host: "{{ jms_asset.address }}"
login_port: "{{ jms_asset.port }}"
name: '{{ jms_asset.spec_info.db_name }}'
encryption: "{{ jms_asset.encryption | default(None) }}"
tds_version: "{{ jms_asset.tds_version | default(None) }}"
script: "CREATE LOGIN [{{ account.username }}] WITH PASSWORD = '{{ account.secret }}'; CREATE USER [{{ account.username }}] FOR LOGIN [{{ account.username }}]; select @@version"
ignore_errors: true
when: user_exist.query_results[0] | length == 0
register: change_info
- name: Verify password
community.general.mssql_script:
mssql_script:
login_user: "{{ account.username }}"
login_password: "{{ account.secret }}"
login_host: "{{ jms_asset.address }}"
login_port: "{{ jms_asset.port }}"
name: '{{ jms_asset.spec_info.db_name }}'
encryption: "{{ jms_asset.encryption | default(None) }}"
tds_version: "{{ jms_asset.tds_version | default(None) }}"
script: |
SELECT @@version
when: check_conn_after_change

View File

@@ -18,6 +18,7 @@
uid: "{{ params.uid | int if params.uid | length > 0 else omit }}"
shell: "{{ params.shell if params.shell | length > 0 else omit }}"
home: "{{ params.home if params.home | length > 0 else '/home/' + account.username }}"
group: "{{ params.group if params.group | length > 0 else omit }}"
groups: "{{ params.groups if params.groups | length > 0 else omit }}"
append: "{{ true if params.groups | length > 0 else false }}"
expires: -1

View File

@@ -28,6 +28,12 @@ params:
default: ''
help_text: "{{ 'Params home help text' | trans }}"
- name: group
type: str
label: "{{ 'Params group label' | trans }}"
default: ''
help_text: "{{ 'Params group help text' | trans }}"
- name: groups
type: str
label: "{{ 'Params groups label' | trans }}"
@@ -61,6 +67,11 @@ i18n:
ja: 'デフォルトのホームディレクトリ /home/{アカウントユーザ名}'
en: 'Default home directory /home/{account username}'
Params group help text:
zh: '请输入用户组(名字或数字),只能输入一个(需填写已存在的用户组)'
ja: 'ユーザー グループ (名前または番号) を入力してください。入力できるのは 1 つだけです (既存のユーザー グループを入力する必要があります)'
en: 'Please enter a user group (name or number), only one can be entered (must fill in an existing user group)'
Params groups help text:
zh: '请输入用户组,多个用户组使用逗号分隔(需填写已存在的用户组)'
ja: 'グループを入力してください。複数のグループはコンマで区切ってください(既存のグループを入力してください)'
@@ -86,6 +97,11 @@ i18n:
ja: 'グループ'
en: 'Groups'
Params group label:
zh: '主组'
ja: '主组'
en: 'Main group'
Params uid label:
zh: '用户ID'
ja: 'ユーザーID'

View File

@@ -18,6 +18,7 @@
uid: "{{ params.uid | int if params.uid | length > 0 else omit }}"
shell: "{{ params.shell if params.shell | length > 0 else omit }}"
home: "{{ params.home if params.home | length > 0 else '/home/' + account.username }}"
group: "{{ params.group if params.group | length > 0 else omit }}"
groups: "{{ params.groups if params.groups | length > 0 else omit }}"
append: "{{ true if params.groups | length > 0 else false }}"
expires: -1

View File

@@ -30,6 +30,12 @@ params:
default: ''
help_text: "{{ 'Params home help text' | trans }}"
- name: group
type: str
label: "{{ 'Params group label' | trans }}"
default: ''
help_text: "{{ 'Params group help text' | trans }}"
- name: groups
type: str
label: "{{ 'Params groups label' | trans }}"
@@ -63,6 +69,11 @@ i18n:
ja: 'デフォルトのホームディレクトリ /home/{アカウントユーザ名}'
en: 'Default home directory /home/{account username}'
Params group help text:
zh: '请输入用户组(名字或数字),只能输入一个(需填写已存在的用户组)'
ja: 'ユーザー グループ (名前または番号) を入力してください。入力できるのは 1 つだけです (既存のユーザー グループを入力する必要があります)'
en: 'Please enter a user group (name or number), only one can be entered (must fill in an existing user group)'
Params groups help text:
zh: '请输入用户组,多个用户组使用逗号分隔(需填写已存在的用户组)'
ja: 'グループを入力してください。複数のグループはコンマで区切ってください(既存のグループを入力してください)'
@@ -84,9 +95,14 @@ i18n:
en: 'Home'
Params groups label:
zh: '用户组'
ja: 'グループ'
en: 'Groups'
zh: '附加组'
ja: '追加グループ'
en: 'Additional Group'
Params group label:
zh: '主组'
ja: '主组'
en: 'Main group'
Params uid label:
zh: '用户ID'

View File

@@ -5,11 +5,13 @@
tasks:
- name: "Remove account"
community.general.mssql_script:
mssql_script:
login_user: "{{ jms_account.username }}"
login_password: "{{ jms_account.secret }}"
login_host: "{{ jms_asset.address }}"
login_port: "{{ jms_asset.port }}"
name: "{{ jms_asset.spec_info.db_name }}"
encryption: "{{ jms_asset.encryption | default(None) }}"
tds_version: "{{ jms_asset.tds_version | default(None) }}"
script: "DROP LOGIN {{ account.username }}; select @@version"

View File

@@ -5,11 +5,13 @@
tasks:
- name: Verify account
community.general.mssql_script:
mssql_script:
login_user: "{{ account.username }}"
login_password: "{{ account.secret }}"
login_host: "{{ jms_asset.address }}"
login_port: "{{ jms_asset.port }}"
name: '{{ jms_asset.spec_info.db_name }}'
encryption: "{{ jms_asset.encryption | default(None) }}"
tds_version: "{{ jms_asset.tds_version | default(None) }}"
script: |
SELECT @@version

View File

@@ -1,13 +0,0 @@
- hosts: website
gather_facts: no
vars:
ansible_python_interpreter: "{{ local_python_interpreter }}"
tasks:
- name: Verify account
website_ping:
login_host: "{{ jms_asset.address }}"
login_user: "{{ account.username }}"
login_password: "{{ account.secret }}"
steps: "{{ params.steps }}"
load_state: "{{ params.load_state }}"

View File

@@ -1,50 +0,0 @@
id: verify_account_website
name: "{{ 'Website account verify' | trans }}"
category: web
type:
- website
method: verify_account
priority: 50
params:
- name: load_state
type: choice
label: "{{ 'Load state' | trans }}"
choices:
- [ networkidle, "{{ 'Network idle' | trans }}" ]
- [ domcontentloaded, "{{ 'Dom content loaded' | trans }}" ]
- [ load, "{{ 'Load completed' | trans }}" ]
default: 'load'
- name: steps
type: list
label: "{{ 'Steps' | trans }}"
help_text: "{{ 'Params step help text' | trans }}"
default: []
i18n:
Website account verify:
zh: 使用 Playwright 模拟浏览器验证账号
ja: Playwright を使用してブラウザをシミュレートし、アカウントの検証を行います
en: Use Playwright to simulate a browser for account verification.
Load state:
zh: 加载状态检测
en: Load state detection
ja: ロード状態の検出
Steps:
zh: 步骤
en: Steps
ja: 手順
Network idle:
zh: 网络空闲
en: Network idle
ja: ネットワークが空いた状態
Dom content loaded:
zh: 文档内容加载完成
en: Dom content loaded
ja: ドキュメントの内容がロードされた状態
Load completed:
zh: 全部加载完成
en: All load completed
ja: すべてのロードが完了した状態
Params step help text:
zh: 配置步骤,根据配置决定任务执行步骤
ja: パラメータを設定し、設定に基づいてタスクの実行手順を決定します
en: Configure steps, and determine the task execution steps based on the configuration.

View File

@@ -14,7 +14,7 @@ from accounts.models import Account, AccountTemplate, GatheredAccount
from accounts.tasks import push_accounts_to_assets_task
from assets.const import Category, AllTypes
from assets.models import Asset
from common.serializers import SecretReadableMixin
from common.serializers import SecretReadableMixin, CommonBulkModelSerializer
from common.serializers.fields import ObjectRelatedField, LabeledChoiceField
from common.utils import get_logger
from .base import BaseAccountSerializer, AuthValidateMixin
@@ -253,6 +253,8 @@ class AccountSerializer(AccountCreateUpdateSerializerMixin, BaseAccountSerialize
'source_id': {'required': False, 'allow_null': True},
}
fields_unimport_template = ['params']
# 手动判断唯一性校验
validators = []
@classmethod
def setup_eager_loading(cls, queryset):
@@ -263,6 +265,21 @@ class AccountSerializer(AccountCreateUpdateSerializerMixin, BaseAccountSerialize
)
return queryset
def validate(self, attrs):
instance = getattr(self, "instance", None)
if instance:
return super().validate(attrs)
field_errors = {}
for _fields in Account._meta.unique_together:
lookup = {field: attrs.get(field) for field in _fields}
if Account.objects.filter(**lookup).exists():
verbose_names = ', '.join([str(Account._meta.get_field(f).verbose_name) for f in _fields])
msg_template = _('Account already exists. Field(s): {fields} must be unique.')
field_errors[_fields[0]] = msg_template.format(fields=verbose_names)
raise serializers.ValidationError(field_errors)
return attrs
class AccountDetailSerializer(AccountSerializer):
has_secret = serializers.BooleanField(label=_("Has secret"), read_only=True)
@@ -275,26 +292,26 @@ class AccountDetailSerializer(AccountSerializer):
class AssetAccountBulkSerializerResultSerializer(serializers.Serializer):
asset = serializers.CharField(read_only=True, label=_('Asset'))
account = serializers.CharField(read_only=True, label=_('Account'))
state = serializers.CharField(read_only=True, label=_('State'))
error = serializers.CharField(read_only=True, label=_('Error'))
changed = serializers.BooleanField(read_only=True, label=_('Changed'))
class AssetAccountBulkSerializer(
AccountCreateUpdateSerializerMixin, AuthValidateMixin, serializers.ModelSerializer
AccountCreateUpdateSerializerMixin, AuthValidateMixin, CommonBulkModelSerializer
):
su_from_username = serializers.CharField(
max_length=128, required=False, write_only=True, allow_null=True, label=_("Su from"),
allow_blank=True,
)
assets = serializers.PrimaryKeyRelatedField(queryset=Asset.objects, many=True, label=_('Assets'))
class Meta:
model = Account
fields = [
'name', 'username', 'secret', 'secret_type', 'passphrase',
'privileged', 'is_active', 'comment', 'template',
'on_invalid', 'push_now', 'params', 'assets',
'name', 'username', 'secret', 'secret_type', 'secret_reset',
'passphrase', 'privileged', 'is_active', 'comment', 'template',
'on_invalid', 'push_now', 'params',
'su_from_username', 'source', 'source_id',
]
extra_kwargs = {
@@ -376,8 +393,7 @@ class AssetAccountBulkSerializer(
handler = self._handle_err_create
return handler
def perform_bulk_create(self, vd):
assets = vd.pop('assets')
def perform_bulk_create(self, vd, assets):
on_invalid = vd.pop('on_invalid', 'skip')
secret_type = vd.get('secret_type', 'password')
@@ -385,8 +401,7 @@ class AssetAccountBulkSerializer(
vd['name'] = vd.get('username')
create_handler = self.get_create_handler(on_invalid)
asset_ids = [asset.id for asset in assets]
secret_type_supports = Asset.get_secret_type_assets(asset_ids, secret_type)
secret_type_supports = Asset.get_secret_type_assets(assets, secret_type)
_results = {}
for asset in assets:
@@ -394,6 +409,7 @@ class AssetAccountBulkSerializer(
_results[asset] = {
'error': _('Asset does not support this secret type: %s') % secret_type,
'state': 'error',
'account': vd['name'],
}
continue
@@ -403,13 +419,13 @@ class AssetAccountBulkSerializer(
self.clean_auth_fields(vd)
instance, changed, state = self.perform_create(vd, create_handler)
_results[asset] = {
'changed': changed, 'instance': instance.id, 'state': state
'changed': changed, 'instance': instance.id, 'state': state, 'account': vd['name']
}
except serializers.ValidationError as e:
_results[asset] = {'error': e.detail[0], 'state': 'error'}
_results[asset] = {'error': e.detail[0], 'state': 'error', 'account': vd['name']}
except Exception as e:
logger.exception(e)
_results[asset] = {'error': str(e), 'state': 'error'}
_results[asset] = {'error': str(e), 'state': 'error', 'account': vd['name']}
results = [{'asset': asset, **result} for asset, result in _results.items()]
state_score = {'created': 3, 'updated': 2, 'skipped': 1, 'error': 0}
@@ -426,7 +442,8 @@ class AssetAccountBulkSerializer(
errors.append({
'error': _('Account has exist'),
'state': 'error',
'asset': str(result['asset'])
'asset': str(result['asset']),
'account': result.get('account'),
})
if errors:
raise serializers.ValidationError(errors)
@@ -445,10 +462,16 @@ class AssetAccountBulkSerializer(
account_ids = [str(_id) for _id in accounts.values_list('id', flat=True)]
push_accounts_to_assets_task.delay(account_ids, params)
def create(self, validated_data):
def bulk_create(self, validated_data, assets):
if not assets:
raise serializers.ValidationError(
{'assets': _('At least one asset or node must be specified')},
{'nodes': _('At least one asset or node must be specified')}
)
params = validated_data.pop('params', None)
push_now = validated_data.pop('push_now', False)
results = self.perform_bulk_create(validated_data)
results = self.perform_bulk_create(validated_data, assets)
self.push_accounts_if_need(results, push_now, params)
for res in results:
res['asset'] = str(res['asset'])

View File

@@ -1,36 +0,0 @@
{% load i18n %}
<h3>{% trans 'Task name' %}: {{ name }}</h3>
<h3>{% trans 'Task execution id' %}: {{ execution_id }}</h3>
<p>{% trans 'Respectful' %} {{ recipient }}</p>
<p>{% trans 'Hello! The following is the failure of changing the password of your assets or pushing the account. Please check and handle it in time.' %}</p>
<table style="width: 100%; border-collapse: collapse; max-width: 100%; text-align: left; margin-top: 20px;">
<caption></caption>
<thead>
<tr style="background-color: #f2f2f2;">
<th style="border: 1px solid #ddd; padding: 10px;">{% trans 'Asset' %}</th>
<th style="border: 1px solid #ddd; padding: 10px;">{% trans 'Account' %}</th>
<th style="border: 1px solid #ddd; padding: 10px;">{% trans 'Error' %}</th>
</tr>
</thead>
<tbody>
{% for asset_name, account_username, error in asset_account_errors %}
<tr>
<td style="border: 1px solid #ddd; padding: 10px;">{{ asset_name }}</td>
<td style="border: 1px solid #ddd; padding: 10px;">{{ account_username }}</td>
<td style="border: 1px solid #ddd; padding: 10px;">
<div style="
max-width: 90%;
white-space: nowrap;
overflow: hidden;
text-overflow: ellipsis;
display: block;"
title="{{ error }}"
>
{{ error }}
</div>
</td>
</tr>
{% endfor %}
</tbody>
</table>

View File

@@ -3,3 +3,4 @@ from .connect_method import *
from .login_acl import *
from .login_asset_acl import *
from .login_asset_check import *
from .data_masking import *

View File

@@ -0,0 +1,20 @@
from orgs.mixins.api import OrgBulkModelViewSet
from .common import ACLUserFilterMixin
from ..models import DataMaskingRule
from .. import serializers
__all__ = ['DataMaskingRuleViewSet']
class DataMaskingRuleFilter(ACLUserFilterMixin):
class Meta:
model = DataMaskingRule
fields = ('name',)
class DataMaskingRuleViewSet(OrgBulkModelViewSet):
model = DataMaskingRule
filterset_class = DataMaskingRuleFilter
search_fields = ('name',)
serializer_class = serializers.DataMaskingRuleSerializer

View File

@@ -8,7 +8,7 @@ __all__ = ['LoginAssetACLViewSet']
class LoginAssetACLFilter(ACLUserAssetFilterMixin):
class Meta:
model = models.LoginAssetACL
fields = ['name', ]
fields = ['name', 'action']
class LoginAssetACLViewSet(OrgBulkModelViewSet):

View File

@@ -0,0 +1,45 @@
# Generated by Django 4.1.13 on 2025-10-07 16:16
import common.db.fields
from django.conf import settings
import django.core.validators
from django.db import migrations, models
import uuid
class Migration(migrations.Migration):
dependencies = [
migrations.swappable_dependency(settings.AUTH_USER_MODEL),
('acls', '0002_auto_20210926_1047'),
]
operations = [
migrations.CreateModel(
name='DataMaskingRule',
fields=[
('created_by', models.CharField(blank=True, max_length=128, null=True, verbose_name='Created by')),
('updated_by', models.CharField(blank=True, max_length=128, null=True, verbose_name='Updated by')),
('date_created', models.DateTimeField(auto_now_add=True, null=True, verbose_name='Date created')),
('date_updated', models.DateTimeField(auto_now=True, verbose_name='Date updated')),
('comment', models.TextField(blank=True, default='', verbose_name='Comment')),
('id', models.UUIDField(default=uuid.uuid4, primary_key=True, serialize=False)),
('org_id', models.CharField(blank=True, db_index=True, default='', max_length=36, verbose_name='Organization')),
('priority', models.IntegerField(default=50, help_text='1-100, the lower the value will be match first', validators=[django.core.validators.MinValueValidator(1), django.core.validators.MaxValueValidator(100)], verbose_name='Priority')),
('action', models.CharField(default='reject', max_length=64, verbose_name='Action')),
('is_active', models.BooleanField(default=True, verbose_name='Active')),
('users', common.db.fields.JSONManyToManyField(default=dict, to='users.User', verbose_name='Users')),
('assets', common.db.fields.JSONManyToManyField(default=dict, to='assets.Asset', verbose_name='Assets')),
('accounts', models.JSONField(default=list, verbose_name='Accounts')),
('name', models.CharField(max_length=128, verbose_name='Name')),
('fields_pattern', models.CharField(default='password', max_length=128, verbose_name='Fields pattern')),
('masking_method', models.CharField(choices=[('fixed_char', 'Fixed Character Replacement'), ('hide_middle', 'Hide Middle Characters'), ('keep_prefix', 'Keep Prefix Only'), ('keep_suffix', 'Keep Suffix Only')], default='fixed_char', max_length=32, verbose_name='Masking Method')),
('mask_pattern', models.CharField(blank=True, default='######', max_length=128, null=True, verbose_name='Mask Pattern')),
('reviewers', models.ManyToManyField(blank=True, to=settings.AUTH_USER_MODEL, verbose_name='Reviewers')),
],
options={
'verbose_name': 'Data Masking Rule',
'unique_together': {('org_id', 'name')},
},
),
]

View File

@@ -2,3 +2,4 @@ from .command_acl import *
from .connect_method import *
from .login_acl import *
from .login_asset_acl import *
from .data_masking import *

View File

@@ -0,0 +1,42 @@
from django.db import models
from acls.models import UserAssetAccountBaseACL
from common.utils import get_logger
from django.utils.translation import gettext_lazy as _
logger = get_logger(__file__)
__all__ = ['MaskingMethod', 'DataMaskingRule']
class MaskingMethod(models.TextChoices):
fixed_char = "fixed_char", _("Fixed Character Replacement") # 固定字符替换
hide_middle = "hide_middle", _("Hide Middle Characters") # 隐藏中间几位
keep_prefix = "keep_prefix", _("Keep Prefix Only") # 只保留前缀
keep_suffix = "keep_suffix", _("Keep Suffix Only") # 只保留后缀
class DataMaskingRule(UserAssetAccountBaseACL):
name = models.CharField(max_length=128, verbose_name=_("Name"))
fields_pattern = models.CharField(max_length=128, default='password', verbose_name=_("Fields pattern"))
masking_method = models.CharField(
max_length=32,
choices=MaskingMethod.choices,
default=MaskingMethod.fixed_char,
verbose_name=_("Masking Method"),
)
mask_pattern = models.CharField(
max_length=128,
verbose_name=_("Mask Pattern"),
default="######",
blank=True,
null=True,
)
def __str__(self):
return self.name
class Meta:
unique_together = [('org_id', 'name')]
verbose_name = _("Data Masking Rule")

View File

@@ -1,30 +1,52 @@
from django.template.loader import render_to_string
from django.utils import timezone
from django.utils.translation import gettext_lazy as _
from accounts.models import Account
from acls.models import LoginACL, LoginAssetACL
from assets.models import Asset
from audits.models import UserLoginLog
from common.views.template import custom_render_to_string
from notifications.notifications import UserMessage
from users.models import User
class UserLoginReminderMsg(UserMessage):
subject = _('User login reminder')
template_name = 'acls/user_login_reminder.html'
contexts = [
{"name": "city", "label": _('Login city'), "default": "Shanghai"},
{"name": "username", "label": _('User'), "default": "john"},
{"name": "ip", "label": "IP", "default": "192.168.1.1"},
{"name": "recipient_name", "label": _("Recipient name"), "default": "John"},
{"name": "recipient_username", "label": _("Recipient username"), "default": "john"},
{"name": "user_agent", "label": _('User agent'), "default": "Mozilla/5.0"},
{"name": "acl_name", "label": _('ACL name'), "default": "login acl"},
{"name": "login_from", "label": _('Login from'), "default": "web"},
{"name": "time", "label": _('Login time'), "default": "2025-01-01 12:00:00"},
]
def __init__(self, user, user_log: UserLoginLog):
def __init__(self, user, user_log: UserLoginLog, acl: LoginACL):
self.user_log = user_log
self.acl_name = str(acl)
self.login_from = user_log.get_type_display()
now = timezone.localtime(user_log.datetime)
self.time = now.strftime('%Y-%m-%d %H:%M:%S')
super().__init__(user)
def get_html_msg(self) -> dict:
user_log = self.user_log
context = {
'ip': user_log.ip,
'time': self.time,
'city': user_log.city,
'acl_name': self.acl_name,
'login_from': self.login_from,
'username': user_log.username,
'recipient': self.user,
'recipient_name': self.user.name,
'recipient_username': self.user.username,
'user_agent': user_log.user_agent,
}
message = render_to_string('acls/user_login_reminder.html', context)
message = custom_render_to_string(self.template_name, context)
return {
'subject': str(self.subject),
@@ -40,24 +62,55 @@ class UserLoginReminderMsg(UserMessage):
class AssetLoginReminderMsg(UserMessage):
subject = _('User login alert for asset')
template_name = 'acls/asset_login_reminder.html'
contexts = [
{"name": "city", "label": _('Login city'), "default": "Shanghai"},
{"name": "username", "label": _('User'), "default": "john"},
{"name": "name", "label": _('Name'), "default": "John"},
{"name": "asset", "label": _('Asset'), "default": "dev server"},
{"name": "recipient_name", "label": _('Recipient name'), "default": "John"},
{"name": "recipient_username", "label": _('Recipient username'), "default": "john"},
{"name": "account", "label": _('Account Input username'), "default": "root"},
{"name": "account_name", "label": _('Account name'), "default": "root"},
{"name": "acl_name", "label": _('ACL name'), "default": "login acl"},
{"name": "ip", "label": "IP", "default": "192.168.1.1"},
{"name": "login_from", "label": _('Login from'), "default": "web"},
{"name": "time", "label": _('Login time'), "default": "2025-01-01 12:00:00"}
]
def __init__(self, user, asset: Asset, login_user: User, account: Account, input_username):
def __init__(
self, user, asset: Asset, login_user: User,
account: Account, acl: LoginAssetACL,
ip, input_username, login_from
):
self.ip = ip
self.asset = asset
self.login_user = login_user
self.account = account
self.acl_name = str(acl)
self.login_from = login_from
self.login_user = login_user
self.input_username = input_username
now = timezone.localtime(timezone.now())
self.time = now.strftime('%Y-%m-%d %H:%M:%S')
super().__init__(user)
def get_html_msg(self) -> dict:
context = {
'recipient': self.user,
'ip': self.ip,
'time': self.time,
'login_from': self.login_from,
'recipient_name': self.user.name,
'recipient_username': self.user.username,
'username': self.login_user.username,
'name': self.login_user.name,
'asset': str(self.asset),
'account': self.input_username,
'account_name': self.account.name,
'acl_name': self.acl_name,
}
message = render_to_string('acls/asset_login_reminder.html', context)
message = custom_render_to_string(self.template_name, context)
return {
'subject': str(self.subject),

View File

@@ -3,3 +3,4 @@ from .connect_method import *
from .login_acl import *
from .login_asset_acl import *
from .login_asset_check import *
from .data_masking import *

View File

@@ -90,7 +90,7 @@ class BaseACLSerializer(ActionAclSerializer, serializers.Serializer):
fields_small = fields_mini + [
"is_active", "priority", "action",
"date_created", "date_updated",
"comment", "created_by", "org_id",
"comment", "created_by"
]
fields_m2m = ["reviewers", ]
fields = fields_small + fields_m2m
@@ -100,6 +100,20 @@ class BaseACLSerializer(ActionAclSerializer, serializers.Serializer):
'reviewers': {'label': _('Recipients')},
}
class BaseUserACLSerializer(BaseACLSerializer):
users = JSONManyToManyField(label=_('User'))
class Meta(BaseACLSerializer.Meta):
fields = BaseACLSerializer.Meta.fields + ['users']
class BaseUserAssetAccountACLSerializer(BaseUserACLSerializer):
assets = JSONManyToManyField(label=_('Asset'))
accounts = serializers.ListField(label=_('Account'))
class Meta(BaseUserACLSerializer.Meta):
fields = BaseUserACLSerializer.Meta.fields + ['assets', 'accounts', 'org_id']
def validate_reviewers(self, reviewers):
action = self.initial_data.get('action')
if not action and self.instance:
@@ -118,19 +132,4 @@ class BaseACLSerializer(ActionAclSerializer, serializers.Serializer):
"None of the reviewers belong to Organization `{}`".format(org.name)
)
raise serializers.ValidationError(error)
return valid_reviewers
class BaseUserACLSerializer(BaseACLSerializer):
users = JSONManyToManyField(label=_('User'))
class Meta(BaseACLSerializer.Meta):
fields = BaseACLSerializer.Meta.fields + ['users']
class BaseUserAssetAccountACLSerializer(BaseUserACLSerializer):
assets = JSONManyToManyField(label=_('Asset'))
accounts = serializers.ListField(label=_('Account'))
class Meta(BaseUserACLSerializer.Meta):
fields = BaseUserACLSerializer.Meta.fields + ['assets', 'accounts']
return valid_reviewers

View File

@@ -0,0 +1,19 @@
from django.utils.translation import gettext_lazy as _
from acls.models import MaskingMethod, DataMaskingRule
from common.serializers.fields import LabeledChoiceField
from common.serializers.mixin import CommonBulkModelSerializer
from orgs.mixins.serializers import BulkOrgResourceModelSerializer
from .base import BaseUserAssetAccountACLSerializer as BaseSerializer
__all__ = ['DataMaskingRuleSerializer']
class DataMaskingRuleSerializer(BaseSerializer, BulkOrgResourceModelSerializer):
masking_method = LabeledChoiceField(
choices=MaskingMethod.choices, default=MaskingMethod.fixed_char, label=_('Masking Method')
)
class Meta(BaseSerializer.Meta):
model = DataMaskingRule
fields = BaseSerializer.Meta.fields + ['fields_pattern', 'masking_method', 'mask_pattern']

View File

@@ -17,7 +17,7 @@ class LoginACLSerializer(BaseUserACLSerializer, CommonBulkModelSerializer):
class Meta(BaseUserACLSerializer.Meta):
model = LoginACL
fields = list((set(BaseUserACLSerializer.Meta.fields) | {'rules'}) - {'org_id'})
fields = list((set(BaseUserACLSerializer.Meta.fields) | {'rules'}))
action_choices_exclude = [
ActionChoices.warning,
ActionChoices.notify_and_warn,

View File

@@ -1,13 +1,17 @@
{% load i18n %}
<h3>{% trans 'Dear' %}: {{ recipient.name }}[{{ recipient.username }}]</h3>
<h3>{% trans 'Dear' %}: {{ recipient_name }}[{{ recipient_username }}]</h3>
<hr>
<p>{% trans 'We would like to inform you that a user has recently logged into the following asset:' %}<p>
<p><strong>{% trans 'Asset details' %}:</strong></p>
<ul>
<li><strong>{% trans 'User' %}:</strong> [{{ name }}({{ username }})]</li>
<li><strong>IP:</strong> [{{ ip }}]</li>
<li><strong>{% trans 'Assets' %}:</strong> [{{ asset }}]</li>
<li><strong>{% trans 'Account' %}:</strong> [{{ account_name }}({{ account }})]</li>
<li><strong>{% trans 'Login asset acl' %}:</strong> [{{ acl_name }}]</li>
<li><strong>{% trans 'Login from' %}:</strong> [{{ login_from }}]</li>
<li><strong>{% trans 'Time' %}:</strong> [{{ time }}]</li>
</ul>
<hr>

View File

@@ -1,6 +1,6 @@
{% load i18n %}
<h3>{% trans 'Dear' %}: {{ recipient.name }}[{{ recipient.username }}]</h3>
<h3>{% trans 'Dear' %}: {{ recipient_name }}[{{ recipient_username }}]</h3>
<hr>
<p>{% trans 'We would like to inform you that a user has recently logged:' %}<p>
<p><strong>{% trans 'User details' %}:</strong></p>
@@ -8,7 +8,10 @@
<li><strong>{% trans 'User' %}:</strong> [{{ username }}]</li>
<li><strong>IP:</strong> [{{ ip }}]</li>
<li><strong>{% trans 'Login city' %}:</strong> [{{ city }}]</li>
<li><strong>{% trans 'Login from' %}:</strong> [{{ login_from }}]</li>
<li><strong>{% trans 'User agent' %}:</strong> [{{ user_agent }}]</li>
<li><strong>{% trans 'Login acl' %}:</strong> [{{ acl_name }}]</li>
<li><strong>{% trans 'Time' %}:</strong> [{{ time }}]</li>
</ul>
<hr>

View File

@@ -11,6 +11,7 @@ router.register(r'login-asset-acls', api.LoginAssetACLViewSet, 'login-asset-acl'
router.register(r'command-filter-acls', api.CommandFilterACLViewSet, 'command-filter-acl')
router.register(r'command-groups', api.CommandGroupViewSet, 'command-group')
router.register(r'connect-method-acls', api.ConnectMethodACLViewSet, 'connect-method-acl')
router.register(r'data-masking-rules', api.DataMaskingRuleViewSet, 'data-masking-rule')
urlpatterns = [
path('login-asset/check/', api.LoginAssetCheckAPI.as_view(), name='login-asset-check'),

View File

@@ -1,8 +1,7 @@
# -*- coding: utf-8 -*-
#
from collections import defaultdict
from django.conf import settings
from django.db import transaction
from django.shortcuts import get_object_or_404
from django.utils.translation import gettext as _
from django_filters import rest_framework as drf_filters
@@ -113,7 +112,7 @@ class BaseAssetViewSet(OrgBulkModelViewSet):
("accounts", AccountSerializer),
)
rbac_perms = (
("match", "assets.match_asset"),
("match", "assets.view_asset"),
("platform", "assets.view_platform"),
("gateways", "assets.view_gateway"),
("accounts", "assets.view_account"),
@@ -181,33 +180,18 @@ class AssetViewSet(SuggestionMixin, BaseAssetViewSet):
def sync_platform_protocols(self, request, *args, **kwargs):
platform_id = request.data.get('platform_id')
platform = get_object_or_404(Platform, pk=platform_id)
assets = platform.assets.all()
asset_ids = list(platform.assets.values_list('id', flat=True))
platform_protocols = list(platform.protocols.values('name', 'port'))
platform_protocols = {
p['name']: p['port']
for p in platform.protocols.values('name', 'port')
}
asset_protocols_map = defaultdict(set)
protocols = assets.prefetch_related('protocols').values_list(
'id', 'protocols__name'
)
for asset_id, protocol in protocols:
asset_id = str(asset_id)
asset_protocols_map[asset_id].add(protocol)
objs = []
for asset_id, protocols in asset_protocols_map.items():
protocol_names = set(platform_protocols) - protocols
if not protocol_names:
continue
for name in protocol_names:
objs.append(
Protocol(
name=name,
port=platform_protocols[name],
asset_id=asset_id,
)
)
Protocol.objects.bulk_create(objs)
with transaction.atomic():
if asset_ids:
Protocol.objects.filter(asset_id__in=asset_ids).delete()
if asset_ids and platform_protocols:
objs = []
for aid in asset_ids:
for p in platform_protocols:
objs.append(Protocol(name=p['name'], port=p['port'], asset_id=aid))
Protocol.objects.bulk_create(objs)
return Response(status=status.HTTP_200_OK)
def filter_bulk_update_data(self):

View File

@@ -16,7 +16,6 @@ class CategoryViewSet(ListModelMixin, JMSGenericViewSet):
'types': TypeSerializer,
}
permission_classes = (IsValidUser,)
default_limit = None
def get_queryset(self):
return AllTypes.categories()

View File

@@ -14,7 +14,7 @@ class FavoriteAssetViewSet(BulkModelViewSet):
serializer_class = FavoriteAssetSerializer
permission_classes = (IsValidUser,)
filterset_fields = ['asset']
default_limit = None
page_no_limit = True
def dispatch(self, request, *args, **kwargs):
with tmp_to_root_org():

View File

@@ -43,7 +43,7 @@ class NodeViewSet(SuggestionMixin, OrgBulkModelViewSet):
search_fields = ('full_value',)
serializer_class = serializers.NodeSerializer
rbac_perms = {
'match': 'assets.match_node',
'match': 'assets.view_node',
'check_assets_amount_task': 'assets.change_node'
}

View File

@@ -43,7 +43,7 @@ class AssetPlatformViewSet(JMSModelViewSet):
'ops_methods': 'assets.view_platform',
'filter_nodes_assets': 'assets.view_platform',
}
default_limit = None
page_no_limit = True
def get_queryset(self):
# 因为没有走分页逻辑,所以需要这里 prefetch
@@ -112,8 +112,10 @@ class PlatformProtocolViewSet(JMSModelViewSet):
class PlatformAutomationMethodsApi(generics.ListAPIView):
permission_classes = (IsValidUser,)
queryset = PlatformAutomation.objects.none()
rbac_perms = {
'list': 'assets.view_platform'
}
@staticmethod
def automation_methods():

View File

@@ -1,8 +1,8 @@
from rest_framework.generics import ListAPIView
from assets import serializers
from assets.const import Protocol
from common.permissions import IsValidUser
from assets.models import Protocol
__all__ = ['ProtocolListApi']

View File

@@ -201,17 +201,14 @@ class PlaybookPrepareMixin:
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
# example: {'gather_fact_windows': {'id': 'gather_fact_windows', 'name': '', 'method': 'gather_fact', ...} }
self.method_id_meta_mapper = self.get_method_id_meta_mapper()
# 根据执行方式就行分组, 不同资产的改密、推送等操作可能会使用不同的执行方式
# 然后根据执行方式分组, 再根据 bulk_size 分组, 生成不同的 playbook
self.playbooks = []
def get_method_id_meta_mapper(self):
return {
self.method_id_meta_mapper = {
method["id"]: method
for method in self.platform_automation_methods
if method["method"] == self.__class__.method_type()
}
# 根据执行方式就行分组, 不同资产的改密、推送等操作可能会使用不同的执行方式
# 然后根据执行方式分组, 再根据 bulk_size 分组, 生成不同的 playbook
self.playbooks = []
@classmethod
def method_type(cls):

View File

@@ -6,11 +6,13 @@
tasks:
- name: Test SQLServer connection
community.general.mssql_script:
mssql_script:
login_user: "{{ jms_account.username }}"
login_password: "{{ jms_account.secret }}"
login_host: "{{ jms_asset.address }}"
login_port: "{{ jms_asset.port }}"
name: '{{ jms_asset.spec_info.db_name }}'
encryption: "{{ jms_asset.encryption | default(None) }}"
tds_version: "{{ jms_asset.tds_version | default(None) }}"
script: |
SELECT @@version

View File

@@ -1,13 +0,0 @@
- hosts: website
gather_facts: no
vars:
ansible_python_interpreter: "{{ local_python_interpreter }}"
tasks:
- name: Test Website connection
website_ping:
login_host: "{{ jms_asset.address }}"
login_user: "{{ jms_account.username }}"
login_password: "{{ jms_account.secret }}"
steps: "{{ params.steps }}"
load_state: "{{ params.load_state }}"

View File

@@ -1,50 +0,0 @@
id: website_ping
name: "{{ 'Website ping' | trans }}"
method: ping
category:
- web
type:
- website
params:
- name: load_state
type: choice
label: "{{ 'Load state' | trans }}"
choices:
- [ networkidle, "{{ 'Network idle' | trans }}" ]
- [ domcontentloaded, "{{ 'Dom content loaded' | trans }}" ]
- [ load, "{{ 'Load completed' | trans }}" ]
default: 'load'
- name: steps
type: list
default: []
label: "{{ 'Steps' | trans }}"
help_text: "{{ 'Params step help text' | trans }}"
i18n:
Website ping:
zh: 使用 Playwright 模拟浏览器测试可连接性
en: Use Playwright to simulate a browser for connectivity testing
ja: Playwright を使用してブラウザをシミュレートし、接続性テストを実行する
Load state:
zh: 加载状态检测
en: Load state detection
ja: ロード状態の検出
Steps:
zh: 步骤
en: Steps
ja: 手順
Network idle:
zh: 网络空闲
en: Network idle
ja: ネットワークが空いた状態
Dom content loaded:
zh: 文档内容加载完成
en: Dom content loaded
ja: ドキュメントの内容がロードされた状態
Load completed:
zh: 全部加载完成
en: All load completed
ja: すべてのロードが完了した状態
Params step help text:
zh: 配置步骤,根据配置决定任务执行步骤
ja: パラメータを設定し、設定に基づいてタスクの実行手順を決定します
en: Configure steps, and determine the task execution steps based on the configuration.

View File

@@ -1,5 +1,6 @@
from django.utils.translation import gettext_lazy as _
from orgs.models import Organization
from .base import BaseType
@@ -52,3 +53,41 @@ class GPTTypes(BaseType):
return [
cls.CHATGPT,
]
CHATX_NAME = 'ChatX'
def create_or_update_chatx_resources(chatx_name=CHATX_NAME, org_id=Organization.SYSTEM_ID):
from django.apps import apps
platform_model = apps.get_model('assets', 'Platform')
asset_model = apps.get_model('assets', 'Asset')
account_model = apps.get_model('accounts', 'Account')
platform, __ = platform_model.objects.get_or_create(
name=chatx_name,
defaults={
'internal': True,
'type': chatx_name,
'category': 'ai',
}
)
asset, __ = asset_model.objects.get_or_create(
address=chatx_name,
defaults={
'name': chatx_name,
'platform': platform,
'org_id': org_id
}
)
account, __ = account_model.objects.get_or_create(
username=chatx_name,
defaults={
'name': chatx_name,
'asset': asset,
'org_id': org_id
}
)
return asset, account

View File

@@ -250,6 +250,12 @@ class Protocol(ChoicesMixin, models.TextChoices):
'default': False,
'label': _('Auth username')
},
'enable_cluster_mode': {
'type': 'bool',
'default': False,
'label': _('Enable cluster mode'),
'help_text': _('Enable if this Redis instance is part of a cluster')
},
}
},
}

View File

@@ -20,17 +20,13 @@ class WebTypes(BaseType):
def _get_automation_constrains(cls) -> dict:
constrains = {
'*': {
'ansible_enabled': True,
'ansible_config': {
'ansible_connection': 'local',
},
'ping_enabled': True,
'ansible_enabled': False,
'ping_enabled': False,
'gather_facts_enabled': False,
'verify_account_enabled': True,
'change_secret_enabled': True,
'verify_account_enabled': False,
'change_secret_enabled': False,
'push_account_enabled': False,
'gather_accounts_enabled': False,
'remove_account_enabled': False,
}
}
return constrains

View File

@@ -408,8 +408,7 @@ class Asset(NodesRelationMixin, LabeledMixin, AbsConnectivity, JSONFilterMixin,
return tree_node
@staticmethod
def get_secret_type_assets(asset_ids, secret_type):
assets = Asset.objects.filter(id__in=asset_ids)
def get_secret_type_assets(assets, secret_type):
asset_protocol = assets.prefetch_related('protocols').values_list('id', 'protocols__name')
protocol_secret_types_map = const.Protocol.protocol_secret_types()
asset_secret_types_mapp = defaultdict(set)

View File

@@ -28,7 +28,8 @@ class MyAsset(JMSBaseModel):
@staticmethod
def set_asset_custom_value(assets, user):
my_assets = MyAsset.objects.filter(asset__in=assets, user=user).all()
asset_ids = [asset.id for asset in assets]
my_assets = MyAsset.objects.filter(asset_id__in=asset_ids, user=user).all()
customs = {my_asset.asset.id: my_asset.custom_to_dict() for my_asset in my_assets}
for asset in assets:
custom = customs.get(asset.id)

View File

@@ -59,7 +59,10 @@ class DatabaseSerializer(AssetSerializer):
if not platform:
return
if platform.type in ['mysql', 'mariadb']:
if platform.type in [
'mysql', 'mariadb', 'oracle', 'sqlserver',
'db2', 'dameng', 'clickhouse', 'redis'
]:
db_field.required = False
db_field.allow_blank = True
db_field.allow_null = True

View File

@@ -26,4 +26,13 @@ class WebSerializer(AssetSerializer):
'submit_selector': {
'default': 'id=login_button',
},
'script': {
'default': [],
}
}
def to_internal_value(self, data):
data = data.copy()
if data.get('script') in ("", None):
data.pop('script', None)
return super().to_internal_value(data)

View File

@@ -84,6 +84,7 @@ class PlatformAutomationSerializer(serializers.ModelSerializer):
class PlatformProtocolSerializer(serializers.ModelSerializer):
setting = MethodSerializer(required=False, label=_("Setting"))
port_from_addr = serializers.BooleanField(label=_("Port from addr"), read_only=True)
port = serializers.IntegerField(label=_("Port"), required=False, min_value=0, max_value=65535)
class Meta:
model = PlatformProtocol

View File

@@ -23,6 +23,8 @@ logger = get_logger(__name__)
class OperatorLogHandler(metaclass=Singleton):
CACHE_KEY = 'OPERATOR_LOG_CACHE_KEY'
SYSTEM_OBJECTS = frozenset({"Role"})
PREFER_CURRENT_ELSE_USER = frozenset({"SSOToken"})
def __init__(self):
self.log_client = self.get_storage_client()
@@ -142,13 +144,21 @@ class OperatorLogHandler(metaclass=Singleton):
after = self.__data_processing(after)
return before, after
@staticmethod
def get_org_id(object_name):
system_obj = ('Role',)
org_id = get_current_org_id()
if object_name in system_obj:
org_id = Organization.SYSTEM_ID
return org_id
def get_org_id(self, user, object_name):
if object_name in self.SYSTEM_OBJECTS:
return Organization.SYSTEM_ID
current = get_current_org_id()
current_id = str(current) if current else None
if object_name in self.PREFER_CURRENT_ELSE_USER:
if current_id and current_id != Organization.DEFAULT_ID:
return current_id
org = user.orgs.distinct().first()
return str(org.id) if org else Organization.DEFAULT_ID
return current_id or Organization.DEFAULT_ID
def create_or_update_operate_log(
self, action, resource_type, resource=None, resource_display=None,
@@ -168,7 +178,7 @@ class OperatorLogHandler(metaclass=Singleton):
# 前后都没变化,没必要生成日志,除非手动强制保存
return
org_id = self.get_org_id(object_name)
org_id = self.get_org_id(user, object_name)
data = {
'id': log_id, "user": str(user), 'action': action,
'resource_type': str(resource_type), 'org_id': org_id,

View File

@@ -116,7 +116,7 @@ def send_login_info_to_reviewers(instance: UserLoginLog | str, auth_acl_id):
reviewers = acl.reviewers.all()
for reviewer in reviewers:
UserLoginReminderMsg(reviewer, instance).publish_async()
UserLoginReminderMsg(reviewer, instance, acl).publish_async()
@receiver(post_auth_success)

View File

@@ -47,20 +47,21 @@ def on_m2m_changed(sender, action, instance, reverse, model, pk_set, **kwargs):
objs = model.objects.filter(pk__in=pk_set)
objs_display = [str(o) for o in objs]
action = M2M_ACTION[action]
changed_field = current_instance.get(field_name, [])
changed_field = current_instance.get(field_name, {})
changed_value = changed_field.get('value', [])
after, before, before_value = None, None, None
if action == ActionChoices.create:
before_value = list(set(changed_field) - set(objs_display))
before_value = list(set(changed_value) - set(objs_display))
elif action == ActionChoices.delete:
before_value = list(
set(changed_field).symmetric_difference(set(objs_display))
)
before_value = list(set(changed_value).symmetric_difference(set(objs_display)))
if changed_field:
after = {field_name: changed_field}
if before_value:
before = {field_name: before_value}
before_change_field = changed_field.copy()
before_change_field['value'] = before_value
before = {field_name: before_change_field}
if sorted(str(before)) == sorted(str(after)):
return

View File

@@ -69,6 +69,8 @@ class RDPFileClientProtocolURLMixin:
'autoreconnection enabled:i': '1',
'bookmarktype:i': '3',
'use redirection server name:i': '0',
'bitmapcachepersistenable:i': '0',
'bitmapcachesize:i': '1500',
}
# copy from
@@ -76,7 +78,6 @@ class RDPFileClientProtocolURLMixin:
rdp_low_speed_broadband_option = {
"connection type:i": 2,
"disable wallpaper:i": 1,
"bitmapcachepersistenable:i": 1,
"disable full window drag:i": 1,
"disable menu anims:i": 1,
"allow font smoothing:i": 0,
@@ -87,7 +88,6 @@ class RDPFileClientProtocolURLMixin:
rdp_high_speed_broadband_option = {
"connection type:i": 4,
"disable wallpaper:i": 0,
"bitmapcachepersistenable:i": 1,
"disable full window drag:i": 1,
"disable menu anims:i": 0,
"allow font smoothing:i": 0,
@@ -362,6 +362,7 @@ class ConnectionTokenViewSet(AuthFaceMixin, ExtraActionApiMixin, RootOrgViewMixi
self.validate_serializer(serializer)
return super().perform_create(serializer)
def _insert_connect_options(self, data, user):
connect_options = data.pop('connect_options', {})
default_name_opts = {
@@ -375,7 +376,7 @@ class ConnectionTokenViewSet(AuthFaceMixin, ExtraActionApiMixin, RootOrgViewMixi
for name in default_name_opts.keys():
value = preferences.get(name, default_name_opts[name])
connect_options[name] = value
connect_options['lang'] = getattr(user, 'lang', settings.LANGUAGE_CODE)
connect_options['lang'] = getattr(user, 'lang') or settings.LANGUAGE_CODE
data['connect_options'] = connect_options
@staticmethod
@@ -431,7 +432,7 @@ class ConnectionTokenViewSet(AuthFaceMixin, ExtraActionApiMixin, RootOrgViewMixi
if account.username != AliasAccount.INPUT:
data['input_username'] = ''
ticket = self._validate_acl(user, asset, account, connect_method)
ticket = self._validate_acl(user, asset, account, connect_method, protocol)
if ticket:
data['from_ticket'] = ticket
@@ -470,7 +471,7 @@ class ConnectionTokenViewSet(AuthFaceMixin, ExtraActionApiMixin, RootOrgViewMixi
after=after, object_name=object_name
)
def _validate_acl(self, user, asset, account, connect_method):
def _validate_acl(self, user, asset, account, connect_method, protocol):
from acls.models import LoginAssetACL
kwargs = {'user': user, 'asset': asset, 'account': account}
if account.username == AliasAccount.INPUT:
@@ -523,9 +524,15 @@ class ConnectionTokenViewSet(AuthFaceMixin, ExtraActionApiMixin, RootOrgViewMixi
return
self._record_operate_log(acl, asset)
os = get_request_os(self.request) if self.request else 'windows'
method = ConnectMethodUtil.get_connect_method(
connect_method, protocol=protocol, os=os
)
login_from = method['label'] if method else connect_method
for reviewer in reviewers:
AssetLoginReminderMsg(
reviewer, asset, user, account, self.input_username
reviewer, asset, user, account, acl,
ip, self.input_username, login_from
).publish_async()
def create_face_verify(self, response):
@@ -558,7 +565,9 @@ class SuperConnectionTokenViewSet(ConnectionTokenViewSet):
rbac_perms = {
'create': 'authentication.add_superconnectiontoken',
'renewal': 'authentication.add_superconnectiontoken',
'list': 'authentication.view_superconnectiontoken',
'check': 'authentication.view_superconnectiontoken',
'retrieve': 'authentication.view_superconnectiontoken',
'get_secret_detail': 'authentication.view_superconnectiontokensecret',
'get_applet_info': 'authentication.view_superconnectiontoken',
'release_applet_account': 'authentication.view_superconnectiontoken',
@@ -566,7 +575,12 @@ class SuperConnectionTokenViewSet(ConnectionTokenViewSet):
}
def get_queryset(self):
return ConnectionToken.objects.all()
return ConnectionToken.objects.none()
def get_object(self):
pk = self.kwargs.get(self.lookup_field)
token = get_object_or_404(ConnectionToken, pk=pk)
return token
def get_user(self, serializer):
return serializer.validated_data.get('user')

View File

@@ -67,8 +67,9 @@ class UserResetPasswordSendCodeApi(CreateAPIView):
code = random_string(settings.SMS_CODE_LENGTH, lower=False, upper=False)
subject = '%s: %s' % (get_login_title(), _('Forgot password'))
tip = _('The validity period of the verification code is {} minute').format(settings.VERIFY_CODE_TTL // 60)
context = {
'user': user, 'title': subject, 'code': code,
'user': user, 'title': subject, 'code': code, 'tip': tip,
}
message = render_to_string('authentication/_msg_reset_password_code.html', context)
content = {'subject': subject, 'message': message}

View File

@@ -25,7 +25,10 @@ class JMSBaseAuthBackend:
"""
# 三方用户认证完成后,在后续的 get_user 获取逻辑中,也应该需要检查用户是否有效
is_valid = getattr(user, 'is_valid', None)
return is_valid or is_valid is None
if not is_valid:
logger.info("User %s is not valid", getattr(user, "username", "<unknown>"))
return False
return True
# allow user to authenticate
def username_allow_authenticate(self, username):

View File

@@ -136,7 +136,7 @@ class SignatureAuthentication(signature.SignatureAuthentication):
# example implementation:
try:
key = AccessKey.objects.get(id=key_id)
if not key.is_active:
if not key.is_valid:
return None, None
user, secret = key.user, str(key.secret)
after_authenticate_update_date(user, key)

View File

@@ -3,6 +3,7 @@ from django.contrib import auth
from django.http import HttpResponseRedirect
from django.urls import reverse
from django.utils.http import urlencode
from django.utils.translation import gettext_lazy as _
from django.views import View
from authentication.backends.base import BaseAuthCallbackClientView
@@ -61,6 +62,10 @@ class OAuth2AuthCallbackView(View, FlashMessageMixin):
return HttpResponseRedirect(
settings.AUTH_OAUTH2_AUTHENTICATION_REDIRECT_URI
)
else:
if getattr(request, 'error_message', ''):
response = self.get_failed_response('/', title=_('OAuth2 Error'), msg=request.error_message)
return response
logger.debug(log_prompt.format('Redirect'))
redirect_url = settings.AUTH_OAUTH2_PROVIDER_END_SESSION_ENDPOINT or '/'

View File

@@ -134,6 +134,7 @@ class OIDCAuthCallbackView(View, FlashMessageMixin):
log_prompt = "Process GET requests [OIDCAuthCallbackView]: {}"
logger.debug(log_prompt.format('Start'))
callback_params = request.GET
error_title = _("OpenID Error")
# Retrieve the state value that was previously generated. No state means that we cannot
# authenticate the user (so a failure should be returned).
@@ -172,10 +173,9 @@ class OIDCAuthCallbackView(View, FlashMessageMixin):
try:
user = auth.authenticate(nonce=nonce, request=request, code_verifier=code_verifier)
except IntegrityError as e:
title = _("OpenID Error")
msg = _('Please check if a user with the same username or email already exists')
logger.error(e, exc_info=True)
response = self.get_failed_response('/', title, msg)
response = self.get_failed_response('/', error_title, msg)
return response
if user:
logger.debug(log_prompt.format('Login: {}'.format(user)))
@@ -194,7 +194,6 @@ class OIDCAuthCallbackView(View, FlashMessageMixin):
return HttpResponseRedirect(
next_url or settings.AUTH_OPENID_AUTHENTICATION_REDIRECT_URI
)
if 'error' in callback_params:
logger.debug(
log_prompt.format('Error in callback params: {}'.format(callback_params['error']))
@@ -205,9 +204,12 @@ class OIDCAuthCallbackView(View, FlashMessageMixin):
# OpenID Connect Provider authenticate endpoint.
logger.debug(log_prompt.format('Logout'))
auth.logout(request)
redirect_url = settings.AUTH_OPENID_AUTHENTICATION_FAILURE_REDIRECT_URI
if not user and getattr(request, 'error_message', ''):
response = self.get_failed_response(redirect_url, title=error_title, msg=request.error_message)
return response
logger.debug(log_prompt.format('Redirect'))
return HttpResponseRedirect(settings.AUTH_OPENID_AUTHENTICATION_FAILURE_REDIRECT_URI)
return HttpResponseRedirect(redirect_url)
class OIDCAuthCallbackClientView(BaseAuthCallbackClientView):

View File

@@ -252,6 +252,7 @@ class Saml2AuthCallbackView(View, PrepareRequestMixin, FlashMessageMixin):
def post(self, request):
log_prompt = "Process SAML2 POST requests: {}"
post_data = request.POST
error_title = _("SAML2 Error")
try:
saml_instance = self.init_saml_auth(request)
@@ -279,15 +280,18 @@ class Saml2AuthCallbackView(View, PrepareRequestMixin, FlashMessageMixin):
try:
user = auth.authenticate(request=request, saml_user_data=saml_user_data)
except IntegrityError as e:
title = _("SAML2 Error")
msg = _('Please check if a user with the same username or email already exists')
logger.error(e, exc_info=True)
response = self.get_failed_response('/', title, msg)
response = self.get_failed_response('/', error_title, msg)
return response
if user and user.is_valid:
logger.debug(log_prompt.format('Login: {}'.format(user)))
auth.login(self.request, user)
if not user and getattr(request, 'error_message', ''):
response = self.get_failed_response('/', title=error_title, msg=request.error_message)
return response
logger.debug(log_prompt.format('Redirect'))
redir = post_data.get('RelayState')
if not redir or len(redir) == 0:

View File

@@ -114,12 +114,12 @@ class BlockMFAError(AuthFailedNeedLogMixin, AuthFailedError):
super().__init__(username=username, request=request, ip=ip)
class BlockLoginError(AuthFailedNeedBlockMixin, AuthFailedError):
class BlockLoginError(AuthFailedNeedLogMixin, AuthFailedNeedBlockMixin, AuthFailedError):
error = 'block_login'
def __init__(self, username, ip):
def __init__(self, username, ip, request):
self.msg = const.block_user_login_msg.format(settings.SECURITY_LOGIN_LIMIT_TIME)
super().__init__(username=username, ip=ip)
super().__init__(username=username, ip=ip, request=request)
class SessionEmptyError(AuthFailedError):

View File

@@ -38,7 +38,7 @@ class BaseMFA(abc.ABC):
if not ok:
return False, msg
cache.set(cache_key, code, 60)
cache.set(cache_key, code, settings.VERIFY_CODE_TTL)
return True, msg
def is_authenticated(self):

View File

@@ -39,13 +39,14 @@ class MFAEmail(BaseMFA):
def send_challenge(self):
code = random_string(settings.SMS_CODE_LENGTH, lower=False, upper=False)
subject = '%s: %s' % (get_login_title(), _('MFA code'))
tip = _('The validity period of the verification code is {} minute').format(settings.VERIFY_CODE_TTL // 60)
context = {
'user': self.user, 'title': subject, 'code': code,
'user': self.user, 'title': subject, 'code': code, 'tip': tip,
}
message = render_to_string('authentication/_msg_mfa_email_code.html', context)
content = {'subject': subject, 'message': message}
sender_util = SendAndVerifyCodeUtil(
self.user.email, code=code, backend=self.name, timeout=60, **content
self.user.email, code=code, backend=self.name, **content
)
sender_util.gen_and_send_async()

View File

@@ -1,6 +1,7 @@
# -*- coding: utf-8 -*-
#
import inspect
import threading
import time
import uuid
from functools import partial
@@ -12,6 +13,7 @@ from django.contrib.auth import (
BACKEND_SESSION_KEY, load_backend,
PermissionDenied, user_login_failed, _clean_credentials,
)
from django.contrib.auth import get_user_model
from django.core.cache import cache
from django.core.exceptions import ImproperlyConfigured
from django.shortcuts import reverse, redirect, get_object_or_404
@@ -46,6 +48,10 @@ def _get_backends(return_tuples=False):
return backends
class OnlyAllowExistUserAuthError(Exception):
pass
auth._get_backends = _get_backends
@@ -54,6 +60,24 @@ def authenticate(request=None, **credentials):
If the given credentials are valid, return a User object.
之所以 hack 这个 authenticate
"""
UserModel = get_user_model()
original_get_or_create = UserModel.objects.get_or_create
thread_local = threading.local()
thread_local.thread_id = threading.get_ident()
def custom_get_or_create(self, *args, **kwargs):
logger.debug(f"get_or_create: thread_id={threading.get_ident()}, username={username}")
if threading.get_ident() != thread_local.thread_id or not settings.ONLY_ALLOW_EXIST_USER_AUTH:
return original_get_or_create(*args, **kwargs)
create_username = kwargs.get('username')
try:
UserModel.objects.get(username=create_username)
except UserModel.DoesNotExist:
raise OnlyAllowExistUserAuthError
return original_get_or_create(*args, **kwargs)
username = credentials.get('username')
temp_user = None
@@ -71,10 +95,19 @@ def authenticate(request=None, **credentials):
# This backend doesn't accept these credentials as arguments. Try the next one.
continue
try:
UserModel.objects.get_or_create = custom_get_or_create.__get__(UserModel.objects)
user = backend.authenticate(request, **credentials)
except PermissionDenied:
# This backend says to stop in our tracks - this user should not be allowed in at all.
break
except OnlyAllowExistUserAuthError:
request.error_message = _(
'''The administrator has enabled "Only allow existing users to log in",
and the current user is not in the user list. Please contact the administrator.'''
)
continue
finally:
UserModel.objects.get_or_create = original_get_or_create
if user is None:
continue
@@ -176,9 +209,9 @@ class AuthPreCheckMixin:
if not is_block:
return
logger.warning('Ip was blocked' + ': ' + username + ':' + ip)
exception = errors.BlockLoginError(username=username, ip=ip)
exception = errors.BlockLoginError(username=username, ip=ip, request=self.request)
if raise_exception:
raise errors.BlockLoginError(username=username, ip=ip)
raise exception
else:
return exception

View File

@@ -25,6 +25,10 @@ class AccessKey(models.Model):
date_last_used = models.DateTimeField(null=True, blank=True, verbose_name=_('Date last used'))
date_created = models.DateTimeField(auto_now_add=True)
@property
def is_valid(self):
return self.is_active and self.user.is_valid
def get_id(self):
return str(self.id)

View File

@@ -338,6 +338,18 @@ class ConnectionToken(JMSOrgBaseModel):
acls = CommandFilterACL.filter_queryset(**kwargs).valid()
return acls
@lazyproperty
def data_masking_rules(self):
from acls.models import DataMaskingRule
kwargs = {
'user': self.user,
'asset': self.asset,
'account': self.account_object,
}
with tmp_to_org(self.asset.org_id):
rules = DataMaskingRule.filter_queryset(**kwargs).valid()
return rules
class SuperConnectionToken(ConnectionToken):
_type = ConnectionTokenType.SUPER

View File

@@ -1,14 +1,24 @@
from django.template.loader import render_to_string
from django.utils.translation import gettext as _
from django.utils.translation import gettext_lazy as _
from common.utils import get_logger
from common.utils.timezone import local_now_display
from common.views.template import custom_render_to_string
from notifications.notifications import UserMessage
logger = get_logger(__file__)
class DifferentCityLoginMessage(UserMessage):
subject = _('Different city login reminder')
template_name = 'authentication/_msg_different_city.html'
contexts = [
{"name": "city", "label": _('Login city'), "default": "Shanghai"},
{"name": "username", "label": _('User'), "default": "john"},
{"name": "name", "label": _('Name'), "default": "John"},
{"name": "ip", "label": "IP", "default": "192.168.1.1"},
{"name": "time", "label": _('Login Date'), "default": "2025-01-01 12:00:00"},
]
def __init__(self, user, ip, city):
self.ip = ip
self.city = city
@@ -16,18 +26,16 @@ class DifferentCityLoginMessage(UserMessage):
def get_html_msg(self) -> dict:
now = local_now_display()
subject = _('Different city login reminder')
context = dict(
subject=subject,
name=self.user.name,
username=self.user.username,
ip=self.ip,
time=now,
city=self.city,
)
message = render_to_string('authentication/_msg_different_city.html', context)
message = custom_render_to_string(self.template_name, context)
return {
'subject': subject,
'subject': str(self.subject),
'message': message
}
@@ -41,6 +49,16 @@ class DifferentCityLoginMessage(UserMessage):
class OAuthBindMessage(UserMessage):
subject = _('OAuth binding reminder')
template_name = 'authentication/_msg_oauth_bind.html'
contexts = [
{"name": "username", "label": _('User'), "default": "john"},
{"name": "name", "label": _('Name'), "default": "John"},
{"name": "ip", "label": "IP", "default": "192.168.1.1"},
{"name": "oauth_name", "label": _('OAuth name'), "default": "WeCom"},
{"name": "oauth_id", "label": _('OAuth ID'), "default": "000001"},
]
def __init__(self, user, ip, oauth_name, oauth_id):
super().__init__(user)
self.ip = ip
@@ -51,7 +69,6 @@ class OAuthBindMessage(UserMessage):
now = local_now_display()
subject = self.oauth_name + ' ' + _('binding reminder')
context = dict(
subject=subject,
name=self.user.name,
username=self.user.username,
ip=self.ip,
@@ -59,9 +76,9 @@ class OAuthBindMessage(UserMessage):
oauth_name=self.oauth_name,
oauth_id=self.oauth_id
)
message = render_to_string('authentication/_msg_oauth_bind.html', context)
message = custom_render_to_string(self.template_name, context)
return {
'subject': subject,
'subject': str(subject),
'message': message
}

View File

@@ -3,7 +3,7 @@ from rest_framework import serializers
from accounts.const import SecretType
from accounts.models import Account
from acls.models import CommandGroup, CommandFilterACL
from acls.models import CommandGroup, CommandFilterACL, DataMaskingRule
from assets.models import Asset, Platform, Gateway, Zone
from assets.serializers.asset import AssetProtocolsSerializer
from assets.serializers.platform import PlatformSerializer
@@ -83,6 +83,14 @@ class _ConnectionTokenGatewaySerializer(serializers.ModelSerializer):
]
class _ConnectionTokenDataMaskingRuleSerializer(serializers.ModelSerializer):
class Meta:
model = DataMaskingRule
fields = ['id', 'name', 'fields_pattern',
'masking_method', 'mask_pattern',
'is_active', 'priority']
class _ConnectionTokenCommandFilterACLSerializer(serializers.ModelSerializer):
command_groups = ObjectRelatedField(
many=True, required=False, queryset=CommandGroup.objects,
@@ -105,7 +113,7 @@ class _ConnectionTokenPlatformSerializer(PlatformSerializer):
class Meta(PlatformSerializer.Meta):
model = Platform
fields = [field for field in PlatformSerializer.Meta.fields
if field not in PlatformSerializer.Meta.fields_m2m]
if field not in PlatformSerializer.Meta.fields_m2m]
def get_field_names(self, declared_fields, info):
names = super().get_field_names(declared_fields, info)
@@ -139,6 +147,7 @@ class ConnectionTokenSecretSerializer(OrgResourceModelSerializerMixin):
platform = _ConnectionTokenPlatformSerializer(read_only=True)
zone = ObjectRelatedField(queryset=Zone.objects, required=False, label=_('Domain'))
command_filter_acls = _ConnectionTokenCommandFilterACLSerializer(read_only=True, many=True)
data_masking_rules = _ConnectionTokenDataMaskingRuleSerializer(read_only=True, many=True)
expire_now = serializers.BooleanField(label=_('Expired now'), write_only=True, default=True)
connect_method = _ConnectTokenConnectMethodSerializer(read_only=True, source='connect_method_object')
connect_options = serializers.JSONField(read_only=True)
@@ -149,7 +158,7 @@ class ConnectionTokenSecretSerializer(OrgResourceModelSerializerMixin):
model = ConnectionToken
fields = [
'id', 'value', 'user', 'asset', 'account',
'platform', 'command_filter_acls', 'protocol',
'platform', 'command_filter_acls', 'data_masking_rules', 'protocol',
'zone', 'gateway', 'actions', 'expire_at',
'from_ticket', 'expire_now', 'connect_method',
'connect_options', 'face_monitor_token'

View File

@@ -12,7 +12,7 @@
<td style="height: 50px;">{% trans 'MFA code' %}: <span style="font-weight: bold;">{{ code }}</span></td>
</tr>
<tr style="border: 1px solid #eee">
<td style="height: 30px;">{% trans 'The validity period of the verification code is one minute' %}</td>
<td style="height: 30px;">{{ tip }}</td>
</tr>
</table>
</div>

View File

@@ -11,8 +11,6 @@
<b>{% trans 'Time' %}:</b> {{ time }}<br>
<b>{% trans 'IP' %}:</b> {{ ip }}
</p>
-
<p>
{% trans 'If the operation is not your own, unbind and change the password.' %}
</p>

View File

@@ -6,12 +6,12 @@
{% trans 'Please click the link below to reset your password, if not your request, concern your account security' %}
<br>
<br>
<a href="{{ rest_password_url }}?token={{ rest_password_token}}" class='showLink' target="_blank">
<a href="{{ rest_password_url }}?token={{ rest_password_token }}" class='showLink' target="_blank">
{% trans 'Click here reset password' %}
</a>
</p>
<br>
<p>
{% trans 'This link is valid for 1 hour. After it expires' %}
<a href="{{ forget_password_url }}?email={{ user.email }}">{% trans 'request new one' %}</a>
<a href="{{ forget_password_url }}?email={{ email }}">{% trans 'request new one' %}</a>
</p>

View File

@@ -15,7 +15,7 @@
<td style="height: 30px;"> {% trans 'Copy the verification code to the Reset Password page to reset the password.' %} </td>
</tr>
<tr style="border: 1px solid #eee">
<td style="height: 30px;">{% trans 'The validity period of the verification code is one minute' %}</td>
<td style="height: 30px;">{{ tip }}</td>
</tr>
</table>
</div>

View File

@@ -15,7 +15,7 @@ from common.utils import get_logger
from common.utils.common import get_request_ip
from common.utils.django import reverse, get_object_or_none
from users.models import User
from users.signal_handlers import check_only_allow_exist_user_auth, bind_user_to_org_role
from users.signal_handlers import bind_user_to_org_role, check_only_allow_exist_user_auth
from .mixins import FlashMessageMixin
logger = get_logger(__file__)
@@ -55,7 +55,6 @@ class BaseLoginCallbackView(AuthMixin, FlashMessageMixin, IMClientMixin, View):
)
if not check_only_allow_exist_user_auth(create):
user.delete()
return user, (self.msg_client_err, self.request.error_message)
setattr(user, f'{self.user_type}_id', user_id)

View File

@@ -1,9 +1,11 @@
# -*- coding: utf-8 -*-
#
from django.conf import settings
from typing import Callable
from django.utils.translation import gettext as _
from rest_framework.decorators import action
from rest_framework.throttling import UserRateThrottle
from rest_framework.request import Request
from rest_framework.response import Response
@@ -14,8 +16,12 @@ from orgs.utils import current_org
__all__ = ['SuggestionMixin', 'RenderToJsonMixin']
class CustomUserRateThrottle(UserRateThrottle):
rate = '60/m'
class SuggestionMixin:
suggestion_limit = 10
suggestion_limit = settings.SUGGESTION_LIMIT
filter_queryset: Callable
get_queryset: Callable
@@ -35,6 +41,7 @@ class SuggestionMixin:
queryset = queryset.none()
queryset = self.filter_queryset(queryset)
queryset = queryset[:self.suggestion_limit]
page = self.paginate_queryset(queryset)
@@ -45,6 +52,11 @@ class SuggestionMixin:
serializer = self.get_serializer(queryset, many=True)
return Response(serializer.data)
def get_throttles(self):
if self.action == 'match':
return [CustomUserRateThrottle()]
return super().get_throttles()
class RenderToJsonMixin:
@action(methods=[POST, PUT], detail=False, url_path='render-to-json')

View File

@@ -5,6 +5,7 @@ from contextlib import nullcontext
from itertools import chain
from typing import Callable
from django.conf import settings
from django.db import models
from django.db.models.signals import m2m_changed
from rest_framework.request import Request
@@ -16,6 +17,7 @@ from common.drf.filters import (
IDNotFilterBackend, NotOrRelFilterBackend, LabelFilterBackend
)
from common.utils import get_logger, lazyproperty
from common.utils import is_uuid
from orgs.utils import tmp_to_org, tmp_to_root_org
from .action import RenderToJsonMixin
from .serializer import SerializerMixin
@@ -95,9 +97,33 @@ class QuerySetMixin:
request: Request
get_serializer_class: Callable
get_queryset: Callable
slug_field = 'name'
def get_queryset(self):
return super().get_queryset()
def get_object(self):
pk = self.kwargs.get(self.lookup_field)
if not pk or is_uuid(pk) or pk.isdigit():
return super().get_object()
return self.get_queryset().get(**{self.slug_field: pk})
def limit_queryset_if_no_page(self, queryset):
if self.request.query_params.get('format') in ['csv', 'xlsx']:
return queryset
action = getattr(self, 'action', None)
if action != 'list':
return queryset
# 如果分页器有设置 limit则不限制
if self.paginator and self.paginator.get_limit(self.request):
return queryset
# 如果分页器没有设置 limit则不限制
if getattr(self, 'page_no_limit', False):
return queryset
if not settings.DEFAULT_PAGE_SIZE:
return queryset
return queryset[:settings.DEFAULT_PAGE_SIZE]
def filter_queryset(self, queryset):
queryset = super().filter_queryset(queryset)
@@ -106,6 +132,7 @@ class QuerySetMixin:
if self.action == 'metadata':
queryset = queryset.none()
queryset = self.setup_eager_loading(queryset)
queryset = self.limit_queryset_if_no_page(queryset)
return queryset
def setup_eager_loading(self, queryset, is_paginated=False):

View File

@@ -77,6 +77,7 @@ class Language(models.TextChoices):
es = 'es', 'Español'
ru = 'ru', 'Русский'
ko = 'ko', '한국어'
vi = 'vi', 'Tiếng Việt'
@classmethod
def get_code_mapper(cls):

View File

@@ -1,5 +1,6 @@
# -*- coding: utf-8 -*-
#
from rest_framework.filters import SearchFilter as SearchFilterBase
import base64
import json
import logging
@@ -35,6 +36,14 @@ __all__ = [
]
class SearchFilter(SearchFilterBase):
def get_search_terms(self, request):
params = request.query_params.get(self.search_param, '') or request.query_params.get('search', '')
params = params.replace('\x00', '') # strip null characters
params = params.replace(',', ' ')
return params.split()
class BaseFilterSet(drf_filters.FilterSet):
days = drf_filters.NumberFilter(method="filter_days")
days__lt = drf_filters.NumberFilter(method="filter_days")

View File

@@ -1,9 +1,13 @@
import re
import uuid
import time
from django.conf import settings
from django.core.management.base import BaseCommand
from django.test import Client
from django.urls import URLPattern, URLResolver
from django.contrib.auth import get_user_model
from django.contrib.auth.models import AnonymousUser
from jumpserver.urls import api_v1
@@ -85,50 +89,262 @@ known_error_urls = [
'/api/v1/terminal/sessions/00000000-0000-0000-0000-000000000000/replay/download/',
]
# API 白名单 - 普通用户可以访问的 API
user_accessible_urls = known_unauth_urls + [
# 添加更多普通用户可以访问的 API
"/api/v1/settings/public/",
"/api/v1/users/profile/",
"/api/v1/users/change-password/",
"/api/v1/users/logout/",
"/api/v1/settings/chatai-prompts/",
"/api/v1/authentication/confirm/",
"/api/v1/users/connection-token/",
"/api/v1/authentication/temp-tokens/",
"/api/v1/notifications/backends/",
"/api/v1/authentication/passkeys/",
"/api/v1/orgs/orgs/current/",
"/api/v1/tickets/apply-asset-tickets/",
"/api/v1/ops/celery/task/00000000-0000-0000-0000-000000000000/task-execution/00000000-0000-0000-0000-000000000000/log/",
"/api/v1/assets/favorite-assets/",
"/api/v1/authentication/connection-token/",
"/api/v1/ops/jobs/",
"/api/v1/assets/categories/",
"/api/v1/tickets/tickets/",
"/api/v1/authentication/ssh-key/",
"/api/v1/terminal/my-sessions/",
"/api/v1/authentication/access-keys/",
"/api/v1/users/profile/permissions/",
"/api/v1/tickets/apply-login-asset-tickets/",
"/api/v1/resources/",
"/api/v1/ops/celery/task/00000000-0000-0000-0000-000000000000/task-execution/00000000-0000-0000-0000-000000000000/result/",
"/api/v1/notifications/site-messages/",
"/api/v1/notifications/site-messages/unread-total/",
"/api/v1/assets/assets/suggestions/",
"/api/v1/search/",
"/api/v1/notifications/user-msg-subscription/",
"/api/v1/ops/ansible/job-execution/00000000-0000-0000-0000-000000000000/log/",
"/api/v1/tickets/apply-login-tickets/",
"/api/v1/ops/variables/form-data/",
"/api/v1/ops/variables/help/",
"/api/v1/users/profile/password/",
"/api/v1/tickets/apply-command-tickets/",
"/api/v1/ops/job-executions/",
"/api/v1/audits/my-login-logs/",
"/api/v1/terminal/components/connect-methods/"
"/api/v1/ops/task-executions/",
"/api/v1/terminal/sessions/online-info/",
"/api/v1/ops/adhocs/",
"/api/v1/tickets/apply-nodes/suggestions/",
"/api/v1/tickets/apply-assets/suggestions/",
"/api/v1/settings/server-info/",
"/api/v1/ops/playbooks/",
"/api/v1/assets/categories/types/",
"/api/v1/assets/protocols/",
"/api/v1/common/countries/",
"/api/v1/audits/jobs/",
"/api/v1/terminal/components/connect-methods/",
"/api/v1/ops/task-executions/",
]
errors = {}
class Command(BaseCommand):
help = 'Check api if unauthorized'
"""
Check API authorization and user access permissions.
This command performs two types of checks:
1. Anonymous access check - finds APIs that can be accessed without authentication
2. User access check - finds APIs that can be accessed by a normal user
The functionality is split into two methods:
- check_anonymous_access(): Checks for APIs accessible without authentication
- check_user_access(): Checks for APIs accessible by a normal user
Usage examples:
# Check both anonymous and user access (default behavior)
python manage.py check_api
# Check only anonymous access
python manage.py check_api --skip-user-check
# Check only user access
python manage.py check_api --skip-anonymous-check
# Check user access and update whitelist
python manage.py check_api --update-whitelist
"""
help = 'Check API authorization and user access permissions'
password = uuid.uuid4().hex
unauth_urls = []
error_urls = []
unformat_urls = []
# 用户可以访问的 API但不在白名单中的 API
unexpected_access = []
def handle(self, *args, **options):
settings.LOG_LEVEL = 'ERROR'
urls = get_api_urls()
def add_arguments(self, parser):
parser.add_argument(
'--skip-anonymous-check',
action='store_true',
help='Skip anonymous access check (only check user access)',
)
parser.add_argument(
'--skip-user-check',
action='store_true',
help='Skip user access check (only check anonymous access)',
)
parser.add_argument(
'--update-whitelist',
action='store_true',
help='Update the user accessible URLs whitelist based on current scan results',
)
def create_test_user(self):
"""创建测试用户"""
User = get_user_model()
username = 'test_user_api_check'
email = 'test@example.com'
# 删除可能存在的测试用户
User.objects.filter(username=username).delete()
# 创建新的测试用户
user = User.objects.create_user(
username=username,
email=email,
password=self.password,
is_active=True
)
return user
def check_user_api_access(self, urls):
"""检查普通用户可以访问的 API"""
user = self.create_test_user()
client = Client()
client.defaults['HTTP_HOST'] = 'localhost'
unauth_urls = []
# 登录用户
login_success = client.login(username=user.username, password=self.password)
if not login_success:
self.stdout.write(
self.style.ERROR('Failed to login test user')
)
return [], []
accessible_urls = []
error_urls = []
unformat_urls = []
self.stdout.write('Checking user API access...')
for url, ourl in urls:
if '(' in url or '<' in url:
continue
try:
response = client.get(url, follow=True)
time.sleep(0.1)
# 如果状态码是 200 或 201说明用户可以访问
if response.status_code in [200, 201]:
accessible_urls.append((url, ourl, response.status_code))
elif response.status_code == 403:
# 403 表示权限不足,这是正常的
pass
else:
# 其他状态码可能是错误
error_urls.append((url, ourl, response.status_code))
except Exception as e:
error_urls.append((url, ourl, str(e)))
# 清理测试用户
user.delete()
return accessible_urls, error_urls
def check_anonymous_access(self, urls):
"""检查匿名访问权限"""
client = Client()
client.defaults['HTTP_HOST'] = 'localhost'
for url, ourl in urls:
if '(' in url or '<' in url:
unformat_urls.append([url, ourl])
self.unformat_urls.append([url, ourl])
continue
try:
response = client.get(url, follow=True)
if response.status_code != 401:
errors[url] = str(response.status_code) + ' ' + str(ourl)
unauth_urls.append(url)
self.unauth_urls.append(url)
except Exception as e:
errors[url] = str(e)
error_urls.append(url)
self.error_urls.append(url)
unauth_urls = set(unauth_urls) - set(known_unauth_urls)
print("\nUnauthorized urls:")
if not unauth_urls:
self.unauth_urls = set(self.unauth_urls) - set(known_unauth_urls)
self.error_urls = set(self.error_urls)
self.unformat_urls = set(self.unformat_urls)
def print_anonymous_access_result(self):
print("\n=== Anonymous Access Check ===")
print("Unauthorized urls:")
if not self.unauth_urls:
print(" Empty, very good!")
for url in unauth_urls:
for url in self.unauth_urls:
print('"{}", {}'.format(url, errors.get(url, '')))
print("\nError urls:")
if not error_urls:
if not self.error_urls:
print(" Empty, very good!")
for url in set(error_urls):
for url in set(self.error_urls):
print(url, ': ' + errors.get(url))
print("\nUnformat urls:")
if not unformat_urls:
if not self.unformat_urls:
print(" Empty, very good!")
for url in unformat_urls:
for url in self.unformat_urls:
print(url)
def check_user_access(self, urls, update_whitelist=False):
"""检查用户访问权限"""
print("\n=== User Access Check ===")
accessible_urls, user_error_urls = self.check_user_api_access(urls)
# 检查是否有不在白名单中的可访问 API
accessible_url_list = [url for url, _, _ in accessible_urls]
unexpected_access = set(accessible_url_list) - set(user_accessible_urls)
self.unexpected_access = unexpected_access
# 如果启用了更新白名单选项
if update_whitelist:
print("\n=== Updating Whitelist ===")
new_whitelist = sorted(set(user_accessible_urls + accessible_url_list))
print("Updated whitelist would include:")
for url in new_whitelist:
print(f' "{url}",')
print(f"\nTotal URLs in whitelist: {len(new_whitelist)}")
def print_user_access_result(self):
print("\n=== User Access Check ===")
print("User unexpected urls:")
if self.unexpected_access:
print(f" Error: Found {len(self.unexpected_access)} URLs accessible by user but not in whitelist:")
for url in self.unexpected_access:
print(f' "{url}"')
else:
print(" Empty, very good!")
def handle(self, *args, **options):
settings.LOG_LEVEL = 'ERROR'
urls = get_api_urls()
# 检查匿名访问权限(默认执行)
if not options['skip_anonymous_check']:
self.check_anonymous_access(urls)
# 检查用户访问权限(默认执行)
if not options['skip_user_check']:
self.check_user_access(urls, options['update_whitelist'])
print("\nCheck total urls: ", len(urls))
self.print_anonymous_access_result()
self.print_user_access_result()

View File

@@ -162,6 +162,7 @@ class FeiShu(RequestMixin):
except Exception as e:
logger.error(f'Get user detail error: {e} data={data}')
data.update(kwargs['other_info'] if 'other_info' in kwargs else {})
info = flatten_dict(data)
default_detail = self.default_user_detail(data, user_id)
detail = map_attributes(default_detail, info, self.attributes)

View File

@@ -207,7 +207,8 @@ class WeComTool(object):
def check_state(self, state, request=None):
return cache.get(state) == self.WECOM_STATE_VALUE or \
request.session[self.WECOM_STATE_SESSION_KEY] == state
request.session.get(self.WECOM_STATE_SESSION_KEY) == state or \
request.GET.get('state') == state # 在企业微信桌面端打开的话,重新创建了个 session会导致 session 校验失败
def wrap_redirect_url(self, next_url):
params = {

View File

@@ -97,10 +97,7 @@ def send_mail_attachment_async(subject, message, recipient_list, attachment_list
for attachment in attachment_list:
email.attach_file(attachment)
os.remove(attachment)
try:
return email.send()
except Exception as e:
logger.error("Sending mail attachment error: {}".format(e))
return email.send()
@shared_task(

View File

@@ -61,8 +61,10 @@ def contains_time_period(time_periods, ctime=None):
"""
time_periods: [{"id": 1, "value": "00:00~07:30、10:00~13:00"}, {"id": 2, "value": "00:00~00:00"}]
"""
if not time_periods:
return None
if not time_periods or all(item['value'] == "" for item in time_periods):
# 需要处理 [{"id":1,"value":""},{"id":2,"value":""},{"id":3,"value":""},...]情况
# 都没选择相当于全选
return True
if ctime is None:
ctime = local_now()

View File

@@ -0,0 +1,47 @@
import logging
import os
from django.conf import settings
from django.template import Context
from django.template import Engine, TemplateSyntaxError
from django.template.loader import render_to_string
from django.utils._os import safe_join
logger = logging.getLogger(__name__)
def safe_render_to_string(template_name, context=None, request=None, using=None):
with open(template_name, encoding="utf-8") as f:
template_code = f.read()
safe_engine = Engine(
debug=False,
libraries={}, # 禁用自定义 tag 库
builtins=[], # 不自动加载内置标签
)
try:
template = safe_engine.from_string(template_code)
except TemplateSyntaxError as e:
logger.error(e)
return template_code
return template.render(Context(context or {}))
def _get_data_template_path(template_name: str):
# 保存到 data/template/<原路径>.html
# 例如 template_name users/_msg_x.html -> data/template/users/_msg_x.html
rel_path = template_name.replace('/', os.sep)
return safe_join(settings.DATA_DIR, 'template', rel_path)
def _get_edit_template_path(template_name: str):
return _get_data_template_path(template_name) + '.edit'
def custom_render_to_string(template_name, context=None, request=None, using=None):
# 如果自定的义模板存在,则使用自定义模板,否则使用系统模板
custom_template = _get_data_template_path(template_name)
if os.path.exists(custom_template):
template = safe_render_to_string(custom_template, context=context, request=request, using=using)
else:
template = render_to_string(template_name, context=context, request=request, using=using)
return template

View File

@@ -16,6 +16,7 @@ class BaseTranslateManager:
'es': 'Spanish',
'ru': 'Russian',
'ko': 'Korean',
'vi': 'Vietnamese',
}
def __init__(self, dir_path, oai_trans_instance):

85
apps/i18n/chen/vi.json Normal file
View File

@@ -0,0 +1,85 @@
{
"ACLRejectError": "Lệnh này không được phép thực hiện",
"AffectedRows": "Trình duyệt cơ sở dữ liệu",
"AlreadyFirstPageError": "Đã là trang đầu tiên",
"AlreadyLastPageError": "Đã đến trang cuối cùng",
"Cancel": "Hủy bỏ",
"ChangeContextError": "Chuyển đổi ngữ cảnh thất bại",
"CommandReview": "Xác nhận lệnh",
"CommandReviewMessage": "Lệnh bạn nhập cần được xác nhận trước khi thực thi, có muốn gửi yêu cầu xác nhận không?",
"CommandReviewRejectBy": "Lệnh xác nhận bị %s từ chối",
"CommandReviewTimeoutError": "Thời gian xác nhận lệnh đã hết",
"CommandWarningDialogMessage": "Lệnh bạn thực hiện có nguy cơ, thông báo cảnh báo sẽ được gửi đến quản lý. Bạn có muốn tiếp tục không?< -SEP->Chạy (Ctrl + Enter)",
"Confirm": "Xác nhận",
"ConnectError": "Kết nối thất bại",
"ConnectSuccess": "Kết nối thành công",
"Connected": "Đã kết nối",
"Copy": "Sao chép",
"CopyFailed": "Sao chép thất bại",
"CopyNotAllowed": "Không được phép sao chép, hãy liên hệ với quản trị viên để mở quyền!",
"CopySucceeded": "Sao chép thành công",
"Current": "Hiện tại",
"DatabaseExplorer": "Người dùng",
"DatabaseProperties": "Thuộc tính nguồn dữ liệu",
"DownloadNotAllowed": "Không cho phép tải về, vui lòng liên hệ với quản lý để mở quyền!",
"DriverClass": "Điều khiển",
"DriverVersion": "Phiên bản lái",
"ErrorMessage": "Thông báo lỗi",
"ExecuteError": "Thực hiện thành công",
"ExecuteSuccess": "Thực hiện thành công",
"ExecutionCanceled": "Thực hiện đã bị hủy",
"ExportALL": "Xuất tất cả dữ liệu",
"ExportAll": "Xuất toàn bộ",
"ExportCurrent": "Xuất trang hiện tại",
"ExportData": "Xuất dữ liệu",
"FetchError": "Lấy dữ liệu thất bại",
"Format": "Định dạng",
"FormatHotKey": "Định dạng (Ctrl + L)",
"InitializeDatasource": "Khởi tạo nguồn dữ liệu",
"InitializeDatasourceFailed": "Khởi tạo nguồn dữ liệu thất bại",
"InitializingDatasourceMessage": "Đang khởi tạo nguồn dữ liệu, xin vui lòng chờ...",
"InsertStatement": "Câu lệnh chèn",
"JDBCURL": "JDBC URL",
"LogOutput": "Xuất nhật ký",
"Name": "Tên",
"NewQuery": "Tạo mới truy vấn",
"NoPermissionError": "Không có quyền thực hiện thao tác này",
"NumRow": "{num} dòng",
"Open": "Mở",
"OverMaxIdleTimeError": "Do phiên này quá thời gian rỗi vượt quá %d phút, đã bị đóng",
"OverMaxSessionTimeError": "Do vì cuộc trò chuyện này kéo dài hơn %d giờ, nó đã bị đóng. \nThuộc tính \nKết nối thành công \nPhiên đã kết thúc \nSao chép thất bại \nLàm mới \nSao chép thành công \nKết nối thất bại \nQuyền đã hết hạn, phiên sẽ hết hạn sau mười phút, vui lòng liên hệ với quản lý để gia hạn kịp thời \nKhông được phép dán, vui lòng liên hệ với quản lý để mở quyền! \nKhông có quyền thực hiện thao tác này \nJDBC URL \nChuyển đổi ngữ cảnh thất bại \nQuyền đã hết hạn \nChọn SQL",
"ParseError": "Phân tích thất bại",
"PasteNotAllowed": "Không cho phép dán, vui lòng liên hệ với quản trị viên để mở quyền!",
"PermissionAlreadyExpired": "Quyền đã hết hạn",
"PermissionExpiredDialogMessage": "- Quyền hạn đã hết hạn, phiên làm việc sẽ hết hạn sau mười phút, xin vui lòng liên hệ với quản lý để gia hạn.\n- Phiên làm việc đã kết thúc.\n- Không cho phép dán, xin vui lòng liên hệ với quản lý để mở quyền!\n- Thuộc tính.\n- Chuyển đổi ngữ cảnh thất bại.\n- Sao chép không thành công.\n- Kết nối thất bại.\n- Làm mới.\n- Chọn SQL.\n- Thực hiện thành công.\n- Kết nối thành công.\n- Sao chép thành công.\n- Quyền hạn đã hết hạn.\n- JDBC URL.\n- Không có quyền thực hiện thao tác này.",
"PermissionExpiredDialogTitle": "Số dòng bị ảnh hưởng",
"PermissionsExpiredOn": "Quyền liên kết với phiên này đã hết hạn vào %s",
"Properties": "Thuộc tính",
"Refresh": "Làm mới",
"Run": "Chạy",
"RunHotKey": "Quyền đã hết hạn",
"RunSelected": "Chạy đã chọn",
"Save": "Lưu",
"SaveSQL": "Lưu SQL",
"SaveSucceed": "Lưu thành công",
"Scope": "Phạm vi",
"SelectSQL": "Chọn SQL",
"SessionClosedBy": "Phiên đã bị %s đóng",
"SessionFinished": "Phiên làm việc đã kết thúc",
"SessionLockedError": "Phiên hiện tại đã bị khóa, không thể tiếp tục thực hiện lệnh",
"SessionLockedMessage": "Phiên này đã bị %s khóa, không thể tiếp tục thực hiện lệnh",
"SessionUnlockedMessage": "Phiên này đã được %s mở khóa, có thể tiếp tục thực hiện lệnh",
"ShowProperties": "Thuộc tính",
"StopHotKey": "Dừng (Ctrl + D)",
"Submit": "Gửi",
"Total": "Tổng cộng",
"Type": "Loại",
"UpdateStatement": "Câu lệnh cập nhật",
"User": "Thực hiện thất bại",
"UserCancelCommandReviewError": "Người dùng hủy lệnh duyệt lại",
"Version": "Phiên bản",
"ViewData": "Xem dữ liệu",
"WaitCommandReviewMessage": "Yêu cầu xem xét đã được gửi đi, xin vui lòng chờ kết quả xem xét.",
"Warning": "Cảnh báo",
"initializingDatasourceFailedMessage": "Kết nối thất bại, vui lòng kiểm tra cấu hình kết nối cơ sở dữ liệu có chính xác hay không."
}

Some files were not shown because too many files have changed in this diff Show More