Compare commits

...

186 Commits

Author SHA1 Message Date
jiangweidong
ecdc1a056a feat: Support automations for website, including ping, secret change, and account verification 2025-09-05 17:39:13 +08:00
jiangweidong
5001ca6960 feat: Support automations for website, including ping, secret change, and account verification 2025-09-05 17:37:49 +08:00
fit2bot
16461b0fa9 perf: support global search (#15961)
* perf: support global search

* perf: change serach

* perf: search model add asset permission

---------

Co-authored-by: mikebofs <mikebofs@gmail.com>
Co-authored-by: ibuler <ibuler@qq.com>
2025-09-05 16:40:18 +08:00
mikebofs
528b0ea1ba perf: change some api view default limit 2025-09-05 16:20:26 +08:00
ibuler
60f06adaa9 fix: wechat or phone decrypt err 2025-09-04 11:59:04 +08:00
Bai
7a6187b95f fix: temp token backend 2025-09-03 18:10:10 +08:00
Bai
aacaf3a174 perf: aks encrypt 2025-09-03 11:16:04 +08:00
Bai
3c9d2534fa perf: aks encrypt 2025-09-03 11:16:04 +08:00
wangruidong
4f79abe678 perf: Connect methods acl allow accept action 2025-09-03 11:00:56 +08:00
fit2bot
ae9956ff91 chore: change readme 2025-09-02 15:22:44 +08:00
Bai
429677e0ce perf: readme 2025-09-02 14:54:28 +08:00
ibuler
034ee65157 perf: decrypt secret logic 2025-09-02 10:38:10 +08:00
Eric
fdd7d9b6b1 perf: add vnc client method 2025-09-02 10:34:39 +08:00
wangruidong
db0e21f5d9 fix: Lazy import Azure and Google Cloud dependencies 2025-08-29 11:10:43 +08:00
wangruidong
468b84eb3d perf: Validate connection token id 2025-08-29 11:09:40 +08:00
ibuler
28d5475d0f perf: try to decrypt then origin value 2025-08-29 11:00:02 +08:00
ibuler
b9c60d856f perf: allow some api page no limits 2025-08-28 17:05:11 +08:00
feng
bd1d73c6dd perf: Report localtime 2025-08-28 15:39:54 +08:00
wangruidong
bf92c756d4 fix: Ensure command arguments are safely quoted in safe_run_cmd 2025-08-28 14:14:55 +08:00
feng
62ebe0d636 perf: Third login redirect url query string 2025-08-27 14:45:56 +08:00
github-actions[bot]
0b1fea8492 perf: Update Dockerfile with new base image tag 2025-08-27 11:05:19 +08:00
mikebofs
65b5f573f8 perf: change requirements 2025-08-27 11:05:19 +08:00
mikebofs
bb639e1fe7 perf: revert django-simple-history version 2025-08-27 10:43:21 +08:00
fit2bot
395b868dcf perf: swagger done (#15865)
* perf: swagger upgrade

* perf: upgrade to drf-spectacular

* perf: 添加部分注解

* perf: swagger done

---------

Co-authored-by: ibuler <ibuler@qq.com>
2025-08-27 10:27:01 +08:00
wangruidong
1350b774b3 perf: Improve chart rendering wait logic in export process 2025-08-26 16:20:22 +08:00
wrd
af7a00c1b1 fix: typo 2025-08-26 15:31:13 +08:00
wangruidong
965ec7007c perf: Enhance eager loading by including labels in queryset 2025-08-26 15:31:13 +08:00
fit2bot
1372fd7535 feat: asset permission support exclude some account
* perf: add perm exclude

* perf: exclude node action account

* perf: add i18n

* perf: pop exclude account

---------

Co-authored-by: mikebofs <mikebofs@gmail.com>
2025-08-26 14:57:57 +08:00
wangruidong
3b0ef4cca7 fix: Add nmap to Dockerfile dependencies 2025-08-25 16:29:10 +08:00
Aaron3S
6832abdaad feat: change some translate 2025-08-25 11:05:49 +08:00
feng
c6bf290dbb perf: Report translate 2025-08-22 18:57:14 +08:00
feng
23ab66c11a perf: Translate 2025-08-22 18:05:30 +08:00
feng
1debaa5547 perf: report perm 2025-08-22 17:53:52 +08:00
Bai
47413966c9 perf: captcha > CAPTCHA 2025-08-22 16:25:45 +08:00
Eric
703f39607c perf: default allow hosts 2025-08-22 14:12:45 +08:00
feng
b65ff0d84c perf: Translate 2025-08-21 18:52:38 +08:00
wangruidong
30d781dd12 fix: Export PDF wait for render done 2025-08-21 18:44:09 +08:00
wangruidong
9551cd4da9 fix: Export PDF with org id 2025-08-21 17:56:26 +08:00
mikebofs
87b456c941 perf: change default width 2025-08-21 16:19:56 +08:00
mikebofs
d4d5224c17 perf: support export dashboard 2025-08-21 16:19:56 +08:00
wangruidong
dabb30d90a perf: Change report name 2025-08-21 16:19:25 +08:00
feng
82192d38e1 perf: Translate 2025-08-21 15:32:04 +08:00
feng
571d2b4575 perf: Custom platform translate 2025-08-21 14:51:38 +08:00
Eric
ea64313c4e perf: fix conenct token platform fields 2025-08-21 14:03:15 +08:00
Bai
8764cdb733 feat: support protocols search 2025-08-21 11:49:18 +08:00
feng
980394efed perf: Transalte 2025-08-21 11:31:29 +08:00
wangruidong
2c94f10d64 fix: The approval setting org admin, and the approver is blank 2025-08-21 10:25:10 +08:00
wangruidong
e1c9f5180d perf: Export pdf using days parameter 2025-08-21 10:23:00 +08:00
wangruidong
3f1d7fa230 perf: Pdf file i18n 2025-08-21 10:23:00 +08:00
wangruidong
44bcd6e399 fix: Send email pdf deps 2025-08-21 10:23:00 +08:00
feng
5f87d98c31 perf: Translate 2025-08-20 18:17:46 +08:00
feng
540becdcbe perf: org admin view settings 2025-08-20 17:11:27 +08:00
feng
6929c4968e perf: Check api 2025-08-20 11:16:46 +08:00
Aaron3S
63b213d3a8 feat: add translate 2025-08-19 19:19:23 +08:00
feng
64fe7a55ec perf: Mongodb ping 2025-08-19 19:08:52 +08:00
feng
27829e09ef perf: Translate 2025-08-19 18:57:23 +08:00
jiangweidong
1bfc7daef6 perf: Avoid Oracle password modification SQL injection risks 2025-08-19 18:55:46 +08:00
Bai
9422aebc5e perf: email i18n 2025-08-19 18:49:25 +08:00
wangruidong
8c0cd20b48 fix: Disable passkey mfa in safe mode 2025-08-19 18:21:33 +08:00
Bai
0c612648a0 perf: email protocol rename 2025-08-19 17:04:32 +08:00
feng
36e01a316c perf: Regular command groups can be filled in with new lines 2025-08-19 15:51:39 +08:00
feng
e1b96e01eb perf: Translate 2025-08-19 15:05:13 +08:00
wangruidong
144f4b4466 fix: Virtual apps manifest i18n 2025-08-19 14:54:03 +08:00
wangruidong
8e007004c2 perf: Translate label for groups parameter 2025-08-19 14:51:52 +08:00
github-actions[bot]
c14f740209 perf: Update Dockerfile with new base image tag 2025-08-19 14:50:45 +08:00
Eric
13a85f062c perf: fix uv pip resolution 2025-08-19 14:50:45 +08:00
fit2bot
7f9d027bd3 perf: Send command translate (#15820)
Co-authored-by: wangruidong <940853815@qq.com>
Co-authored-by: Bryan <jiangjie.bai@fit2cloud.com>
2025-08-18 19:14:48 +08:00
wangruidong
c037ce1c29 perf: Send report email 2025-08-18 19:12:29 +08:00
wangruidong
ee7c6b4708 fix: Init db error 2025-08-18 19:11:59 +08:00
feng
d0e625e322 perf: Translate 2025-08-18 19:08:34 +08:00
feng
c65794a99d perf: KOKO translate 2025-08-18 18:39:42 +08:00
Eric
1e4bca6e24 perf: add lion i18n 2025-08-18 18:28:22 +08:00
feng
c1c5025fbb perf: Account automation report 2025-08-18 17:40:49 +08:00
Eric
96020fa6b4 perf: add lion i18n 2025-08-18 11:42:33 +08:00
wangruidong
5ad6f87a9e fix: Docker build error 2025-08-18 10:53:33 +08:00
feng
9b0c73c9f9 perf: translate 2025-08-15 18:57:46 +08:00
wangruidong
c029714ffd fix: Export pdf failed 2025-08-15 17:42:48 +08:00
wangruidong
c1e8a1b561 fix: Install export pdf deps 2025-08-15 17:42:48 +08:00
feng
21126de2c1 perf: get_cpu_model_count 2025-08-15 16:45:39 +08:00
feng
7d06819bbe perf: foot_js 2025-08-15 16:35:43 +08:00
Eric
92b20fe2ef perf: add lion i18n 2025-08-15 16:24:18 +08:00
feng
4326d35065 perf: User report 2025-08-14 18:55:15 +08:00
feng
4810eae725 perf: group_stats 2025-08-14 16:09:43 +08:00
fit2bot
24f7946b7b perf: change some field to encrypt field (#15842)
* perf: conn token add remote addr

* perf: change some field to encrypt field

---------

Co-authored-by: ibuler <ibuler@qq.com>
2025-08-14 15:05:18 +08:00
王晓阳
4b9c4a550e feat: support vastbase 2025-08-14 14:31:31 +08:00
feng
d3ec23ba85 perf: group_stats 2025-08-14 11:45:36 +08:00
feng
e3c33bca32 perf: User report 2025-08-14 11:12:58 +08:00
feng
0fb7e84678 perf: user asset account report 2025-08-13 18:51:08 +08:00
feng
ab30bfb2d2 perf: mysql pg playbook 2025-08-13 15:15:53 +08:00
feng
d9d034488f fix: report 2025-08-12 19:19:00 +08:00
feng
24bd7b7e1a fix rbac pam 2025-08-12 14:48:16 +08:00
wangruidong
7fb5fd3956 fix: set ansible_timeout for account connectivity tasks 2025-08-11 10:37:23 +08:00
feng
9c621f5ff5 perf: rbac pam 2025-08-08 13:52:38 +08:00
feng
ac8998b9ee perf: Account risk delete normal account 2025-08-06 17:02:53 +08:00
wangruidong
b258537890 fix: Fallback to browser language if user language is not set 2025-08-06 14:15:30 +08:00
fit2bot
b38d83c578 feat: report charts (#15630)
* perf: initial

* perf: basic finished

* perf: depend

* perf: Update Dockerfile with new base image tag

* perf: Add user report api

* perf: Update Dockerfile with new base image tag

* perf: Use user report api

* perf: Update Dockerfile with new base image tag

* perf: user login report

* perf: Update Dockerfile with new base image tag

* perf: user change password

* perf: change password dashboard

* perf: Update Dockerfile with new base image tag

* perf: Translate

* perf: asset api

* perf: asset activity

* perf: Asset report

* perf: add charts_map

* perf: account report

* perf: Translate

* perf: account automation

* perf: Account automation

* perf: title

* perf: Update Dockerfile with new base image tag

---------

Co-authored-by: ibuler <ibuler@qq.com>
Co-authored-by: feng <1304903146@qq.com>
Co-authored-by: github-actions[bot] <github-actions[bot]@users.noreply.github.com>
Co-authored-by: wangruidong <940853815@qq.com>
Co-authored-by: feng626 <57284900+feng626@users.noreply.github.com>
2025-08-06 14:05:38 +08:00
feng
257f290d18 perf: Translate 2025-08-06 11:33:52 +08:00
wangruidong
d185be2180 perf: Optimize redis connection number 2025-08-04 18:53:34 +08:00
ibuler
4e33b5b478 perf: some risk example file path 2025-08-01 10:35:15 +08:00
wangruidong
1406437d4e fix: Failed to switch languages 2025-08-01 10:24:17 +08:00
feng
e46aa95980 perf: check_asset_permission_will_expired filter is_active=True 2025-08-01 10:18:52 +08:00
Eric
c619a35a04 perf: update lion i18n tip 2025-08-01 10:18:12 +08:00
wangruidong
29f10bf10e perf: ES connect error detail 2025-07-31 17:15:55 +08:00
wangruidong
a822905ae7 fix: When the cas user doesn't exist, you will be prompted with an error when logging in. 2025-07-31 17:15:18 +08:00
zhaojisen
dc5a743f4f revert style 2025-07-30 14:27:52 +08:00
zhaojisen
1de8781704 Fixed: Fix the issue with the login page footer 2025-07-30 14:27:52 +08:00
wangruidong
f3d9f4c446 fix: Failed to switch languages 2025-07-29 16:40:30 +08:00
jiangweidong
6b5d5c15ae feat: Add an embedded form to ChatAI 2025-07-29 14:15:01 +08:00
feng
1074a0df19 perf: MFA coce reuse 2025-07-29 11:00:39 +08:00
Eric
04dca794dd fix: fix chrome_app password_manager dialog 2025-07-29 10:21:46 +08:00
ibuler
14e0396508 perf: change ip db path 2025-07-29 10:20:37 +08:00
wangruidong
835eb2e3d0 perf: Improve error handling for email sending in tasks 2025-07-28 10:30:42 +08:00
ibuler
be24f28d9b perf: in safe mode passkey cannot be as mfa 2025-07-25 10:50:46 +08:00
wangruidong
26cea550c4 fix: The applet list is not translated. 2025-07-25 10:49:47 +08:00
wangruidong
36ae076cb0 fix: Open redirect security vulnerability 2025-07-24 15:50:05 +08:00
feng
51c5294fb4 perf: Ticket filter org 2025-07-24 14:36:15 +08:00
feng
da083fffa3 perf: Translate email help text 2025-07-24 14:35:21 +08:00
feng
1df04d2a94 perf: Pam rbac 2025-07-23 10:21:38 +08:00
Eric
299e52cd11 perf: vnc_guide method only by xpack 2025-07-22 14:37:38 +08:00
feng
38b268b104 fix: Circular import 2025-07-22 14:36:22 +08:00
wangruidong
6095e9c9bd perf: Modify the layout to flex 2025-07-22 14:35:05 +08:00
ibuler
c4a348aac6 perf: remove client redirect api 2025-07-22 14:34:11 +08:00
feng
75575af56f perf: Callback client 2025-07-22 13:51:08 +08:00
feng
8f91cb1473 perf: Translate 2025-07-17 15:12:01 +08:00
feng
b72e8eba7c perf: Change the secret and retry in batches 2025-07-17 14:21:31 +08:00
feng
d1d6f3fe9c perf: string_punctuation remove > ^ 2025-07-17 14:02:19 +08:00
wangruidong
6095c9865f fix: Action tips translate 2025-07-17 11:48:28 +08:00
wangruidong
6c374cb41f fix: View replay generate multiple operation logs 2025-07-17 11:24:16 +08:00
Eric
df64145adc perf: lion i18n 2025-07-16 19:49:07 +08:00
ibuler
44d77ba03f perf: random password exclude some char 2025-07-16 19:35:04 +08:00
wangruidong
3af188492f fix: Gather account failed 2025-07-16 19:19:43 +08:00
feng
9e798cd0b6 perf: Translate and tools version 2025-07-16 17:43:35 +08:00
feng
4d22c0722b fix: Exclude special char failed 2025-07-16 16:10:17 +08:00
Eric
e6a1662780 perf: add lion i18n 2025-07-15 18:58:42 +08:00
wangruidong
cc4be36752 perf: Log IntegrityError details during user authentication 2025-07-15 18:58:16 +08:00
wangruidong
e1f5d3c737 fix: Delete user failed(DoesNotExist) when user create share session 2025-07-15 18:43:43 +08:00
wangruidong
c0adc1fe74 fix: Gather account error 2025-07-15 18:43:14 +08:00
feng
613715135b perf: Translate 2025-07-15 11:46:39 +08:00
Eric
fe1d5f9828 perf: add en i18n 2025-07-11 15:34:24 +08:00
Eric
1d375e15c5 perf: add i18n keys 2025-07-11 15:34:24 +08:00
Eric
ac21d260ea perf: add lion i18n 2025-07-11 15:34:24 +08:00
wangruidong
accde77307 fix: Add third party login check is block 2025-07-11 15:33:48 +08:00
ibuler
c7dcf1ba59 perf: playbook task db save if conn timeout 2025-07-11 11:00:20 +08:00
wangruidong
b564bbebb3 perf: Translate 2025-07-11 10:30:40 +08:00
Eric
9440c855f4 perf: add lion i18n 2025-07-10 12:50:11 +08:00
w940853815
f282b2079e Update comment 2025-07-10 11:39:37 +08:00
wangruidong
1790cd8345 fix: Add additional third-party authentication backends and adjust MFA check 2025-07-10 11:39:37 +08:00
ibuler
7da74dc6e8 fix: integrate with azure oidc 2025-07-10 11:33:41 +08:00
Ewall555
33b0068f49 feat: exclude SSO token permissions for change and delete actions 2025-07-10 11:29:18 +08:00
Ewall555
9a446c118b feat: support rbac SSO token 2025-07-10 11:29:18 +08:00
Eric
4bf337b2b4 perf: add VNC terminal type 2025-07-10 11:28:32 +08:00
wangruidong
2acbb80920 perf: Add account date_expired 2025-07-09 10:47:06 +08:00
gerry-f2c
ae859c5562 perf: dbeaver uses a fixed driver directory (#15689) 2025-07-08 18:02:24 +08:00
Eric
a9bc716af5 perf: add encrypt field for sqlserver 2008 2025-07-08 18:01:31 +08:00
feng
2d5401e76e perf: Translate 2025-07-08 16:01:52 +08:00
Gerry.tan
d933e296bc perf: ES command log supports fuzzy search 2025-07-08 11:25:44 +08:00
wangruidong
1e5a995917 fix: Ticket filter error 2025-07-08 10:42:40 +08:00
wangruidong
baaaf83ab9 perf: Translate 2025-07-08 10:35:04 +08:00
wangruidong
ab06ac1f1f perf: Update IP group validation to include address validation 2025-07-08 10:34:34 +08:00
jiangweidong
99c4622ccb fix: SSO access to web assets with encrypted password auto-filling 2025-07-08 10:19:32 +08:00
Eric
9bdfab966f perf: add replay_size on session 2025-07-08 10:18:54 +08:00
老广
1a1acb62de Update README.md 2025-07-08 10:16:43 +08:00
wanghe-fit2cloud
2a128ea01b docs: Add GitCode badges 2025-07-07 15:33:08 +08:00
王贺
5a720b41bf docs: Add GitCode badge 2025-07-07 13:42:15 +08:00
feng
726c5cf34d fix: View replay record operate log 2025-07-07 10:37:29 +08:00
wangruidong
06afc8a0e1 perf: Translate 2025-07-02 19:04:15 +08:00
ibuler
276fd928a7 perf: add pg client 2025-07-01 16:18:25 +08:00
dependabot[bot]
05c6272d7e chore(deps): bump requests from 2.31.0 to 2.32.4
Bumps [requests](https://github.com/psf/requests) from 2.31.0 to 2.32.4.
- [Release notes](https://github.com/psf/requests/releases)
- [Changelog](https://github.com/psf/requests/blob/main/HISTORY.md)
- [Commits](https://github.com/psf/requests/compare/v2.31.0...v2.32.4)

---
updated-dependencies:
- dependency-name: requests
  dependency-version: 2.32.4
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2025-07-01 15:59:44 +08:00
Bai
c3f877d116 fix: check count 2025-07-01 15:35:17 +08:00
wangruidong
60deef2abf fix: Org admin cannot use system tools. 2025-07-01 15:23:34 +08:00
wangruidong
058754dc1b perf: Translate time cost 2025-06-30 10:05:13 +08:00
feng
a238c5d34b perf: operate record 2025-06-27 19:03:24 +08:00
feng626
76c6ed0f95 Merge pull request #15649 from jumpserver/pr@dev@translate
perf: Translate
2025-06-27 14:01:16 +08:00
feng626
0d07f7421b Merge branch 'dev' into pr@dev@translate 2025-06-27 14:01:05 +08:00
Ewall555
b07d4e207c perf: Translate 2025-06-27 13:59:57 +08:00
feng
dc92963059 perf: Translate 2025-06-27 13:15:07 +08:00
feng
9abd708a0a fix: ES search session count 2025-06-27 10:32:28 +08:00
jiangweidong
c9270877eb fix: According to the CMPP2.0 protocol standard, modify the attribute alignment. 2025-06-26 18:41:30 +08:00
feng
b5518dd2ba perf: Pam perm tree 2025-06-26 18:14:57 +08:00
wangruidong
1d40f5ecbc fix: Handle ValidationError in account_obj property 2025-06-26 15:23:16 +08:00
ibuler
91fee6c034 perf: change some 18n 2025-06-26 14:54:06 +08:00
feng
1b65055c5e perf: Account backup backup_type limit 2025-06-23 18:27:27 +08:00
feng
e79ef516a5 perf: Change secret windoes translate 2025-06-23 15:59:55 +08:00
Ewall555
8843f247d6 fix: Use local Python interpreter variable in RDP automation scripts 2025-06-23 14:15:46 +08:00
ibuler
cb42df542d fix: bitwardne request data encode 2025-06-23 14:13:15 +08:00
Ewall555
46ddad1d59 perf: Update metismenu plugin to version 3.0.7 2025-06-23 14:11:23 +08:00
335 changed files with 17388 additions and 6646 deletions

11
.prettierrc Normal file
View File

@@ -0,0 +1,11 @@
{
"tabWidth": 4,
"useTabs": false,
"semi": true,
"singleQuote": true,
"trailingComma": "es5",
"bracketSpacing": true,
"arrowParens": "avoid",
"printWidth": 100,
"endOfLine": "lf"
}

View File

@@ -1,4 +1,4 @@
FROM jumpserver/core-base:20250509_094529 AS stage-build
FROM jumpserver/core-base:20250827_025554 AS stage-build
ARG VERSION
@@ -33,6 +33,7 @@ ARG TOOLS=" \
default-libmysqlclient-dev \
openssh-client \
sshpass \
nmap \
bubblewrap"
ARG APT_MIRROR=http://deb.debian.org

View File

@@ -13,7 +13,9 @@ ARG TOOLS=" \
nmap \
telnet \
vim \
wget"
postgresql-client-13 \
wget \
poppler-utils"
RUN set -ex \
&& apt-get update \
@@ -26,5 +28,5 @@ WORKDIR /opt/jumpserver
ARG PIP_MIRROR=https://pypi.org/simple
RUN set -ex \
&& uv pip install -i${PIP_MIRROR} --group xpack
&& uv pip install -i${PIP_MIRROR} --group xpack \
&& playwright install chromium --with-deps --only-shell

View File

@@ -2,7 +2,7 @@
<a name="readme-top"></a>
<a href="https://jumpserver.com" target="_blank"><img src="https://download.jumpserver.org/images/jumpserver-logo.svg" alt="JumpServer" width="300" /></a>
## An open-source PAM tool (Bastion Host)
## An open-source PAM platform (Bastion Host)
[![][license-shield]][license-link]
[![][docs-shield]][docs-link]
@@ -19,7 +19,7 @@
## What is JumpServer?
JumpServer is an open-source Privileged Access Management (PAM) tool that provides DevOps and IT teams with on-demand and secure access to SSH, RDP, Kubernetes, Database and RemoteApp endpoints through a web browser.
JumpServer is an open-source Privileged Access Management (PAM) platform that provides DevOps and IT teams with on-demand and secure access to SSH, RDP, Kubernetes, Database and RemoteApp endpoints through a web browser.
<picture>
@@ -85,6 +85,8 @@ JumpServer consists of multiple key components, which collectively form the func
| [Nec](https://github.com/jumpserver/nec) | <img alt="Nec" src="https://img.shields.io/badge/release-private-red" /> | JumpServer EE VNC Proxy Connector |
| [Facelive](https://github.com/jumpserver/facelive) | <img alt="Facelive" src="https://img.shields.io/badge/release-private-red" /> | JumpServer EE Facial Recognition |
## Third-party projects
- [jumpserver-grafana-dashboard](https://github.com/acerrah/jumpserver-grafana-dashboard) JumpServer with grafana dashboard
## Contributing

View File

@@ -41,8 +41,8 @@ class AccountViewSet(OrgBulkModelViewSet):
'partial_update': ['accounts.change_account'],
'su_from_accounts': 'accounts.view_account',
'clear_secret': 'accounts.change_account',
'move_to_assets': 'accounts.create_account',
'copy_to_assets': 'accounts.create_account',
'move_to_assets': 'accounts.delete_account',
'copy_to_assets': 'accounts.add_account',
}
export_as_zip = True
@@ -190,6 +190,7 @@ class AccountHistoriesSecretAPI(ExtraFilterFieldsMixin, AccountRecordViewLogMixi
rbac_perms = {
'GET': 'accounts.view_accountsecret',
}
queryset = Account.history.model.objects.none()
@lazyproperty
def account(self) -> Account:

View File

@@ -20,7 +20,7 @@ __all__ = ['PamDashboardApi']
class PamDashboardApi(APIView):
http_method_names = ['get']
rbac_perms = {
'GET': 'accounts.view_account',
'GET': 'rbac.view_pam',
}
@staticmethod

View File

@@ -12,6 +12,8 @@ class VirtualAccountViewSet(OrgBulkModelViewSet):
filterset_fields = ('alias',)
def get_queryset(self):
if getattr(self, "swagger_fake_view", False):
return VirtualAccount.objects.none()
return VirtualAccount.get_or_init_queryset()
def get_object(self, ):

View File

@@ -41,6 +41,7 @@ class AutomationAssetsListApi(generics.ListAPIView):
class AutomationRemoveAssetApi(generics.UpdateAPIView):
model = BaseAutomation
queryset = BaseAutomation.objects.all()
serializer_class = serializers.UpdateAssetSerializer
http_method_names = ['patch']
@@ -59,6 +60,7 @@ class AutomationRemoveAssetApi(generics.UpdateAPIView):
class AutomationAddAssetApi(generics.UpdateAPIView):
model = BaseAutomation
queryset = BaseAutomation.objects.all()
serializer_class = serializers.UpdateAssetSerializer
http_method_names = ['patch']

View File

@@ -97,12 +97,13 @@ class ChangeSecretRecordViewSet(mixins.ListModelMixin, OrgGenericViewSet):
def execute(self, request, *args, **kwargs):
record_ids = request.data.get('record_ids')
records = self.get_queryset().filter(id__in=record_ids)
execution_count = records.values_list('execution_id', flat=True).distinct().count()
if execution_count != 1:
if not records.exists():
return Response(
{'detail': 'Only one execution is allowed to execute'},
{'detail': 'No valid records found'},
status=status.HTTP_400_BAD_REQUEST
)
record_ids = [str(_id) for _id in records.values_list('id', flat=True)]
task = execute_automation_record_task.delay(record_ids, self.tp)
return Response({'task': task.id}, status=status.HTTP_200_OK)
@@ -153,12 +154,10 @@ class ChangSecretAddAssetApi(AutomationAddAssetApi):
model = ChangeSecretAutomation
serializer_class = serializers.ChangeSecretUpdateAssetSerializer
class ChangSecretNodeAddRemoveApi(AutomationNodeAddRemoveApi):
model = ChangeSecretAutomation
serializer_class = serializers.ChangeSecretUpdateNodeSerializer
class ChangeSecretStatusViewSet(OrgBulkModelViewSet):
perm_model = ChangeSecretAutomation
filterset_class = ChangeSecretStatusFilterSet

View File

@@ -62,7 +62,8 @@ class ChangeSecretDashboardApi(APIView):
status_counts = defaultdict(lambda: defaultdict(int))
for date_finished, status in results:
date_str = str(date_finished.date())
dt_local = timezone.localtime(date_finished)
date_str = str(dt_local.date())
if status == ChangeSecretRecordStatusChoice.failed:
status_counts[date_str]['failed'] += 1
elif status == ChangeSecretRecordStatusChoice.success:
@@ -90,10 +91,10 @@ class ChangeSecretDashboardApi(APIView):
def get_change_secret_asset_queryset(self):
qs = self.change_secrets_queryset
node_ids = qs.filter(nodes__isnull=False).values_list('nodes', flat=True).distinct()
nodes = Node.objects.filter(id__in=node_ids)
node_ids = qs.values_list('nodes', flat=True).distinct()
nodes = Node.objects.filter(id__in=node_ids).only('id', 'key')
node_asset_ids = Node.get_nodes_all_assets(*nodes).values_list('id', flat=True)
direct_asset_ids = qs.filter(assets__isnull=False).values_list('assets', flat=True).distinct()
direct_asset_ids = qs.values_list('assets', flat=True).distinct()
asset_ids = set(list(direct_asset_ids) + list(node_asset_ids))
return Asset.objects.filter(id__in=asset_ids)

View File

@@ -45,10 +45,10 @@ class CheckAccountAutomationViewSet(OrgBulkModelViewSet):
class CheckAccountExecutionViewSet(AutomationExecutionViewSet):
rbac_perms = (
("list", "accounts.view_checkaccountexecution"),
("retrieve", "accounts.view_checkaccountsexecution"),
("retrieve", "accounts.view_checkaccountexecution"),
("create", "accounts.add_checkaccountexecution"),
("adhoc", "accounts.add_checkaccountexecution"),
("report", "accounts.view_checkaccountsexecution"),
("report", "accounts.view_checkaccountexecution"),
)
ordering = ("-date_created",)
tp = AutomationTypes.check_account
@@ -150,6 +150,9 @@ class CheckAccountEngineViewSet(JMSModelViewSet):
http_method_names = ['get', 'options']
def get_queryset(self):
if getattr(self, "swagger_fake_view", False):
return CheckAccountEngine.objects.none()
return CheckAccountEngine.get_default_engines()
def filter_queryset(self, queryset: list):

View File

@@ -63,12 +63,10 @@ class PushAccountRemoveAssetApi(AutomationRemoveAssetApi):
model = PushAccountAutomation
serializer_class = serializers.PushAccountUpdateAssetSerializer
class PushAccountAddAssetApi(AutomationAddAssetApi):
model = PushAccountAutomation
serializer_class = serializers.PushAccountUpdateAssetSerializer
class PushAccountNodeAddRemoveApi(AutomationNodeAddRemoveApi):
model = PushAccountAutomation
serializer_class = serializers.PushAccountUpdateNodeSerializer
serializer_class = serializers.PushAccountUpdateNodeSerializer

View File

@@ -105,6 +105,10 @@ class BaseChangeSecretPushManager(AccountBasePlaybookManager):
h['account']['mode'] = 'sysdba' if account.privileged else None
return h
def add_extra_params(self, host, **kwargs):
host['ssh_params'] = {}
return host
def host_callback(self, host, asset=None, account=None, automation=None, path_dir=None, **kwargs):
host = super().host_callback(
host, asset=asset, account=account, automation=automation,
@@ -113,8 +117,7 @@ class BaseChangeSecretPushManager(AccountBasePlaybookManager):
if host.get('error'):
return host
host['ssh_params'] = {}
host = self.add_extra_params(host, automation=automation)
accounts = self.get_accounts(account)
existing_ids = set(map(str, accounts.values_list('id', flat=True)))
missing_ids = set(map(str, self.account_ids)) - existing_ids

View File

@@ -53,4 +53,6 @@
ssl_certfile: "{{ jms_asset.secret_info.client_key | default('') }}"
connection_options:
- tlsAllowInvalidHostnames: "{{ jms_asset.spec_info.allow_invalid_cert}}"
when: check_conn_after_change
when: check_conn_after_change
register: result
failed_when: not result.is_available

View File

@@ -39,7 +39,8 @@
name: "{{ account.username }}"
password: "{{ account.secret }}"
host: "%"
priv: "{{ account.username + '.*:USAGE' if db_name == '' else db_name + '.*:ALL' }}"
priv: "{{ omit if db_name == '' else db_name + '.*:ALL' }}"
append_privs: "{{ db_name != '' | bool }}"
ignore_errors: true
when: db_info is succeeded

View File

@@ -56,3 +56,5 @@
ssl_key: "{{ ssl_key if check_ssl and ssl_key | length > 0 else omit }}"
ssl_mode: "{{ jms_asset.spec_info.pg_ssl_mode }}"
when: check_conn_after_change
register: result
failed_when: not result.is_available

View File

@@ -8,7 +8,7 @@ type:
params:
- name: groups
type: str
label: '用户组'
label: "{{ 'Params groups label' | trans }}"
default: 'Users,Remote Desktop Users'
help_text: "{{ 'Params groups help text' | trans }}"
@@ -24,3 +24,7 @@ i18n:
ja: 'グループを入力してください。複数のグループはコンマで区切ってください(既存のグループを入力してください)'
en: 'Please enter the group. Multiple groups are separated by commas (please enter the existing group)'
Params groups label:
zh: '用户组'
ja: 'グループ'
en: 'Groups'

View File

@@ -9,7 +9,7 @@ type:
params:
- name: groups
type: str
label: '用户组'
label: "{{ 'Params groups label' | trans }}"
default: 'Users,Remote Desktop Users'
help_text: "{{ 'Params groups help text' | trans }}"
@@ -25,3 +25,8 @@ i18n:
ja: 'グループを入力してください。複数のグループはコンマで区切ってください(既存のグループを入力してください)'
en: 'Please enter the group. Multiple groups are separated by commas (please enter the existing group)'
Params groups label:
zh: '用户组'
ja: 'グループ'
en: 'Groups'

View File

@@ -9,19 +9,24 @@ priority: 49
params:
- name: groups
type: str
label: '用户组'
label: "{{ 'Params groups label' | trans }}"
default: 'Users,Remote Desktop Users'
help_text: "{{ 'Params groups help text' | trans }}"
i18n:
Windows account change secret rdp verify:
zh: '使用 Ansible 模块 win_user 执行 Windows 账号改密 RDP 协议测试最后的可连接性'
ja: 'Ansibleモジュールwin_userWindowsアカウントの改密RDPプロトコルテストの最後の接続性を実行する'
en: 'Using the Ansible module win_user performs Windows account encryption RDP protocol testing for final connectivity'
zh: '使用 Ansible 模块 win_user 执行 Windows 账号改密(最后使用 Python 模块 pyfreerdp 验证账号的可连接性'
ja: 'Ansible モジュール win_user を使用して Windows アカウントのパスワードを変更します (最後に Python モジュール pyfreerdp を使用してアカウントの接続を確認します)'
en: 'Use the Ansible module win_user to change the Windows account password (finally use the Python module pyfreerdp to verify the account connectivity)'
Params groups help text:
zh: '请输入用户组,多个用户组使用逗号分隔(需填写已存在的用户组)'
ja: 'グループを入力してください。複数のグループはコンマで区切ってください(既存のグループを入力してください)'
en: 'Please enter the group. Multiple groups are separated by commas (please enter the existing group)'
Params groups label:
zh: '用户组'
ja: 'グループ'
en: 'Groups'

View File

@@ -5,6 +5,9 @@ from django.conf import settings
from django.utils.translation import gettext_lazy as _
from xlsxwriter import Workbook
from assets.automations.methods import platform_automation_methods as asset_methods
from assets.const import AutomationTypes as AssetAutomationTypes
from accounts.automations.methods import platform_automation_methods as account_methods
from accounts.const import (
AutomationTypes, SecretStrategy, ChangeSecretRecordStatusChoice
)
@@ -22,6 +25,22 @@ logger = get_logger(__name__)
class ChangeSecretManager(BaseChangeSecretPushManager):
ansible_account_prefer = ''
def get_method_id_meta_mapper(self):
return {
method["id"]: method for method in self.platform_automation_methods
}
@property
def platform_automation_methods(self):
return asset_methods + account_methods
def add_extra_params(self, host, **kwargs):
host = super().add_extra_params(host, **kwargs)
automation = kwargs.get('automation')
for extra_type in [AssetAutomationTypes.ping, AutomationTypes.verify_account]:
host[f"{extra_type}_params"] = self.get_params(automation, extra_type)
return host
@classmethod
def method_type(cls):
return AutomationTypes.change_secret

View File

@@ -0,0 +1,36 @@
- hosts: website
gather_facts: no
vars:
ansible_python_interpreter: "{{ local_python_interpreter }}"
tasks:
- name: Test privileged account
website_ping:
login_host: "{{ jms_asset.address }}"
login_user: "{{ jms_account.username }}"
login_password: "{{ jms_account.secret }}"
steps: "{{ ping_params.steps }}"
load_state: "{{ ping_params.load_state }}"
- name: "Change {{ account.username }} password"
website_user:
login_host: "{{ jms_asset.address }}"
login_user: "{{ jms_account.username }}"
login_password: "{{ jms_account.secret }}"
steps: "{{ params.steps }}"
load_state: "{{ params.load_state }}"
name: "{{ account.username }}"
password: "{{ account.secret }}"
ignore_errors: true
register: change_secret_result
- name: "Verify {{ account.username }} password"
website_ping:
login_host: "{{ jms_asset.address }}"
login_user: "{{ account.username }}"
login_password: "{{ account.secret }}"
steps: "{{ verify_account_params.steps }}"
load_state: "{{ verify_account_params.load_state }}"
when:
- check_conn_after_change or change_secret_result.failed | default(false)
delegate_to: localhost

View File

@@ -0,0 +1,51 @@
id: change_account_website
name: "{{ 'Website account change secret' | trans }}"
category: web
type:
- website
method: change_secret
priority: 50
params:
- name: load_state
type: choice
label: "{{ 'Load state' | trans }}"
choices:
- [ networkidle, "{{ 'Network idle' | trans }}" ]
- [ domcontentloaded, "{{ 'Dom content loaded' | trans }}" ]
- [ load, "{{ 'Load completed' | trans }}" ]
default: 'load'
- name: steps
type: list
default: [ ]
label: "{{ 'Steps' | trans }}"
help_text: "{{ 'Params step help text' | trans }}"
i18n:
Website account change secret:
zh: 使用 Playwright 模拟浏览器变更账号密码
ja: Playwright を使用してブラウザをシミュレートし、アカウントのパスワードを変更します
en: Use Playwright to simulate a browser for account password change.
Load state:
zh: 加载状态检测
en: Load state detection
ja: ロード状態の検出
Steps:
zh: 步骤
en: Steps
ja: 手順
Network idle:
zh: 网络空闲
en: Network idle
ja: ネットワークが空いた状態
Dom content loaded:
zh: 文档内容加载完成
en: Dom content loaded
ja: ドキュメントの内容がロードされた状態
Load completed:
zh: 全部加载完成
en: All load completed
ja: すべてのロードが完了した状態
Params step help text:
zh: 根据配置决定任务执行步骤
ja: 設定に基づいてタスクの実行ステップを決定する
en: Determine task execution steps based on configuration

View File

@@ -15,11 +15,13 @@ from common.decorators import bulk_create_decorator, bulk_update_decorator
from settings.models import LeakPasswords
# 已设置手动 finish
@bulk_create_decorator(AccountRisk)
def create_risk(data):
return AccountRisk(**data)
# 已设置手动 finish
@bulk_update_decorator(AccountRisk, update_fields=["details", "status"])
def update_risk(risk):
return risk
@@ -217,6 +219,9 @@ class CheckAccountManager(BaseManager):
"details": [{"datetime": now, 'type': 'init'}],
})
create_risk.finish()
update_risk.finish()
def pre_run(self):
super().pre_run()
self.assets = self.execution.get_all_assets()
@@ -235,6 +240,11 @@ class CheckAccountManager(BaseManager):
print("Check: {} => {}".format(account, msg))
if not error:
AccountRisk.objects.filter(
asset=account.asset,
username=account.username,
risk=handler.risk
).delete()
continue
self.add_risk(handler.risk, account)
self.commit_risks(_assets)

View File

@@ -30,6 +30,16 @@ common_risk_items = [
diff_items = risk_items + common_risk_items
@bulk_create_decorator(AccountRisk)
def _create_risk(data):
return AccountRisk(**data)
@bulk_update_decorator(AccountRisk, update_fields=["details"])
def _update_risk(account):
return account
def format_datetime(value):
if isinstance(value, timezone.datetime):
return value.strftime("%Y-%m-%d %H:%M:%S")
@@ -141,25 +151,17 @@ class AnalyseAccountRisk:
found = assets_risks.get(key)
if not found:
self._create_risk(dict(**d, details=[detail]))
_create_risk(dict(**d, details=[detail]))
continue
found.details.append(detail)
self._update_risk(found)
@bulk_create_decorator(AccountRisk)
def _create_risk(self, data):
return AccountRisk(**data)
@bulk_update_decorator(AccountRisk, update_fields=["details"])
def _update_risk(self, account):
return account
_update_risk(found)
def lost_accounts(self, asset, lost_users):
if not self.check_risk:
return
for user in lost_users:
self._create_risk(
_create_risk(
dict(
asset_id=str(asset.id),
username=user,
@@ -176,7 +178,7 @@ class AnalyseAccountRisk:
self._analyse_item_changed(ga, d)
if not sys_found:
basic = {"asset": asset, "username": d["username"], 'gathered_account': ga}
self._create_risk(
_create_risk(
dict(
**basic,
risk=RiskChoice.new_found,
@@ -388,6 +390,7 @@ class GatherAccountsManager(AccountBasePlaybookManager):
self.update_gathered_account(ori_account, d)
ori_found = username in ori_users
need_analyser_gather_account.append((asset, ga, d, ori_found))
# 这里顺序不能调整risk 外键关联了 gathered_account 主键 id所以在创建 risk 需要保证 gathered_account 已经创建完成
self.create_gathered_account.finish()
self.update_gathered_account.finish()
for analysis_data in need_analyser_gather_account:
@@ -403,6 +406,9 @@ class GatherAccountsManager(AccountBasePlaybookManager):
present=True
)
# 因为有 bulk create, bulk update, 所以这里需要 sleep 一下,等待数据同步
_update_risk.finish()
_create_risk.finish()
time.sleep(0.5)
def get_report_template(self):

View File

@@ -54,3 +54,5 @@
connection_options:
- tlsAllowInvalidHostnames: "{{ jms_asset.spec_info.allow_invalid_cert}}"
when: check_conn_after_change
register: result
failed_when: not result.is_available

View File

@@ -39,7 +39,8 @@
name: "{{ account.username }}"
password: "{{ account.secret }}"
host: "%"
priv: "{{ account.username + '.*:USAGE' if db_name == '' else db_name + '.*:ALL' }}"
priv: "{{ omit if db_name == '' else db_name + '.*:ALL' }}"
append_privs: "{{ db_name != '' | bool }}"
ignore_errors: true
when: db_info is succeeded

View File

@@ -8,7 +8,7 @@ type:
params:
- name: groups
type: str
label: '用户组'
label: "{{ 'Params groups label' | trans }}"
default: 'Users,Remote Desktop Users'
help_text: "{{ 'Params groups help text' | trans }}"
@@ -22,3 +22,8 @@ i18n:
zh: '请输入用户组,多个用户组使用逗号分隔(需填写已存在的用户组)'
ja: 'グループを入力してください。複数のグループはコンマで区切ってください(既存のグループを入力してください)'
en: 'Please enter the group. Multiple groups are separated by commas (please enter the existing group)'
Params groups label:
zh: '用户组'
ja: 'グループ'
en: 'Groups'

View File

@@ -9,7 +9,7 @@ type:
params:
- name: groups
type: str
label: '用户组'
label: "{{ 'Params groups label' | trans }}"
default: 'Users,Remote Desktop Users'
help_text: "{{ 'Params groups help text' | trans }}"
@@ -23,3 +23,8 @@ i18n:
zh: '请输入用户组,多个用户组使用逗号分隔(需填写已存在的用户组)'
ja: 'グループを入力してください。複数のグループはコンマで区切ってください(既存のグループを入力してください)'
en: 'Please enter the group. Multiple groups are separated by commas (please enter the existing group)'
Params groups label:
zh: '用户组'
ja: 'グループ'
en: 'Groups'

View File

@@ -9,7 +9,7 @@ priority: 49
params:
- name: groups
type: str
label: '用户组'
label: "{{ 'Params groups label' | trans }}"
default: 'Users,Remote Desktop Users'
help_text: "{{ 'Params groups help text' | trans }}"
@@ -23,3 +23,8 @@ i18n:
zh: '请输入用户组,多个用户组使用逗号分隔(需填写已存在的用户组)'
ja: 'グループを入力してください。複数のグループはコンマで区切ってください(既存のグループを入力してください)'
en: 'Please enter the group. Multiple groups are separated by commas (please enter the existing group)'
Params groups label:
zh: '用户组'
ja: 'グループ'
en: 'Groups'

View File

@@ -3,7 +3,7 @@
vars:
ansible_shell_type: sh
ansible_connection: local
ansible_python_interpreter: /opt/py3/bin/python
ansible_python_interpreter: "{{ local_python_interpreter }}"
tasks:
- name: Verify account (pyfreerdp)

View File

@@ -16,3 +16,5 @@
ssl_certfile: "{{ jms_asset.secret_info.client_key | default('') }}"
connection_options:
- tlsAllowInvalidHostnames: "{{ jms_asset.spec_info.allow_invalid_cert }}"
register: result
failed_when: not result.is_available

View File

@@ -0,0 +1,13 @@
- hosts: website
gather_facts: no
vars:
ansible_python_interpreter: "{{ local_python_interpreter }}"
tasks:
- name: Verify account
website_ping:
login_host: "{{ jms_asset.address }}"
login_user: "{{ account.username }}"
login_password: "{{ account.secret }}"
steps: "{{ params.steps }}"
load_state: "{{ params.load_state }}"

View File

@@ -0,0 +1,50 @@
id: verify_account_website
name: "{{ 'Website account verify' | trans }}"
category: web
type:
- website
method: verify_account
priority: 50
params:
- name: load_state
type: choice
label: "{{ 'Load state' | trans }}"
choices:
- [ networkidle, "{{ 'Network idle' | trans }}" ]
- [ domcontentloaded, "{{ 'Dom content loaded' | trans }}" ]
- [ load, "{{ 'Load completed' | trans }}" ]
default: 'load'
- name: steps
type: list
label: "{{ 'Steps' | trans }}"
help_text: "{{ 'Params step help text' | trans }}"
default: []
i18n:
Website account verify:
zh: 使用 Playwright 模拟浏览器验证账号
ja: Playwright を使用してブラウザをシミュレートし、アカウントの検証を行います
en: Use Playwright to simulate a browser for account verification.
Load state:
zh: 加载状态检测
en: Load state detection
ja: ロード状態の検出
Steps:
zh: 步骤
en: Steps
ja: 手順
Network idle:
zh: 网络空闲
en: Network idle
ja: ネットワークが空いた状態
Dom content loaded:
zh: 文档内容加载完成
en: Dom content loaded
ja: ドキュメントの内容がロードされた状態
Load completed:
zh: 全部加载完成
en: All load completed
ja: すべてのロードが完了した状態
Params step help text:
zh: 配置步骤,根据配置决定任务执行步骤
ja: パラメータを設定し、設定に基づいてタスクの実行手順を決定します
en: Configure steps, and determine the task execution steps based on the configuration.

View File

@@ -1,8 +1,5 @@
# -*- coding: utf-8 -*-
#
from azure.core.exceptions import ResourceNotFoundError, ClientAuthenticationError
from azure.identity import ClientSecretCredential
from azure.keyvault.secrets import SecretClient
from common.utils import get_logger
@@ -14,6 +11,9 @@ __all__ = ['AZUREVaultClient']
class AZUREVaultClient(object):
def __init__(self, vault_url, tenant_id, client_id, client_secret):
from azure.identity import ClientSecretCredential
from azure.keyvault.secrets import SecretClient
authentication_endpoint = 'https://login.microsoftonline.com/' \
if ('azure.net' in vault_url) else 'https://login.chinacloudapi.cn/'
@@ -23,6 +23,8 @@ class AZUREVaultClient(object):
self.client = SecretClient(vault_url=vault_url, credential=credentials)
def is_active(self):
from azure.core.exceptions import ResourceNotFoundError, ClientAuthenticationError
try:
self.client.set_secret('jumpserver', '666')
except (ResourceNotFoundError, ClientAuthenticationError) as e:
@@ -32,6 +34,8 @@ class AZUREVaultClient(object):
return True, ''
def get(self, name, version=None):
from azure.core.exceptions import ResourceNotFoundError, ClientAuthenticationError
try:
secret = self.client.get_secret(name, version)
return secret.value

View File

@@ -46,11 +46,16 @@ class Migration(migrations.Migration):
],
options={
'verbose_name': 'Account',
'permissions': [('view_accountsecret', 'Can view asset account secret'),
('view_historyaccount', 'Can view asset history account'),
('view_historyaccountsecret', 'Can view asset history account secret'),
('verify_account', 'Can verify account'), ('push_account', 'Can push account'),
('remove_account', 'Can remove account')],
'permissions': [
('view_accountsecret', 'Can view asset account secret'),
('view_historyaccount', 'Can view asset history account'),
('view_historyaccountsecret', 'Can view asset history account secret'),
('verify_account', 'Can verify account'),
('push_account', 'Can push account'),
('remove_account', 'Can remove account'),
('view_accountsession', 'Can view session'),
('view_accountactivity', 'Can view activity')
],
},
),
migrations.CreateModel(

View File

@@ -335,6 +335,7 @@ class Migration(migrations.Migration):
],
options={
"abstract": False,
"verbose_name": "Check engine",
},
),
migrations.CreateModel(

View File

@@ -116,6 +116,8 @@ class Account(AbsConnectivity, LabeledMixin, BaseAccount, JSONFilterMixin):
('verify_account', _('Can verify account')),
('push_account', _('Can push account')),
('remove_account', _('Can remove account')),
('view_accountsession', _('Can view session')),
('view_accountactivity', _('Can view activity')),
]
def __str__(self):
@@ -130,7 +132,7 @@ class Account(AbsConnectivity, LabeledMixin, BaseAccount, JSONFilterMixin):
return self.asset.platform
@lazyproperty
def alias(self):
def alias(self) -> str:
"""
别称,因为有虚拟账号,@INPUT @MANUAL @USER, 否则为 id
"""
@@ -138,13 +140,13 @@ class Account(AbsConnectivity, LabeledMixin, BaseAccount, JSONFilterMixin):
return self.username
return str(self.id)
def is_virtual(self):
def is_virtual(self) -> bool:
"""
不要用 username 去判断,因为可能是构造的 account 对象,设置了同名账号的用户名,
"""
return self.alias.startswith('@')
def is_ds_account(self):
def is_ds_account(self) -> bool:
if self.is_virtual():
return ''
if not self.asset.is_directory_service:
@@ -158,7 +160,7 @@ class Account(AbsConnectivity, LabeledMixin, BaseAccount, JSONFilterMixin):
return self.asset.ds
@lazyproperty
def ds_domain(self):
def ds_domain(self) -> str:
"""这个不能去掉perm_account 会动态设置这个值,以更改 full_username"""
if self.is_virtual():
return ''
@@ -170,17 +172,17 @@ class Account(AbsConnectivity, LabeledMixin, BaseAccount, JSONFilterMixin):
return '@' in self.username or '\\' in self.username
@property
def full_username(self):
def full_username(self) -> str:
if not self.username_has_domain() and self.ds_domain:
return '{}@{}'.format(self.username, self.ds_domain)
return self.username
@lazyproperty
def has_secret(self):
def has_secret(self) -> bool:
return bool(self.secret)
@lazyproperty
def versions(self):
def versions(self) -> int:
return self.history.count()
def get_su_from_accounts(self):

View File

@@ -33,7 +33,7 @@ class IntegrationApplication(JMSOrgBaseModel):
return qs.filter(*query)
@property
def accounts_amount(self):
def accounts_amount(self) -> int:
return self.get_accounts().count()
@property

View File

@@ -68,8 +68,10 @@ class AccountRisk(JMSOrgBaseModel):
related_name='risks', null=True
)
risk = models.CharField(max_length=128, verbose_name=_('Risk'), choices=RiskChoice.choices)
status = models.CharField(max_length=32, choices=ConfirmOrIgnore.choices, default=ConfirmOrIgnore.pending,
blank=True, verbose_name=_('Status'))
status = models.CharField(
max_length=32, choices=ConfirmOrIgnore.choices, default=ConfirmOrIgnore.pending,
blank=True, verbose_name=_('Status')
)
details = models.JSONField(default=list, verbose_name=_('Detail'))
class Meta:
@@ -119,6 +121,9 @@ class CheckAccountEngine(JMSBaseModel):
def __str__(self):
return self.name
class Meta:
verbose_name = _('Check engine')
@staticmethod
def get_default_engines():
data = [

View File

@@ -75,11 +75,11 @@ class BaseAccount(VaultModelMixin, JMSOrgBaseModel):
return bool(self.secret)
@property
def has_username(self):
def has_username(self) -> bool:
return bool(self.username)
@property
def spec_info(self):
def spec_info(self) -> dict:
data = {}
if self.secret_type != SecretType.SSH_KEY:
return data
@@ -87,13 +87,13 @@ class BaseAccount(VaultModelMixin, JMSOrgBaseModel):
return data
@property
def password(self):
def password(self) -> str:
if self.secret_type == SecretType.PASSWORD:
return self.secret
return None
@property
def private_key(self):
def private_key(self) -> str:
if self.secret_type == SecretType.SSH_KEY:
return self.secret
return None
@@ -110,7 +110,7 @@ class BaseAccount(VaultModelMixin, JMSOrgBaseModel):
return None
@property
def ssh_key_fingerprint(self):
def ssh_key_fingerprint(self) -> str:
if self.public_key:
public_key = self.public_key
elif self.private_key:

View File

@@ -56,7 +56,7 @@ class VaultModelMixin(models.Model):
__secret = None
@property
def secret(self):
def secret(self) -> str:
if self.__secret:
return self.__secret
from accounts.backends import vault_client

View File

@@ -18,11 +18,11 @@ class VirtualAccount(JMSOrgBaseModel):
verbose_name = _('Virtual account')
@property
def name(self):
def name(self) -> str:
return self.get_alias_display()
@property
def username(self):
def username(self) -> str:
usernames_map = {
AliasAccount.INPUT: _("Manual input"),
AliasAccount.USER: _("Same with user"),
@@ -32,7 +32,7 @@ class VirtualAccount(JMSOrgBaseModel):
return usernames_map.get(self.alias, '')
@property
def comment(self):
def comment(self) -> str:
comments_map = {
AliasAccount.INPUT: _('Non-asset account, Input username/password on connect'),
AliasAccount.USER: _('The account username name same with user on connect'),

View File

@@ -456,6 +456,8 @@ class AssetAccountBulkSerializer(
class AccountSecretSerializer(SecretReadableMixin, AccountSerializer):
spec_info = serializers.DictField(label=_('Spec info'), read_only=True)
class Meta(AccountSerializer.Meta):
fields = AccountSerializer.Meta.fields + ['spec_info']
extra_kwargs = {
@@ -470,6 +472,7 @@ class AccountSecretSerializer(SecretReadableMixin, AccountSerializer):
class AccountHistorySerializer(serializers.ModelSerializer):
secret_type = LabeledChoiceField(choices=SecretType.choices, label=_('Secret type'))
secret = serializers.CharField(label=_('Secret'), read_only=True)
id = serializers.IntegerField(label=_('ID'), source='history_id', read_only=True)
class Meta:

View File

@@ -70,6 +70,8 @@ class AuthValidateMixin(serializers.Serializer):
class BaseAccountSerializer(
AuthValidateMixin, ResourceLabelsMixin, BulkOrgResourceModelSerializer
):
spec_info = serializers.DictField(label=_('Spec info'), read_only=True)
class Meta:
model = BaseAccount
fields_mini = ["id", "name", "username"]

View File

@@ -1,8 +1,9 @@
# -*- coding: utf-8 -*-
#
from django.conf import settings
from django.utils.translation import gettext_lazy as _
from accounts.const import AutomationTypes
from accounts.const import AutomationTypes, AccountBackupType
from accounts.models import BackupAccountAutomation
from common.serializers.fields import EncryptedField
from common.utils import get_logger
@@ -41,6 +42,17 @@ class BackupAccountSerializer(BaseAutomationSerializer):
'types': {'label': _('Asset type')}
}
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
self.set_backup_type_choices()
def set_backup_type_choices(self):
field_backup_type = self.fields.get("backup_type")
if not field_backup_type:
return
if not settings.XPACK_LICENSE_IS_VALID:
field_backup_type._choices.pop(AccountBackupType.object_storage, None)
@property
def model_type(self):
return AutomationTypes.backup_account

View File

@@ -130,7 +130,7 @@ class ChangeSecretRecordSerializer(serializers.ModelSerializer):
read_only_fields = fields
@staticmethod
def get_is_success(obj):
def get_is_success(obj) -> bool:
return obj.status == ChangeSecretRecordStatusChoice.success
@@ -157,7 +157,7 @@ class ChangeSecretRecordBackUpSerializer(serializers.ModelSerializer):
read_only_fields = fields
@staticmethod
def get_asset(instance):
def get_asset(instance) -> str:
return str(instance.asset)
@staticmethod
@@ -165,7 +165,7 @@ class ChangeSecretRecordBackUpSerializer(serializers.ModelSerializer):
return str(instance.account)
@staticmethod
def get_is_success(obj):
def get_is_success(obj) -> str:
if obj.status == ChangeSecretRecordStatusChoice.success.value:
return _("Success")
return _("Failed")
@@ -196,9 +196,9 @@ class ChangeSecretAccountSerializer(serializers.ModelSerializer):
read_only_fields = fields
@staticmethod
def get_meta(obj):
def get_meta(obj) -> dict:
return account_secret_task_status.get(str(obj.id))
@staticmethod
def get_ttl(obj):
def get_ttl(obj) -> int:
return account_secret_task_status.get_ttl(str(obj.id))

View File

@@ -69,7 +69,7 @@ class AssetRiskSerializer(serializers.Serializer):
risk_summary = serializers.SerializerMethodField()
@staticmethod
def get_risk_summary(obj):
def get_risk_summary(obj) -> dict:
summary = {}
for risk in RiskChoice.choices:
summary[f"{risk[0]}_count"] = obj.get(f"{risk[0]}_count", 0)

View File

@@ -28,7 +28,7 @@ class DiscoverAccountAutomationSerializer(BaseAutomationSerializer):
+ read_only_fields)
extra_kwargs = {
'check_risk': {
'help_text': _('Whether to check the risk of the gathered accounts.'),
'help_text': _('Whether to check the risk of the discovered accounts.'),
},
**BaseAutomationSerializer.Meta.extra_kwargs
}

View File

@@ -1,4 +1,5 @@
import datetime
from collections import defaultdict
from celery import shared_task
from django.db.models import Q
@@ -72,24 +73,43 @@ def execute_automation_record_task(record_ids, tp):
task_name = gettext_noop('Execute automation record')
with tmp_to_root_org():
records = ChangeSecretRecord.objects.filter(id__in=record_ids)
records = ChangeSecretRecord.objects.filter(id__in=record_ids).order_by('-date_updated')
if not records:
logger.error('No automation record found: {}'.format(record_ids))
logger.error(f'No automation record found: {record_ids}')
return
record = records[0]
record_map = {f'{record.asset_id}-{record.account_id}': str(record.id) for record in records}
task_snapshot = {
'params': {},
'record_map': record_map,
'secret': record.new_secret,
'secret_type': record.execution.snapshot.get('secret_type'),
'assets': [str(instance.asset_id) for instance in records],
'accounts': [str(instance.account_id) for instance in records],
}
with tmp_to_org(record.execution.org_id):
quickstart_automation_by_snapshot(task_name, tp, task_snapshot)
seen_accounts = set()
unique_records = []
for rec in records:
acct = str(rec.account_id)
if acct not in seen_accounts:
seen_accounts.add(acct)
unique_records.append(rec)
exec_groups = defaultdict(list)
for rec in unique_records:
exec_groups[rec.execution_id].append(rec)
for __, group in exec_groups.items():
latest_rec = group[0]
snapshot = getattr(latest_rec.execution, 'snapshot', {}) or {}
record_map = {f"{r.asset_id}-{r.account_id}": str(r.id) for r in group}
assets = [str(r.asset_id) for r in group]
accounts = [str(r.account_id) for r in group]
task_snapshot = {
'params': {},
'record_map': record_map,
'secret': latest_rec.new_secret,
'secret_type': snapshot.get('secret_type'),
'assets': assets,
'accounts': accounts,
}
with tmp_to_org(latest_rec.execution.org_id):
quickstart_automation_by_snapshot(task_name, tp, task_snapshot)
@shared_task(

View File

@@ -11,4 +11,4 @@ class ActionChoices(models.TextChoices):
notify_and_warn = 'notify_and_warn', _('Prompt and warn')
face_verify = 'face_verify', _('Face verify')
face_online = 'face_online', _('Face online')
change_secret = 'change_secret', _('Change secret')
change_secret = 'change_secret', _('Secret rotation')

View File

@@ -5,7 +5,7 @@ from django.utils.translation import gettext_lazy as _
from common.db.fields import JSONManyToManyField
from common.db.models import JMSBaseModel
from common.utils import contains_ip
from common.utils.time_period import contains_time_period
from common.utils.timezone import contains_time_period
from orgs.mixins.models import OrgModelMixin, OrgManager
from ..const import ActionChoices

View File

@@ -34,16 +34,16 @@ class CommandGroup(JMSOrgBaseModel):
@lazyproperty
def pattern(self):
content = self.content.replace('\r\n', '\n')
if self.type == 'command':
s = self.construct_command_regex(self.content)
s = self.construct_command_regex(content)
else:
s = r'{0}'.format(self.content)
s = r'{0}'.format(r'{}'.format('|'.join(content.split('\n'))))
return s
@classmethod
def construct_command_regex(cls, content):
regex = []
content = content.replace('\r\n', '\n')
for _cmd in content.split('\n'):
cmd = re.sub(r'\s+', ' ', _cmd)
cmd = re.escape(cmd)

View File

@@ -1,4 +1,4 @@
from orgs.mixins.serializers import BulkOrgResourceModelSerializer
from common.serializers.mixin import CommonBulkModelSerializer
from .base import BaseUserAssetAccountACLSerializer as BaseSerializer
from ..const import ActionChoices
from ..models import ConnectMethodACL
@@ -6,16 +6,15 @@ from ..models import ConnectMethodACL
__all__ = ["ConnectMethodACLSerializer"]
class ConnectMethodACLSerializer(BaseSerializer, BulkOrgResourceModelSerializer):
class ConnectMethodACLSerializer(BaseSerializer, CommonBulkModelSerializer):
class Meta(BaseSerializer.Meta):
model = ConnectMethodACL
fields = [
i for i in BaseSerializer.Meta.fields + ['connect_methods']
if i not in ['assets', 'accounts']
if i not in ['assets', 'accounts', 'org_id']
]
action_choices_exclude = BaseSerializer.Meta.action_choices_exclude + [
ActionChoices.review,
ActionChoices.accept,
ActionChoices.notice,
ActionChoices.face_verify,
ActionChoices.face_online,

View File

@@ -1,7 +1,7 @@
from django.utils.translation import gettext as _
from common.serializers import CommonBulkModelSerializer
from common.serializers import MethodSerializer
from orgs.mixins.serializers import BulkOrgResourceModelSerializer
from .base import BaseUserACLSerializer
from .rules import RuleSerializer
from ..const import ActionChoices
@@ -12,12 +12,12 @@ __all__ = ["LoginACLSerializer"]
common_help_text = _("With * indicating a match all. ")
class LoginACLSerializer(BaseUserACLSerializer, BulkOrgResourceModelSerializer):
class LoginACLSerializer(BaseUserACLSerializer, CommonBulkModelSerializer):
rules = MethodSerializer(label=_('Rule'))
class Meta(BaseUserACLSerializer.Meta):
model = LoginACL
fields = BaseUserACLSerializer.Meta.fields + ['rules', ]
fields = list((set(BaseUserACLSerializer.Meta.fields) | {'rules'}) - {'org_id'})
action_choices_exclude = [
ActionChoices.warning,
ActionChoices.notify_and_warn,

View File

@@ -1,5 +1,7 @@
# coding: utf-8
#
from urllib.parse import urlparse
from django.utils.translation import gettext_lazy as _
from rest_framework import serializers
@@ -8,7 +10,7 @@ from common.utils.ip import is_ip_address, is_ip_network, is_ip_segment
logger = get_logger(__file__)
__all__ = ['RuleSerializer', 'ip_group_child_validator', 'ip_group_help_text']
__all__ = ['RuleSerializer', 'ip_group_child_validator', 'ip_group_help_text', 'address_validator']
def ip_group_child_validator(ip_group_child):
@@ -21,6 +23,19 @@ def ip_group_child_validator(ip_group_child):
raise serializers.ValidationError(error)
def address_validator(value):
parsed = urlparse(value)
is_basic_url = parsed.scheme in ('http', 'https') and parsed.netloc
is_valid = value == '*' \
or is_ip_address(value) \
or is_ip_network(value) \
or is_ip_segment(value) \
or is_basic_url
if not is_valid:
error = _('address invalid: `{}`').format(value)
raise serializers.ValidationError(error)
ip_group_help_text = _(
'With * indicating a match all. '
'Such as: '

View File

@@ -16,6 +16,7 @@ class CategoryViewSet(ListModelMixin, JMSGenericViewSet):
'types': TypeSerializer,
}
permission_classes = (IsValidUser,)
default_limit = None
def get_queryset(self):
return AllTypes.categories()

View File

@@ -14,6 +14,7 @@ class FavoriteAssetViewSet(BulkModelViewSet):
serializer_class = FavoriteAssetSerializer
permission_classes = (IsValidUser,)
filterset_fields = ['asset']
default_limit = None
def dispatch(self, request, *args, **kwargs):
with tmp_to_root_org():

View File

@@ -7,15 +7,18 @@ from rest_framework.decorators import action
from rest_framework.response import Response
from assets.const import AllTypes
from assets.models import Platform, Node, Asset, PlatformProtocol
from assets.models import Platform, Node, Asset, PlatformProtocol, PlatformAutomation
from assets.serializers import PlatformSerializer, PlatformProtocolSerializer, PlatformListSerializer
from common.api import JMSModelViewSet
from common.permissions import IsValidUser
from common.serializers import GroupedChoiceSerializer
from rbac.models import RoleBinding
__all__ = ['AssetPlatformViewSet', 'PlatformAutomationMethodsApi', 'PlatformProtocolViewSet']
class PlatformFilter(filters.FilterSet):
name__startswith = filters.CharFilter(field_name='name', lookup_expr='istartswith')
@@ -40,6 +43,7 @@ class AssetPlatformViewSet(JMSModelViewSet):
'ops_methods': 'assets.view_platform',
'filter_nodes_assets': 'assets.view_platform',
}
default_limit = None
def get_queryset(self):
# 因为没有走分页逻辑,所以需要这里 prefetch
@@ -63,6 +67,13 @@ class AssetPlatformViewSet(JMSModelViewSet):
return super().get_object()
return self.get_queryset().get(name=pk)
def check_permissions(self, request):
if self.action == 'list' and RoleBinding.is_org_admin(request.user):
return True
else:
return super().check_permissions(request)
def check_object_permissions(self, request, obj):
if request.method.lower() in ['delete', 'put', 'patch'] and obj.internal:
self.permission_denied(
@@ -102,6 +113,7 @@ class PlatformProtocolViewSet(JMSModelViewSet):
class PlatformAutomationMethodsApi(generics.ListAPIView):
permission_classes = (IsValidUser,)
queryset = PlatformAutomation.objects.none()
@staticmethod
def automation_methods():

View File

@@ -1,8 +1,8 @@
from rest_framework.generics import ListAPIView
from assets import serializers
from assets.const import Protocol
from common.permissions import IsValidUser
from assets.models import Protocol
__all__ = ['ProtocolListApi']
@@ -13,3 +13,13 @@ class ProtocolListApi(ListAPIView):
def get_queryset(self):
return list(Protocol.protocols())
def filter_queryset(self, queryset):
search = self.request.query_params.get("search", "").lower().strip()
if not search:
return queryset
queryset = [
p for p in queryset
if search in p['label'].lower() or search in p['value'].lower()
]
return queryset

View File

@@ -161,6 +161,7 @@ class CategoryTreeApi(SerializeToTreeNodeMixin, generics.ListAPIView):
'GET': 'assets.view_asset',
'list': 'assets.view_asset',
}
queryset = Node.objects.none()
def get_assets(self):
key = self.request.query_params.get('key')

View File

@@ -123,9 +123,7 @@ class BaseManager:
self.execution.summary = self.summary
self.execution.result = self.result
self.execution.status = self.status
with safe_atomic_db_connection():
self.execution.save()
self.execution.save()
def print_summary(self):
content = "\nSummery: \n"
@@ -157,7 +155,7 @@ class BaseManager:
report = self.gen_report()
report = transform(report, cssutils_logging_level="CRITICAL")
subject = self.get_report_subject()
emails = [r.email for r in recipients if r.email]
emails = [user.email]
send_mail_async(subject, report, emails, html_message=report)
def gen_report(self):
@@ -167,9 +165,10 @@ class BaseManager:
return data
def post_run(self):
self.update_execution()
self.print_summary()
self.send_report_if_need()
with safe_atomic_db_connection():
self.update_execution()
self.print_summary()
self.send_report_if_need()
def run(self, *args, **kwargs):
self.pre_run()
@@ -202,14 +201,17 @@ class PlaybookPrepareMixin:
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
# example: {'gather_fact_windows': {'id': 'gather_fact_windows', 'name': '', 'method': 'gather_fact', ...} }
self.method_id_meta_mapper = {
self.method_id_meta_mapper = self.get_method_id_meta_mapper()
# 根据执行方式就行分组, 不同资产的改密、推送等操作可能会使用不同的执行方式
# 然后根据执行方式分组, 再根据 bulk_size 分组, 生成不同的 playbook
self.playbooks = []
def get_method_id_meta_mapper(self):
return {
method["id"]: method
for method in self.platform_automation_methods
if method["method"] == self.__class__.method_type()
}
# 根据执行方式就行分组, 不同资产的改密、推送等操作可能会使用不同的执行方式
# 然后根据执行方式分组, 再根据 bulk_size 分组, 生成不同的 playbook
self.playbooks = []
@classmethod
def method_type(cls):
@@ -548,7 +550,8 @@ class BasePlaybookManager(PlaybookPrepareMixin, BaseManager):
try:
kwargs.update({"clean_workspace": False})
cb = runner.run(**kwargs)
self.on_runner_success(runner, cb)
with safe_atomic_db_connection():
self.on_runner_success(runner, cb)
except Exception as e:
self.on_runner_failed(runner, e, **info)
finally:

View File

@@ -11,15 +11,20 @@ class FormatAssetInfo:
@staticmethod
def get_cpu_model_count(cpus):
try:
models = [cpus[i + 1] + " " + cpus[i + 2] for i in range(0, len(cpus), 3)]
if len(cpus) % 3 == 0:
step = 3
models = [cpus[i + 2] for i in range(0, len(cpus), step)]
elif len(cpus) % 2 == 0:
step = 2
models = [cpus[i + 1] for i in range(0, len(cpus), step)]
else:
raise ValueError("CPU list format not recognized")
model_counts = Counter(models)
result = ', '.join([f"{model} x{count}" for model, count in model_counts.items()])
except Exception as e:
print(f"Error processing CPU model list: {e}")
result = ''
return result
@staticmethod

View File

@@ -3,7 +3,8 @@
vars:
ansible_shell_type: sh
ansible_connection: local
ansible_python_interpreter: /opt/py3/bin/python
ansible_python_interpreter: "{{ local_python_interpreter }}"
ansible_timeout: 30
tasks:
- name: Test asset connection (pyfreerdp)

View File

@@ -4,7 +4,7 @@
ansible_connection: local
ansible_shell_type: sh
ansible_become: false
ansible_timeout: 30
tasks:
- name: Test asset connection (paramiko)
ssh_ping:

View File

@@ -3,7 +3,7 @@
vars:
ansible_connection: local
ansible_shell_type: sh
ansible_timeout: 30
tasks:
- name: Test asset connection (telnet)
telnet_ping:

View File

@@ -2,6 +2,7 @@
gather_facts: no
vars:
ansible_python_interpreter: "{{ local_python_interpreter }}"
ansible_timeout: 30
tasks:
- name: Test MongoDB connection
@@ -16,3 +17,5 @@
ssl_certfile: "{{ jms_asset.secret_info.client_key | default('') }}"
connection_options:
- tlsAllowInvalidHostnames: "{{ jms_asset.spec_info.allow_invalid_cert}}"
register: result
failed_when: not result.is_available

View File

@@ -6,6 +6,7 @@
ca_cert: "{{ jms_asset.secret_info.ca_cert | default('') }}"
ssl_cert: "{{ jms_asset.secret_info.client_cert | default('') }}"
ssl_key: "{{ jms_asset.secret_info.client_key | default('') }}"
ansible_timeout: 30
tasks:
- name: Test MySQL connection

View File

@@ -2,6 +2,7 @@
gather_facts: no
vars:
ansible_python_interpreter: "{{ local_python_interpreter }}"
ansible_timeout: 30
tasks:
- name: Test Oracle connection

View File

@@ -6,7 +6,7 @@
ca_cert: "{{ jms_asset.secret_info.ca_cert | default('') }}"
ssl_cert: "{{ jms_asset.secret_info.client_cert | default('') }}"
ssl_key: "{{ jms_asset.secret_info.client_key | default('') }}"
ansible_timeout: 30
tasks:
- name: Test PostgreSQL connection
community.postgresql.postgresql_ping:

View File

@@ -2,6 +2,7 @@
gather_facts: no
vars:
ansible_python_interpreter: "{{ local_python_interpreter }}"
ansible_timeout: 30
tasks:
- name: Test SQLServer connection

View File

@@ -0,0 +1,13 @@
- hosts: website
gather_facts: no
vars:
ansible_python_interpreter: "{{ local_python_interpreter }}"
tasks:
- name: Test Website connection
website_ping:
login_host: "{{ jms_asset.address }}"
login_user: "{{ jms_account.username }}"
login_password: "{{ jms_account.secret }}"
steps: "{{ params.steps }}"
load_state: "{{ params.load_state }}"

View File

@@ -0,0 +1,50 @@
id: website_ping
name: "{{ 'Website ping' | trans }}"
method: ping
category:
- web
type:
- website
params:
- name: load_state
type: choice
label: "{{ 'Load state' | trans }}"
choices:
- [ networkidle, "{{ 'Network idle' | trans }}" ]
- [ domcontentloaded, "{{ 'Dom content loaded' | trans }}" ]
- [ load, "{{ 'Load completed' | trans }}" ]
default: 'load'
- name: steps
type: list
default: []
label: "{{ 'Steps' | trans }}"
help_text: "{{ 'Params step help text' | trans }}"
i18n:
Website ping:
zh: 使用 Playwright 模拟浏览器测试可连接性
en: Use Playwright to simulate a browser for connectivity testing
ja: Playwright を使用してブラウザをシミュレートし、接続性テストを実行する
Load state:
zh: 加载状态检测
en: Load state detection
ja: ロード状態の検出
Steps:
zh: 步骤
en: Steps
ja: 手順
Network idle:
zh: 网络空闲
en: Network idle
ja: ネットワークが空いた状態
Dom content loaded:
zh: 文档内容加载完成
en: Dom content loaded
ja: ドキュメントの内容がロードされた状態
Load completed:
zh: 全部加载完成
en: All load completed
ja: すべてのロードが完了した状態
Params step help text:
zh: 配置步骤,根据配置决定任务执行步骤
ja: パラメータを設定し、設定に基づいてタスクの実行手順を決定します
en: Configure steps, and determine the task execution steps based on the configuration.

View File

@@ -14,6 +14,10 @@ class Connectivity(TextChoices):
NTLM_ERR = 'ntlm_err', _('NTLM credentials rejected error')
CREATE_TEMPORARY_ERR = 'create_temp_err', _('Create temporary error')
@classmethod
def as_dict(cls):
return {choice.value: choice.label for choice in cls}
class AutomationTypes(TextChoices):
ping = 'ping', _('Ping')

View File

@@ -20,3 +20,7 @@ class Category(ChoicesMixin, models.TextChoices):
_category = getattr(cls, category.upper(), None)
choices = [(_category.value, _category.label)] if _category else cls.choices
return choices
@classmethod
def as_dict(cls):
return {choice.value: choice.label for choice in cls}

View File

@@ -194,6 +194,12 @@ class Protocol(ChoicesMixin, models.TextChoices):
'default': '>=2014',
'label': _('Version'),
'help_text': _('SQL Server version, Different versions have different connection drivers')
},
'encrypt': {
'type': 'bool',
'default': True,
'label': _('Encrypt'),
'help_text': _('Whether to use TLS encryption.')
}
}
},
@@ -343,7 +349,7 @@ class Protocol(ChoicesMixin, models.TextChoices):
for protocol, config in cls.settings().items():
if not xpack_enabled and config.get('xpack', False):
continue
protocols.append(protocol)
protocols.append({'label': protocol.label, 'value': protocol.value})
from assets.models.platform import PlatformProtocol
custom_protocols = (

View File

@@ -20,13 +20,17 @@ class WebTypes(BaseType):
def _get_automation_constrains(cls) -> dict:
constrains = {
'*': {
'ansible_enabled': False,
'ping_enabled': False,
'ansible_enabled': True,
'ansible_config': {
'ansible_connection': 'local',
},
'ping_enabled': True,
'gather_facts_enabled': False,
'verify_account_enabled': False,
'change_secret_enabled': False,
'verify_account_enabled': True,
'change_secret_enabled': True,
'push_account_enabled': False,
'gather_accounts_enabled': False,
'remove_account_enabled': False,
}
}
return constrains

View File

@@ -112,7 +112,7 @@ class Protocol(models.Model):
return protocols[0] if len(protocols) > 0 else {}
@property
def setting(self):
def setting(self) -> dict:
if self._setting is not None:
return self._setting
return self.asset_platform_protocol.get('setting', {})
@@ -122,7 +122,7 @@ class Protocol(models.Model):
self._setting = value
@property
def public(self):
def public(self) -> bool:
return self.asset_platform_protocol.get('public', True)
@@ -210,7 +210,7 @@ class Asset(NodesRelationMixin, LabeledMixin, AbsConnectivity, JSONFilterMixin,
return self.category == const.Category.DS and hasattr(self, 'ds')
@lazyproperty
def spec_info(self):
def spec_info(self) -> dict:
instance = getattr(self, self.category, None)
if not instance:
return {}
@@ -240,7 +240,7 @@ class Asset(NodesRelationMixin, LabeledMixin, AbsConnectivity, JSONFilterMixin,
return info
@lazyproperty
def auto_config(self):
def auto_config(self) -> dict:
platform = self.platform
auto_config = {
'su_enabled': platform.su_enabled,
@@ -343,11 +343,11 @@ class Asset(NodesRelationMixin, LabeledMixin, AbsConnectivity, JSONFilterMixin,
return names
@lazyproperty
def type(self):
def type(self) -> str:
return self.platform.type
@lazyproperty
def category(self):
def category(self) -> str:
return self.platform.category
def is_category(self, category):

View File

@@ -53,7 +53,7 @@ class BaseAutomation(PeriodTaskModelMixin, JMSOrgBaseModel):
return name
def get_all_assets(self):
nodes = self.nodes.all()
nodes = self.nodes.only("id", "key")
node_asset_ids = Node.get_nodes_all_assets(*nodes).values_list("id", flat=True)
direct_asset_ids = self.assets.all().values_list("id", flat=True)
asset_ids = set(list(direct_asset_ids) + list(node_asset_ids))

View File

@@ -573,7 +573,7 @@ class Node(JMSOrgBaseModel, SomeNodesMixin, FamilyMixin, NodeAssetsMixin):
return not self.__gt__(other)
@property
def name(self):
def name(self) -> str:
return self.value
def computed_full_value(self):

View File

@@ -25,7 +25,7 @@ class PlatformProtocol(models.Model):
return '{}/{}'.format(self.name, self.port)
@property
def secret_types(self):
def secret_types(self) -> list:
return Protocol.settings().get(self.name, {}).get('secret_types', ['password'])
@lazyproperty

View File

@@ -147,6 +147,7 @@ class AssetSerializer(BulkOrgResourceModelSerializer, ResourceLabelsMixin, Writa
protocols = AssetProtocolsSerializer(many=True, required=False, label=_('Protocols'), default=())
accounts = AssetAccountSerializer(many=True, required=False, allow_null=True, write_only=True, label=_('Accounts'))
nodes_display = NodeDisplaySerializer(read_only=False, required=False, label=_("Node path"))
auto_config = serializers.DictField(read_only=True, label=_('Auto info'))
platform = ObjectRelatedField(queryset=Platform.objects, required=True, label=_('Platform'),
attrs=('id', 'name', 'type'))
accounts_amount = serializers.IntegerField(read_only=True, label=_('Accounts amount'))
@@ -425,6 +426,18 @@ class DetailMixin(serializers.Serializer):
gathered_info = MethodSerializer(label=_('Gathered info'), read_only=True)
auto_config = serializers.DictField(read_only=True, label=_('Auto info'))
@staticmethod
def get_auto_config(obj) -> dict:
return obj.auto_config
@staticmethod
def get_gathered_info(obj) -> dict:
return obj.gathered_info
@staticmethod
def get_spec_info(obj) -> dict:
return obj.spec_info
def get_instance(self):
request = self.context.get('request')
if not self.instance and UUID_PATTERN.findall(request.path):

View File

@@ -1,10 +1,11 @@
from django.db.models import QuerySet
from django.utils.translation import gettext_lazy as _
from django.utils.translation import gettext_lazy as _, get_language
from assets.models import Custom, Platform, Asset
from common.const import UUID_PATTERN
from common.serializers import create_serializer_class
from common.serializers.common import DictSerializer, MethodSerializer
from terminal.models import Applet
from .common import AssetSerializer
__all__ = ['CustomSerializer']
@@ -47,8 +48,38 @@ class CustomSerializer(AssetSerializer):
if not platform:
return default_field
custom_fields = platform.custom_fields
if not custom_fields:
return default_field
name = platform.name.title() + 'CustomSerializer'
applet = Applet.objects.filter(
name=platform.created_by.replace('Applet:', '')
).first()
if not applet:
return create_serializer_class(name, custom_fields)()
i18n = applet.manifest.get('i18n', {})
lang = get_language()
lang_short = lang[:2]
def translate_text(key):
return (
i18n.get(key, {}).get(lang)
or i18n.get(key, {}).get(lang_short)
or key
)
for field in custom_fields:
label = field.get('label')
help_text = field.get('help_text')
if label:
field['label'] = translate_text(label)
if help_text:
field['help_text'] = translate_text(help_text)
return create_serializer_class(name, custom_fields)()

View File

@@ -19,11 +19,13 @@ __all__ = [
class BaseAutomationSerializer(PeriodTaskSerializerMixin, BulkOrgResourceModelSerializer):
assets = ObjectRelatedField(many=True, required=False, queryset=Asset.objects, label=_('Assets'))
nodes = ObjectRelatedField(many=True, required=False, queryset=Node.objects, label=_('Nodes'))
executed_amount = serializers.IntegerField(read_only=True, label=_('Executed amount'))
class Meta:
read_only_fields = [
'date_created', 'date_updated', 'created_by',
'periodic_display', 'executed_amount', 'type', 'last_execution_date'
'periodic_display', 'executed_amount', 'type',
'last_execution_date',
]
mini_fields = [
'id', 'name', 'type', 'is_periodic', 'interval',

View File

@@ -32,7 +32,7 @@ from rbac.permissions import RBACPermission
from terminal.models import default_storage
from users.models import User
from .backends import TYPE_ENGINE_MAPPING
from .const import ActivityChoices
from .const import ActivityChoices, ActionChoices
from .filters import UserSessionFilterSet, OperateLogFilterSet
from .models import (
FTPLog, UserLoginLog, OperateLog, PasswordChangeLog,
@@ -45,7 +45,7 @@ from .serializers import (
FileSerializer, UserSessionSerializer, JobsAuditSerializer,
ServiceAccessLogSerializer
)
from .utils import construct_userlogin_usernames
from .utils import construct_userlogin_usernames, record_operate_log_and_activity_log
logger = get_logger(__name__)
@@ -126,6 +126,11 @@ class FTPLogViewSet(OrgModelViewSet):
response['Content-Type'] = 'application/octet-stream'
filename = escape_uri_path(ftp_log.filename)
response["Content-Disposition"] = "attachment; filename*=UTF-8''{}".format(filename)
record_operate_log_and_activity_log(
[ftp_log.id], ActionChoices.download, '', self.model,
resource_display=f'{ftp_log.asset}: {ftp_log.filename}',
)
return response
@action(methods=[POST], detail=True, permission_classes=[IsServiceAccount, ], serializer_class=FileSerializer)
@@ -167,10 +172,7 @@ class UserLoginLogViewSet(UserLoginCommonMixin, OrgReadonlyModelViewSet):
def get_queryset(self):
queryset = super().get_queryset()
if current_org.is_root() or not settings.XPACK_ENABLED:
return queryset
users = self.get_org_member_usernames()
queryset = queryset.filter(username__in=users)
queryset = queryset.model.filter_queryset_by_org(queryset)
return queryset
@@ -292,12 +294,7 @@ class PasswordChangeLogViewSet(OrgReadonlyModelViewSet):
def get_queryset(self):
queryset = super().get_queryset()
if not current_org.is_root():
users = current_org.get_members()
queryset = queryset.filter(
user__in=[str(user) for user in users]
)
return queryset
return self.model.filter_queryset_by_org(queryset)
class UserSessionViewSet(CommonApiMixin, viewsets.ModelViewSet):

View File

@@ -35,6 +35,7 @@ class OperateLogStore(ES, metaclass=Singleton):
}
}
exact_fields = {}
fuzzy_fields = {}
match_fields = {
'id', 'user', 'action', 'resource_type',
'resource', 'remote_addr', 'org_id'
@@ -44,7 +45,7 @@ class OperateLogStore(ES, metaclass=Singleton):
}
if not config.get('INDEX'):
config['INDEX'] = 'jumpserver_operate_log'
super().__init__(config, properties, keyword_fields, exact_fields, match_fields)
super().__init__(config, properties, keyword_fields, exact_fields, fuzzy_fields, match_fields)
self.pre_use_check()
@staticmethod

View File

@@ -1,6 +1,6 @@
import os
import uuid
from datetime import timedelta
from datetime import timedelta, datetime
from importlib import import_module
from django.conf import settings
@@ -40,7 +40,7 @@ __all__ = [
class JobLog(JobExecution):
@property
def creator_name(self):
def creator_name(self) -> str:
return self.creator.name
class Meta:
@@ -73,6 +73,9 @@ class FTPLog(OrgModelMixin):
models.Index(fields=['date_start', 'org_id'], name='idx_date_start_org'),
]
def __str__(self):
return "{0.id} of {0.user} to {0.asset}".format(self)
@property
def filepath(self):
return os.path.join(self.upload_to, self.date_start.strftime('%Y-%m-%d'), str(self.id))
@@ -186,6 +189,15 @@ class PasswordChangeLog(models.Model):
class Meta:
verbose_name = _("Password change log")
@staticmethod
def filter_queryset_by_org(queryset):
if not current_org.is_root():
users = current_org.get_members()
queryset = queryset.filter(
user__in=[str(user) for user in users]
)
return queryset
class UserLoginLog(models.Model):
id = models.UUIDField(default=uuid.uuid4, primary_key=True)
@@ -220,7 +232,7 @@ class UserLoginLog(models.Model):
return '%s(%s)' % (self.username, self.city)
@property
def backend_display(self):
def backend_display(self) -> str:
return gettext(self.backend)
@classmethod
@@ -246,7 +258,7 @@ class UserLoginLog(models.Model):
return login_logs
@property
def reason_display(self):
def reason_display(self) -> str:
from authentication.errors import reason_choices, old_reason_choices
reason = reason_choices.get(self.reason)
@@ -255,6 +267,15 @@ class UserLoginLog(models.Model):
reason = old_reason_choices.get(self.reason, self.reason)
return reason
@staticmethod
def filter_queryset_by_org(queryset):
from audits.utils import construct_userlogin_usernames
if current_org.is_root() or not settings.XPACK_ENABLED:
return queryset
user_queryset = current_org.get_members()
users = construct_userlogin_usernames(user_queryset)
return queryset.filter(username__in=users)
class Meta:
ordering = ["-datetime", "username"]
verbose_name = _("User login log")
@@ -279,15 +300,15 @@ class UserSession(models.Model):
return '%s(%s)' % (self.user, self.ip)
@property
def backend_display(self):
def backend_display(self) -> str:
return gettext(self.backend)
@property
def is_active(self):
def is_active(self) -> bool:
return user_session_manager.check_active(self.key)
@property
def date_expired(self):
def date_expired(self) -> datetime:
session_store_cls = import_module(settings.SESSION_ENGINE).SessionStore
session_store = session_store_cls(session_key=self.key)
cache_key = session_store.cache_key

View File

@@ -119,11 +119,11 @@ class OperateLogSerializer(BulkOrgResourceModelSerializer):
fields = fields_small
@staticmethod
def get_resource_type(instance):
def get_resource_type(instance) -> str:
return _(instance.resource_type)
@staticmethod
def get_resource(instance):
def get_resource(instance) -> str:
return i18n_trans(instance.resource)
@@ -147,11 +147,11 @@ class ActivityUnionLogSerializer(serializers.Serializer):
r_type = serializers.CharField(read_only=True)
@staticmethod
def get_timestamp(obj):
def get_timestamp(obj) -> str:
return as_current_tz(obj['datetime']).strftime('%Y-%m-%d %H:%M:%S')
@staticmethod
def get_content(obj):
def get_content(obj) -> str:
if not obj['r_detail']:
action = obj['r_action'].replace('_', ' ').capitalize()
ctn = _('%s %s this resource') % (obj['r_user'], _(action).lower())
@@ -160,7 +160,7 @@ class ActivityUnionLogSerializer(serializers.Serializer):
return ctn
@staticmethod
def get_detail_url(obj):
def get_detail_url(obj) -> str:
detail_url = ''
detail_id, obj_type = obj['r_detail_id'], obj['r_type']
if not detail_id:
@@ -210,7 +210,7 @@ class UserSessionSerializer(serializers.ModelSerializer):
"backend_display": {"label": _("Auth backend display")},
}
def get_is_current_user_session(self, obj):
def get_is_current_user_session(self, obj) -> bool:
request = self.context.get('request')
if not request:
return False

View File

@@ -89,6 +89,8 @@ def create_activities(resource_ids, detail, detail_id, action, org_id):
for activity in activities:
create_activity(activity)
create_activity.finish()
@signals.after_task_publish.connect
def after_task_publish_for_activity_log(headers=None, body=None, **kwargs):

View File

@@ -180,7 +180,7 @@ def on_django_start_set_operate_log_monitor_models(sender, **kwargs):
'PlatformAutomation', 'PlatformProtocol', 'Protocol',
'HistoricalAccount', 'GatheredUser', 'ApprovalRule',
'BaseAutomation', 'CeleryTask', 'Command', 'JobLog',
'ConnectionToken', 'SessionJoinRecord',
'ConnectionToken', 'SessionJoinRecord', 'SessionSharing',
'HistoricalJob', 'Status', 'TicketStep', 'Ticket',
'UserAssetGrantedTreeNodeRelation', 'TicketAssignee',
'SuperTicket', 'SuperConnectionToken', 'AdminConnectionToken', 'PermNode',

View File

@@ -2,18 +2,19 @@
#
import datetime
import os
import subprocess
from celery import shared_task
from django.conf import settings
from django.core.files.storage import default_storage
from django.db import transaction
from django.utils import timezone
from django.utils._os import safe_join
from django.utils.translation import gettext_lazy as _
from common.const.crontab import CRONTAB_AT_AM_TWO
from common.storage.ftp_file import FTPFileStorageHandler
from common.utils import get_log_keep_day, get_logger
from common.utils.safe import safe_run_cmd
from ops.celery.decorator import register_as_period_task
from ops.models import CeleryTaskExecution
from orgs.utils import tmp_to_root_org
@@ -57,14 +58,12 @@ def clean_ftp_log_period():
now = timezone.now()
days = get_log_keep_day('FTP_LOG_KEEP_DAYS')
expired_day = now - datetime.timedelta(days=days)
file_store_dir = os.path.join(default_storage.base_location, FTPLog.upload_to)
file_store_dir = safe_join(default_storage.base_location, FTPLog.upload_to)
FTPLog.objects.filter(date_start__lt=expired_day).delete()
command = "find %s -mtime +%s -type f -exec rm -f {} \\;" % (
file_store_dir, days
)
subprocess.call(command, shell=True)
command = "find %s -type d -empty -delete;" % file_store_dir
subprocess.call(command, shell=True)
command = "find %s -mtime +%s -type f -exec rm -f {} \\;"
safe_run_cmd(command, (file_store_dir, days))
command = "find %s -type d -empty -delete;"
safe_run_cmd(command, (file_store_dir,))
logger.info("Clean FTP file done")
@@ -76,12 +75,11 @@ def clean_celery_tasks_period():
tasks.delete()
tasks = CeleryTaskExecution.objects.filter(date_start__isnull=True)
tasks.delete()
command = "find %s -mtime +%s -name '*.log' -type f -exec rm -f {} \\;" % (
settings.CELERY_LOG_DIR, expire_days
)
subprocess.call(command, shell=True)
command = "echo > {}".format(os.path.join(settings.LOG_DIR, 'celery.log'))
subprocess.call(command, shell=True)
command = "find %s -mtime +%s -name '*.log' -type f -exec rm -f {} \\;"
safe_run_cmd(command, (settings.CELERY_LOG_DIR, expire_days))
celery_log_path = safe_join(settings.LOG_DIR, 'celery.log')
command = "echo > %s"
safe_run_cmd(command, (celery_log_path,))
def batch_delete(queryset, batch_size=3000):
@@ -119,15 +117,15 @@ def clean_expired_session_period():
expired_sessions = Session.objects.filter(date_start__lt=expire_date)
timestamp = expire_date.timestamp()
expired_commands = Command.objects.filter(timestamp__lt=timestamp)
replay_dir = os.path.join(default_storage.base_location, 'replay')
replay_dir = safe_join(default_storage.base_location, 'replay')
batch_delete(expired_sessions)
logger.info("Clean session item done")
batch_delete(expired_commands)
logger.info("Clean session command done")
remove_files_by_days(replay_dir, days)
command = "find %s -type d -empty -delete;" % replay_dir
subprocess.call(command, shell=True)
command = "find %s -type d -empty -delete;"
safe_run_cmd(command, (replay_dir,))
logger.info("Clean session replay done")

View File

@@ -1,5 +1,6 @@
import copy
from datetime import datetime
from itertools import chain
from django.contrib.contenttypes.fields import GenericForeignKey, GenericRelation
from django.core.exceptions import ObjectDoesNotExist
@@ -7,7 +8,6 @@ from django.db import models
from django.db.models import F, Value, CharField
from django.db.models.functions import Concat
from django.utils import translation
from itertools import chain
from common.db.fields import RelatedManager
from common.utils import validate_ip, get_ip_city, get_logger
@@ -16,7 +16,6 @@ from .const import DEFAULT_CITY, ActivityChoices as LogChoice
from .handler import create_or_update_operate_log
from .models import ActivityLog
logger = get_logger(__name__)
@@ -151,7 +150,7 @@ def record_operate_log_and_activity_log(ids, action, detail, model, **kwargs):
org_id = current_org.id
with translation.override('en'):
resource_type = model._meta.verbose_name
resource_type = kwargs.pop('resource_type', None) or model._meta.verbose_name
create_or_update_operate_log(action, resource_type, force=True, **kwargs)
base_data = {'type': LogChoice.operate_log, 'detail': detail, 'org_id': org_id}
activities = [ActivityLog(resource_id=r_id, **base_data) for r_id in ids]

View File

@@ -37,6 +37,7 @@ class UserConfirmationViewSet(JMSGenericViewSet):
backend_classes = ConfirmType.get_prop_backends(confirm_type)
if not backend_classes:
return
for backend_cls in backend_classes:
backend = backend_cls(self.request.user, self.request)
if not backend.check():
@@ -69,6 +70,7 @@ class UserConfirmationViewSet(JMSGenericViewSet):
ok, msg = backend.authenticate(secret_key, mfa_type)
if ok:
request.session['CONFIRM_LEVEL'] = ConfirmType.values.index(confirm_type) + 1
request.session['CONFIRM_TYPE'] = confirm_type
request.session['CONFIRM_TIME'] = int(time.time())
return Response('ok')
return Response({'error': msg}, status=400)

View File

@@ -618,6 +618,8 @@ class SuperConnectionTokenViewSet(ConnectionTokenViewSet):
token_id = request.data.get('id') or ''
token = ConnectionToken.get_typed_connection_token(token_id)
if not token:
raise PermissionDenied('Token {} is not valid'.format(token))
token.is_valid()
serializer = self.get_serializer(instance=token)

View File

@@ -14,7 +14,6 @@ from rest_framework.response import Response
from authentication.errors import ACLError
from common.api import JMSGenericViewSet
from common.const.http import POST, GET
from common.permissions import OnlySuperUser
from common.serializers import EmptySerializer
from common.utils import reverse, safe_next_url
from common.utils.timezone import utc_now
@@ -38,8 +37,11 @@ class SSOViewSet(AuthMixin, JMSGenericViewSet):
'login_url': SSOTokenSerializer,
'login': EmptySerializer
}
rbac_perms = {
'login_url': 'authentication.add_ssotoken',
}
@action(methods=[POST], detail=False, permission_classes=[OnlySuperUser], url_path='login-url')
@action(methods=[POST], detail=False, url_path='login-url')
def login_url(self, request, *args, **kwargs):
if not settings.AUTH_SSO:
raise SSOAuthClosed()

View File

@@ -1,9 +1,9 @@
from django.contrib.auth.backends import ModelBackend
from django.contrib.auth import get_user_model
from django.contrib.auth.backends import ModelBackend
from django.views import View
from users.models import User
from common.utils import get_logger
from users.models import User
UserModel = get_user_model()
logger = get_logger(__file__)
@@ -61,4 +61,13 @@ class JMSBaseAuthBackend:
class JMSModelBackend(JMSBaseAuthBackend, ModelBackend):
pass
def user_can_authenticate(self, user):
return True
class BaseAuthCallbackClientView(View):
http_method_names = ['get']
def get(self, request):
from authentication.views.utils import redirect_to_guard_view
return redirect_to_guard_view(query_string='next=client')

View File

@@ -1,14 +1,51 @@
# -*- coding: utf-8 -*-
#
from django_cas_ng.backends import CASBackend as _CASBackend
from django.conf import settings
import threading
from django.conf import settings
from django.contrib.auth import get_user_model
from django_cas_ng.backends import CASBackend as _CASBackend
from common.utils import get_logger
from ..base import JMSBaseAuthBackend
__all__ = ['CASBackend']
__all__ = ['CASBackend', 'CASUserDoesNotExist']
logger = get_logger(__name__)
class CASUserDoesNotExist(Exception):
"""Exception raised when a CAS user does not exist."""
pass
class CASBackend(JMSBaseAuthBackend, _CASBackend):
@staticmethod
def is_enabled():
return settings.AUTH_CAS
def authenticate(self, request, ticket, service):
UserModel = get_user_model()
manager = UserModel._default_manager
original_get_by_natural_key = manager.get_by_natural_key
thread_local = threading.local()
thread_local.thread_id = threading.get_ident()
logger.debug(f"CASBackend.authenticate: thread_id={thread_local.thread_id}")
def get_by_natural_key(self, username):
logger.debug(f"CASBackend.get_by_natural_key: thread_id={threading.get_ident()}, username={username}")
if threading.get_ident() != thread_local.thread_id:
return original_get_by_natural_key(username)
try:
user = original_get_by_natural_key(username)
except UserModel.DoesNotExist:
raise CASUserDoesNotExist(username)
return user
try:
manager.get_by_natural_key = get_by_natural_key.__get__(manager, type(manager))
user = super().authenticate(request, ticket=ticket, service=service)
finally:
manager.get_by_natural_key = original_get_by_natural_key
return user

View File

@@ -1,23 +1,33 @@
from django.core.exceptions import PermissionDenied
from django.http import HttpResponseRedirect
from django.views.generic import View
from django.utils.translation import gettext_lazy as _
from django_cas_ng.views import LoginView
__all__ = ['LoginView']
from authentication.backends.base import BaseAuthCallbackClientView
from common.utils import FlashMessageUtil
from .backends import CASUserDoesNotExist
from authentication.views.utils import redirect_to_guard_view
__all__ = ['LoginView']
class CASLoginView(LoginView):
def get(self, request):
try:
return super().get(request)
resp = super().get(request)
return resp
except PermissionDenied:
return HttpResponseRedirect('/')
except CASUserDoesNotExist as e:
message_data = {
'title': _('User does not exist: {}').format(e),
'error': _(
'CAS login was successful, but no corresponding local user was found in the system, and automatic '
'user creation is disabled in the CAS authentication configuration. Login failed.'),
'interval': 10,
'redirect_url': '/',
}
return FlashMessageUtil.gen_and_redirect_to(message_data)
class CASCallbackClientView(View):
http_method_names = ['get', ]
def get(self, request):
return redirect_to_guard_view(query_string='next=client')
class CASCallbackClientView(BaseAuthCallbackClientView):
pass

View File

@@ -5,10 +5,10 @@ from django.urls import reverse
from django.utils.http import urlencode
from django.views import View
from authentication.backends.base import BaseAuthCallbackClientView
from authentication.mixins import authenticate
from authentication.utils import build_absolute_uri
from authentication.views.mixins import FlashMessageMixin
from authentication.views.utils import redirect_to_guard_view
from common.utils import get_logger
logger = get_logger(__file__)
@@ -67,11 +67,8 @@ class OAuth2AuthCallbackView(View, FlashMessageMixin):
return HttpResponseRedirect(redirect_url)
class OAuth2AuthCallbackClientView(View):
http_method_names = ['get', ]
def get(self, request):
return redirect_to_guard_view(query_string='next=client')
class OAuth2AuthCallbackClientView(BaseAuthCallbackClientView):
pass
class OAuth2EndSessionView(View):

View File

@@ -224,7 +224,6 @@ class OIDCAuthCodeBackend(OIDCBaseBackend):
user_auth_failed.send(
sender=self.__class__, request=request, username=user.username,
reason="User is invalid", backend=settings.AUTH_BACKEND_OIDC_CODE
)
return None

View File

@@ -10,16 +10,15 @@ import datetime as dt
from calendar import timegm
from urllib.parse import urlparse
from django.conf import settings
from django.core.exceptions import SuspiciousOperation
from django.utils.encoding import force_bytes, smart_bytes
from jwkest import JWKESTException
from jwkest.jwk import KEYS
from jwkest.jws import JWS
from django.conf import settings
from common.utils import get_logger
logger = get_logger(__file__)
@@ -99,7 +98,8 @@ def _validate_claims(id_token, nonce=None, validate_nonce=True):
raise SuspiciousOperation('Incorrect id_token: nbf')
# Verifies that the token was issued in the allowed timeframe.
if utc_timestamp > id_token['iat'] + settings.AUTH_OPENID_ID_TOKEN_MAX_AGE:
max_age = settings.AUTH_OPENID_ID_TOKEN_MAX_AGE
if utc_timestamp > id_token['iat'] + max_age:
logger.debug(log_prompt.format('Incorrect id_token: iat'))
raise SuspiciousOperation('Incorrect id_token: iat')

Some files were not shown because too many files have changed in this diff Show More