mirror of
https://github.com/jumpserver/jumpserver.git
synced 2025-07-13 06:34:42 +00:00
merge: dev to master
Ready to relase
This commit is contained in:
commit
558188da90
10
.github/dependabot.yml
vendored
Normal file
10
.github/dependabot.yml
vendored
Normal file
@ -0,0 +1,10 @@
|
||||
version: 2
|
||||
updates:
|
||||
- package-ecosystem: "pip"
|
||||
directory: "/"
|
||||
schedule:
|
||||
interval: "weekly"
|
||||
day: "monday"
|
||||
time: "09:30"
|
||||
timezone: "Asia/Shanghai"
|
||||
target-branch: dev
|
7
.github/workflows/translate-readme.yml
vendored
7
.github/workflows/translate-readme.yml
vendored
@ -2,10 +2,14 @@ name: Translate README
|
||||
on:
|
||||
workflow_dispatch:
|
||||
inputs:
|
||||
source_readme:
|
||||
description: "Source README"
|
||||
required: false
|
||||
default: "./readmes/README.en.md"
|
||||
target_langs:
|
||||
description: "Target Languages"
|
||||
required: false
|
||||
default: "zh-hans,zh-hant,ja,pt-br"
|
||||
default: "zh-hans,zh-hant,ja,pt-br,es,ru"
|
||||
gen_dir_path:
|
||||
description: "Generate Dir Name"
|
||||
required: false
|
||||
@ -34,6 +38,7 @@ jobs:
|
||||
GITHUB_TOKEN: ${{ secrets.PRIVATE_TOKEN }}
|
||||
OPENAI_API_KEY: ${{ secrets.GPT_API_TOKEN }}
|
||||
GPT_MODE: ${{ github.event.inputs.gpt_mode }}
|
||||
SOURCE_README: ${{ github.event.inputs.source_readme }}
|
||||
TARGET_LANGUAGES: ${{ github.event.inputs.target_langs }}
|
||||
PUSH_BRANCH: ${{ github.event.inputs.push_branch }}
|
||||
GEN_DIR_PATH: ${{ github.event.inputs.gen_dir_path }}
|
||||
|
@ -1,4 +1,4 @@
|
||||
FROM jumpserver/core-base:20250224_065619 AS stage-build
|
||||
FROM jumpserver/core-base:20250415_032719 AS stage-build
|
||||
|
||||
ARG VERSION
|
||||
|
||||
|
33
README.md
33
README.md
@ -1,6 +1,6 @@
|
||||
<div align="center">
|
||||
<a name="readme-top"></a>
|
||||
<a href="https://jumpserver.org/index-en.html"><img src="https://download.jumpserver.org/images/jumpserver-logo.svg" alt="JumpServer" width="300" /></a>
|
||||
<a href="https://jumpserver.com" target="_blank"><img src="https://download.jumpserver.org/images/jumpserver-logo.svg" alt="JumpServer" width="300" /></a>
|
||||
|
||||
## An open-source PAM tool (Bastion Host)
|
||||
|
||||
@ -10,7 +10,7 @@
|
||||
[![][github-release-shield]][github-release-link]
|
||||
[![][github-stars-shield]][github-stars-link]
|
||||
|
||||
[English](/README.md) · [中文(简体)](/readmes/README.zh-hans.md) · [中文(繁體)](/readmes/README.zh-hant.md) · [日本語](/readmes/README.ja.md) · [Português (Brasil)](/readmes/README.pt-br.md)
|
||||
[English](/README.md) · [中文(简体)](/readmes/README.zh-hans.md) · [中文(繁體)](/readmes/README.zh-hant.md) · [日本語](/readmes/README.ja.md) · [Português (Brasil)](/readmes/README.pt-br.md) · [Español](/readmes/README.es.md) · [Русский](/readmes/README.ru.md)
|
||||
|
||||
</div>
|
||||
<br/>
|
||||
@ -19,7 +19,13 @@
|
||||
|
||||
JumpServer is an open-source Privileged Access Management (PAM) tool that provides DevOps and IT teams with on-demand and secure access to SSH, RDP, Kubernetes, Database and RemoteApp endpoints through a web browser.
|
||||
|
||||

|
||||
|
||||
<picture>
|
||||
<source media="(prefers-color-scheme: light)" srcset="https://github.com/user-attachments/assets/dd612f3d-c958-4f84-b164-f31b75454d7f">
|
||||
<source media="(prefers-color-scheme: dark)" srcset="https://github.com/user-attachments/assets/28676212-2bc4-4a9f-ae10-3be9320647e3">
|
||||
<img src="https://github.com/user-attachments/assets/dd612f3d-c958-4f84-b164-f31b75454d7f" alt="Theme-based Image">
|
||||
</picture>
|
||||
|
||||
|
||||
## Quickstart
|
||||
|
||||
@ -36,18 +42,19 @@ Access JumpServer in your browser at `http://your-jumpserver-ip/`
|
||||
[](https://www.youtube.com/watch?v=UlGYRbKrpgY "JumpServer Quickstart")
|
||||
|
||||
## Screenshots
|
||||
|
||||
<table style="border-collapse: collapse; border: 1px solid black;">
|
||||
<tr>
|
||||
<td style="padding: 5px;background-color:#fff;"><img src= "https://github.com/jumpserver/jumpserver/assets/32935519/99fabe5b-0475-4a53-9116-4c370a1426c4" alt="JumpServer Console" /></td>
|
||||
<td style="padding: 5px;background-color:#fff;"><img src= "https://github.com/jumpserver/jumpserver/assets/32935519/a424d731-1c70-4108-a7d8-5bbf387dda9a" alt="JumpServer Audits" /></td>
|
||||
<td style="padding: 5px;background-color:#fff;"><img src= "https://github.com/user-attachments/assets/7c1f81af-37e8-4f07-8ac9-182895e1062e" alt="JumpServer PAM" /></td>
|
||||
</tr>
|
||||
|
||||
<tr>
|
||||
<td style="padding: 5px;background-color:#fff;"><img src= "https://github.com/jumpserver/jumpserver/assets/32935519/a424d731-1c70-4108-a7d8-5bbf387dda9a" alt="JumpServer Audits" /></td>
|
||||
<td style="padding: 5px;background-color:#fff;"><img src= "https://github.com/jumpserver/jumpserver/assets/32935519/393d2c27-a2d0-4dea-882d-00ed509e00c9" alt="JumpServer Workbench" /></td>
|
||||
</tr>
|
||||
<tr>
|
||||
<td style="padding: 5px;background-color:#fff;"><img src= "https://github.com/user-attachments/assets/eaa41f66-8cc8-4f01-a001-0d258501f1c9" alt="JumpServer RBAC" /></td>
|
||||
<td style="padding: 5px;background-color:#fff;"><img src= "https://github.com/jumpserver/jumpserver/assets/32935519/3a2611cd-8902-49b8-b82b-2a6dac851f3e" alt="JumpServer Settings" /></td>
|
||||
</tr>
|
||||
|
||||
<tr>
|
||||
<td style="padding: 5px;background-color:#fff;"><img src= "https://github.com/jumpserver/jumpserver/assets/32935519/1e236093-31f7-4563-8eb1-e36d865f1568" alt="JumpServer SSH" /></td>
|
||||
<td style="padding: 5px;background-color:#fff;"><img src= "https://github.com/jumpserver/jumpserver/assets/32935519/69373a82-f7ab-41e8-b763-bbad2ba52167" alt="JumpServer RDP" /></td>
|
||||
@ -69,9 +76,9 @@ JumpServer consists of multiple key components, which collectively form the func
|
||||
| [KoKo](https://github.com/jumpserver/koko) | <a href="https://github.com/jumpserver/koko/releases"><img alt="Koko release" src="https://img.shields.io/github/release/jumpserver/koko.svg" /></a> | JumpServer Character Protocol Connector |
|
||||
| [Lion](https://github.com/jumpserver/lion) | <a href="https://github.com/jumpserver/lion/releases"><img alt="Lion release" src="https://img.shields.io/github/release/jumpserver/lion.svg" /></a> | JumpServer Graphical Protocol Connector |
|
||||
| [Chen](https://github.com/jumpserver/chen) | <a href="https://github.com/jumpserver/chen/releases"><img alt="Chen release" src="https://img.shields.io/github/release/jumpserver/chen.svg" /> | JumpServer Web DB |
|
||||
| [Razor](https://github.com/jumpserver/razor) | <img alt="Chen" src="https://img.shields.io/badge/release-private-red" /> | JumpServer EE RDP Proxy Connector |
|
||||
| [Tinker](https://github.com/jumpserver/tinker) | <img alt="Tinker" src="https://img.shields.io/badge/release-private-red" /> | JumpServer EE Remote Application Connector (Windows) |
|
||||
| [Tinker](https://github.com/jumpserver/tinker) | <img alt="Tinker" src="https://img.shields.io/badge/release-private-red" /> | JumpServer Remote Application Connector (Windows) |
|
||||
| [Panda](https://github.com/jumpserver/Panda) | <img alt="Panda" src="https://img.shields.io/badge/release-private-red" /> | JumpServer EE Remote Application Connector (Linux) |
|
||||
| [Razor](https://github.com/jumpserver/razor) | <img alt="Chen" src="https://img.shields.io/badge/release-private-red" /> | JumpServer EE RDP Proxy Connector |
|
||||
| [Magnus](https://github.com/jumpserver/magnus) | <img alt="Magnus" src="https://img.shields.io/badge/release-private-red" /> | JumpServer EE Database Proxy Connector |
|
||||
| [Nec](https://github.com/jumpserver/nec) | <img alt="Nec" src="https://img.shields.io/badge/release-private-red" /> | JumpServer EE VNC Proxy Connector |
|
||||
| [Facelive](https://github.com/jumpserver/facelive) | <img alt="Facelive" src="https://img.shields.io/badge/release-private-red" /> | JumpServer EE Facial Recognition |
|
||||
@ -81,12 +88,6 @@ JumpServer consists of multiple key components, which collectively form the func
|
||||
|
||||
Welcome to submit PR to contribute. Please refer to [CONTRIBUTING.md][contributing-link] for guidelines.
|
||||
|
||||
## Security
|
||||
|
||||
JumpServer is a mission critical product. Please refer to the Basic Security Recommendations for installation and deployment. If you encounter any security-related issues, please contact us directly:
|
||||
|
||||
- Email: support@fit2cloud.com
|
||||
|
||||
## License
|
||||
|
||||
Copyright (c) 2014-2025 FIT2CLOUD, All rights reserved.
|
||||
@ -115,5 +116,3 @@ Unless required by applicable law or agreed to in writing, software distributed
|
||||
[docker-shield]: https://img.shields.io/docker/pulls/jumpserver/jms_all.svg
|
||||
[license-shield]: https://img.shields.io/github/license/jumpserver/jumpserver
|
||||
[discord-shield]: https://img.shields.io/discord/1194233267294052363?style=flat&logo=discord&logoColor=%23f5f5f5&labelColor=%235462eb&color=%235462eb
|
||||
|
||||
<!-- Image link -->
|
||||
|
@ -5,8 +5,7 @@ JumpServer 是一款正在成长的安全产品, 请参考 [基本安全建议
|
||||
如果你发现安全问题,请直接联系我们,我们携手让世界更好:
|
||||
|
||||
- ibuler@fit2cloud.com
|
||||
- support@fit2cloud.com
|
||||
- 400-052-0755
|
||||
- support@lxware.hk
|
||||
|
||||
|
||||
# Security Policy
|
||||
@ -16,6 +15,5 @@ JumpServer is a security product, The installation and development should follow
|
||||
All security bugs should be reported to the contact as below:
|
||||
|
||||
- ibuler@fit2cloud.com
|
||||
- support@fit2cloud.com
|
||||
- 400-052-0755
|
||||
- support@lxware.hk
|
||||
|
||||
|
@ -46,6 +46,16 @@ class AccountViewSet(OrgBulkModelViewSet):
|
||||
}
|
||||
export_as_zip = True
|
||||
|
||||
def get_queryset(self):
|
||||
queryset = super().get_queryset()
|
||||
asset_id = self.request.query_params.get('asset') or self.request.query_params.get('asset_id')
|
||||
if not asset_id:
|
||||
return queryset
|
||||
|
||||
asset = get_object_or_404(Asset, pk=asset_id)
|
||||
queryset = asset.all_accounts.all()
|
||||
return queryset
|
||||
|
||||
@action(methods=['get'], detail=False, url_path='su-from-accounts')
|
||||
def su_from_accounts(self, request, *args, **kwargs):
|
||||
account_id = request.query_params.get('account')
|
||||
@ -117,7 +127,7 @@ class AccountViewSet(OrgBulkModelViewSet):
|
||||
self.model.objects.create(**account_data)
|
||||
success_count += 1
|
||||
except Exception as e:
|
||||
logger.debug(f'{ "Move" if move else "Copy" } to assets error: {e}')
|
||||
logger.debug(f'{"Move" if move else "Copy"} to assets error: {e}')
|
||||
creation_results[asset] = {'error': _('Account already exists'), 'state': 'error'}
|
||||
|
||||
results = [{'asset': str(asset), **res} for asset, res in creation_results.items()]
|
||||
|
@ -17,7 +17,7 @@ from orgs.mixins import generics
|
||||
__all__ = [
|
||||
'AutomationAssetsListApi', 'AutomationRemoveAssetApi',
|
||||
'AutomationAddAssetApi', 'AutomationNodeAddRemoveApi',
|
||||
'AutomationExecutionViewSet', 'RecordListMixin'
|
||||
'AutomationExecutionViewSet'
|
||||
]
|
||||
|
||||
|
||||
@ -39,9 +39,10 @@ class AutomationAssetsListApi(generics.ListAPIView):
|
||||
return assets
|
||||
|
||||
|
||||
class AutomationRemoveAssetApi(generics.RetrieveUpdateAPIView):
|
||||
class AutomationRemoveAssetApi(generics.UpdateAPIView):
|
||||
model = BaseAutomation
|
||||
serializer_class = serializers.UpdateAssetSerializer
|
||||
http_method_names = ['patch']
|
||||
|
||||
def update(self, request, *args, **kwargs):
|
||||
instance = self.get_object()
|
||||
@ -56,9 +57,10 @@ class AutomationRemoveAssetApi(generics.RetrieveUpdateAPIView):
|
||||
return Response({'msg': 'ok'})
|
||||
|
||||
|
||||
class AutomationAddAssetApi(generics.RetrieveUpdateAPIView):
|
||||
class AutomationAddAssetApi(generics.UpdateAPIView):
|
||||
model = BaseAutomation
|
||||
serializer_class = serializers.UpdateAssetSerializer
|
||||
http_method_names = ['patch']
|
||||
|
||||
def update(self, request, *args, **kwargs):
|
||||
instance = self.get_object()
|
||||
@ -72,9 +74,10 @@ class AutomationAddAssetApi(generics.RetrieveUpdateAPIView):
|
||||
return Response({"error": serializer.errors})
|
||||
|
||||
|
||||
class AutomationNodeAddRemoveApi(generics.RetrieveUpdateAPIView):
|
||||
class AutomationNodeAddRemoveApi(generics.UpdateAPIView):
|
||||
model = BaseAutomation
|
||||
serializer_class = serializers.UpdateNodeSerializer
|
||||
http_method_names = ['patch']
|
||||
|
||||
def update(self, request, *args, **kwargs):
|
||||
action_params = ['add', 'remove']
|
||||
@ -124,12 +127,3 @@ class AutomationExecutionViewSet(
|
||||
execution = self.get_object()
|
||||
report = execution.manager.gen_report()
|
||||
return HttpResponse(report)
|
||||
|
||||
|
||||
class RecordListMixin:
|
||||
def list(self, request, *args, **kwargs):
|
||||
try:
|
||||
response = super().list(request, *args, **kwargs)
|
||||
except Exception as e:
|
||||
response = Response({'detail': str(e)}, status=status.HTTP_400_BAD_REQUEST)
|
||||
return response
|
||||
|
@ -16,7 +16,7 @@ from orgs.mixins.api import OrgBulkModelViewSet, OrgGenericViewSet
|
||||
from rbac.permissions import RBACPermission
|
||||
from .base import (
|
||||
AutomationAssetsListApi, AutomationRemoveAssetApi, AutomationAddAssetApi,
|
||||
AutomationNodeAddRemoveApi, AutomationExecutionViewSet, RecordListMixin
|
||||
AutomationNodeAddRemoveApi, AutomationExecutionViewSet
|
||||
)
|
||||
|
||||
__all__ = [
|
||||
@ -35,7 +35,7 @@ class ChangeSecretAutomationViewSet(OrgBulkModelViewSet):
|
||||
serializer_class = serializers.ChangeSecretAutomationSerializer
|
||||
|
||||
|
||||
class ChangeSecretRecordViewSet(RecordListMixin, mixins.ListModelMixin, OrgGenericViewSet):
|
||||
class ChangeSecretRecordViewSet(mixins.ListModelMixin, OrgGenericViewSet):
|
||||
filterset_class = ChangeSecretRecordFilterSet
|
||||
permission_classes = [RBACPermission, IsValidLicense]
|
||||
search_fields = ('asset__address', 'account__username')
|
||||
|
@ -147,6 +147,7 @@ class CheckAccountEngineViewSet(JMSModelViewSet):
|
||||
serializer_class = serializers.CheckAccountEngineSerializer
|
||||
permission_classes = [RBACPermission, IsValidLicense]
|
||||
perm_model = CheckAccountEngine
|
||||
http_method_names = ['get', 'options']
|
||||
|
||||
def get_queryset(self):
|
||||
return CheckAccountEngine.get_default_engines()
|
||||
|
@ -9,7 +9,7 @@ from accounts.models import PushAccountAutomation, PushSecretRecord
|
||||
from orgs.mixins.api import OrgBulkModelViewSet, OrgGenericViewSet
|
||||
from .base import (
|
||||
AutomationAssetsListApi, AutomationRemoveAssetApi, AutomationAddAssetApi,
|
||||
AutomationNodeAddRemoveApi, AutomationExecutionViewSet, RecordListMixin
|
||||
AutomationNodeAddRemoveApi, AutomationExecutionViewSet
|
||||
)
|
||||
|
||||
__all__ = [
|
||||
@ -42,7 +42,7 @@ class PushAccountExecutionViewSet(AutomationExecutionViewSet):
|
||||
return queryset
|
||||
|
||||
|
||||
class PushAccountRecordViewSet(RecordListMixin, mixins.ListModelMixin, OrgGenericViewSet):
|
||||
class PushAccountRecordViewSet(mixins.ListModelMixin, OrgGenericViewSet):
|
||||
filterset_class = PushAccountRecordFilterSet
|
||||
search_fields = ('asset__address', 'account__username')
|
||||
ordering_fields = ('date_finished',)
|
||||
|
@ -69,7 +69,7 @@ class BaseChangeSecretPushManager(AccountBasePlaybookManager):
|
||||
return
|
||||
|
||||
asset = privilege_account.asset
|
||||
accounts = asset.accounts.all()
|
||||
accounts = asset.all_accounts.all()
|
||||
accounts = accounts.filter(id__in=self.account_ids, secret_reset=True)
|
||||
|
||||
if self.secret_type:
|
||||
@ -94,6 +94,7 @@ class BaseChangeSecretPushManager(AccountBasePlaybookManager):
|
||||
h['account'] = {
|
||||
'name': account.name,
|
||||
'username': account.username,
|
||||
'full_username': account.full_username,
|
||||
'secret_type': secret_type,
|
||||
'secret': account.escape_jinja2_syntax(new_secret),
|
||||
'private_key_path': private_key_path,
|
||||
|
@ -41,6 +41,7 @@
|
||||
password: "{{ account.secret | password_hash('des') }}"
|
||||
update_password: always
|
||||
ignore_errors: true
|
||||
register: change_secret_result
|
||||
when: account.secret_type == "password"
|
||||
|
||||
- name: "Get home directory for {{ account.username }}"
|
||||
@ -83,6 +84,7 @@
|
||||
user: "{{ account.username }}"
|
||||
key: "{{ account.secret }}"
|
||||
exclusive: "{{ ssh_params.exclusive }}"
|
||||
register: change_secret_result
|
||||
when: account.secret_type == "ssh_key"
|
||||
|
||||
- name: Refresh connection
|
||||
@ -101,7 +103,9 @@
|
||||
become_password: "{{ account.become.ansible_password | default('') }}"
|
||||
become_private_key_path: "{{ account.become.ansible_ssh_private_key_file | default(None) }}"
|
||||
old_ssh_version: "{{ jms_asset.old_ssh_version | default(False) }}"
|
||||
when: account.secret_type == "password" and check_conn_after_change
|
||||
when:
|
||||
- account.secret_type == "password"
|
||||
- check_conn_after_change or change_secret_result.failed | default(false)
|
||||
delegate_to: localhost
|
||||
|
||||
- name: "Verify {{ account.username }} SSH KEY (paramiko)"
|
||||
@ -112,5 +116,7 @@
|
||||
login_private_key_path: "{{ account.private_key_path }}"
|
||||
gateway_args: "{{ jms_asset.ansible_ssh_common_args | default(None) }}"
|
||||
old_ssh_version: "{{ jms_asset.old_ssh_version | default(False) }}"
|
||||
when: account.secret_type == "ssh_key" and check_conn_after_change
|
||||
when:
|
||||
- account.secret_type == "ssh_key"
|
||||
- check_conn_after_change or change_secret_result.failed | default(false)
|
||||
delegate_to: localhost
|
||||
|
@ -41,6 +41,7 @@
|
||||
password: "{{ account.secret | password_hash('sha512') }}"
|
||||
update_password: always
|
||||
ignore_errors: true
|
||||
register: change_secret_result
|
||||
when: account.secret_type == "password"
|
||||
|
||||
- name: "Get home directory for {{ account.username }}"
|
||||
@ -83,6 +84,7 @@
|
||||
user: "{{ account.username }}"
|
||||
key: "{{ account.secret }}"
|
||||
exclusive: "{{ ssh_params.exclusive }}"
|
||||
register: change_secret_result
|
||||
when: account.secret_type == "ssh_key"
|
||||
|
||||
- name: Refresh connection
|
||||
@ -101,7 +103,9 @@
|
||||
become_password: "{{ account.become.ansible_password | default('') }}"
|
||||
become_private_key_path: "{{ account.become.ansible_ssh_private_key_file | default(None) }}"
|
||||
old_ssh_version: "{{ jms_asset.old_ssh_version | default(False) }}"
|
||||
when: account.secret_type == "password" and check_conn_after_change
|
||||
when:
|
||||
- account.secret_type == "password"
|
||||
- check_conn_after_change or change_secret_result.failed | default(false)
|
||||
delegate_to: localhost
|
||||
|
||||
- name: "Verify {{ account.username }} SSH KEY (paramiko)"
|
||||
@ -112,5 +116,7 @@
|
||||
login_private_key_path: "{{ account.private_key_path }}"
|
||||
gateway_args: "{{ jms_asset.ansible_ssh_common_args | default(None) }}"
|
||||
old_ssh_version: "{{ jms_asset.old_ssh_version | default(False) }}"
|
||||
when: account.secret_type == "ssh_key" and check_conn_after_change
|
||||
when:
|
||||
- account.secret_type == "ssh_key"
|
||||
- check_conn_after_change or change_secret_result.failed | default(false)
|
||||
delegate_to: localhost
|
||||
|
@ -0,0 +1,27 @@
|
||||
- hosts: demo
|
||||
gather_facts: no
|
||||
tasks:
|
||||
- name: Test privileged account
|
||||
ansible.windows.win_ping:
|
||||
|
||||
- name: Change password
|
||||
community.windows.win_domain_user:
|
||||
name: "{{ account.username }}"
|
||||
password: "{{ account.secret }}"
|
||||
update_password: always
|
||||
password_never_expires: yes
|
||||
state: present
|
||||
groups: "{{ params.groups }}"
|
||||
groups_action: add
|
||||
ignore_errors: true
|
||||
when: account.secret_type == "password"
|
||||
|
||||
- name: Refresh connection
|
||||
ansible.builtin.meta: reset_connection
|
||||
|
||||
- name: Verify password
|
||||
ansible.windows.win_ping:
|
||||
vars:
|
||||
ansible_user: "{{ account.full_username }}"
|
||||
ansible_password: "{{ account.secret }}"
|
||||
when: account.secret_type == "password" and check_conn_after_change
|
@ -0,0 +1,27 @@
|
||||
id: change_secret_ad_windows
|
||||
name: "{{ 'Windows account change secret' | trans }}"
|
||||
version: 1
|
||||
method: change_secret
|
||||
category:
|
||||
- ds
|
||||
type:
|
||||
- windows_ad
|
||||
params:
|
||||
- name: groups
|
||||
type: str
|
||||
label: '用户组'
|
||||
default: 'Users,Remote Desktop Users'
|
||||
help_text: "{{ 'Params groups help text' | trans }}"
|
||||
|
||||
|
||||
i18n:
|
||||
Windows account change secret:
|
||||
zh: '使用 Ansible 模块 win_domain_user 执行 Windows 账号改密'
|
||||
ja: 'Ansible win_domain_user モジュールを使用して Windows アカウントのパスワード変更'
|
||||
en: 'Using Ansible module win_domain_user to change Windows account secret'
|
||||
|
||||
Params groups help text:
|
||||
zh: '请输入用户组,多个用户组使用逗号分隔(需填写已存在的用户组)'
|
||||
ja: 'グループを入力してください。複数のグループはコンマで区切ってください(既存のグループを入力してください)'
|
||||
en: 'Please enter the group. Multiple groups are separated by commas (please enter the existing group)'
|
||||
|
@ -13,6 +13,7 @@ def parse_date(date_str, default=None):
|
||||
formats = [
|
||||
'%Y/%m/%d %H:%M:%S',
|
||||
'%Y-%m-%dT%H:%M:%S',
|
||||
'%Y-%m-%d %H:%M:%S',
|
||||
'%d-%m-%Y %H:%M:%S',
|
||||
'%Y/%m/%d',
|
||||
'%d-%m-%Y',
|
||||
@ -26,7 +27,6 @@ def parse_date(date_str, default=None):
|
||||
return default
|
||||
|
||||
|
||||
# TODO 后期会挪到 playbook 中
|
||||
class GatherAccountsFilter:
|
||||
def __init__(self, tp):
|
||||
self.tp = tp
|
||||
@ -208,14 +208,35 @@ class GatherAccountsFilter:
|
||||
key, value = parts
|
||||
user_info[key.strip()] = value.strip()
|
||||
detail = {'groups': user_info.get('Global Group memberships', ''), }
|
||||
user = {
|
||||
'username': user_info.get('User name', ''),
|
||||
'date_password_change': parse_date(user_info.get('Password last set', '')),
|
||||
'date_password_expired': parse_date(user_info.get('Password expires', '')),
|
||||
'date_last_login': parse_date(user_info.get('Last logon', '')),
|
||||
|
||||
username = user_info.get('User name')
|
||||
if not username:
|
||||
continue
|
||||
|
||||
result[username] = {
|
||||
'username': username,
|
||||
'date_password_change': parse_date(user_info.get('Password last set')),
|
||||
'date_password_expired': parse_date(user_info.get('Password expires')),
|
||||
'date_last_login': parse_date(user_info.get('Last logon')),
|
||||
'groups': detail,
|
||||
}
|
||||
return result
|
||||
|
||||
@staticmethod
|
||||
def windows_ad_filter(info):
|
||||
result = {}
|
||||
for user_info in info['user_details']:
|
||||
detail = {'groups': user_info.get('GlobalGroupMemberships', ''), }
|
||||
username = user_info.get('SamAccountName')
|
||||
if not username:
|
||||
continue
|
||||
result[username] = {
|
||||
'username': username,
|
||||
'date_password_change': parse_date(user_info.get('PasswordLastSet')),
|
||||
'date_password_expired': parse_date(user_info.get('PasswordExpires')),
|
||||
'date_last_login': parse_date(user_info.get('LastLogonDate')),
|
||||
'groups': detail,
|
||||
}
|
||||
result[user['username']] = user
|
||||
return result
|
||||
|
||||
@staticmethod
|
||||
|
@ -4,6 +4,7 @@
|
||||
- name: Run net user command to get all users
|
||||
win_shell: net user
|
||||
register: user_list_output
|
||||
failed_when: false
|
||||
|
||||
- name: Parse all users from net user command
|
||||
set_fact:
|
||||
|
@ -2,10 +2,13 @@ id: gather_accounts_windows
|
||||
name: "{{ 'Windows account gather' | trans }}"
|
||||
version: 1
|
||||
method: gather_accounts
|
||||
category: host
|
||||
category:
|
||||
- host
|
||||
|
||||
type:
|
||||
- windows
|
||||
|
||||
|
||||
i18n:
|
||||
Windows account gather:
|
||||
zh: 使用命令 net user 收集 Windows 账号
|
||||
|
@ -0,0 +1,74 @@
|
||||
- hosts: demo
|
||||
gather_facts: no
|
||||
tasks:
|
||||
- name: Import ActiveDirectory module
|
||||
win_shell: Import-Module ActiveDirectory
|
||||
args:
|
||||
warn: false
|
||||
|
||||
- name: Get the SamAccountName list of all AD users
|
||||
win_shell: |
|
||||
Import-Module ActiveDirectory
|
||||
Get-ADUser -Filter * | Select-Object -ExpandProperty SamAccountName
|
||||
register: ad_user_list
|
||||
|
||||
- name: Set the all_users variable
|
||||
set_fact:
|
||||
all_users: "{{ ad_user_list.stdout_lines }}"
|
||||
|
||||
- name: Get detailed information for each user
|
||||
win_shell: |
|
||||
Import-Module ActiveDirectory
|
||||
|
||||
$user = Get-ADUser -Identity {{ item }} -Properties Name, SamAccountName, Enabled, LastLogonDate, PasswordLastSet, msDS-UserPasswordExpiryTimeComputed, MemberOf
|
||||
|
||||
$globalGroups = @()
|
||||
if ($user.MemberOf) {
|
||||
$globalGroups = $user.MemberOf | ForEach-Object {
|
||||
try {
|
||||
$group = Get-ADGroup $_ -ErrorAction Stop
|
||||
if ($group.GroupScope -eq 'Global') { $group.Name }
|
||||
} catch {
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
$passwordExpiry = $null
|
||||
$expiryRaw = $user.'msDS-UserPasswordExpiryTimeComputed'
|
||||
if ($expiryRaw) {
|
||||
try {
|
||||
$passwordExpiry = [datetime]::FromFileTime($expiryRaw)
|
||||
} catch {
|
||||
$passwordExpiry = $null
|
||||
}
|
||||
}
|
||||
|
||||
$output = [PSCustomObject]@{
|
||||
Name = $user.Name
|
||||
SamAccountName = $user.SamAccountName
|
||||
Enabled = $user.Enabled
|
||||
LastLogonDate = if ($user.LastLogonDate) { $user.LastLogonDate.ToString("yyyy-MM-dd HH:mm:ss") } else { $null }
|
||||
PasswordLastSet = if ($user.PasswordLastSet) { $user.PasswordLastSet.ToString("yyyy-MM-dd HH:mm:ss") } else { $null }
|
||||
PasswordExpires = if ($passwordExpiry) { $passwordExpiry.ToString("yyyy-MM-dd HH:mm:ss") } else { $null }
|
||||
GlobalGroupMemberships = $globalGroups
|
||||
}
|
||||
|
||||
$output | ConvertTo-Json -Depth 3
|
||||
loop: "{{ all_users }}"
|
||||
register: ad_user_details
|
||||
ignore_errors: yes
|
||||
|
||||
|
||||
- set_fact:
|
||||
info:
|
||||
user_details: >-
|
||||
{{
|
||||
ad_user_details.results
|
||||
| selectattr('rc', 'equalto', 0)
|
||||
| map(attribute='stdout')
|
||||
| select('truthy')
|
||||
| map('from_json')
|
||||
}}
|
||||
|
||||
- debug:
|
||||
var: info
|
@ -0,0 +1,15 @@
|
||||
id: gather_accounts_windows_ad
|
||||
name: "{{ 'Windows account gather' | trans }}"
|
||||
version: 1
|
||||
method: gather_accounts
|
||||
category:
|
||||
- ds
|
||||
|
||||
type:
|
||||
- windows_ad
|
||||
|
||||
i18n:
|
||||
Windows account gather:
|
||||
zh: 使用命令 Get-ADUser 收集 Windows 账号
|
||||
ja: コマンド Get-ADUser を使用して Windows アカウントを収集する
|
||||
en: Using command Get-ADUser to gather accounts
|
@ -1,6 +1,6 @@
|
||||
import time
|
||||
from collections import defaultdict
|
||||
|
||||
import time
|
||||
from django.utils import timezone
|
||||
|
||||
from accounts.const import AutomationTypes
|
||||
@ -222,6 +222,7 @@ class GatherAccountsManager(AccountBasePlaybookManager):
|
||||
def _collect_asset_account_info(self, asset, info):
|
||||
result = self._filter_success_result(asset.type, info)
|
||||
accounts = []
|
||||
|
||||
for username, info in result.items():
|
||||
self.asset_usernames_mapper[str(asset.id)].add(username)
|
||||
|
||||
@ -373,6 +374,7 @@ class GatherAccountsManager(AccountBasePlaybookManager):
|
||||
|
||||
for asset, accounts_data in self.asset_account_info.items():
|
||||
ori_users = self.ori_asset_usernames[str(asset.id)]
|
||||
need_analyser_gather_account = []
|
||||
with tmp_to_org(asset.org_id):
|
||||
for d in accounts_data:
|
||||
username = d["username"]
|
||||
@ -385,10 +387,11 @@ class GatherAccountsManager(AccountBasePlaybookManager):
|
||||
ga = ori_account
|
||||
self.update_gathered_account(ori_account, d)
|
||||
ori_found = username in ori_users
|
||||
risk_analyser.analyse_risk(asset, ga, d, ori_found)
|
||||
|
||||
need_analyser_gather_account.append((asset, ga, d, ori_found))
|
||||
self.create_gathered_account.finish()
|
||||
self.update_gathered_account.finish()
|
||||
for analysis_data in need_analyser_gather_account:
|
||||
risk_analyser.analyse_risk(*analysis_data)
|
||||
self.update_gather_accounts_status(asset)
|
||||
if not self.is_sync_account:
|
||||
continue
|
||||
|
@ -41,6 +41,7 @@
|
||||
password: "{{ account.secret | password_hash('des') }}"
|
||||
update_password: always
|
||||
ignore_errors: true
|
||||
register: change_secret_result
|
||||
when: account.secret_type == "password"
|
||||
|
||||
- name: "Get home directory for {{ account.username }}"
|
||||
@ -83,6 +84,7 @@
|
||||
user: "{{ account.username }}"
|
||||
key: "{{ account.secret }}"
|
||||
exclusive: "{{ ssh_params.exclusive }}"
|
||||
register: change_secret_result
|
||||
when: account.secret_type == "ssh_key"
|
||||
|
||||
- name: Refresh connection
|
||||
@ -101,7 +103,9 @@
|
||||
become_password: "{{ account.become.ansible_password | default('') }}"
|
||||
become_private_key_path: "{{ account.become.ansible_ssh_private_key_file | default(None) }}"
|
||||
old_ssh_version: "{{ jms_asset.old_ssh_version | default(False) }}"
|
||||
when: account.secret_type == "password" and check_conn_after_change
|
||||
when:
|
||||
- account.secret_type == "password"
|
||||
- check_conn_after_change or change_secret_result.failed | default(false)
|
||||
delegate_to: localhost
|
||||
|
||||
- name: "Verify {{ account.username }} SSH KEY (paramiko)"
|
||||
@ -112,6 +116,8 @@
|
||||
login_private_key_path: "{{ account.private_key_path }}"
|
||||
gateway_args: "{{ jms_asset.ansible_ssh_common_args | default(None) }}"
|
||||
old_ssh_version: "{{ jms_asset.old_ssh_version | default(False) }}"
|
||||
when: account.secret_type == "ssh_key" and check_conn_after_change
|
||||
when:
|
||||
- account.secret_type == "ssh_key"
|
||||
- check_conn_after_change or change_secret_result.failed | default(false)
|
||||
delegate_to: localhost
|
||||
|
||||
|
@ -41,6 +41,7 @@
|
||||
password: "{{ account.secret | password_hash('sha512') }}"
|
||||
update_password: always
|
||||
ignore_errors: true
|
||||
register: change_secret_result
|
||||
when: account.secret_type == "password"
|
||||
|
||||
- name: "Get home directory for {{ account.username }}"
|
||||
@ -83,6 +84,7 @@
|
||||
user: "{{ account.username }}"
|
||||
key: "{{ account.secret }}"
|
||||
exclusive: "{{ ssh_params.exclusive }}"
|
||||
register: change_secret_result
|
||||
when: account.secret_type == "ssh_key"
|
||||
|
||||
- name: Refresh connection
|
||||
@ -101,7 +103,9 @@
|
||||
become_password: "{{ account.become.ansible_password | default('') }}"
|
||||
become_private_key_path: "{{ account.become.ansible_ssh_private_key_file | default(None) }}"
|
||||
old_ssh_version: "{{ jms_asset.old_ssh_version | default(False) }}"
|
||||
when: account.secret_type == "password" and check_conn_after_change
|
||||
when:
|
||||
- account.secret_type == "password"
|
||||
- check_conn_after_change or change_secret_result.failed | default(false)
|
||||
delegate_to: localhost
|
||||
|
||||
- name: "Verify {{ account.username }} SSH KEY (paramiko)"
|
||||
@ -112,6 +116,8 @@
|
||||
login_private_key_path: "{{ account.private_key_path }}"
|
||||
gateway_args: "{{ jms_asset.ansible_ssh_common_args | default(None) }}"
|
||||
old_ssh_version: "{{ jms_asset.old_ssh_version | default(False) }}"
|
||||
when: account.secret_type == "ssh_key" and check_conn_after_change
|
||||
when:
|
||||
- account.secret_type == "ssh_key"
|
||||
- check_conn_after_change or change_secret_result.failed | default(false)
|
||||
delegate_to: localhost
|
||||
|
||||
|
@ -0,0 +1,27 @@
|
||||
- hosts: demo
|
||||
gather_facts: no
|
||||
tasks:
|
||||
- name: Test privileged account
|
||||
ansible.windows.win_ping:
|
||||
|
||||
- name: Push user password
|
||||
community.windows.win_domain_user:
|
||||
name: "{{ account.username }}"
|
||||
password: "{{ account.secret }}"
|
||||
update_password: always
|
||||
password_never_expires: yes
|
||||
state: present
|
||||
groups: "{{ params.groups }}"
|
||||
groups_action: add
|
||||
ignore_errors: true
|
||||
when: account.secret_type == "password"
|
||||
|
||||
- name: Refresh connection
|
||||
ansible.builtin.meta: reset_connection
|
||||
|
||||
- name: Verify password
|
||||
ansible.windows.win_ping:
|
||||
vars:
|
||||
ansible_user: "{{ account.full_username }}"
|
||||
ansible_password: "{{ account.secret }}"
|
||||
when: account.secret_type == "password" and check_conn_after_change
|
@ -0,0 +1,25 @@
|
||||
id: push_account_ad_windows
|
||||
name: "{{ 'Windows account push' | trans }}"
|
||||
version: 1
|
||||
method: push_account
|
||||
category:
|
||||
- ds
|
||||
type:
|
||||
- windows_ad
|
||||
params:
|
||||
- name: groups
|
||||
type: str
|
||||
label: '用户组'
|
||||
default: 'Users,Remote Desktop Users'
|
||||
help_text: "{{ 'Params groups help text' | trans }}"
|
||||
|
||||
i18n:
|
||||
Windows account push:
|
||||
zh: '使用 Ansible 模块 win_domain_user 执行 Windows 账号推送'
|
||||
ja: 'Ansible win_domain_user モジュールを使用して Windows アカウントをプッシュする'
|
||||
en: 'Using Ansible module win_domain_user to push account'
|
||||
|
||||
Params groups help text:
|
||||
zh: '请输入用户组,多个用户组使用逗号分隔(需填写已存在的用户组)'
|
||||
ja: 'グループを入力してください。複数のグループはコンマで区切ってください(既存のグループを入力してください)'
|
||||
en: 'Please enter the group. Multiple groups are separated by commas (please enter the existing group)'
|
@ -11,4 +11,5 @@
|
||||
login_host: "{{ jms_asset.address }}"
|
||||
login_port: "{{ jms_asset.port }}"
|
||||
name: "{{ jms_asset.spec_info.db_name }}"
|
||||
script: "DROP USER {{ account.username }}"
|
||||
script: "DROP LOGIN {{ account.username }}; select @@version"
|
||||
|
||||
|
@ -0,0 +1,9 @@
|
||||
- hosts: windows
|
||||
gather_facts: no
|
||||
tasks:
|
||||
- name: "Remove account"
|
||||
ansible.windows.win_domain_user:
|
||||
name: "{{ account.username }}"
|
||||
state: absent
|
||||
|
||||
|
@ -0,0 +1,14 @@
|
||||
id: remove_account_ad_windows
|
||||
name: "{{ 'Windows account remove' | trans }}"
|
||||
version: 1
|
||||
method: remove_account
|
||||
category:
|
||||
- ds
|
||||
type:
|
||||
- windows_ad
|
||||
|
||||
i18n:
|
||||
Windows account remove:
|
||||
zh: 使用 Ansible 模块 win_domain_user 删除账号
|
||||
ja: Ansible モジュール win_domain_user を使用してアカウントを削除する
|
||||
en: Use the Ansible module win_domain_user to delete an account
|
@ -10,6 +10,6 @@
|
||||
rdp_ping:
|
||||
login_host: "{{ jms_asset.address }}"
|
||||
login_port: "{{ jms_asset.port }}"
|
||||
login_user: "{{ account.username }}"
|
||||
login_user: "{{ account.full_username }}"
|
||||
login_password: "{{ account.secret }}"
|
||||
login_secret_type: "{{ account.secret_type }}"
|
||||
|
@ -2,8 +2,10 @@ id: verify_account_by_rdp
|
||||
name: "{{ 'Windows rdp account verify' | trans }}"
|
||||
category:
|
||||
- host
|
||||
- ds
|
||||
type:
|
||||
- windows
|
||||
- windows_ad
|
||||
method: verify_account
|
||||
protocol: rdp
|
||||
priority: 1
|
||||
|
@ -7,5 +7,5 @@
|
||||
- name: Verify account
|
||||
ansible.windows.win_ping:
|
||||
vars:
|
||||
ansible_user: "{{ account.username }}"
|
||||
ansible_user: "{{ account.full_username }}"
|
||||
ansible_password: "{{ account.secret }}"
|
||||
|
@ -2,9 +2,12 @@ id: verify_account_windows
|
||||
name: "{{ 'Windows account verify' | trans }}"
|
||||
version: 1
|
||||
method: verify_account
|
||||
category: host
|
||||
category:
|
||||
- host
|
||||
- ds
|
||||
type:
|
||||
- windows
|
||||
- windows_ad
|
||||
|
||||
i18n:
|
||||
Windows account verify:
|
||||
|
@ -42,7 +42,7 @@ class VerifyAccountManager(AccountBasePlaybookManager):
|
||||
if host.get('error'):
|
||||
return host
|
||||
|
||||
accounts = asset.accounts.all()
|
||||
accounts = asset.all_accounts.all()
|
||||
accounts = self.get_accounts(account, accounts)
|
||||
inventory_hosts = []
|
||||
|
||||
@ -64,6 +64,7 @@ class VerifyAccountManager(AccountBasePlaybookManager):
|
||||
h['account'] = {
|
||||
'name': account.name,
|
||||
'username': account.username,
|
||||
'full_username': account.full_username,
|
||||
'secret_type': account.secret_type,
|
||||
'secret': account.escape_jinja2_syntax(secret),
|
||||
'private_key_path': private_key_path,
|
||||
|
@ -5,7 +5,6 @@ import uuid
|
||||
import django_filters
|
||||
from django.db.models import Q
|
||||
from django.utils import timezone
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
from django_filters import rest_framework as drf_filters
|
||||
from rest_framework import filters
|
||||
from rest_framework.compat import coreapi
|
||||
@ -13,11 +12,26 @@ from rest_framework.compat import coreapi
|
||||
from assets.models import Node
|
||||
from assets.utils import get_node_from_request
|
||||
from common.drf.filters import BaseFilterSet
|
||||
from common.utils import get_logger
|
||||
from common.utils.timezone import local_zero_hour, local_now
|
||||
from .const.automation import ChangeSecretRecordStatusChoice
|
||||
from .models import Account, GatheredAccount, ChangeSecretRecord, PushSecretRecord, IntegrationApplication, \
|
||||
AutomationExecution
|
||||
|
||||
logger = get_logger(__file__)
|
||||
|
||||
|
||||
class UUIDFilterMixin:
|
||||
@staticmethod
|
||||
def filter_uuid(queryset, name, value):
|
||||
try:
|
||||
uuid.UUID(value)
|
||||
except ValueError:
|
||||
logger.warning(f"Invalid UUID: {value}")
|
||||
return queryset.none()
|
||||
|
||||
return queryset.filter(**{name: value})
|
||||
|
||||
|
||||
class NodeFilterBackend(filters.BaseFilterBackend):
|
||||
fields = ['node_id']
|
||||
@ -43,14 +57,15 @@ class NodeFilterBackend(filters.BaseFilterBackend):
|
||||
return queryset
|
||||
|
||||
|
||||
class AccountFilterSet(BaseFilterSet):
|
||||
class AccountFilterSet(UUIDFilterMixin, BaseFilterSet):
|
||||
ip = drf_filters.CharFilter(field_name="address", lookup_expr="exact")
|
||||
name = drf_filters.CharFilter(field_name="name", lookup_expr="exact")
|
||||
hostname = drf_filters.CharFilter(field_name="name", lookup_expr="exact")
|
||||
username = drf_filters.CharFilter(field_name="username", lookup_expr="exact")
|
||||
address = drf_filters.CharFilter(field_name="asset__address", lookup_expr="exact")
|
||||
asset_id = drf_filters.CharFilter(field_name="asset", lookup_expr="exact")
|
||||
asset = drf_filters.CharFilter(field_name="asset", lookup_expr="exact")
|
||||
assets = drf_filters.CharFilter(field_name="asset_id", lookup_expr="exact")
|
||||
asset_name = drf_filters.CharFilter(field_name="asset__name", lookup_expr="exact")
|
||||
asset_id = drf_filters.CharFilter(field_name="asset", method="filter_uuid")
|
||||
assets = drf_filters.CharFilter(field_name="asset_id", method="filter_uuid")
|
||||
has_secret = drf_filters.BooleanFilter(method="filter_has_secret")
|
||||
platform = drf_filters.CharFilter(
|
||||
field_name="asset__platform_id", lookup_expr="exact"
|
||||
@ -135,8 +150,9 @@ class AccountFilterSet(BaseFilterSet):
|
||||
kwargs.update({"date_change_secret__gt": date})
|
||||
|
||||
if name == "latest_secret_change_failed":
|
||||
queryset = queryset.filter(date_change_secret__gt=date).exclude(
|
||||
change_secret_status=ChangeSecretRecordStatusChoice.success
|
||||
queryset = (
|
||||
queryset.filter(date_change_secret__gt=date)
|
||||
.exclude(change_secret_status=ChangeSecretRecordStatusChoice.success)
|
||||
)
|
||||
|
||||
if kwargs:
|
||||
@ -146,8 +162,8 @@ class AccountFilterSet(BaseFilterSet):
|
||||
class Meta:
|
||||
model = Account
|
||||
fields = [
|
||||
"id", "asset", "source_id", "secret_type", "category",
|
||||
"type", "privileged", "secret_reset", "connectivity", 'is_active'
|
||||
"id", "source_id", "secret_type", "category", "type",
|
||||
"privileged", "secret_reset", "connectivity", "is_active"
|
||||
]
|
||||
|
||||
|
||||
@ -185,16 +201,6 @@ class SecretRecordMixin(drf_filters.FilterSet):
|
||||
return queryset.filter(date_finished__gte=dt)
|
||||
|
||||
|
||||
class UUIDExecutionFilterMixin:
|
||||
@staticmethod
|
||||
def filter_execution(queryset, name, value):
|
||||
try:
|
||||
uuid.UUID(value)
|
||||
except ValueError:
|
||||
raise ValueError(_('Enter a valid UUID.'))
|
||||
return queryset.filter(**{name: value})
|
||||
|
||||
|
||||
class DaysExecutionFilterMixin:
|
||||
days = drf_filters.NumberFilter(method="filter_days")
|
||||
field: str
|
||||
@ -209,10 +215,10 @@ class DaysExecutionFilterMixin:
|
||||
|
||||
|
||||
class ChangeSecretRecordFilterSet(
|
||||
SecretRecordMixin, UUIDExecutionFilterMixin,
|
||||
SecretRecordMixin, UUIDFilterMixin,
|
||||
DaysExecutionFilterMixin, BaseFilterSet
|
||||
):
|
||||
execution_id = django_filters.CharFilter(method="filter_execution")
|
||||
execution_id = django_filters.CharFilter(method="filter_uuid")
|
||||
days = drf_filters.NumberFilter(method="filter_days")
|
||||
|
||||
field = 'date_finished'
|
||||
@ -230,8 +236,8 @@ class AutomationExecutionFilterSet(DaysExecutionFilterMixin, BaseFilterSet):
|
||||
fields = ["days", 'trigger', 'automation_id', 'automation__name']
|
||||
|
||||
|
||||
class PushAccountRecordFilterSet(SecretRecordMixin, UUIDExecutionFilterMixin, BaseFilterSet):
|
||||
execution_id = django_filters.CharFilter(method="filter_execution")
|
||||
class PushAccountRecordFilterSet(SecretRecordMixin, UUIDFilterMixin, BaseFilterSet):
|
||||
execution_id = django_filters.CharFilter(method="filter_uuid")
|
||||
|
||||
class Meta:
|
||||
model = PushSecretRecord
|
||||
|
@ -1,65 +1,15 @@
|
||||
from rest_framework.response import Response
|
||||
from rest_framework import status
|
||||
from django.db.models import Model
|
||||
from django.utils import translation
|
||||
from django.utils.translation import gettext_noop
|
||||
|
||||
from audits.const import ActionChoices
|
||||
from common.views.mixins import RecordViewLogMixin
|
||||
from common.utils import i18n_fmt
|
||||
from audits.handler import create_or_update_operate_log
|
||||
|
||||
|
||||
class AccountRecordViewLogMixin(RecordViewLogMixin):
|
||||
class AccountRecordViewLogMixin(object):
|
||||
get_object: callable
|
||||
get_queryset: callable
|
||||
|
||||
@staticmethod
|
||||
def _filter_params(params):
|
||||
new_params = {}
|
||||
need_pop_params = ('format', 'order')
|
||||
for key, value in params.items():
|
||||
if key in need_pop_params:
|
||||
continue
|
||||
if isinstance(value, list):
|
||||
value = list(filter(None, value))
|
||||
if value:
|
||||
new_params[key] = value
|
||||
return new_params
|
||||
|
||||
def get_resource_display(self, request):
|
||||
query_params = dict(request.query_params)
|
||||
params = self._filter_params(query_params)
|
||||
|
||||
spm_filter = params.pop("spm", None)
|
||||
|
||||
if not params and not spm_filter:
|
||||
display_message = gettext_noop("Export all")
|
||||
elif spm_filter:
|
||||
display_message = gettext_noop("Export only selected items")
|
||||
else:
|
||||
query = ",".join(
|
||||
["%s=%s" % (key, value) for key, value in params.items()]
|
||||
)
|
||||
display_message = i18n_fmt(gettext_noop("Export filtered: %s"), query)
|
||||
return display_message
|
||||
|
||||
@property
|
||||
def detail_msg(self):
|
||||
return i18n_fmt(
|
||||
gettext_noop('User %s view/export secret'), self.request.user
|
||||
)
|
||||
|
||||
def list(self, request, *args, **kwargs):
|
||||
list_func = getattr(super(), 'list')
|
||||
if not callable(list_func):
|
||||
return Response(status=status.HTTP_405_METHOD_NOT_ALLOWED)
|
||||
response = list_func(request, *args, **kwargs)
|
||||
with translation.override('en'):
|
||||
resource_display = self.get_resource_display(request)
|
||||
ids = [q.id for q in self.get_queryset()]
|
||||
self.record_logs(
|
||||
ids, ActionChoices.view, self.detail_msg, resource_display=resource_display
|
||||
)
|
||||
return response
|
||||
model: Model
|
||||
|
||||
def retrieve(self, request, *args, **kwargs):
|
||||
retrieve_func = getattr(super(), 'retrieve')
|
||||
@ -67,9 +17,9 @@ class AccountRecordViewLogMixin(RecordViewLogMixin):
|
||||
return Response(status=status.HTTP_405_METHOD_NOT_ALLOWED)
|
||||
response = retrieve_func(request, *args, **kwargs)
|
||||
with translation.override('en'):
|
||||
resource = self.get_object()
|
||||
self.record_logs(
|
||||
[resource.id], ActionChoices.view, self.detail_msg, resource=resource
|
||||
create_or_update_operate_log(
|
||||
ActionChoices.view, self.model._meta.verbose_name,
|
||||
force=True, resource=self.get_object(),
|
||||
)
|
||||
return response
|
||||
|
||||
|
@ -131,9 +131,46 @@ class Account(AbsConnectivity, LabeledMixin, BaseAccount, JSONFilterMixin):
|
||||
|
||||
@lazyproperty
|
||||
def alias(self):
|
||||
"""
|
||||
别称,因为有虚拟账号,@INPUT @MANUAL @USER, 否则为 id
|
||||
"""
|
||||
if self.username.startswith('@'):
|
||||
return self.username
|
||||
return self.name
|
||||
return str(self.id)
|
||||
|
||||
def is_virtual(self):
|
||||
"""
|
||||
不要用 username 去判断,因为可能是构造的 account 对象,设置了同名账号的用户名,
|
||||
"""
|
||||
return self.alias.startswith('@')
|
||||
|
||||
def is_ds_account(self):
|
||||
if self.is_virtual():
|
||||
return ''
|
||||
if not self.asset.is_directory_service:
|
||||
return False
|
||||
return True
|
||||
|
||||
@lazyproperty
|
||||
def ds(self):
|
||||
if not self.is_ds_account():
|
||||
return None
|
||||
return self.asset.ds
|
||||
|
||||
@lazyproperty
|
||||
def ds_domain(self):
|
||||
"""这个不能去掉,perm_account 会动态设置这个值,以更改 full_username"""
|
||||
if self.is_virtual():
|
||||
return ''
|
||||
if self.ds and self.ds.domain_name:
|
||||
return self.ds.domain_name
|
||||
return ''
|
||||
|
||||
@property
|
||||
def full_username(self):
|
||||
if self.ds_domain:
|
||||
return '{}@{}'.format(self.username, self.ds_domain)
|
||||
return self.username
|
||||
|
||||
@lazyproperty
|
||||
def has_secret(self):
|
||||
|
@ -92,8 +92,9 @@ class VirtualAccount(JMSOrgBaseModel):
|
||||
from .account import Account
|
||||
username = user.username
|
||||
|
||||
alias = AliasAccount.USER.value
|
||||
with tmp_to_org(asset.org):
|
||||
same_account = cls.objects.filter(alias='@USER').first()
|
||||
same_account = cls.objects.filter(alias=alias).first()
|
||||
|
||||
secret = ''
|
||||
if same_account and same_account.secret_from_login:
|
||||
@ -101,4 +102,6 @@ class VirtualAccount(JMSOrgBaseModel):
|
||||
|
||||
if not secret and not from_permed:
|
||||
secret = input_secret
|
||||
return Account(name=AliasAccount.USER.label, username=username, secret=secret)
|
||||
account = Account(name=AliasAccount.USER.label, username=username, secret=secret)
|
||||
account.alias = alias
|
||||
return account
|
||||
|
@ -233,6 +233,7 @@ class AccountSerializer(AccountCreateUpdateSerializerMixin, BaseAccountSerialize
|
||||
required=False, queryset=Account.objects, allow_null=True, allow_empty=True,
|
||||
label=_('Su from'), attrs=('id', 'name', 'username')
|
||||
)
|
||||
ds = ObjectRelatedField(read_only=True, label=_('Directory service'), attrs=('id', 'name', 'domain_name'))
|
||||
|
||||
class Meta(BaseAccountSerializer.Meta):
|
||||
model = Account
|
||||
@ -241,7 +242,7 @@ class AccountSerializer(AccountCreateUpdateSerializerMixin, BaseAccountSerialize
|
||||
'date_change_secret', 'change_secret_status'
|
||||
]
|
||||
fields = BaseAccountSerializer.Meta.fields + [
|
||||
'su_from', 'asset', 'version',
|
||||
'su_from', 'asset', 'version', 'ds',
|
||||
'source', 'source_id', 'secret_reset',
|
||||
] + AccountCreateUpdateSerializerMixin.Meta.fields + automation_fields
|
||||
read_only_fields = BaseAccountSerializer.Meta.read_only_fields + automation_fields
|
||||
@ -258,7 +259,7 @@ class AccountSerializer(AccountCreateUpdateSerializerMixin, BaseAccountSerialize
|
||||
queryset = queryset.prefetch_related(
|
||||
'asset', 'asset__platform',
|
||||
'asset__platform__automation'
|
||||
).prefetch_related('labels', 'labels__label')
|
||||
)
|
||||
return queryset
|
||||
|
||||
|
||||
|
@ -1,3 +1,4 @@
|
||||
from django.templatetags.static import static
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
from rest_framework import serializers
|
||||
|
||||
@ -27,13 +28,14 @@ class IntegrationApplicationSerializer(BulkOrgResourceModelSerializer):
|
||||
'name': {'label': _('Name')},
|
||||
'accounts_amount': {'label': _('Accounts amount')},
|
||||
'is_active': {'default': True},
|
||||
'logo': {'required': False},
|
||||
}
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
request_method = self.context.get('request').method
|
||||
if request_method == 'PUT':
|
||||
self.fields['logo'].required = False
|
||||
def to_representation(self, instance):
|
||||
data = super().to_representation(instance)
|
||||
if not data.get('logo'):
|
||||
data['logo'] = static('img/logo.png')
|
||||
return data
|
||||
|
||||
|
||||
class IntegrationAccountSecretSerializer(serializers.Serializer):
|
||||
|
@ -129,7 +129,7 @@
|
||||
</tbody>
|
||||
</table>
|
||||
{% else %}
|
||||
<p class="no-data">{% trans 'No new accounts found' %}</p>
|
||||
<p class="no-data">{% trans 'No lost accounts found' %}</p>
|
||||
{% endif %}
|
||||
</div>
|
||||
</section>
|
||||
|
@ -8,6 +8,6 @@ class ActionChoices(models.TextChoices):
|
||||
review = 'review', _('Review')
|
||||
warning = 'warning', _('Warn')
|
||||
notice = 'notice', _('Notify')
|
||||
notify_and_warn = 'notify_and_warn', _('Notify and warn')
|
||||
notify_and_warn = 'notify_and_warn', _('Prompt and warn')
|
||||
face_verify = 'face_verify', _('Face Verify')
|
||||
face_online = 'face_online', _('Face Online')
|
||||
|
@ -18,7 +18,12 @@ class LoginACLSerializer(BaseUserACLSerializer, BulkOrgResourceModelSerializer):
|
||||
class Meta(BaseUserACLSerializer.Meta):
|
||||
model = LoginACL
|
||||
fields = BaseUserACLSerializer.Meta.fields + ['rules', ]
|
||||
action_choices_exclude = [ActionChoices.face_online, ActionChoices.face_verify]
|
||||
action_choices_exclude = [
|
||||
ActionChoices.warning,
|
||||
ActionChoices.notify_and_warn,
|
||||
ActionChoices.face_online,
|
||||
ActionChoices.face_verify
|
||||
]
|
||||
|
||||
def get_rules_serializer(self):
|
||||
return RuleSerializer()
|
||||
|
@ -3,6 +3,7 @@ from .cloud import *
|
||||
from .custom import *
|
||||
from .database import *
|
||||
from .device import *
|
||||
from .ds import *
|
||||
from .gpt import *
|
||||
from .host import *
|
||||
from .permission import *
|
||||
|
@ -11,6 +11,7 @@ from rest_framework.decorators import action
|
||||
from rest_framework.response import Response
|
||||
from rest_framework.status import HTTP_200_OK
|
||||
|
||||
from accounts.serializers import AccountSerializer
|
||||
from accounts.tasks import push_accounts_to_assets_task, verify_accounts_connectivity_task
|
||||
from assets import serializers
|
||||
from assets.exceptions import NotSupportedTemporarilyError
|
||||
@ -96,10 +97,10 @@ class AssetFilterSet(BaseFilterSet):
|
||||
return queryset.filter(protocols__name__in=value).distinct()
|
||||
|
||||
|
||||
class AssetViewSet(SuggestionMixin, OrgBulkModelViewSet):
|
||||
"""
|
||||
API endpoint that allows Asset to be viewed or edited.
|
||||
class BaseAssetViewSet(OrgBulkModelViewSet):
|
||||
"""
|
||||
API endpoint that allows Asset to be viewed or edited.
|
||||
"""
|
||||
model = Asset
|
||||
filterset_class = AssetFilterSet
|
||||
search_fields = ("name", "address", "comment")
|
||||
@ -109,18 +110,19 @@ class AssetViewSet(SuggestionMixin, OrgBulkModelViewSet):
|
||||
("platform", serializers.PlatformSerializer),
|
||||
("suggestion", serializers.MiniAssetSerializer),
|
||||
("gateways", serializers.GatewaySerializer),
|
||||
("accounts", AccountSerializer),
|
||||
)
|
||||
rbac_perms = (
|
||||
("match", "assets.match_asset"),
|
||||
("platform", "assets.view_platform"),
|
||||
("gateways", "assets.view_gateway"),
|
||||
("accounts", "assets.view_account"),
|
||||
("spec_info", "assets.view_asset"),
|
||||
("gathered_info", "assets.view_asset"),
|
||||
("sync_platform_protocols", "assets.change_asset"),
|
||||
)
|
||||
extra_filter_backends = [
|
||||
IpInFilterBackend,
|
||||
NodeFilterBackend, AttrRulesFilterBackend
|
||||
IpInFilterBackend, NodeFilterBackend, AttrRulesFilterBackend
|
||||
]
|
||||
|
||||
def perform_destroy(self, instance):
|
||||
@ -141,6 +143,25 @@ class AssetViewSet(SuggestionMixin, OrgBulkModelViewSet):
|
||||
return retrieve_cls
|
||||
return cls
|
||||
|
||||
def paginate_queryset(self, queryset):
|
||||
page = super().paginate_queryset(queryset)
|
||||
if page:
|
||||
page = Asset.compute_all_accounts_amount(page)
|
||||
return page
|
||||
|
||||
def create(self, request, *args, **kwargs):
|
||||
if request.path.find('/api/v1/assets/assets/') > -1:
|
||||
error = _('Cannot create asset directly, you should create a host or other')
|
||||
return Response({'error': error}, status=400)
|
||||
|
||||
if not settings.XPACK_LICENSE_IS_VALID and self.model.objects.order_by().count() >= 5000:
|
||||
error = _('The number of assets exceeds the limit of 5000')
|
||||
return Response({'error': error}, status=400)
|
||||
|
||||
return super().create(request, *args, **kwargs)
|
||||
|
||||
|
||||
class AssetViewSet(SuggestionMixin, BaseAssetViewSet):
|
||||
@action(methods=["GET"], detail=True, url_path="platform")
|
||||
def platform(self, *args, **kwargs):
|
||||
asset = super().get_object()
|
||||
@ -189,17 +210,6 @@ class AssetViewSet(SuggestionMixin, OrgBulkModelViewSet):
|
||||
Protocol.objects.bulk_create(objs)
|
||||
return Response(status=status.HTTP_200_OK)
|
||||
|
||||
def create(self, request, *args, **kwargs):
|
||||
if request.path.find('/api/v1/assets/assets/') > -1:
|
||||
error = _('Cannot create asset directly, you should create a host or other')
|
||||
return Response({'error': error}, status=400)
|
||||
|
||||
if not settings.XPACK_LICENSE_IS_VALID and self.model.objects.order_by().count() >= 5000:
|
||||
error = _('The number of assets exceeds the limit of 5000')
|
||||
return Response({'error': error}, status=400)
|
||||
|
||||
return super().create(request, *args, **kwargs)
|
||||
|
||||
def filter_bulk_update_data(self):
|
||||
bulk_data = []
|
||||
skip_assets = []
|
||||
|
@ -1,12 +1,12 @@
|
||||
from assets.models import Cloud, Asset
|
||||
from assets.serializers import CloudSerializer
|
||||
|
||||
from .asset import AssetViewSet
|
||||
from .asset import BaseAssetViewSet
|
||||
|
||||
__all__ = ['CloudViewSet']
|
||||
|
||||
|
||||
class CloudViewSet(AssetViewSet):
|
||||
class CloudViewSet(BaseAssetViewSet):
|
||||
model = Cloud
|
||||
perm_model = Asset
|
||||
|
||||
|
@ -1,12 +1,12 @@
|
||||
from assets.models import Custom, Asset
|
||||
from assets.serializers import CustomSerializer
|
||||
|
||||
from .asset import AssetViewSet
|
||||
from .asset import BaseAssetViewSet
|
||||
|
||||
__all__ = ['CustomViewSet']
|
||||
|
||||
|
||||
class CustomViewSet(AssetViewSet):
|
||||
class CustomViewSet(BaseAssetViewSet):
|
||||
model = Custom
|
||||
perm_model = Asset
|
||||
|
||||
|
@ -1,12 +1,12 @@
|
||||
from assets.models import Database, Asset
|
||||
from assets.serializers import DatabaseSerializer
|
||||
|
||||
from .asset import AssetViewSet
|
||||
from .asset import BaseAssetViewSet
|
||||
|
||||
__all__ = ['DatabaseViewSet']
|
||||
|
||||
|
||||
class DatabaseViewSet(AssetViewSet):
|
||||
class DatabaseViewSet(BaseAssetViewSet):
|
||||
model = Database
|
||||
perm_model = Asset
|
||||
|
||||
|
@ -1,11 +1,11 @@
|
||||
from assets.serializers import DeviceSerializer
|
||||
from assets.models import Device, Asset
|
||||
from .asset import AssetViewSet
|
||||
from assets.serializers import DeviceSerializer
|
||||
from .asset import BaseAssetViewSet
|
||||
|
||||
__all__ = ['DeviceViewSet']
|
||||
|
||||
|
||||
class DeviceViewSet(AssetViewSet):
|
||||
class DeviceViewSet(BaseAssetViewSet):
|
||||
model = Device
|
||||
perm_model = Asset
|
||||
|
||||
|
16
apps/assets/api/asset/ds.py
Normal file
16
apps/assets/api/asset/ds.py
Normal file
@ -0,0 +1,16 @@
|
||||
from assets.models import DirectoryService, Asset
|
||||
from assets.serializers import DSSerializer
|
||||
|
||||
from .asset import BaseAssetViewSet
|
||||
|
||||
__all__ = ['DSViewSet']
|
||||
|
||||
|
||||
class DSViewSet(BaseAssetViewSet):
|
||||
model = DirectoryService
|
||||
perm_model = Asset
|
||||
|
||||
def get_serializer_classes(self):
|
||||
serializer_classes = super().get_serializer_classes()
|
||||
serializer_classes['default'] = DSSerializer
|
||||
return serializer_classes
|
@ -1,12 +1,12 @@
|
||||
from assets.models import GPT, Asset
|
||||
from assets.serializers import GPTSerializer
|
||||
|
||||
from .asset import AssetViewSet
|
||||
from .asset import BaseAssetViewSet
|
||||
|
||||
__all__ = ['GPTViewSet']
|
||||
|
||||
|
||||
class GPTViewSet(AssetViewSet):
|
||||
class GPTViewSet(BaseAssetViewSet):
|
||||
model = GPT
|
||||
perm_model = Asset
|
||||
|
||||
|
@ -1,11 +1,11 @@
|
||||
from assets.models import Host, Asset
|
||||
from assets.serializers import HostSerializer
|
||||
from .asset import AssetViewSet
|
||||
from .asset import BaseAssetViewSet
|
||||
|
||||
__all__ = ['HostViewSet']
|
||||
|
||||
|
||||
class HostViewSet(AssetViewSet):
|
||||
class HostViewSet(BaseAssetViewSet):
|
||||
model = Host
|
||||
perm_model = Asset
|
||||
|
||||
|
@ -1,12 +1,12 @@
|
||||
from assets.models import Web, Asset
|
||||
from assets.serializers import WebSerializer
|
||||
|
||||
from .asset import AssetViewSet
|
||||
from .asset import BaseAssetViewSet
|
||||
|
||||
__all__ = ['WebViewSet']
|
||||
|
||||
|
||||
class WebViewSet(AssetViewSet):
|
||||
class WebViewSet(BaseAssetViewSet):
|
||||
model = Web
|
||||
perm_model = Asset
|
||||
|
||||
|
@ -52,7 +52,7 @@ class AssetPlatformViewSet(JMSModelViewSet):
|
||||
queryset = (
|
||||
super().get_queryset()
|
||||
.annotate(assets_amount=Coalesce(Subquery(asset_count_subquery), Value(0)))
|
||||
.prefetch_related('protocols', 'automation', 'labels', 'labels__label')
|
||||
.prefetch_related('protocols', 'automation')
|
||||
)
|
||||
queryset = queryset.filter(type__in=AllTypes.get_types_values())
|
||||
return queryset
|
||||
|
@ -3,10 +3,10 @@ import json
|
||||
import logging
|
||||
import os
|
||||
import shutil
|
||||
import time
|
||||
from collections import defaultdict
|
||||
from socket import gethostname
|
||||
|
||||
import time
|
||||
import yaml
|
||||
from django.conf import settings
|
||||
from django.template.loader import render_to_string
|
||||
@ -334,7 +334,8 @@ class PlaybookPrepareMixin:
|
||||
return sub_playbook_path
|
||||
|
||||
def check_automation_enabled(self, platform, assets):
|
||||
if not platform.automation or not platform.automation.ansible_enabled:
|
||||
automation = getattr(platform, 'automation', None)
|
||||
if not (automation and getattr(automation, 'ansible_enabled', False)):
|
||||
print(_(" - Platform {} ansible disabled").format(platform.name))
|
||||
self.on_assets_not_ansible_enabled(assets)
|
||||
return False
|
||||
|
@ -2,9 +2,12 @@ id: gather_facts_windows
|
||||
name: "{{ 'Gather facts windows' | trans }}"
|
||||
version: 1
|
||||
method: gather_facts
|
||||
category: host
|
||||
category:
|
||||
- host
|
||||
- ds
|
||||
type:
|
||||
- windows
|
||||
- windows_ad
|
||||
i18n:
|
||||
Gather facts windows:
|
||||
zh: '使用 Ansible 指令 gather_facts 从 Windows 获取设备信息'
|
||||
|
@ -3,8 +3,10 @@ name: "{{ 'Ping by pyfreerdp' | trans }}"
|
||||
category:
|
||||
- device
|
||||
- host
|
||||
- ds
|
||||
type:
|
||||
- windows
|
||||
- windows_ad
|
||||
method: ping
|
||||
protocol: rdp
|
||||
priority: 1
|
||||
|
@ -3,6 +3,7 @@ name: "{{ 'Ping by paramiko' | trans }}"
|
||||
category:
|
||||
- device
|
||||
- host
|
||||
- ds
|
||||
type:
|
||||
- all
|
||||
method: ping
|
||||
|
@ -3,6 +3,7 @@ name: "{{ 'Ping by telnet' | trans }}"
|
||||
category:
|
||||
- device
|
||||
- host
|
||||
- ds
|
||||
type:
|
||||
- all
|
||||
method: ping
|
||||
|
@ -2,9 +2,12 @@ id: win_ping
|
||||
name: "{{ 'Windows ping' | trans }}"
|
||||
version: 1
|
||||
method: ping
|
||||
category: host
|
||||
category:
|
||||
- host
|
||||
- ds
|
||||
type:
|
||||
- windows
|
||||
- windows_ad
|
||||
i18n:
|
||||
Windows ping:
|
||||
zh: 使用 Ansible 模块 内置模块 win_ping 来测试可连接性
|
||||
|
@ -112,8 +112,7 @@ class BaseType(TextChoices):
|
||||
|
||||
@classmethod
|
||||
def get_choices(cls):
|
||||
if not settings.XPACK_LICENSE_IS_VALID:
|
||||
choices = cls.choices
|
||||
if not settings.XPACK_LICENSE_IS_VALID and hasattr(cls, 'get_community_types'):
|
||||
choices = [(tp.value, tp.label) for tp in cls.get_community_types()]
|
||||
else:
|
||||
choices = cls.choices
|
||||
return choices
|
||||
|
@ -12,6 +12,7 @@ class Category(ChoicesMixin, models.TextChoices):
|
||||
DATABASE = 'database', _("Database")
|
||||
CLOUD = 'cloud', _("Cloud service")
|
||||
WEB = 'web', _("Web")
|
||||
DS = 'ds', _("Directory service")
|
||||
CUSTOM = 'custom', _("Custom type")
|
||||
|
||||
@classmethod
|
||||
|
@ -20,6 +20,7 @@ class DeviceTypes(BaseType):
|
||||
'*': {
|
||||
'charset_enabled': False,
|
||||
'domain_enabled': True,
|
||||
'ds_enabled': False,
|
||||
'su_enabled': True,
|
||||
'su_methods': ['enable', 'super', 'super_level']
|
||||
}
|
||||
|
70
apps/assets/const/ds.py
Normal file
70
apps/assets/const/ds.py
Normal file
@ -0,0 +1,70 @@
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from .base import BaseType
|
||||
|
||||
|
||||
class DirectoryTypes(BaseType):
|
||||
GENERAL = 'general', _('General')
|
||||
# LDAP = 'ldap', _('LDAP')
|
||||
# AD = 'ad', _('Active Directory')
|
||||
WINDOWS_AD = 'windows_ad', _('Windows Active Directory')
|
||||
|
||||
# AZURE_AD = 'azure_ad', _('Azure Active Directory')
|
||||
|
||||
@classmethod
|
||||
def _get_base_constrains(cls) -> dict:
|
||||
return {
|
||||
'*': {
|
||||
'charset_enabled': True,
|
||||
'domain_enabled': True,
|
||||
'ds_enabled': False,
|
||||
'su_enabled': True,
|
||||
},
|
||||
cls.WINDOWS_AD: {
|
||||
'su_enabled': False,
|
||||
}
|
||||
}
|
||||
|
||||
@classmethod
|
||||
def _get_automation_constrains(cls) -> dict:
|
||||
constrains = {
|
||||
'*': {
|
||||
'ansible_enabled': False,
|
||||
},
|
||||
cls.WINDOWS_AD: {
|
||||
'ansible_enabled': True,
|
||||
'ping_enabled': True,
|
||||
'gather_facts_enabled': True,
|
||||
'verify_account_enabled': True,
|
||||
'change_secret_enabled': True,
|
||||
'push_account_enabled': True,
|
||||
'gather_accounts_enabled': True,
|
||||
'remove_account_enabled': True,
|
||||
}
|
||||
}
|
||||
return constrains
|
||||
|
||||
@classmethod
|
||||
def _get_protocol_constrains(cls) -> dict:
|
||||
return {
|
||||
cls.GENERAL: {
|
||||
'choices': ['ssh']
|
||||
},
|
||||
cls.WINDOWS_AD: {
|
||||
'choices': ['rdp', 'ssh', 'vnc', 'winrm']
|
||||
},
|
||||
}
|
||||
|
||||
@classmethod
|
||||
def internal_platforms(cls):
|
||||
return {
|
||||
cls.WINDOWS_AD: [
|
||||
{'name': 'Windows Active Directory'}
|
||||
],
|
||||
}
|
||||
|
||||
@classmethod
|
||||
def get_community_types(cls):
|
||||
return [
|
||||
cls.GENERAL,
|
||||
]
|
@ -20,6 +20,7 @@ class HostTypes(BaseType):
|
||||
'charset': 'utf-8', # default
|
||||
'domain_enabled': True,
|
||||
'su_enabled': True,
|
||||
'ds_enabled': True,
|
||||
'su_methods': ['sudo', 'su', 'only_sudo', 'only_su'],
|
||||
},
|
||||
cls.WINDOWS: {
|
||||
@ -56,7 +57,6 @@ class HostTypes(BaseType):
|
||||
'change_secret_enabled': True,
|
||||
'push_account_enabled': True,
|
||||
'remove_account_enabled': True,
|
||||
|
||||
},
|
||||
cls.WINDOWS: {
|
||||
'ansible_config': {
|
||||
@ -69,7 +69,6 @@ class HostTypes(BaseType):
|
||||
'ping_enabled': False,
|
||||
'gather_facts_enabled': False,
|
||||
'gather_accounts_enabled': False,
|
||||
'verify_account_enabled': False,
|
||||
'change_secret_enabled': False,
|
||||
'push_account_enabled': False
|
||||
},
|
||||
@ -126,5 +125,5 @@ class HostTypes(BaseType):
|
||||
@classmethod
|
||||
def get_community_types(cls) -> list:
|
||||
return [
|
||||
cls.LINUX, cls.UNIX, cls.WINDOWS, cls.OTHER_HOST
|
||||
cls.LINUX, cls.WINDOWS, cls.UNIX, cls.OTHER_HOST
|
||||
]
|
||||
|
@ -13,6 +13,7 @@ from .cloud import CloudTypes
|
||||
from .custom import CustomTypes
|
||||
from .database import DatabaseTypes
|
||||
from .device import DeviceTypes
|
||||
from .ds import DirectoryTypes
|
||||
from .gpt import GPTTypes
|
||||
from .host import HostTypes
|
||||
from .web import WebTypes
|
||||
@ -22,7 +23,8 @@ class AllTypes(ChoicesMixin):
|
||||
choices: list
|
||||
includes = [
|
||||
HostTypes, DeviceTypes, DatabaseTypes,
|
||||
CloudTypes, WebTypes, CustomTypes, GPTTypes
|
||||
CloudTypes, WebTypes, CustomTypes,
|
||||
DirectoryTypes, GPTTypes
|
||||
]
|
||||
_category_constrains = {}
|
||||
_automation_methods = None
|
||||
@ -173,6 +175,7 @@ class AllTypes(ChoicesMixin):
|
||||
(Category.DATABASE, DatabaseTypes),
|
||||
(Category.WEB, WebTypes),
|
||||
(Category.CLOUD, CloudTypes),
|
||||
(Category.DS, DirectoryTypes),
|
||||
(Category.CUSTOM, CustomTypes)
|
||||
]
|
||||
return types
|
||||
|
57
apps/assets/migrations/0016_directory_service.py
Normal file
57
apps/assets/migrations/0016_directory_service.py
Normal file
@ -0,0 +1,57 @@
|
||||
# Generated by Django 4.1.13 on 2025-04-03 09:51
|
||||
|
||||
import django.db.models.deletion
|
||||
from django.db import migrations, models
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
("assets", "0015_automationexecution_type"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.CreateModel(
|
||||
name="DirectoryService",
|
||||
fields=[
|
||||
(
|
||||
"asset_ptr",
|
||||
models.OneToOneField(
|
||||
auto_created=True,
|
||||
on_delete=django.db.models.deletion.CASCADE,
|
||||
parent_link=True,
|
||||
primary_key=True,
|
||||
serialize=False,
|
||||
to="assets.asset",
|
||||
),
|
||||
),
|
||||
(
|
||||
"domain_name",
|
||||
models.CharField(
|
||||
blank=True,
|
||||
default="",
|
||||
max_length=128,
|
||||
verbose_name="Domain name",
|
||||
),
|
||||
),
|
||||
],
|
||||
options={
|
||||
"verbose_name": "Directory service",
|
||||
"default_related_name": "ds"
|
||||
},
|
||||
bases=("assets.asset",),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="platform",
|
||||
name="ds_enabled",
|
||||
field=models.BooleanField(default=False, verbose_name="DS enabled"),
|
||||
),
|
||||
migrations.AddField(
|
||||
model_name="asset",
|
||||
name="directory_services",
|
||||
field=models.ManyToManyField(
|
||||
related_name="assets",
|
||||
to="assets.directoryservice",
|
||||
verbose_name="Directory services",
|
||||
),
|
||||
),
|
||||
]
|
165
apps/assets/migrations/0017_auto_20250407_1124.py
Normal file
165
apps/assets/migrations/0017_auto_20250407_1124.py
Normal file
@ -0,0 +1,165 @@
|
||||
# Generated by Django 4.1.13 on 2025-04-07 03:24
|
||||
|
||||
import json
|
||||
|
||||
from django.db import migrations
|
||||
|
||||
from assets.const import AllTypes
|
||||
|
||||
|
||||
def add_ds_platforms(apps, schema_editor):
|
||||
data = """
|
||||
[
|
||||
{
|
||||
"created_by": "system",
|
||||
"updated_by": "system",
|
||||
"comment": "",
|
||||
"name": "WindowsActiveDirectory",
|
||||
"category": "ds",
|
||||
"type": "windows_ad",
|
||||
"meta": {},
|
||||
"internal": true,
|
||||
"domain_enabled": true,
|
||||
"su_enabled": false,
|
||||
"su_method": null,
|
||||
"custom_fields": [],
|
||||
"automation": {
|
||||
"ansible_enabled": true,
|
||||
"ansible_config": {
|
||||
"ansible_shell_type": "cmd",
|
||||
"ansible_connection": "ssh"
|
||||
},
|
||||
"ping_enabled": true,
|
||||
"ping_method": "ping_by_rdp",
|
||||
"ping_params": {},
|
||||
"gather_facts_enabled": true,
|
||||
"gather_facts_method": "gather_facts_windows",
|
||||
"gather_facts_params": {},
|
||||
"change_secret_enabled": true,
|
||||
"change_secret_method": "change_secret_ad_windows",
|
||||
"change_secret_params": {
|
||||
},
|
||||
"push_account_enabled": true,
|
||||
"push_account_method": "push_account_ad_windows",
|
||||
"push_account_params": {},
|
||||
"verify_account_enabled": true,
|
||||
"verify_account_method": "verify_account_by_rdp",
|
||||
"verify_account_params": {
|
||||
|
||||
},
|
||||
"gather_accounts_enabled": true,
|
||||
"gather_accounts_method": "gather_accounts_windows_ad",
|
||||
"gather_accounts_params": {
|
||||
|
||||
},
|
||||
"remove_account_enabled": true,
|
||||
"remove_account_method": "remove_account_ad_windows",
|
||||
"remove_account_params": {
|
||||
|
||||
}
|
||||
},
|
||||
"protocols": [
|
||||
{
|
||||
"name": "rdp",
|
||||
"port": 3389,
|
||||
"primary": true,
|
||||
"required": false,
|
||||
"default": false,
|
||||
"public": true,
|
||||
"setting": {
|
||||
"console": false,
|
||||
"security": "any"
|
||||
}
|
||||
},
|
||||
{
|
||||
"name": "ssh",
|
||||
"port": 22,
|
||||
"primary": false,
|
||||
"required": false,
|
||||
"default": false,
|
||||
"public": true,
|
||||
"setting": {
|
||||
"sftp_enabled": true,
|
||||
"sftp_home": "/tmp"
|
||||
}
|
||||
},
|
||||
{
|
||||
"name": "vnc",
|
||||
"port": 5900,
|
||||
"primary": false,
|
||||
"required": false,
|
||||
"default": false,
|
||||
"public": true,
|
||||
"setting": {
|
||||
|
||||
}
|
||||
},
|
||||
{
|
||||
"name": "winrm",
|
||||
"port": 5985,
|
||||
"primary": false,
|
||||
"required": false,
|
||||
"default": false,
|
||||
"public": false,
|
||||
"setting": {
|
||||
"use_ssl": false
|
||||
}
|
||||
}
|
||||
]
|
||||
},
|
||||
{
|
||||
"created_by": "system",
|
||||
"updated_by": "system",
|
||||
"comment": "",
|
||||
"name": "General",
|
||||
"category": "ds",
|
||||
"type": "general",
|
||||
"meta": {
|
||||
},
|
||||
"internal": true,
|
||||
"domain_enabled": false,
|
||||
"su_enabled": false,
|
||||
"su_method": null,
|
||||
"custom_fields": [
|
||||
],
|
||||
"automation": {
|
||||
"ansible_enabled": false,
|
||||
"ansible_config": {
|
||||
}
|
||||
},
|
||||
"protocols": [
|
||||
{
|
||||
"name": "ssh",
|
||||
"port": 22,
|
||||
"primary": true,
|
||||
"required": false,
|
||||
"default": false,
|
||||
"public": true,
|
||||
"setting": {
|
||||
"sftp_enabled": true,
|
||||
"sftp_home": "/tmp"
|
||||
}
|
||||
}
|
||||
]
|
||||
}
|
||||
]
|
||||
"""
|
||||
platform_model = apps.get_model('assets', 'Platform')
|
||||
automation_cls = apps.get_model('assets', 'PlatformAutomation')
|
||||
platform_datas = json.loads(data)
|
||||
|
||||
for platform_data in platform_datas:
|
||||
AllTypes.create_or_update_by_platform_data(
|
||||
platform_data, platform_cls=platform_model,
|
||||
automation_cls=automation_cls
|
||||
)
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
dependencies = [
|
||||
("assets", "0016_directory_service"),
|
||||
]
|
||||
|
||||
operations = [
|
||||
migrations.RunPython(add_ds_platforms)
|
||||
]
|
@ -3,6 +3,7 @@ from .common import *
|
||||
from .custom import *
|
||||
from .database import *
|
||||
from .device import *
|
||||
from .ds import *
|
||||
from .gpt import *
|
||||
from .host import *
|
||||
from .web import *
|
||||
|
@ -6,7 +6,7 @@ import logging
|
||||
from collections import defaultdict
|
||||
|
||||
from django.db import models
|
||||
from django.db.models import Q
|
||||
from django.db.models import Q, Count
|
||||
from django.forms import model_to_dict
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
@ -175,6 +175,10 @@ class Asset(NodesRelationMixin, LabeledMixin, AbsConnectivity, JSONFilterMixin,
|
||||
nodes = models.ManyToManyField(
|
||||
'assets.Node', default=default_node, related_name='assets', verbose_name=_("Nodes")
|
||||
)
|
||||
directory_services = models.ManyToManyField(
|
||||
'assets.DirectoryService', related_name='assets',
|
||||
verbose_name=_("Directory service")
|
||||
)
|
||||
is_active = models.BooleanField(default=True, verbose_name=_('Active'))
|
||||
gathered_info = models.JSONField(verbose_name=_('Gathered info'), default=dict, blank=True) # 资产的一些信息,如 硬件信息
|
||||
custom_info = models.JSONField(verbose_name=_('Custom info'), default=dict)
|
||||
@ -201,6 +205,10 @@ class Asset(NodesRelationMixin, LabeledMixin, AbsConnectivity, JSONFilterMixin,
|
||||
info[i.name] = v
|
||||
return info
|
||||
|
||||
@lazyproperty
|
||||
def is_directory_service(self):
|
||||
return self.category == const.Category.DS and hasattr(self, 'ds')
|
||||
|
||||
@lazyproperty
|
||||
def spec_info(self):
|
||||
instance = getattr(self, self.category, None)
|
||||
@ -245,9 +253,28 @@ class Asset(NodesRelationMixin, LabeledMixin, AbsConnectivity, JSONFilterMixin,
|
||||
auto_config.update(model_to_dict(automation))
|
||||
return auto_config
|
||||
|
||||
@property
|
||||
def all_accounts(self):
|
||||
if not self.joined_dir_svcs:
|
||||
queryset = self.accounts.all()
|
||||
else:
|
||||
queryset = self.accounts.model.objects.filter(asset__in=[self.id, *self.joined_dir_svcs])
|
||||
return queryset
|
||||
|
||||
@property
|
||||
def dc_accounts(self):
|
||||
queryset = self.accounts.model.objects.filter(asset__in=[*self.joined_dir_svcs])
|
||||
return queryset
|
||||
|
||||
@lazyproperty
|
||||
def all_valid_accounts(self):
|
||||
queryset = (self.all_accounts.filter(is_active=True)
|
||||
.prefetch_related('asset', 'asset__platform'))
|
||||
return queryset
|
||||
|
||||
@lazyproperty
|
||||
def accounts_amount(self):
|
||||
return self.accounts.count()
|
||||
return self.all_accounts.count()
|
||||
|
||||
def get_target_ip(self):
|
||||
return self.address
|
||||
@ -259,6 +286,41 @@ class Asset(NodesRelationMixin, LabeledMixin, AbsConnectivity, JSONFilterMixin,
|
||||
protocol = self.protocols.all().filter(name=protocol).first()
|
||||
return protocol.port if protocol else 0
|
||||
|
||||
def is_dir_svc(self):
|
||||
return self.category == const.Category.DS
|
||||
|
||||
@property
|
||||
def joined_dir_svcs(self):
|
||||
return self.directory_services.all()
|
||||
|
||||
@classmethod
|
||||
def compute_all_accounts_amount(cls, assets):
|
||||
from .ds import DirectoryService
|
||||
asset_ids = [asset.id for asset in assets]
|
||||
asset_id_dc_ids_mapper = defaultdict(list)
|
||||
dc_ids = set()
|
||||
|
||||
asset_dc_relations = (
|
||||
Asset.directory_services.through.objects
|
||||
.filter(asset_id__in=asset_ids)
|
||||
.values_list('asset_id', 'directoryservice_id')
|
||||
)
|
||||
for asset_id, ds_id in asset_dc_relations:
|
||||
dc_ids.add(ds_id)
|
||||
asset_id_dc_ids_mapper[asset_id].append(ds_id)
|
||||
|
||||
directory_services = (
|
||||
DirectoryService.objects.filter(id__in=dc_ids)
|
||||
.annotate(accounts_amount=Count('accounts'))
|
||||
)
|
||||
ds_accounts_amount_mapper = {ds.id: ds.accounts_amount for ds in directory_services}
|
||||
for asset in assets:
|
||||
asset_dc_ids = asset_id_dc_ids_mapper.get(asset.id, [])
|
||||
for dc_id in asset_dc_ids:
|
||||
ds_accounts = ds_accounts_amount_mapper.get(dc_id, 0)
|
||||
asset.accounts_amount += ds_accounts
|
||||
return assets
|
||||
|
||||
@property
|
||||
def is_valid(self):
|
||||
warning = ''
|
||||
|
14
apps/assets/models/asset/ds.py
Normal file
14
apps/assets/models/asset/ds.py
Normal file
@ -0,0 +1,14 @@
|
||||
from django.db import models
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from .common import Asset
|
||||
|
||||
__all__ = ['DirectoryService']
|
||||
|
||||
|
||||
class DirectoryService(Asset):
|
||||
domain_name = models.CharField(max_length=128, blank=True, default='', verbose_name=_("Domain name"))
|
||||
|
||||
class Meta:
|
||||
default_related_name = "ds"
|
||||
verbose_name = _("Directory service")
|
@ -102,6 +102,7 @@ class Platform(LabeledMixin, JMSBaseModel):
|
||||
max_length=8, verbose_name=_("Charset")
|
||||
)
|
||||
domain_enabled = models.BooleanField(default=True, verbose_name=_("Gateway enabled"))
|
||||
ds_enabled = models.BooleanField(default=False, verbose_name=_("DS enabled"))
|
||||
# 账号有关的
|
||||
su_enabled = models.BooleanField(default=False, verbose_name=_("Su enabled"))
|
||||
su_method = models.CharField(max_length=32, blank=True, null=True, verbose_name=_("Su method"))
|
||||
@ -115,6 +116,11 @@ class Platform(LabeledMixin, JMSBaseModel):
|
||||
def assets_amount(self):
|
||||
return self.assets.count()
|
||||
|
||||
def save(self, *args, **kwargs):
|
||||
if not self.ds_enabled:
|
||||
self.ds = None
|
||||
super().save(*args, **kwargs)
|
||||
|
||||
@classmethod
|
||||
def default(cls):
|
||||
linux, created = cls.objects.get_or_create(
|
||||
|
@ -4,6 +4,7 @@ from .common import *
|
||||
from .custom import *
|
||||
from .database import *
|
||||
from .device import *
|
||||
from .ds import *
|
||||
from .gpt import *
|
||||
from .host import *
|
||||
from .web import *
|
||||
|
@ -147,7 +147,8 @@ class AssetSerializer(BulkOrgResourceModelSerializer, ResourceLabelsMixin, Writa
|
||||
protocols = AssetProtocolsSerializer(many=True, required=False, label=_('Protocols'), default=())
|
||||
accounts = AssetAccountSerializer(many=True, required=False, allow_null=True, write_only=True, label=_('Accounts'))
|
||||
nodes_display = NodeDisplaySerializer(read_only=False, required=False, label=_("Node path"))
|
||||
platform = ObjectRelatedField(queryset=Platform.objects, required=True, label=_('Platform'), attrs=('id', 'name', 'type'))
|
||||
platform = ObjectRelatedField(queryset=Platform.objects, required=True, label=_('Platform'),
|
||||
attrs=('id', 'name', 'type'))
|
||||
accounts_amount = serializers.IntegerField(read_only=True, label=_('Accounts amount'))
|
||||
_accounts = None
|
||||
|
||||
@ -159,6 +160,7 @@ class AssetSerializer(BulkOrgResourceModelSerializer, ResourceLabelsMixin, Writa
|
||||
fields_m2m = [
|
||||
'nodes', 'labels', 'protocols',
|
||||
'nodes_display', 'accounts',
|
||||
'directory_services',
|
||||
]
|
||||
read_only_fields = [
|
||||
'accounts_amount', 'category', 'type', 'connectivity', 'auto_config',
|
||||
@ -172,6 +174,11 @@ class AssetSerializer(BulkOrgResourceModelSerializer, ResourceLabelsMixin, Writa
|
||||
'address': {'label': _('Address')},
|
||||
'nodes_display': {'label': _('Node path')},
|
||||
'nodes': {'allow_empty': True, 'label': _("Nodes")},
|
||||
'directory_services': {
|
||||
'required': False,
|
||||
'allow_empty': True,
|
||||
'default': list, 'label': _("Directory service")
|
||||
},
|
||||
}
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
@ -226,15 +233,11 @@ class AssetSerializer(BulkOrgResourceModelSerializer, ResourceLabelsMixin, Writa
|
||||
@classmethod
|
||||
def setup_eager_loading(cls, queryset):
|
||||
""" Perform necessary eager loading of data. """
|
||||
queryset = queryset.prefetch_related('domain', 'nodes', 'protocols', ) \
|
||||
queryset = queryset.prefetch_related('domain', 'nodes', 'protocols', 'directory_services') \
|
||||
.prefetch_related('platform', 'platform__automation') \
|
||||
.annotate(category=F("platform__category")) \
|
||||
.annotate(type=F("platform__type")) \
|
||||
.annotate(accounts_amount=Count('accounts'))
|
||||
if queryset.model is Asset:
|
||||
queryset = queryset.prefetch_related('labels__label', 'labels')
|
||||
else:
|
||||
queryset = queryset.prefetch_related('asset_ptr__labels__label', 'asset_ptr__labels')
|
||||
return queryset
|
||||
|
||||
@staticmethod
|
||||
|
22
apps/assets/serializers/asset/ds.py
Normal file
22
apps/assets/serializers/asset/ds.py
Normal file
@ -0,0 +1,22 @@
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from assets.models import DirectoryService
|
||||
from .common import AssetSerializer
|
||||
|
||||
__all__ = ['DSSerializer']
|
||||
|
||||
|
||||
class DSSerializer(AssetSerializer):
|
||||
class Meta(AssetSerializer.Meta):
|
||||
model = DirectoryService
|
||||
fields = AssetSerializer.Meta.fields + [
|
||||
'domain_name',
|
||||
]
|
||||
extra_kwargs = {
|
||||
**AssetSerializer.Meta.extra_kwargs,
|
||||
'domain_name': {
|
||||
'help_text': _('The domain part used by the directory service (e.g., AD) and appended to '
|
||||
'the username during login, such as example.com in user@example.com.'),
|
||||
'label': _('Domain name')
|
||||
}
|
||||
}
|
@ -20,4 +20,5 @@ class HostGatheredInfoSerializer(serializers.Serializer):
|
||||
|
||||
category_gathered_serializer_map = {
|
||||
'host': HostGatheredInfoSerializer,
|
||||
'ds': HostGatheredInfoSerializer,
|
||||
}
|
||||
|
@ -2,7 +2,7 @@ from django.utils.translation import gettext_lazy as _
|
||||
from rest_framework import serializers
|
||||
|
||||
from assets.const import FillType
|
||||
from assets.models import Database, Web
|
||||
from assets.models import Database, Web, DirectoryService
|
||||
from common.serializers.fields import LabeledChoiceField
|
||||
|
||||
|
||||
@ -28,7 +28,7 @@ class WebSpecSerializer(serializers.ModelSerializer):
|
||||
# 查看 Web 资产详情时
|
||||
self.pop_fields_if_need(fields)
|
||||
return fields
|
||||
|
||||
|
||||
def is_retrieve(self):
|
||||
try:
|
||||
self.context.get('request').method and self.parent.instance.web
|
||||
@ -51,9 +51,14 @@ class WebSpecSerializer(serializers.ModelSerializer):
|
||||
return fields
|
||||
|
||||
|
||||
class DsSpecSerializer(serializers.ModelSerializer):
|
||||
class Meta:
|
||||
model = DirectoryService
|
||||
fields = ['domain_name']
|
||||
|
||||
|
||||
category_spec_serializer_map = {
|
||||
'database': DatabaseSpecSerializer,
|
||||
'web': WebSpecSerializer,
|
||||
'ds': DsSpecSerializer,
|
||||
}
|
||||
|
@ -4,12 +4,11 @@ from django.db.models import Count, Q
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
from rest_framework import serializers
|
||||
|
||||
from assets.models.gateway import Gateway
|
||||
from common.serializers import ResourceLabelsMixin
|
||||
from common.serializers.fields import ObjectRelatedField
|
||||
from orgs.mixins.serializers import BulkOrgResourceModelSerializer
|
||||
from .gateway import GatewayWithAccountSecretSerializer
|
||||
from ..models import Domain
|
||||
from ..models import Domain, Gateway
|
||||
|
||||
__all__ = ['DomainSerializer', 'DomainWithGatewaySerializer', 'DomainListSerializer']
|
||||
|
||||
@ -55,11 +54,6 @@ class DomainSerializer(ResourceLabelsMixin, BulkOrgResourceModelSerializer):
|
||||
validated_data['assets'] = assets + gateways
|
||||
return super().update(instance, validated_data)
|
||||
|
||||
@classmethod
|
||||
def setup_eager_loading(cls, queryset):
|
||||
queryset = queryset.prefetch_related('labels', 'labels__label')
|
||||
return queryset
|
||||
|
||||
|
||||
class DomainListSerializer(DomainSerializer):
|
||||
class Meta(DomainSerializer.Meta):
|
||||
|
@ -195,7 +195,7 @@ class PlatformSerializer(ResourceLabelsMixin, CommonSerializerMixin, WritableNes
|
||||
fields_m2m = ['assets', 'assets_amount']
|
||||
fields = fields_small + fields_m2m + [
|
||||
"protocols", "domain_enabled", "su_enabled", "su_method",
|
||||
"automation", "comment", "custom_fields", "labels"
|
||||
"ds_enabled", "automation", "comment", "custom_fields", "labels"
|
||||
] + read_only_fields
|
||||
extra_kwargs = {
|
||||
"su_enabled": {
|
||||
@ -220,6 +220,11 @@ class PlatformSerializer(ResourceLabelsMixin, CommonSerializerMixin, WritableNes
|
||||
def set_initial_value(self):
|
||||
if not hasattr(self, 'initial_data'):
|
||||
return
|
||||
|
||||
name = self.initial_data.get('name')
|
||||
if ' ' in name:
|
||||
self.initial_data['name'] = name.replace(' ', '-')
|
||||
|
||||
if self.instance:
|
||||
return
|
||||
if not self.initial_data.get('automation'):
|
||||
|
@ -16,6 +16,7 @@ router.register(r'databases', api.DatabaseViewSet, 'database')
|
||||
router.register(r'webs', api.WebViewSet, 'web')
|
||||
router.register(r'clouds', api.CloudViewSet, 'cloud')
|
||||
router.register(r'gpts', api.GPTViewSet, 'gpt')
|
||||
router.register(r'directories', api.DSViewSet, 'ds')
|
||||
router.register(r'customs', api.CustomViewSet, 'custom')
|
||||
router.register(r'platforms', api.AssetPlatformViewSet, 'platform')
|
||||
router.register(r'nodes', api.NodeViewSet, 'node')
|
||||
|
@ -64,8 +64,8 @@ class JobLogAuditViewSet(OrgReadonlyModelViewSet):
|
||||
|
||||
class JobsAuditViewSet(OrgModelViewSet):
|
||||
model = Job
|
||||
search_fields = ['creator__name']
|
||||
filterset_fields = ['creator__name']
|
||||
search_fields = ['creator__name', 'args', 'name']
|
||||
filterset_fields = ['creator__name', 'args', 'name']
|
||||
serializer_class = JobsAuditSerializer
|
||||
ordering = ['-is_periodic', '-date_updated']
|
||||
http_method_names = ['get', 'options', 'patch']
|
||||
@ -257,10 +257,18 @@ class OperateLogViewSet(OrgReadonlyModelViewSet):
|
||||
return super().get_serializer_class()
|
||||
|
||||
def get_queryset(self):
|
||||
qs = OperateLog.objects.all()
|
||||
if self.is_action_detail:
|
||||
with tmp_to_root_org():
|
||||
qs |= OperateLog.objects.filter(org_id=Organization.SYSTEM_ID)
|
||||
current_org_id = str(current_org.id)
|
||||
|
||||
with tmp_to_root_org():
|
||||
qs = OperateLog.objects.all()
|
||||
if current_org_id != Organization.ROOT_ID:
|
||||
filtered_org_ids = {current_org_id}
|
||||
if current_org_id == Organization.DEFAULT_ID:
|
||||
filtered_org_ids.update(Organization.INTERNAL_IDS)
|
||||
if self.is_action_detail:
|
||||
filtered_org_ids.add(Organization.SYSTEM_ID)
|
||||
qs = OperateLog.objects.filter(org_id__in=filtered_org_ids)
|
||||
|
||||
es_config = settings.OPERATE_LOG_ELASTICSEARCH_CONFIG
|
||||
if es_config:
|
||||
engine_mod = import_module(TYPE_ENGINE_MAPPING['es'])
|
||||
|
@ -24,6 +24,7 @@ class ActionChoices(TextChoices):
|
||||
update = "update", _("Update")
|
||||
delete = "delete", _("Delete")
|
||||
create = "create", _("Create")
|
||||
export = "export", _("Export")
|
||||
# Activities action
|
||||
download = "download", _("Download")
|
||||
connect = "connect", _("Connect")
|
||||
|
@ -9,7 +9,6 @@ import common.db.encoder
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
|
||||
initial = True
|
||||
|
||||
dependencies = [
|
||||
@ -19,10 +18,14 @@ class Migration(migrations.Migration):
|
||||
migrations.CreateModel(
|
||||
name='ActivityLog',
|
||||
fields=[
|
||||
('org_id', models.CharField(blank=True, db_index=True, default='', max_length=36, verbose_name='Organization')),
|
||||
('org_id',
|
||||
models.CharField(blank=True, db_index=True, default='', max_length=36, verbose_name='Organization')),
|
||||
('id', models.UUIDField(default=uuid.uuid4, primary_key=True, serialize=False)),
|
||||
('type', models.CharField(choices=[('O', 'Operate log'), ('S', 'Session log'), ('L', 'Login log'), ('T', 'Task')], default=None, max_length=2, null=True, verbose_name='Activity type')),
|
||||
('resource_id', models.CharField(blank=True, db_index=True, default='', max_length=36, verbose_name='Resource')),
|
||||
('type', models.CharField(
|
||||
choices=[('O', 'Operate log'), ('S', 'Session log'), ('L', 'Login log'), ('T', 'Task')],
|
||||
default=None, max_length=2, null=True, verbose_name='Activity type')),
|
||||
('resource_id',
|
||||
models.CharField(blank=True, db_index=True, default='', max_length=36, verbose_name='Resource')),
|
||||
('datetime', models.DateTimeField(auto_now=True, db_index=True, verbose_name='Datetime')),
|
||||
('detail', models.TextField(blank=True, default='', verbose_name='Detail')),
|
||||
('detail_id', models.CharField(default=None, max_length=36, null=True, verbose_name='Detail ID')),
|
||||
@ -35,13 +38,17 @@ class Migration(migrations.Migration):
|
||||
migrations.CreateModel(
|
||||
name='FTPLog',
|
||||
fields=[
|
||||
('org_id', models.CharField(blank=True, db_index=True, default='', max_length=36, verbose_name='Organization')),
|
||||
('org_id',
|
||||
models.CharField(blank=True, db_index=True, default='', max_length=36, verbose_name='Organization')),
|
||||
('id', models.UUIDField(default=uuid.uuid4, primary_key=True, serialize=False)),
|
||||
('user', models.CharField(max_length=128, verbose_name='User')),
|
||||
('remote_addr', models.CharField(blank=True, max_length=128, null=True, verbose_name='Remote addr')),
|
||||
('asset', models.CharField(max_length=1024, verbose_name='Asset')),
|
||||
('account', models.CharField(max_length=128, verbose_name='Account')),
|
||||
('operate', models.CharField(choices=[('mkdir', 'Mkdir'), ('rmdir', 'Rmdir'), ('delete', 'Delete'), ('upload', 'Upload'), ('rename', 'Rename'), ('symlink', 'Symlink'), ('download', 'Download'), ('rename_dir', 'Rename dir')], max_length=16, verbose_name='Operate')),
|
||||
('operate', models.CharField(
|
||||
choices=[('mkdir', 'Mkdir'), ('rmdir', 'Rmdir'), ('delete', 'Delete'), ('upload', 'Upload'),
|
||||
('rename', 'Rename'), ('symlink', 'Symlink'), ('download', 'Download'),
|
||||
('rename_dir', 'Rename dir')], max_length=16, verbose_name='Operate')),
|
||||
('filename', models.CharField(max_length=1024, verbose_name='Filename')),
|
||||
('is_success', models.BooleanField(default=True, verbose_name='Success')),
|
||||
('date_start', models.DateTimeField(auto_now_add=True, db_index=True, verbose_name='Date start')),
|
||||
@ -55,13 +62,32 @@ class Migration(migrations.Migration):
|
||||
migrations.CreateModel(
|
||||
name='OperateLog',
|
||||
fields=[
|
||||
('org_id', models.CharField(blank=True, db_index=True, default='', max_length=36, verbose_name='Organization')),
|
||||
('org_id',
|
||||
models.CharField(blank=True, db_index=True, default='', max_length=36, verbose_name='Organization')),
|
||||
('id', models.UUIDField(default=uuid.uuid4, primary_key=True, serialize=False)),
|
||||
('user', models.CharField(max_length=128, verbose_name='User')),
|
||||
('action', models.CharField(choices=[('view', 'View'), ('update', 'Update'), ('delete', 'Delete'), ('create', 'Create'), ('download', 'Download'), ('connect', 'Connect'), ('login', 'Login'), ('change_password', 'Change password'), ('accept', 'Accept'), ('review', 'Review'), ('notice', 'Notifications'), ('reject', 'Reject'), ('approve', 'Approve'), ('close', 'Close'), ('finished', 'Finished')], max_length=16, verbose_name='Action')),
|
||||
('action', models.CharField(choices=[
|
||||
("view", "View"),
|
||||
("update", "Update"),
|
||||
("delete", "Delete"),
|
||||
("create", "Create"),
|
||||
("export", "Export"),
|
||||
("download", "Download"),
|
||||
("connect", "Connect"),
|
||||
("login", "Login"),
|
||||
("change_password", "Change password"),
|
||||
("accept", "Accept"),
|
||||
("review", "Review"),
|
||||
("notice", "Notifications"),
|
||||
("reject", "Reject"),
|
||||
("approve", "Approve"),
|
||||
("close", "Close"),
|
||||
("finished", "Finished"),
|
||||
], max_length=16, verbose_name='Action')),
|
||||
('resource_type', models.CharField(max_length=64, verbose_name='Resource Type')),
|
||||
('resource', models.CharField(max_length=128, verbose_name='Resource')),
|
||||
('resource_id', models.CharField(blank=True, db_index=True, default='', max_length=128, verbose_name='Resource')),
|
||||
('resource_id',
|
||||
models.CharField(blank=True, db_index=True, default='', max_length=128, verbose_name='Resource')),
|
||||
('remote_addr', models.CharField(blank=True, max_length=128, null=True, verbose_name='Remote addr')),
|
||||
('datetime', models.DateTimeField(auto_now=True, db_index=True, verbose_name='Datetime')),
|
||||
('diff', models.JSONField(default=dict, encoder=common.db.encoder.ModelJSONFieldEncoder, null=True)),
|
||||
@ -89,14 +115,18 @@ class Migration(migrations.Migration):
|
||||
fields=[
|
||||
('id', models.UUIDField(default=uuid.uuid4, primary_key=True, serialize=False)),
|
||||
('username', models.CharField(max_length=128, verbose_name='Username')),
|
||||
('type', models.CharField(choices=[('W', 'Web'), ('T', 'Terminal'), ('U', 'Unknown')], max_length=2, verbose_name='Login type')),
|
||||
('type', models.CharField(choices=[('W', 'Web'), ('T', 'Terminal'), ('U', 'Unknown')], max_length=2,
|
||||
verbose_name='Login type')),
|
||||
('ip', models.GenericIPAddressField(verbose_name='Login IP')),
|
||||
('city', models.CharField(blank=True, max_length=254, null=True, verbose_name='Login city')),
|
||||
('user_agent', models.CharField(blank=True, max_length=254, null=True, verbose_name='User agent')),
|
||||
('mfa', models.SmallIntegerField(choices=[(0, 'Disabled'), (1, 'Enabled'), (2, '-')], default=2, verbose_name='MFA')),
|
||||
('mfa', models.SmallIntegerField(choices=[(0, 'Disabled'), (1, 'Enabled'), (2, '-')], default=2,
|
||||
verbose_name='MFA')),
|
||||
('reason', models.CharField(blank=True, default='', max_length=128, verbose_name='Reason')),
|
||||
('status', models.BooleanField(choices=[(1, 'Success'), (0, 'Failed')], default=1, verbose_name='Status')),
|
||||
('datetime', models.DateTimeField(db_index=True, default=django.utils.timezone.now, verbose_name='Login Date')),
|
||||
('status',
|
||||
models.BooleanField(choices=[(1, 'Success'), (0, 'Failed')], default=1, verbose_name='Status')),
|
||||
('datetime',
|
||||
models.DateTimeField(db_index=True, default=django.utils.timezone.now, verbose_name='Login Date')),
|
||||
('backend', models.CharField(default='', max_length=32, verbose_name='Auth backend')),
|
||||
],
|
||||
options={
|
||||
@ -112,7 +142,8 @@ class Migration(migrations.Migration):
|
||||
('key', models.CharField(max_length=128, verbose_name='Session key')),
|
||||
('city', models.CharField(blank=True, max_length=254, null=True, verbose_name='Login city')),
|
||||
('user_agent', models.CharField(blank=True, max_length=254, null=True, verbose_name='User agent')),
|
||||
('type', models.CharField(choices=[('W', 'Web'), ('T', 'Terminal'), ('U', 'Unknown')], max_length=2, verbose_name='Login type')),
|
||||
('type', models.CharField(choices=[('W', 'Web'), ('T', 'Terminal'), ('U', 'Unknown')], max_length=2,
|
||||
verbose_name='Login type')),
|
||||
('backend', models.CharField(default='', max_length=32, verbose_name='Auth backend')),
|
||||
('date_created', models.DateTimeField(blank=True, null=True, verbose_name='Login date')),
|
||||
],
|
||||
|
@ -35,14 +35,13 @@ class JobLogSerializer(JobExecutionSerializer):
|
||||
|
||||
|
||||
class JobsAuditSerializer(JobSerializer):
|
||||
material = serializers.ReadOnlyField(label=_("Command"))
|
||||
summary = serializers.ReadOnlyField(label=_("Summary"))
|
||||
crontab = serializers.ReadOnlyField(label=_("Execution cycle"))
|
||||
is_periodic_display = serializers.BooleanField(read_only=True, source='is_periodic')
|
||||
|
||||
class Meta(JobSerializer.Meta):
|
||||
read_only_fields = [
|
||||
"id", 'name', 'args', 'material', 'type', 'crontab', 'interval', 'date_last_run', 'summary', 'created_by',
|
||||
"id", 'name', 'args', 'type', 'crontab', 'interval', 'date_last_run', 'summary', 'created_by',
|
||||
'is_periodic_display'
|
||||
]
|
||||
fields = read_only_fields + ['is_periodic']
|
||||
@ -189,6 +188,9 @@ class ActivityUnionLogSerializer(serializers.Serializer):
|
||||
class FileSerializer(serializers.Serializer):
|
||||
file = serializers.FileField(allow_empty_file=True)
|
||||
|
||||
class Meta:
|
||||
ref_name = 'AuditFileSerializer'
|
||||
|
||||
|
||||
class UserSessionSerializer(serializers.ModelSerializer):
|
||||
type = LabeledChoiceField(choices=LoginTypeChoices.choices, label=_("Type"))
|
||||
|
@ -6,12 +6,16 @@ from django.core.exceptions import ObjectDoesNotExist
|
||||
from django.db import models
|
||||
from django.db.models import F, Value, CharField
|
||||
from django.db.models.functions import Concat
|
||||
from django.utils import translation
|
||||
from itertools import chain
|
||||
|
||||
from common.db.fields import RelatedManager
|
||||
from common.utils import validate_ip, get_ip_city, get_logger
|
||||
from common.utils.timezone import as_current_tz
|
||||
from .const import DEFAULT_CITY
|
||||
from .const import DEFAULT_CITY, ActivityChoices as LogChoice
|
||||
from .handler import create_or_update_operate_log
|
||||
from .models import ActivityLog
|
||||
|
||||
|
||||
logger = get_logger(__name__)
|
||||
|
||||
@ -140,3 +144,15 @@ def construct_userlogin_usernames(user_queryset):
|
||||
).values_list("usernames_combined_field", flat=True)
|
||||
usernames = list(chain(usernames_original, usernames_combined))
|
||||
return usernames
|
||||
|
||||
|
||||
def record_operate_log_and_activity_log(ids, action, detail, model, **kwargs):
|
||||
from orgs.utils import current_org
|
||||
|
||||
org_id = current_org.id
|
||||
with translation.override('en'):
|
||||
resource_type = model._meta.verbose_name
|
||||
create_or_update_operate_log(action, resource_type, force=True, **kwargs)
|
||||
base_data = {'type': LogChoice.operate_log, 'detail': detail, 'org_id': org_id}
|
||||
activities = [ActivityLog(resource_id=r_id, **base_data) for r_id in ids]
|
||||
ActivityLog.objects.bulk_create(activities)
|
||||
|
@ -35,7 +35,8 @@ from ..models import ConnectionToken, AdminConnectionToken, date_expired_default
|
||||
from ..serializers import (
|
||||
ConnectionTokenSerializer, ConnectionTokenSecretSerializer,
|
||||
SuperConnectionTokenSerializer, ConnectTokenAppletOptionSerializer,
|
||||
ConnectionTokenReusableSerializer, ConnectTokenVirtualAppOptionSerializer
|
||||
ConnectionTokenReusableSerializer, ConnectTokenVirtualAppOptionSerializer,
|
||||
AdminConnectionTokenSerializer,
|
||||
)
|
||||
|
||||
__all__ = ['ConnectionTokenViewSet', 'SuperConnectionTokenViewSet', 'AdminConnectionTokenViewSet']
|
||||
@ -407,22 +408,22 @@ class ConnectionTokenViewSet(AuthFaceMixin, ExtraActionApiMixin, RootOrgViewMixi
|
||||
def validate_exchange_token(self, token):
|
||||
user = token.user
|
||||
asset = token.asset
|
||||
account_name = token.account
|
||||
_data = self._validate(user, asset, account_name, token.protocol, token.connect_method)
|
||||
account_alias = token.account
|
||||
_data = self._validate(user, asset, account_alias, token.protocol, token.connect_method)
|
||||
for k, v in _data.items():
|
||||
setattr(token, k, v)
|
||||
return token
|
||||
|
||||
def _validate(self, user, asset, account_name, protocol, connect_method):
|
||||
def _validate(self, user, asset, account_alias, protocol, connect_method):
|
||||
data = dict()
|
||||
data['org_id'] = asset.org_id
|
||||
data['user'] = user
|
||||
data['value'] = random_string(16)
|
||||
|
||||
if account_name == AliasAccount.ANON and asset.category not in ['web', 'custom']:
|
||||
if account_alias == AliasAccount.ANON and asset.category not in ['web', 'custom']:
|
||||
raise ValidationError(_('Anonymous account is not supported for this asset'))
|
||||
|
||||
account = self._validate_perm(user, asset, account_name, protocol)
|
||||
account = self._validate_perm(user, asset, account_alias, protocol)
|
||||
if account.has_secret:
|
||||
data['input_secret'] = ''
|
||||
|
||||
@ -436,18 +437,16 @@ class ConnectionTokenViewSet(AuthFaceMixin, ExtraActionApiMixin, RootOrgViewMixi
|
||||
if ticket or self.need_face_verify:
|
||||
data['is_active'] = False
|
||||
if self.face_monitor_token:
|
||||
FaceMonitorContext.get_or_create_context(self.face_monitor_token,
|
||||
self.request.user.id)
|
||||
FaceMonitorContext.get_or_create_context(self.face_monitor_token, self.request.user.id)
|
||||
data['face_monitor_token'] = self.face_monitor_token
|
||||
return data
|
||||
|
||||
@staticmethod
|
||||
def get_permed_account(user, asset, account_name, protocol):
|
||||
from perms.utils.asset_perm import PermAssetDetailUtil
|
||||
return PermAssetDetailUtil(user, asset).validate_permission(account_name, protocol)
|
||||
def get_permed_account(user, asset, account_alias, protocol):
|
||||
return ConnectionToken.get_user_permed_account(user, asset, account_alias, protocol)
|
||||
|
||||
def _validate_perm(self, user, asset, account_name, protocol):
|
||||
account = self.get_permed_account(user, asset, account_name, protocol)
|
||||
def _validate_perm(self, user, asset, account_alias, protocol):
|
||||
account = self.get_permed_account(user, asset, account_alias, protocol)
|
||||
if not account or not account.actions:
|
||||
msg = _('Account not found')
|
||||
raise JMSException(code='perm_account_invalid', detail=msg)
|
||||
@ -617,7 +616,7 @@ class SuperConnectionTokenViewSet(ConnectionTokenViewSet):
|
||||
raise PermissionDenied('Not allow to view secret')
|
||||
|
||||
token_id = request.data.get('id') or ''
|
||||
token = get_object_or_404(ConnectionToken, pk=token_id)
|
||||
token = ConnectionToken.get_typed_connection_token(token_id)
|
||||
token.is_valid()
|
||||
serializer = self.get_serializer(instance=token)
|
||||
|
||||
@ -670,6 +669,9 @@ class SuperConnectionTokenViewSet(ConnectionTokenViewSet):
|
||||
|
||||
|
||||
class AdminConnectionTokenViewSet(ConnectionTokenViewSet):
|
||||
serializer_classes = {
|
||||
'default': AdminConnectionTokenSerializer,
|
||||
}
|
||||
|
||||
def check_permissions(self, request):
|
||||
user = request.user
|
||||
@ -677,11 +679,7 @@ class AdminConnectionTokenViewSet(ConnectionTokenViewSet):
|
||||
self.permission_denied(request)
|
||||
|
||||
def get_queryset(self):
|
||||
return AdminConnectionToken.objects.all()
|
||||
return AdminConnectionToken.objects.all().filter(user=self.request.user)
|
||||
|
||||
def get_permed_account(self, user, asset, account_name, protocol):
|
||||
with tmp_to_org(asset.org):
|
||||
account = asset.accounts.all().active().get(name=account_name)
|
||||
account.actions = ActionChoices.all()
|
||||
account.date_expired = timezone.now() + timezone.timedelta(days=365)
|
||||
return account
|
||||
return AdminConnectionToken.get_user_permed_account(user, asset, account_name, protocol)
|
||||
|
@ -136,7 +136,7 @@ class PrepareRequestMixin:
|
||||
"en": {
|
||||
"name": "JumpServer",
|
||||
"displayname": "JumpServer",
|
||||
"url": "https://jumpserver.org/"
|
||||
"url": "https://jumpserver.com/"
|
||||
}
|
||||
},
|
||||
}
|
||||
|
30
apps/authentication/backends/test/test_drf.py
Normal file
30
apps/authentication/backends/test/test_drf.py
Normal file
@ -0,0 +1,30 @@
|
||||
import os
|
||||
import requests
|
||||
from httpsig.requests_auth import HTTPSignatureAuth
|
||||
import datetime
|
||||
|
||||
|
||||
def test_drf_ak():
|
||||
KEY_ID = os.environ.get('KEY_ID') or ''
|
||||
SECRET = os.environ.get('KEY_SECRET') or ''
|
||||
|
||||
signature_headers = ['(request-target)', 'date']
|
||||
now = datetime.datetime.now()
|
||||
headers = {
|
||||
'Host': 'localhost:8000',
|
||||
'Accept': 'application/json',
|
||||
'Date': now.strftime('%a, %d %b %Y %H:%M:%S GMT'),
|
||||
}
|
||||
|
||||
# url = 'http://localhost:8080/api/v1/assets/assets/?limit=100'
|
||||
url = 'http://localhost:8080/api/v1/users/users/?limit=100'
|
||||
|
||||
auth = HTTPSignatureAuth(key_id=KEY_ID, secret=SECRET,
|
||||
algorithm='hmac-sha256',
|
||||
headers=signature_headers)
|
||||
req = requests.get(url, auth=auth, headers=headers)
|
||||
print(req.content)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
test_drf_ak()
|
@ -31,6 +31,7 @@ class ConfirmType(TextChoices):
|
||||
class MFAType(TextChoices):
|
||||
OTP = 'otp', _('OTP')
|
||||
SMS = 'sms', _('SMS')
|
||||
Email = 'email', _('Email')
|
||||
Face = 'face', _('Face Recognition')
|
||||
Radius = 'otp_radius', _('Radius')
|
||||
Custom = 'mfa_custom', _('Custom')
|
||||
@ -48,6 +49,6 @@ class FaceMonitorActionChoices(TextChoices):
|
||||
|
||||
|
||||
class ConnectionTokenType(TextChoices):
|
||||
ADMIN = 'admin', 'Admin'
|
||||
SUPER = 'super', 'Super'
|
||||
USER = 'user', 'User'
|
||||
ADMIN = 'admin', 'Admin' # 管理员 Token, 可以访问所有资源
|
||||
SUPER = 'super', 'Super' # 超级 Token, 可以为不同用户生成的 Token, 遵守用户 Token 的限制
|
||||
USER = 'user', 'User' # 用户 Token, 只能访问指定资源
|
||||
|
@ -3,3 +3,4 @@ from .face import MFAFace
|
||||
from .otp import MFAOtp, otp_failed_msg
|
||||
from .radius import MFARadius
|
||||
from .sms import MFASms
|
||||
from .email import MFAEmail
|
||||
|
73
apps/authentication/mfa/email.py
Normal file
73
apps/authentication/mfa/email.py
Normal file
@ -0,0 +1,73 @@
|
||||
from django.conf import settings
|
||||
from django.template.loader import render_to_string
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from common.utils import random_string
|
||||
from common.utils.verify_code import SendAndVerifyCodeUtil
|
||||
from settings.utils import get_login_title
|
||||
from .base import BaseMFA
|
||||
from ..const import MFAType
|
||||
|
||||
email_failed_msg = _("Email verify code invalid")
|
||||
|
||||
|
||||
class MFAEmail(BaseMFA):
|
||||
name = MFAType.Email.value
|
||||
display_name = MFAType.Email.name
|
||||
placeholder = _('Email verification code')
|
||||
|
||||
def _check_code(self, code):
|
||||
assert self.is_authenticated()
|
||||
sender_util = SendAndVerifyCodeUtil(self.user.email, backend=self.name)
|
||||
ok = False
|
||||
msg = ''
|
||||
try:
|
||||
ok = sender_util.verify(code)
|
||||
except Exception as e:
|
||||
msg = str(e)
|
||||
return ok, msg
|
||||
|
||||
def is_active(self):
|
||||
if not self.is_authenticated():
|
||||
return True
|
||||
return self.user.email
|
||||
|
||||
@staticmethod
|
||||
def challenge_required():
|
||||
return True
|
||||
|
||||
def send_challenge(self):
|
||||
code = random_string(settings.SMS_CODE_LENGTH, lower=False, upper=False)
|
||||
subject = '%s: %s' % (get_login_title(), _('MFA code'))
|
||||
context = {
|
||||
'user': self.user, 'title': subject, 'code': code,
|
||||
}
|
||||
message = render_to_string('authentication/_msg_mfa_email_code.html', context)
|
||||
content = {'subject': subject, 'message': message}
|
||||
sender_util = SendAndVerifyCodeUtil(
|
||||
self.user.email, code=code, backend=self.name, timeout=60, **content
|
||||
)
|
||||
sender_util.gen_and_send_async()
|
||||
|
||||
@staticmethod
|
||||
def global_enabled():
|
||||
return settings.SECURITY_MFA_BY_EMAIL
|
||||
|
||||
def disable(self):
|
||||
return '/ui/#/profile/index'
|
||||
|
||||
def get_enable_url(self) -> str:
|
||||
return ''
|
||||
|
||||
def can_disable(self) -> bool:
|
||||
return False
|
||||
|
||||
def get_disable_url(self):
|
||||
return ''
|
||||
|
||||
@staticmethod
|
||||
def help_text_of_enable():
|
||||
return ''
|
||||
|
||||
def help_text_of_disable(self):
|
||||
return ''
|
@ -5,6 +5,7 @@ from datetime import timedelta
|
||||
from django.conf import settings
|
||||
from django.core.cache import cache
|
||||
from django.db import models
|
||||
from django.shortcuts import get_object_or_404
|
||||
from django.utils import timezone
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
from rest_framework.exceptions import PermissionDenied
|
||||
@ -15,7 +16,7 @@ from assets.const.host import GATEWAY_NAME
|
||||
from authentication.const import ConnectionTokenType
|
||||
from common.db.fields import EncryptTextField
|
||||
from common.exceptions import JMSException
|
||||
from common.utils import lazyproperty, pretty_string, bulk_get
|
||||
from common.utils import lazyproperty, pretty_string, bulk_get, is_uuid
|
||||
from common.utils.timezone import as_current_tz
|
||||
from orgs.mixins.models import JMSOrgBaseModel
|
||||
from orgs.utils import tmp_to_org
|
||||
@ -70,9 +71,15 @@ class ConnectionToken(JMSOrgBaseModel):
|
||||
]
|
||||
verbose_name = _('Connection token')
|
||||
|
||||
def save(self, *args, **kwargs):
|
||||
self.type = self._meta.model._type
|
||||
return super().save(*args, **kwargs)
|
||||
@classmethod
|
||||
def get_typed_connection_token(cls, token_id):
|
||||
token = get_object_or_404(cls, id=token_id)
|
||||
|
||||
if token.type == ConnectionTokenType.ADMIN.value:
|
||||
token = AdminConnectionToken.objects.get(id=token_id)
|
||||
else:
|
||||
token = ConnectionToken.objects.get(id=token_id)
|
||||
return token
|
||||
|
||||
@property
|
||||
def is_expired(self):
|
||||
@ -87,6 +94,7 @@ class ConnectionToken(JMSOrgBaseModel):
|
||||
return int(seconds)
|
||||
|
||||
def save(self, *args, **kwargs):
|
||||
self.type = self._type
|
||||
self.asset_display = pretty_string(self.asset, max_length=128)
|
||||
self.user_display = pretty_string(self.user, max_length=128)
|
||||
return super().save(*args, **kwargs)
|
||||
@ -112,12 +120,35 @@ class ConnectionToken(JMSOrgBaseModel):
|
||||
self.date_expired = date_expired_default()
|
||||
self.save()
|
||||
|
||||
@classmethod
|
||||
def get_user_permed_account(cls, user, asset, account_alias, protocol):
|
||||
from perms.utils import PermAssetDetailUtil
|
||||
permed_account = PermAssetDetailUtil(user, asset) \
|
||||
.validate_permission(account_alias, protocol)
|
||||
return permed_account
|
||||
|
||||
@classmethod
|
||||
def get_asset_accounts_by_alias(cls, asset, alias):
|
||||
"""
|
||||
获取资产下的账号
|
||||
:param alias: 账号别名
|
||||
:return: 账号对象
|
||||
"""
|
||||
if is_uuid(alias):
|
||||
kwargs = {'id': alias}
|
||||
else:
|
||||
kwargs = {'name': alias}
|
||||
|
||||
with tmp_to_org(asset.org_id):
|
||||
account = asset.all_valid_accounts.filter(**kwargs).first()
|
||||
return account
|
||||
|
||||
def get_permed_account(self):
|
||||
return self.get_user_permed_account(self.user, self.asset, self.account, self.protocol)
|
||||
|
||||
@lazyproperty
|
||||
def permed_account(self):
|
||||
from perms.utils import PermAssetDetailUtil
|
||||
permed_account = PermAssetDetailUtil(self.user, self.asset) \
|
||||
.validate_permission(self.account, self.protocol)
|
||||
return permed_account
|
||||
return self.get_permed_account()
|
||||
|
||||
@lazyproperty
|
||||
def actions(self):
|
||||
@ -148,7 +179,8 @@ class ConnectionToken(JMSOrgBaseModel):
|
||||
if timezone.now() - self.date_created < timedelta(seconds=60):
|
||||
return True, None
|
||||
|
||||
if not self.permed_account or not self.permed_account.actions:
|
||||
permed_account = self.get_permed_account()
|
||||
if not permed_account or not permed_account.actions:
|
||||
msg = 'user `{}` not has asset `{}` permission for login `{}`'.format(
|
||||
self.user, self.asset, self.account
|
||||
)
|
||||
@ -191,6 +223,8 @@ class ConnectionToken(JMSOrgBaseModel):
|
||||
'alternate shell:s': app,
|
||||
'remoteapplicationcmdline:s': cmdline_b64,
|
||||
'disableconnectionsharing:i': '1',
|
||||
'bitmapcachepersistenable:i': '0', # 图缓存相关设置,便于录像审计
|
||||
'bitmapcachesize:i': '1500',
|
||||
}
|
||||
return options
|
||||
|
||||
@ -237,6 +271,21 @@ class ConnectionToken(JMSOrgBaseModel):
|
||||
cache.delete(lock_key)
|
||||
return True
|
||||
|
||||
def set_ad_domain_if_need(self, account):
|
||||
if not self.protocol == 'rdp':
|
||||
return
|
||||
if account.ds_domain:
|
||||
return
|
||||
|
||||
rdp = self.asset.platform.protocols.filter(name='rdp').first()
|
||||
if not rdp or not rdp.setting:
|
||||
return
|
||||
|
||||
ad_domain = rdp.setting.get('ad_domain')
|
||||
if ad_domain:
|
||||
# serializer account username 用的是 full_username 所以这么设置
|
||||
account.ds_domain = ad_domain
|
||||
|
||||
@lazyproperty
|
||||
def account_object(self):
|
||||
if not self.asset:
|
||||
@ -248,9 +297,11 @@ class ConnectionToken(JMSOrgBaseModel):
|
||||
input_secret=self.input_secret, from_permed=False
|
||||
)
|
||||
else:
|
||||
account = self.asset.accounts.filter(name=self.account).first()
|
||||
account = self.get_asset_accounts_by_alias(self.asset, self.account)
|
||||
if not account.secret and self.input_secret:
|
||||
account.secret = self.input_secret
|
||||
self.set_ad_domain_if_need(account)
|
||||
|
||||
return account
|
||||
|
||||
@lazyproperty
|
||||
@ -317,4 +368,17 @@ class AdminConnectionToken(ConnectionToken):
|
||||
return (timezone.now() + timezone.timedelta(days=365)).timestamp()
|
||||
|
||||
def is_valid(self):
|
||||
return True
|
||||
return super().is_valid()
|
||||
|
||||
@classmethod
|
||||
def get_user_permed_account(cls, user, asset, account_alias, protocol):
|
||||
"""
|
||||
管理员 token 可以访问所有资产的账号
|
||||
"""
|
||||
account = cls.get_asset_accounts_by_alias(asset, account_alias)
|
||||
if not account:
|
||||
return None
|
||||
|
||||
account.actions = ActionChoices.all()
|
||||
account.date_expired = timezone.now() + timezone.timedelta(days=5)
|
||||
return account
|
||||
|
@ -40,6 +40,7 @@ class _ConnectionTokenAssetSerializer(serializers.ModelSerializer):
|
||||
|
||||
class _SimpleAccountSerializer(serializers.ModelSerializer):
|
||||
secret_type = LabeledChoiceField(choices=SecretType.choices, required=False, label=_('Secret type'))
|
||||
username = serializers.CharField(label=_('Username'), source='full_username', read_only=True)
|
||||
|
||||
class Meta:
|
||||
model = Account
|
||||
@ -49,6 +50,7 @@ class _SimpleAccountSerializer(serializers.ModelSerializer):
|
||||
class _ConnectionTokenAccountSerializer(serializers.ModelSerializer):
|
||||
su_from = serializers.SerializerMethodField(label=_('Su from'))
|
||||
secret_type = LabeledChoiceField(choices=SecretType.choices, required=False, label=_('Secret type'))
|
||||
username = serializers.CharField(label=_('Username'), source='full_username', read_only=True)
|
||||
|
||||
class Meta:
|
||||
model = Account
|
||||
|
@ -4,11 +4,11 @@ from rest_framework import serializers
|
||||
from common.serializers import CommonModelSerializer
|
||||
from common.serializers.fields import EncryptedField
|
||||
from perms.serializers.permission import ActionChoicesField
|
||||
from ..models import ConnectionToken
|
||||
from ..models import ConnectionToken, AdminConnectionToken
|
||||
|
||||
__all__ = [
|
||||
'ConnectionTokenSerializer', 'SuperConnectionTokenSerializer',
|
||||
'ConnectionTokenReusableSerializer',
|
||||
'ConnectionTokenReusableSerializer', 'AdminConnectionTokenSerializer',
|
||||
]
|
||||
|
||||
|
||||
@ -74,3 +74,7 @@ class SuperConnectionTokenSerializer(ConnectionTokenSerializer):
|
||||
|
||||
def get_user(self, attrs):
|
||||
return attrs.get('user')
|
||||
|
||||
class AdminConnectionTokenSerializer(ConnectionTokenSerializer):
|
||||
class Meta(ConnectionTokenSerializer.Meta):
|
||||
model = AdminConnectionToken
|
||||
|
@ -0,0 +1,18 @@
|
||||
{% load i18n %}
|
||||
|
||||
<div style="width: 100%; text-align: center">
|
||||
<table style="margin: 0 auto; border: 1px solid #ccc; border-collapse: collapse; width: 60%">
|
||||
<tr style="background-color: #1ab394; color: white">
|
||||
<th style="height: 80px;">{{ title }}</th>
|
||||
</tr>
|
||||
<tr style="border: 1px solid #eee;">
|
||||
<td style="height: 50px;">{% trans 'Hello' %} {{ user.name }},</td>
|
||||
</tr>
|
||||
<tr style="border: 1px solid #eee">
|
||||
<td style="height: 50px;">{% trans 'MFA code' %}: <span style="font-weight: bold;">{{ code }}</span></td>
|
||||
</tr>
|
||||
<tr style="border: 1px solid #eee">
|
||||
<td style="height: 30px;">{% trans 'The validity period of the verification code is one minute' %}</td>
|
||||
</tr>
|
||||
</table>
|
||||
</div>
|
30
apps/common/api/decorator.py
Normal file
30
apps/common/api/decorator.py
Normal file
@ -0,0 +1,30 @@
|
||||
from common.utils import get_logger
|
||||
|
||||
|
||||
logger = get_logger(__file__)
|
||||
|
||||
|
||||
def deprecated_api(replacement=None, sunset_date=None):
|
||||
"""类视图废弃装饰器"""
|
||||
def decorator(cls):
|
||||
original_dispatch = cls.dispatch
|
||||
|
||||
def new_dispatch(self, request, *args, **kwargs):
|
||||
logger.warning(
|
||||
f'The client {request.get_host()} calls the deprecated interface: {request.path}'
|
||||
)
|
||||
response = original_dispatch(self, request, *args, **kwargs)
|
||||
response.headers["Deprecation"] = "true"
|
||||
if replacement:
|
||||
response.headers["Link"] = f'<{replacement}>; rel="deprecation"'
|
||||
if sunset_date:
|
||||
response.headers["Sunset"] = sunset_date
|
||||
if hasattr(response, "data") and isinstance(response.data, dict):
|
||||
response.data.update({
|
||||
'warning': f'This interface has been deprecated. Please use {replacement} instead.'
|
||||
})
|
||||
return response
|
||||
|
||||
cls.dispatch = new_dispatch
|
||||
return cls
|
||||
return decorator
|
@ -1,6 +1,7 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
#
|
||||
from collections import defaultdict
|
||||
from contextlib import nullcontext
|
||||
from itertools import chain
|
||||
from typing import Callable
|
||||
|
||||
@ -15,6 +16,7 @@ from common.drf.filters import (
|
||||
IDNotFilterBackend, NotOrRelFilterBackend, LabelFilterBackend
|
||||
)
|
||||
from common.utils import get_logger, lazyproperty
|
||||
from orgs.utils import tmp_to_org, tmp_to_root_org
|
||||
from .action import RenderToJsonMixin
|
||||
from .serializer import SerializerMixin
|
||||
|
||||
@ -95,7 +97,10 @@ class QuerySetMixin:
|
||||
get_queryset: Callable
|
||||
|
||||
def get_queryset(self):
|
||||
queryset = super().get_queryset()
|
||||
return super().get_queryset()
|
||||
|
||||
def filter_queryset(self, queryset):
|
||||
queryset = super().filter_queryset(queryset)
|
||||
if not hasattr(self, 'action'):
|
||||
return queryset
|
||||
if self.action == 'metadata':
|
||||
@ -103,26 +108,58 @@ class QuerySetMixin:
|
||||
queryset = self.setup_eager_loading(queryset)
|
||||
return queryset
|
||||
|
||||
# Todo: 未来考虑自定义 pagination
|
||||
def setup_eager_loading(self, queryset):
|
||||
if self.request.query_params.get('format') not in ['csv', 'xlsx']:
|
||||
def setup_eager_loading(self, queryset, is_paginated=False):
|
||||
is_export_request = self.request.query_params.get('format') in ['csv', 'xlsx']
|
||||
no_request_page = self.request.query_params.get('limit') is None
|
||||
# 不分页不走一般这个,是因为会消耗多余的 sql 查询, 不如分页的时候查询一次
|
||||
if not is_export_request and not is_paginated and not no_request_page:
|
||||
return queryset
|
||||
|
||||
serializer_class = self.get_serializer_class()
|
||||
if not serializer_class or not hasattr(serializer_class, 'setup_eager_loading'):
|
||||
if not serializer_class:
|
||||
return queryset
|
||||
return serializer_class.setup_eager_loading(queryset)
|
||||
|
||||
if hasattr(serializer_class, 'setup_eager_loading'):
|
||||
queryset = serializer_class.setup_eager_loading(queryset)
|
||||
|
||||
if hasattr(serializer_class, 'setup_eager_labels'):
|
||||
queryset = serializer_class.setup_eager_labels(queryset)
|
||||
return queryset
|
||||
|
||||
def paginate_queryset(self, queryset):
|
||||
page = super().paginate_queryset(queryset)
|
||||
model = getattr(queryset, 'model', None)
|
||||
if not model or hasattr(queryset, 'custom'):
|
||||
return page
|
||||
|
||||
serializer_class = self.get_serializer_class()
|
||||
if page and serializer_class and hasattr(serializer_class, 'setup_eager_loading'):
|
||||
ids = [str(obj.id) for obj in page]
|
||||
page = self.get_queryset().filter(id__in=ids)
|
||||
page = serializer_class.setup_eager_loading(page)
|
||||
if page and serializer_class:
|
||||
# 必须要返回 ids,用于排序
|
||||
queryset, ids = self._get_page_again(page, model)
|
||||
page = self.setup_eager_loading(queryset, is_paginated=True)
|
||||
page_mapper = {str(obj.id): obj for obj in page}
|
||||
page = [page_mapper.get(_id) for _id in ids if _id in page_mapper]
|
||||
return page
|
||||
|
||||
def _get_page_again(self, page, model):
|
||||
"""
|
||||
因为 setup_eager_loading 需要是 queryset 结构, 所以必须要重新构造
|
||||
"""
|
||||
id_org_mapper = {str(obj.id): getattr(obj, 'org_id', None) for obj in page}
|
||||
ids = list(id_org_mapper.keys())
|
||||
org_ids = list(set(id_org_mapper.values()) - {None})
|
||||
|
||||
if not org_ids:
|
||||
context = nullcontext()
|
||||
elif len(org_ids) == 1:
|
||||
context = tmp_to_org(org_ids[0])
|
||||
else:
|
||||
context = tmp_to_root_org()
|
||||
|
||||
with context:
|
||||
page = model.objects.filter(id__in=ids)
|
||||
return page, ids
|
||||
|
||||
|
||||
class ExtraFilterFieldsMixin:
|
||||
"""
|
||||
@ -217,8 +254,8 @@ class OrderingFielderFieldsMixin:
|
||||
|
||||
|
||||
class CommonApiMixin(
|
||||
SerializerMixin, ExtraFilterFieldsMixin, OrderingFielderFieldsMixin,
|
||||
QuerySetMixin, RenderToJsonMixin, PaginatedResponseMixin
|
||||
SerializerMixin, QuerySetMixin, ExtraFilterFieldsMixin,
|
||||
OrderingFielderFieldsMixin, RenderToJsonMixin, PaginatedResponseMixin
|
||||
):
|
||||
def is_swagger_request(self):
|
||||
return getattr(self, 'swagger_fake_view', False) or \
|
||||
|
@ -13,11 +13,13 @@ from rest_framework.utils import encoders, json
|
||||
|
||||
from common.serializers import fields as common_fields
|
||||
from common.utils import get_logger
|
||||
from .mixins import LogMixin
|
||||
|
||||
|
||||
logger = get_logger(__file__)
|
||||
|
||||
|
||||
class BaseFileRenderer(BaseRenderer):
|
||||
class BaseFileRenderer(LogMixin, BaseRenderer):
|
||||
# 渲染模板标识, 导入、导出、更新模板: ['import', 'update', 'export']
|
||||
template = 'export'
|
||||
serializer = None
|
||||
@ -256,6 +258,8 @@ class BaseFileRenderer(BaseRenderer):
|
||||
logger.debug(e, exc_info=True)
|
||||
value = 'Render error! ({})'.format(self.media_type).encode('utf-8')
|
||||
return value
|
||||
|
||||
self.record_logs(request, view, data)
|
||||
return value
|
||||
|
||||
def compress_into_zip_file(self, value, request, response):
|
||||
|
69
apps/common/drf/renders/mixins.py
Normal file
69
apps/common/drf/renders/mixins.py
Normal file
@ -0,0 +1,69 @@
|
||||
from django.utils.translation import gettext_noop
|
||||
|
||||
from audits.const import ActionChoices
|
||||
from audits.utils import record_operate_log_and_activity_log
|
||||
from common.utils import get_logger
|
||||
|
||||
|
||||
logger = get_logger(__file__)
|
||||
|
||||
|
||||
class LogMixin(object):
|
||||
@staticmethod
|
||||
def _clean_params(query_params):
|
||||
clean_params = {}
|
||||
ignore_params = ('format', 'order')
|
||||
for key, value in dict(query_params).items():
|
||||
if key in ignore_params:
|
||||
continue
|
||||
if isinstance(value, list):
|
||||
value = list(filter(None, value))
|
||||
if value:
|
||||
clean_params[key] = value
|
||||
return clean_params
|
||||
|
||||
@staticmethod
|
||||
def _get_model(view):
|
||||
model = getattr(view, 'model', None)
|
||||
if not model:
|
||||
serializer = view.get_serializer()
|
||||
if serializer:
|
||||
model = serializer.Meta.model
|
||||
return model
|
||||
|
||||
@staticmethod
|
||||
def _build_after(params, data):
|
||||
base = {
|
||||
gettext_noop('Resource count'): {'value': len(data)}
|
||||
}
|
||||
extra = {key: {'value': value} for key, value in params.items()}
|
||||
return {**extra, **base}
|
||||
|
||||
@staticmethod
|
||||
def get_resource_display(params):
|
||||
spm_filter = params.pop("spm", None)
|
||||
if not params and not spm_filter:
|
||||
display_message = gettext_noop("Export all")
|
||||
elif spm_filter:
|
||||
display_message = gettext_noop("Export only selected items")
|
||||
else:
|
||||
display_message = gettext_noop("Export filtered")
|
||||
return display_message
|
||||
|
||||
def record_logs(self, request, view, data):
|
||||
activity_ids, activity_detail = [], ''
|
||||
model = self._get_model(view)
|
||||
if not model:
|
||||
logger.warning('Model is not defined in view: %s' % view)
|
||||
return
|
||||
|
||||
params = self._clean_params(request.query_params)
|
||||
resource_display = self.get_resource_display(params)
|
||||
after = self._build_after(params, data)
|
||||
if hasattr(view, 'get_activity_detail_msg'):
|
||||
activity_detail = view.get_activity_detail_msg()
|
||||
activity_ids = [d['id'] for d in data if 'id' in d]
|
||||
record_operate_log_and_activity_log(
|
||||
activity_ids, ActionChoices.export, activity_detail,
|
||||
model, resource_display=resource_display, after=after
|
||||
)
|
@ -26,7 +26,7 @@ class ServicesUtil(object):
|
||||
|
||||
def start_and_watch(self):
|
||||
logging.info(time.ctime())
|
||||
logging.info(f'JumpServer version {__version__}, more see https://www.jumpserver.org')
|
||||
logging.info(f'JumpServer version {__version__}, more see https://www.jumpserver.com')
|
||||
self.start()
|
||||
if self.run_daemon:
|
||||
self.show_status()
|
||||
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user