mirror of
https://github.com/jumpserver/jumpserver.git
synced 2025-12-15 08:32:48 +00:00
Compare commits
25 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
6892523df7 | ||
|
|
eb9f261459 | ||
|
|
c4d99ed8e2 | ||
|
|
8d01c189f0 | ||
|
|
a2701090de | ||
|
|
4582aa0a09 | ||
|
|
c7c379479f | ||
|
|
ce4479c23e | ||
|
|
eedc2f1b41 | ||
|
|
7ba24293d1 | ||
|
|
f10114c9ed | ||
|
|
cf31cbfb07 | ||
|
|
0edad24d5d | ||
|
|
1f1c1a9157 | ||
|
|
6c9d271ae1 | ||
|
|
6ff852e225 | ||
|
|
baa75dc735 | ||
|
|
8a9f0436b8 | ||
|
|
a9620a3cbe | ||
|
|
769e7dc8a0 | ||
|
|
2a70449411 | ||
|
|
8df720f19e | ||
|
|
dabbb45f6e | ||
|
|
ce24c1c3fd | ||
|
|
3c54c82ce9 |
28
.github/ISSUE_TEMPLATE/----.md
vendored
28
.github/ISSUE_TEMPLATE/----.md
vendored
@@ -1,35 +1,11 @@
|
||||
---
|
||||
name: 需求建议
|
||||
about: 提出针对本项目的想法和建议
|
||||
title: "[Feature] 需求标题"
|
||||
title: "[Feature] "
|
||||
labels: 类型:需求
|
||||
assignees:
|
||||
- ibuler
|
||||
- baijiangjie
|
||||
---
|
||||
|
||||
## 注意
|
||||
_针对过于简单的需求描述不予考虑。请确保提供足够的细节和信息以支持功能的开发和实现。_
|
||||
|
||||
## 功能名称
|
||||
[在这里输入功能的名称或标题]
|
||||
|
||||
## 功能描述
|
||||
[在这里描述该功能的详细内容,包括其作用、目的和所需的功能]
|
||||
|
||||
## 用户故事(可选)
|
||||
[如果适用,可以提供用户故事来更好地理解该功能的使用场景和用户期望]
|
||||
|
||||
## 功能要求
|
||||
- [要求1:描述该功能的具体要求,如界面设计、交互逻辑等]
|
||||
- [要求2:描述该功能的另一个具体要求]
|
||||
- [以此类推,列出所有相关的功能要求]
|
||||
|
||||
## 示例或原型(可选)
|
||||
[如果有的话,提供该功能的示例或原型图以更好地说明功能的实现方式]
|
||||
|
||||
## 优先级
|
||||
[描述该功能的优先级,如高、中、低,或使用数字等其他标识]
|
||||
|
||||
## 备注(可选)
|
||||
[在这里添加任何其他相关信息或备注]
|
||||
**请描述您的需求或者改进建议.**
|
||||
|
||||
45
.github/ISSUE_TEMPLATE/bug---.md
vendored
45
.github/ISSUE_TEMPLATE/bug---.md
vendored
@@ -1,51 +1,22 @@
|
||||
---
|
||||
name: Bug 提交
|
||||
about: 提交产品缺陷帮助我们更好的改进
|
||||
title: "[Bug] Bug 标题"
|
||||
title: "[Bug] "
|
||||
labels: 类型:Bug
|
||||
assignees:
|
||||
- baijiangjie
|
||||
---
|
||||
|
||||
## 注意
|
||||
**JumpServer 版本( v2.28 之前的版本不再支持 )** <br>
|
||||
_针对过于简单的 Bug 描述不予考虑。请确保提供足够的细节和信息以支持 Bug 的复现和修复。_
|
||||
|
||||
## 当前使用的 JumpServer 版本 (必填)
|
||||
[在这里输入当前使用的 JumpServer 的版本号]
|
||||
|
||||
## 使用的版本类型 (必填)
|
||||
- [ ] 社区版
|
||||
- [ ] 企业版
|
||||
- [ ] 企业试用版
|
||||
**JumpServer 版本( v2.28 之前的版本不再支持 )**
|
||||
|
||||
|
||||
## 版本安装方式 (必填)
|
||||
- [ ] 在线安装 (一键命令)
|
||||
- [ ] 离线安装 (下载离线包)
|
||||
- [ ] All-in-One
|
||||
- [ ] 1Panel 安装
|
||||
- [ ] Kubernetes 安装
|
||||
- [ ] 源码安装
|
||||
**浏览器版本**
|
||||
|
||||
## Bug 描述 (详细)
|
||||
[在这里描述 Bug 的详细情况,包括其影响和出现的具体情况]
|
||||
|
||||
## 复现步骤
|
||||
1. [描述如何复现 Bug 的第一步]
|
||||
2. [描述如何复现 Bug 的第二步]
|
||||
3. [以此类推,列出所有复现 Bug 所需的步骤]
|
||||
**Bug 描述**
|
||||
|
||||
## 期望行为
|
||||
[描述 Bug 出现时期望的系统行为或结果]
|
||||
|
||||
## 实际行为
|
||||
[描述实际上发生了什么,以及 Bug 出现的具体情况]
|
||||
|
||||
## 系统环境
|
||||
- 操作系统:[例如:Windows 10, macOS Big Sur]
|
||||
- 浏览器/应用版本:[如果适用,请提供相关版本信息]
|
||||
- 其他相关环境信息:[如果有其他相关环境信息,请在此处提供]
|
||||
|
||||
## 附加信息(可选)
|
||||
[在这里添加任何其他相关信息,如截图、错误信息等]
|
||||
**Bug 重现步骤(有截图更好)**
|
||||
1.
|
||||
2.
|
||||
3.
|
||||
|
||||
44
.github/ISSUE_TEMPLATE/question.md
vendored
44
.github/ISSUE_TEMPLATE/question.md
vendored
@@ -1,50 +1,10 @@
|
||||
---
|
||||
name: 问题咨询
|
||||
about: 提出针对本项目安装部署、使用及其他方面的相关问题
|
||||
title: "[Question] 问题标题"
|
||||
title: "[Question] "
|
||||
labels: 类型:提问
|
||||
assignees:
|
||||
- baijiangjie
|
||||
---
|
||||
## 注意
|
||||
**请描述您的问题.** <br>
|
||||
**JumpServer 版本( v2.28 之前的版本不再支持 )** <br>
|
||||
_针对过于简单的 Bug 描述不予考虑。请确保提供足够的细节和信息以支持 Bug 的复现和修复。_
|
||||
|
||||
## 当前使用的 JumpServer 版本 (必填)
|
||||
[在这里输入当前使用的 JumpServer 的版本号]
|
||||
|
||||
## 使用的版本类型 (必填)
|
||||
- [ ] 社区版
|
||||
- [ ] 企业版
|
||||
- [ ] 企业试用版
|
||||
|
||||
|
||||
## 版本安装方式 (必填)
|
||||
- [ ] 在线安装 (一键命令)
|
||||
- [ ] 离线安装 (下载离线包)
|
||||
- [ ] All-in-One
|
||||
- [ ] 1Panel 安装
|
||||
- [ ] Kubernetes 安装
|
||||
- [ ] 源码安装
|
||||
|
||||
## 问题描述 (详细)
|
||||
[在这里描述你遇到的问题]
|
||||
|
||||
## 背景信息
|
||||
- 操作系统:[例如:Windows 10, macOS Big Sur]
|
||||
- 浏览器/应用版本:[如果适用,请提供相关版本信息]
|
||||
- 其他相关环境信息:[如果有其他相关环境信息,请在此处提供]
|
||||
|
||||
## 具体问题
|
||||
[在这里详细描述你的问题,包括任何相关细节或错误信息]
|
||||
|
||||
## 尝试过的解决方法
|
||||
[如果你已经尝试过解决问题,请在这里列出你已经尝试过的解决方法]
|
||||
|
||||
## 预期结果
|
||||
[描述你期望的解决方案或结果]
|
||||
|
||||
## 我们的期望
|
||||
[描述你希望我们提供的帮助或支持]
|
||||
|
||||
**请描述您的问题.**
|
||||
|
||||
45
.github/workflows/jms-build-test.yml
vendored
45
.github/workflows/jms-build-test.yml
vendored
@@ -1,32 +1,26 @@
|
||||
name: "Run Build Test"
|
||||
on:
|
||||
push:
|
||||
paths:
|
||||
- 'Dockerfile'
|
||||
- 'Dockerfile-*'
|
||||
- 'pyproject.toml'
|
||||
- 'poetry.lock'
|
||||
branches:
|
||||
- pr@*
|
||||
- repr@*
|
||||
|
||||
jobs:
|
||||
build:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v3
|
||||
- uses: docker/setup-qemu-action@v3
|
||||
- uses: docker/setup-buildx-action@v3
|
||||
|
||||
- name: Check Dockerfile
|
||||
run: |
|
||||
test -f Dockerfile-ce || cp -f Dockerfile Dockerfile-ce
|
||||
- uses: docker/setup-qemu-action@v2
|
||||
|
||||
- name: Build CE Image
|
||||
uses: docker/build-push-action@v5
|
||||
- uses: docker/setup-buildx-action@v2
|
||||
|
||||
- uses: docker/build-push-action@v3
|
||||
with:
|
||||
context: .
|
||||
push: false
|
||||
file: Dockerfile-ce
|
||||
tags: jumpserver/core-ce:test
|
||||
platforms: linux/amd64
|
||||
file: Dockerfile-ce
|
||||
build-args: |
|
||||
APT_MIRROR=http://deb.debian.org
|
||||
PIP_MIRROR=https://pypi.org/simple
|
||||
@@ -34,22 +28,9 @@ jobs:
|
||||
cache-from: type=gha
|
||||
cache-to: type=gha,mode=max
|
||||
|
||||
- name: Prepare EE Image
|
||||
run: |
|
||||
sed -i 's@^FROM registry.fit2cloud.com@# FROM registry.fit2cloud.com@g' Dockerfile-ee
|
||||
sed -i 's@^COPY --from=build-xpack@# COPY --from=build-xpack@g' Dockerfile-ee
|
||||
|
||||
- name: Build EE Image
|
||||
uses: docker/build-push-action@v5
|
||||
- uses: LouisBrunner/checks-action@v1.5.0
|
||||
if: always()
|
||||
with:
|
||||
context: .
|
||||
push: false
|
||||
file: Dockerfile-ee
|
||||
tags: jumpserver/core-ee:test
|
||||
platforms: linux/amd64
|
||||
build-args: |
|
||||
APT_MIRROR=http://deb.debian.org
|
||||
PIP_MIRROR=https://pypi.org/simple
|
||||
PIP_JMS_MIRROR=https://pypi.org/simple
|
||||
cache-from: type=gha
|
||||
cache-to: type=gha,mode=max
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
name: Check Build
|
||||
conclusion: ${{ job.status }}
|
||||
|
||||
1
.gitignore
vendored
1
.gitignore
vendored
@@ -43,4 +43,3 @@ releashe
|
||||
data/*
|
||||
test.py
|
||||
.history/
|
||||
.test/
|
||||
|
||||
@@ -87,7 +87,6 @@ ARG TOOLS=" \
|
||||
default-mysql-client \
|
||||
iputils-ping \
|
||||
locales \
|
||||
netcat-openbsd \
|
||||
nmap \
|
||||
openssh-client \
|
||||
patch \
|
||||
@@ -112,17 +111,8 @@ RUN --mount=type=cache,target=/var/cache/apt,sharing=locked,id=core-apt \
|
||||
&& sed -i "s@# export @export @g" ~/.bashrc \
|
||||
&& sed -i "s@# alias @alias @g" ~/.bashrc
|
||||
|
||||
ARG RECEPTOR_VERSION=v1.4.5
|
||||
RUN set -ex \
|
||||
&& wget -O /opt/receptor.tar.gz https://github.com/ansible/receptor/releases/download/${RECEPTOR_VERSION}/receptor_${RECEPTOR_VERSION/v/}_linux_${TARGETARCH}.tar.gz \
|
||||
&& tar -xf /opt/receptor.tar.gz -C /usr/local/bin/ \
|
||||
&& chown root:root /usr/local/bin/receptor \
|
||||
&& chmod 755 /usr/local/bin/receptor \
|
||||
&& rm -f /opt/receptor.tar.gz
|
||||
|
||||
COPY --from=stage-2 /opt/py3 /opt/py3
|
||||
COPY --from=stage-1 /opt/jumpserver/release/jumpserver /opt/jumpserver
|
||||
COPY --from=stage-1 /opt/jumpserver/release/jumpserver/apps/libs/ansible/ansible.cfg /etc/ansible/
|
||||
|
||||
WORKDIR /opt/jumpserver
|
||||
|
||||
|
||||
@@ -85,7 +85,7 @@ If you find a security problem, please contact us directly:
|
||||
- 400-052-0755
|
||||
|
||||
### License & Copyright
|
||||
Copyright (c) 2014-2024 FIT2CLOUD Tech, Inc., All rights reserved.
|
||||
Copyright (c) 2014-2022 FIT2CLOUD Tech, Inc., All rights reserved.
|
||||
|
||||
Licensed under The GNU General Public License version 3 (GPLv3) (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at
|
||||
|
||||
|
||||
@@ -20,6 +20,7 @@ class AccountBackupPlanViewSet(OrgBulkModelViewSet):
|
||||
model = AccountBackupAutomation
|
||||
filterset_fields = ('name',)
|
||||
search_fields = filterset_fields
|
||||
ordering = ('name',)
|
||||
serializer_class = serializers.AccountBackupSerializer
|
||||
|
||||
|
||||
|
||||
@@ -6,12 +6,9 @@ from rest_framework.response import Response
|
||||
|
||||
from accounts import serializers
|
||||
from accounts.const import AutomationTypes
|
||||
from accounts.filters import ChangeSecretRecordFilterSet
|
||||
from accounts.models import ChangeSecretAutomation, ChangeSecretRecord
|
||||
from accounts.tasks import execute_automation_record_task
|
||||
from authentication.permissions import UserConfirmation, ConfirmType
|
||||
from orgs.mixins.api import OrgBulkModelViewSet, OrgGenericViewSet
|
||||
from rbac.permissions import RBACPermission
|
||||
from .base import (
|
||||
AutomationAssetsListApi, AutomationRemoveAssetApi, AutomationAddAssetApi,
|
||||
AutomationNodeAddRemoveApi, AutomationExecutionViewSet
|
||||
@@ -33,48 +30,29 @@ class ChangeSecretAutomationViewSet(OrgBulkModelViewSet):
|
||||
|
||||
|
||||
class ChangeSecretRecordViewSet(mixins.ListModelMixin, OrgGenericViewSet):
|
||||
filterset_class = ChangeSecretRecordFilterSet
|
||||
serializer_class = serializers.ChangeSecretRecordSerializer
|
||||
filterset_fields = ('asset_id', 'execution_id')
|
||||
search_fields = ('asset__address',)
|
||||
tp = AutomationTypes.change_secret
|
||||
serializer_classes = {
|
||||
'default': serializers.ChangeSecretRecordSerializer,
|
||||
'secret': serializers.ChangeSecretRecordViewSecretSerializer,
|
||||
}
|
||||
rbac_perms = {
|
||||
'execute': 'accounts.add_changesecretexecution',
|
||||
'secret': 'accounts.view_changesecretrecord',
|
||||
}
|
||||
|
||||
def get_permissions(self):
|
||||
if self.action == 'secret':
|
||||
self.permission_classes = [
|
||||
RBACPermission,
|
||||
UserConfirmation.require(ConfirmType.MFA)
|
||||
]
|
||||
return super().get_permissions()
|
||||
|
||||
def get_queryset(self):
|
||||
return ChangeSecretRecord.objects.all()
|
||||
|
||||
@action(methods=['post'], detail=False, url_path='execute')
|
||||
def execute(self, request, *args, **kwargs):
|
||||
record_ids = request.data.get('record_ids')
|
||||
records = self.get_queryset().filter(id__in=record_ids)
|
||||
execution_count = records.values_list('execution_id', flat=True).distinct().count()
|
||||
if execution_count != 1:
|
||||
record_id = request.data.get('record_id')
|
||||
record = self.get_queryset().filter(pk=record_id)
|
||||
if not record:
|
||||
return Response(
|
||||
{'detail': 'Only one execution is allowed to execute'},
|
||||
status=status.HTTP_400_BAD_REQUEST
|
||||
{'detail': 'record not found'},
|
||||
status=status.HTTP_404_NOT_FOUND
|
||||
)
|
||||
task = execute_automation_record_task.delay(record_ids, self.tp)
|
||||
task = execute_automation_record_task.delay(record_id, self.tp)
|
||||
return Response({'task': task.id}, status=status.HTTP_200_OK)
|
||||
|
||||
@action(methods=['get'], detail=True, url_path='secret')
|
||||
def secret(self, request, *args, **kwargs):
|
||||
instance = self.get_object()
|
||||
serializer = self.get_serializer(instance)
|
||||
return Response(serializer.data)
|
||||
|
||||
|
||||
class ChangSecretExecutionViewSet(AutomationExecutionViewSet):
|
||||
rbac_perms = (
|
||||
|
||||
@@ -6,7 +6,7 @@ from django.conf import settings
|
||||
from rest_framework import serializers
|
||||
from xlsxwriter import Workbook
|
||||
|
||||
from accounts.const import AccountBackupType
|
||||
from accounts.const.automation import AccountBackupType
|
||||
from accounts.models.automations.backup_account import AccountBackupAutomation
|
||||
from accounts.notifications import AccountBackupExecutionTaskMsg, AccountBackupByObjStorageExecutionTaskMsg
|
||||
from accounts.serializers import AccountSecretSerializer
|
||||
|
||||
@@ -18,8 +18,6 @@
|
||||
become_user: "{{ custom_become_user | default('') }}"
|
||||
become_password: "{{ custom_become_password | default('') }}"
|
||||
become_private_key_path: "{{ custom_become_private_key_path | default(None) }}"
|
||||
old_ssh_version: "{{ jms_asset.old_ssh_version | default(False) }}"
|
||||
gateway_args: "{{ jms_asset.ansible_ssh_common_args | default(None) }}"
|
||||
register: ping_info
|
||||
delegate_to: localhost
|
||||
|
||||
@@ -56,6 +54,4 @@
|
||||
become_user: "{{ account.become.ansible_user | default('') }}"
|
||||
become_password: "{{ account.become.ansible_password | default('') }}"
|
||||
become_private_key_path: "{{ account.become.ansible_ssh_private_key_file | default(None) }}"
|
||||
old_ssh_version: "{{ jms_asset.old_ssh_version | default(False) }}"
|
||||
gateway_args: "{{ jms_asset.ansible_ssh_common_args | default(None) }}"
|
||||
delegate_to: localhost
|
||||
|
||||
@@ -85,7 +85,6 @@
|
||||
become_user: "{{ account.become.ansible_user | default('') }}"
|
||||
become_password: "{{ account.become.ansible_password | default('') }}"
|
||||
become_private_key_path: "{{ account.become.ansible_ssh_private_key_file | default(None) }}"
|
||||
old_ssh_version: "{{ jms_asset.old_ssh_version | default(False) }}"
|
||||
when: account.secret_type == "password"
|
||||
delegate_to: localhost
|
||||
|
||||
@@ -96,6 +95,5 @@
|
||||
login_user: "{{ account.username }}"
|
||||
login_private_key_path: "{{ account.private_key_path }}"
|
||||
gateway_args: "{{ jms_asset.ansible_ssh_common_args | default('') }}"
|
||||
old_ssh_version: "{{ jms_asset.old_ssh_version | default(False) }}"
|
||||
when: account.secret_type == "ssh_key"
|
||||
delegate_to: localhost
|
||||
|
||||
@@ -85,7 +85,6 @@
|
||||
become_user: "{{ account.become.ansible_user | default('') }}"
|
||||
become_password: "{{ account.become.ansible_password | default('') }}"
|
||||
become_private_key_path: "{{ account.become.ansible_ssh_private_key_file | default(None) }}"
|
||||
old_ssh_version: "{{ jms_asset.old_ssh_version | default(False) }}"
|
||||
when: account.secret_type == "password"
|
||||
delegate_to: localhost
|
||||
|
||||
@@ -96,6 +95,5 @@
|
||||
login_user: "{{ account.username }}"
|
||||
login_private_key_path: "{{ account.private_key_path }}"
|
||||
gateway_args: "{{ jms_asset.ansible_ssh_common_args | default('') }}"
|
||||
old_ssh_version: "{{ jms_asset.old_ssh_version | default(False) }}"
|
||||
when: account.secret_type == "ssh_key"
|
||||
delegate_to: localhost
|
||||
|
||||
@@ -7,9 +7,9 @@ from django.utils import timezone
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
from xlsxwriter import Workbook
|
||||
|
||||
from accounts.const import AutomationTypes, SecretType, SSHKeyStrategy, SecretStrategy, ChangeSecretRecordStatusChoice
|
||||
from accounts.const import AutomationTypes, SecretType, SSHKeyStrategy, SecretStrategy
|
||||
from accounts.models import ChangeSecretRecord
|
||||
from accounts.notifications import ChangeSecretExecutionTaskMsg, ChangeSecretFailedMsg
|
||||
from accounts.notifications import ChangeSecretExecutionTaskMsg
|
||||
from accounts.serializers import ChangeSecretRecordBackUpSerializer
|
||||
from assets.const import HostTypes
|
||||
from common.utils import get_logger
|
||||
@@ -27,7 +27,7 @@ class ChangeSecretManager(AccountBasePlaybookManager):
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
self.record_map = self.execution.snapshot.get('record_map', {})
|
||||
self.record_id = self.execution.snapshot.get('record_id')
|
||||
self.secret_type = self.execution.snapshot.get('secret_type')
|
||||
self.secret_strategy = self.execution.snapshot.get(
|
||||
'secret_strategy', SecretStrategy.custom
|
||||
@@ -123,20 +123,14 @@ class ChangeSecretManager(AccountBasePlaybookManager):
|
||||
print(f'new_secret is None, account: {account}')
|
||||
continue
|
||||
|
||||
asset_account_id = f'{asset.id}-{account.id}'
|
||||
if asset_account_id not in self.record_map:
|
||||
if self.record_id is None:
|
||||
recorder = ChangeSecretRecord(
|
||||
asset=asset, account=account, execution=self.execution,
|
||||
old_secret=account.secret, new_secret=new_secret,
|
||||
)
|
||||
records.append(recorder)
|
||||
else:
|
||||
record_id = self.record_map[asset_account_id]
|
||||
try:
|
||||
recorder = ChangeSecretRecord.objects.get(id=record_id)
|
||||
except ChangeSecretRecord.DoesNotExist:
|
||||
print(f"Record {record_id} not found")
|
||||
continue
|
||||
recorder = ChangeSecretRecord.objects.get(id=self.record_id)
|
||||
|
||||
self.name_recorder_mapper[h['name']] = recorder
|
||||
|
||||
@@ -164,43 +158,25 @@ class ChangeSecretManager(AccountBasePlaybookManager):
|
||||
recorder = self.name_recorder_mapper.get(host)
|
||||
if not recorder:
|
||||
return
|
||||
recorder.status = ChangeSecretRecordStatusChoice.success.value
|
||||
recorder.status = 'success'
|
||||
recorder.date_finished = timezone.now()
|
||||
|
||||
recorder.save()
|
||||
account = recorder.account
|
||||
if not account:
|
||||
print("Account not found, deleted ?")
|
||||
return
|
||||
account.secret = recorder.new_secret
|
||||
account.date_updated = timezone.now()
|
||||
|
||||
max_retries = 3
|
||||
retry_count = 0
|
||||
|
||||
while retry_count < max_retries:
|
||||
try:
|
||||
recorder.save()
|
||||
account.save(update_fields=['secret', 'version', 'date_updated'])
|
||||
break
|
||||
except Exception as e:
|
||||
retry_count += 1
|
||||
if retry_count == max_retries:
|
||||
self.on_host_error(host, str(e), result)
|
||||
else:
|
||||
print(f'retry {retry_count} times for {host} recorder save error: {e}')
|
||||
time.sleep(1)
|
||||
account.save(update_fields=['secret', 'date_updated'])
|
||||
|
||||
def on_host_error(self, host, error, result):
|
||||
recorder = self.name_recorder_mapper.get(host)
|
||||
if not recorder:
|
||||
return
|
||||
recorder.status = ChangeSecretRecordStatusChoice.failed.value
|
||||
recorder.status = 'failed'
|
||||
recorder.date_finished = timezone.now()
|
||||
recorder.error = error
|
||||
try:
|
||||
recorder.save()
|
||||
except Exception as e:
|
||||
print(f"\033[31m Save {host} recorder error: {e} \033[0m\n")
|
||||
recorder.save()
|
||||
|
||||
def on_runner_failed(self, runner, e):
|
||||
logger.error("Account error: ", e)
|
||||
@@ -216,7 +192,7 @@ class ChangeSecretManager(AccountBasePlaybookManager):
|
||||
def get_summary(recorders):
|
||||
total, succeed, failed = 0, 0, 0
|
||||
for recorder in recorders:
|
||||
if recorder.status == ChangeSecretRecordStatusChoice.success.value:
|
||||
if recorder.status == 'success':
|
||||
succeed += 1
|
||||
else:
|
||||
failed += 1
|
||||
@@ -233,35 +209,18 @@ class ChangeSecretManager(AccountBasePlaybookManager):
|
||||
summary = self.get_summary(recorders)
|
||||
print(summary, end='')
|
||||
|
||||
if self.record_map:
|
||||
if self.record_id:
|
||||
return
|
||||
|
||||
failed_recorders = [
|
||||
r for r in recorders
|
||||
if r.status == ChangeSecretRecordStatusChoice.failed.value
|
||||
]
|
||||
self.send_recorder_mail(recorders, summary)
|
||||
|
||||
def send_recorder_mail(self, recorders, summary):
|
||||
recipients = self.execution.recipients
|
||||
if not recorders or not recipients:
|
||||
return
|
||||
|
||||
recipients = User.objects.filter(id__in=list(recipients.keys()))
|
||||
if not recipients:
|
||||
return
|
||||
|
||||
if failed_recorders:
|
||||
name = self.execution.snapshot.get('name')
|
||||
execution_id = str(self.execution.id)
|
||||
_ids = [r.id for r in failed_recorders]
|
||||
asset_account_errors = ChangeSecretRecord.objects.filter(
|
||||
id__in=_ids).values_list('asset__name', 'account__username', 'error')
|
||||
|
||||
for user in recipients:
|
||||
ChangeSecretFailedMsg(name, execution_id, user, asset_account_errors).publish()
|
||||
|
||||
if not recorders:
|
||||
return
|
||||
|
||||
self.send_recorder_mail(recipients, recorders, summary)
|
||||
|
||||
def send_recorder_mail(self, recipients, recorders, summary):
|
||||
name = self.execution.snapshot['name']
|
||||
path = os.path.join(os.path.dirname(settings.BASE_DIR), 'tmp')
|
||||
filename = os.path.join(path, f'{name}-{local_now_filename()}-{time.time()}.xlsx')
|
||||
|
||||
@@ -51,22 +51,14 @@ class GatherAccountsManager(AccountBasePlaybookManager):
|
||||
data = self.generate_data(asset, result)
|
||||
self.asset_account_info[asset] = data
|
||||
|
||||
@staticmethod
|
||||
def get_nested_info(data, *keys):
|
||||
for key in keys:
|
||||
data = data.get(key, {})
|
||||
if not data:
|
||||
break
|
||||
return data
|
||||
|
||||
def on_host_success(self, host, result):
|
||||
info = self.get_nested_info(result, 'debug', 'res', 'info')
|
||||
info = result.get('debug', {}).get('res', {}).get('info', {})
|
||||
asset = self.host_asset_mapper.get(host)
|
||||
if asset and info:
|
||||
result = self.filter_success_result(asset.type, info)
|
||||
self.collect_asset_account_info(asset, result)
|
||||
else:
|
||||
print(f'\033[31m Not found {host} info \033[0m\n')
|
||||
logger.error(f'Not found {host} info')
|
||||
|
||||
def update_or_create_accounts(self):
|
||||
for asset, data in self.asset_account_info.items():
|
||||
|
||||
@@ -85,7 +85,6 @@
|
||||
become_user: "{{ account.become.ansible_user | default('') }}"
|
||||
become_password: "{{ account.become.ansible_password | default('') }}"
|
||||
become_private_key_path: "{{ account.become.ansible_ssh_private_key_file | default(None) }}"
|
||||
old_ssh_version: "{{ jms_asset.old_ssh_version | default(False) }}"
|
||||
when: account.secret_type == "password"
|
||||
delegate_to: localhost
|
||||
|
||||
@@ -96,7 +95,6 @@
|
||||
login_user: "{{ account.username }}"
|
||||
login_private_key_path: "{{ account.private_key_path }}"
|
||||
gateway_args: "{{ jms_asset.ansible_ssh_common_args | default('') }}"
|
||||
old_ssh_version: "{{ jms_asset.old_ssh_version | default(False) }}"
|
||||
when: account.secret_type == "ssh_key"
|
||||
delegate_to: localhost
|
||||
|
||||
|
||||
@@ -85,7 +85,6 @@
|
||||
become_user: "{{ account.become.ansible_user | default('') }}"
|
||||
become_password: "{{ account.become.ansible_password | default('') }}"
|
||||
become_private_key_path: "{{ account.become.ansible_ssh_private_key_file | default(None) }}"
|
||||
old_ssh_version: "{{ jms_asset.old_ssh_version | default(False) }}"
|
||||
when: account.secret_type == "password"
|
||||
delegate_to: localhost
|
||||
|
||||
@@ -96,7 +95,6 @@
|
||||
login_user: "{{ account.username }}"
|
||||
login_private_key_path: "{{ account.private_key_path }}"
|
||||
gateway_args: "{{ jms_asset.ansible_ssh_common_args | default('') }}"
|
||||
old_ssh_version: "{{ jms_asset.old_ssh_version | default(False) }}"
|
||||
when: account.secret_type == "ssh_key"
|
||||
delegate_to: localhost
|
||||
|
||||
|
||||
@@ -60,11 +60,8 @@ class RemoveAccountManager(AccountBasePlaybookManager):
|
||||
if not tuple_asset_gather_account:
|
||||
return
|
||||
asset, gather_account = tuple_asset_gather_account
|
||||
try:
|
||||
Account.objects.filter(
|
||||
asset_id=asset.id,
|
||||
username=gather_account.username
|
||||
).delete()
|
||||
gather_account.delete()
|
||||
except Exception as e:
|
||||
print(f'\033[31m Delete account {gather_account.username} failed: {e} \033[0m\n')
|
||||
Account.objects.filter(
|
||||
asset_id=asset.id,
|
||||
username=gather_account.username
|
||||
).delete()
|
||||
gather_account.delete()
|
||||
|
||||
@@ -3,7 +3,6 @@
|
||||
vars:
|
||||
ansible_shell_type: sh
|
||||
ansible_connection: local
|
||||
ansible_python_interpreter: /opt/py3/bin/python
|
||||
|
||||
tasks:
|
||||
- name: Verify account (pyfreerdp)
|
||||
|
||||
@@ -19,5 +19,3 @@
|
||||
become_user: "{{ account.become.ansible_user | default('') }}"
|
||||
become_password: "{{ account.become.ansible_password | default('') }}"
|
||||
become_private_key_path: "{{ account.become.ansible_ssh_private_key_file | default(None) }}"
|
||||
old_ssh_version: "{{ jms_asset.old_ssh_version | default(False) }}"
|
||||
gateway_args: "{{ jms_asset.ansible_ssh_common_args | default(None) }}"
|
||||
|
||||
@@ -76,14 +76,8 @@ class VerifyAccountManager(AccountBasePlaybookManager):
|
||||
|
||||
def on_host_success(self, host, result):
|
||||
account = self.host_account_mapper.get(host)
|
||||
try:
|
||||
account.set_connectivity(Connectivity.OK)
|
||||
except Exception as e:
|
||||
print(f'\033[31m Update account {account.name} connectivity failed: {e} \033[0m\n')
|
||||
account.set_connectivity(Connectivity.OK)
|
||||
|
||||
def on_host_error(self, host, error, result):
|
||||
account = self.host_account_mapper.get(host)
|
||||
try:
|
||||
account.set_connectivity(Connectivity.ERR)
|
||||
except Exception as e:
|
||||
print(f'\033[31m Update account {account.name} connectivity failed: {e} \033[0m\n')
|
||||
account.set_connectivity(Connectivity.ERR)
|
||||
|
||||
@@ -15,7 +15,6 @@ class AliasAccount(TextChoices):
|
||||
INPUT = '@INPUT', _('Manual input')
|
||||
USER = '@USER', _('Dynamic user')
|
||||
ANON = '@ANON', _('Anonymous account')
|
||||
SPEC = '@SPEC', _('Specified account')
|
||||
|
||||
@classmethod
|
||||
def virtual_choices(cls):
|
||||
|
||||
@@ -16,7 +16,7 @@ DEFAULT_PASSWORD_RULES = {
|
||||
__all__ = [
|
||||
'AutomationTypes', 'SecretStrategy', 'SSHKeyStrategy', 'Connectivity',
|
||||
'DEFAULT_PASSWORD_LENGTH', 'DEFAULT_PASSWORD_RULES', 'TriggerChoice',
|
||||
'PushAccountActionChoice', 'AccountBackupType', 'ChangeSecretRecordStatusChoice',
|
||||
'PushAccountActionChoice', 'AccountBackupType'
|
||||
]
|
||||
|
||||
|
||||
@@ -103,9 +103,3 @@ class AccountBackupType(models.TextChoices):
|
||||
email = 'email', _('Email')
|
||||
# 目前只支持sftp方式
|
||||
object_storage = 'object_storage', _('SFTP')
|
||||
|
||||
|
||||
class ChangeSecretRecordStatusChoice(models.TextChoices):
|
||||
failed = 'failed', _('Failed')
|
||||
success = 'success', _('Success')
|
||||
pending = 'pending', _('Pending')
|
||||
|
||||
@@ -5,7 +5,7 @@ from django_filters import rest_framework as drf_filters
|
||||
|
||||
from assets.models import Node
|
||||
from common.drf.filters import BaseFilterSet
|
||||
from .models import Account, GatheredAccount, ChangeSecretRecord
|
||||
from .models import Account, GatheredAccount
|
||||
|
||||
|
||||
class AccountFilterSet(BaseFilterSet):
|
||||
@@ -61,13 +61,3 @@ class GatheredAccountFilterSet(BaseFilterSet):
|
||||
class Meta:
|
||||
model = GatheredAccount
|
||||
fields = ['id', 'username']
|
||||
|
||||
|
||||
class ChangeSecretRecordFilterSet(BaseFilterSet):
|
||||
asset_name = drf_filters.CharFilter(field_name='asset__name', lookup_expr='icontains')
|
||||
account_username = drf_filters.CharFilter(field_name='account__username', lookup_expr='icontains')
|
||||
execution_id = drf_filters.CharFilter(field_name='execution_id', lookup_expr='exact')
|
||||
|
||||
class Meta:
|
||||
model = ChangeSecretRecord
|
||||
fields = ['id', 'status', 'asset_id', 'execution']
|
||||
|
||||
@@ -1,8 +1,7 @@
|
||||
# Generated by Django 4.1.10 on 2023-08-01 09:12
|
||||
|
||||
import uuid
|
||||
|
||||
from django.db import migrations, models
|
||||
import uuid
|
||||
|
||||
|
||||
class Migration(migrations.Migration):
|
||||
@@ -21,7 +20,7 @@ class Migration(migrations.Migration):
|
||||
('date_updated', models.DateTimeField(auto_now=True, verbose_name='Date updated')),
|
||||
('id', models.UUIDField(default=uuid.uuid4, primary_key=True, serialize=False)),
|
||||
('org_id', models.CharField(blank=True, db_index=True, default='', max_length=36, verbose_name='Organization')),
|
||||
('alias', models.CharField(choices=[('@INPUT', 'Manual input'), ('@USER', 'Dynamic user'), ('@ANON', 'Anonymous account'), ('@SPEC', 'Specified account')], max_length=128, verbose_name='Alias')),
|
||||
('alias', models.CharField(choices=[('@INPUT', 'Manual input'), ('@USER', 'Dynamic user'), ('@ANON', 'Anonymous account')], max_length=128, verbose_name='Alias')),
|
||||
('secret_from_login', models.BooleanField(default=None, null=True, verbose_name='Secret from login')),
|
||||
],
|
||||
options={
|
||||
|
||||
@@ -8,7 +8,7 @@ from django.db import models
|
||||
from django.db.models import F
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from accounts.const import AccountBackupType
|
||||
from accounts.const.automation import AccountBackupType
|
||||
from common.const.choices import Trigger
|
||||
from common.db import fields
|
||||
from common.db.encoder import ModelJSONFieldEncoder
|
||||
|
||||
@@ -2,7 +2,7 @@ from django.db import models
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from accounts.const import (
|
||||
AutomationTypes, ChangeSecretRecordStatusChoice
|
||||
AutomationTypes
|
||||
)
|
||||
from common.db import fields
|
||||
from common.db.models import JMSBaseModel
|
||||
@@ -40,10 +40,7 @@ class ChangeSecretRecord(JMSBaseModel):
|
||||
new_secret = fields.EncryptTextField(blank=True, null=True, verbose_name=_('New secret'))
|
||||
date_started = models.DateTimeField(blank=True, null=True, verbose_name=_('Date started'))
|
||||
date_finished = models.DateTimeField(blank=True, null=True, verbose_name=_('Date finished'))
|
||||
status = models.CharField(
|
||||
max_length=16, verbose_name=_('Status'),
|
||||
default=ChangeSecretRecordStatusChoice.pending.value
|
||||
)
|
||||
status = models.CharField(max_length=16, default='pending', verbose_name=_('Status'))
|
||||
error = models.TextField(blank=True, null=True, verbose_name=_('Error'))
|
||||
|
||||
class Meta:
|
||||
|
||||
@@ -137,13 +137,16 @@ class BaseAccount(VaultModelMixin, JMSOrgBaseModel):
|
||||
else:
|
||||
return None
|
||||
|
||||
def get_private_key_path(self, path):
|
||||
@property
|
||||
def private_key_path(self):
|
||||
if self.secret_type != SecretType.SSH_KEY \
|
||||
or not self.secret \
|
||||
or not self.private_key:
|
||||
return None
|
||||
project_dir = settings.PROJECT_DIR
|
||||
tmp_dir = os.path.join(project_dir, 'tmp')
|
||||
key_name = '.' + md5(self.private_key.encode('utf-8')).hexdigest()
|
||||
key_path = os.path.join(path, key_name)
|
||||
key_path = os.path.join(tmp_dir, key_name)
|
||||
if not os.path.exists(key_path):
|
||||
# https://github.com/ansible/ansible-runner/issues/544
|
||||
# ssh requires OpenSSH format keys to have a full ending newline.
|
||||
@@ -155,12 +158,6 @@ class BaseAccount(VaultModelMixin, JMSOrgBaseModel):
|
||||
os.chmod(key_path, 0o400)
|
||||
return key_path
|
||||
|
||||
@property
|
||||
def private_key_path(self):
|
||||
project_dir = settings.PROJECT_DIR
|
||||
tmp_dir = os.path.join(project_dir, 'tmp')
|
||||
return self.get_private_key_path(tmp_dir)
|
||||
|
||||
def get_private_key(self):
|
||||
if not self.private_key:
|
||||
return None
|
||||
|
||||
@@ -1,7 +1,6 @@
|
||||
from django.template.loader import render_to_string
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from accounts.models import ChangeSecretRecord
|
||||
from common.tasks import send_mail_attachment_async, upload_backup_to_obj_storage
|
||||
from notifications.notifications import UserMessage
|
||||
from terminal.models.component.storage import ReplayStorage
|
||||
@@ -99,35 +98,3 @@ class GatherAccountChangeMsg(UserMessage):
|
||||
def gen_test_msg(cls):
|
||||
user = User.objects.first()
|
||||
return cls(user, {})
|
||||
|
||||
|
||||
class ChangeSecretFailedMsg(UserMessage):
|
||||
subject = _('Change secret or push account failed information')
|
||||
|
||||
def __init__(self, name, execution_id, user, asset_account_errors: list):
|
||||
self.name = name
|
||||
self.execution_id = execution_id
|
||||
self.asset_account_errors = asset_account_errors
|
||||
super().__init__(user)
|
||||
|
||||
def get_html_msg(self) -> dict:
|
||||
context = {
|
||||
'name': self.name,
|
||||
'recipient': self.user,
|
||||
'execution_id': self.execution_id,
|
||||
'asset_account_errors': self.asset_account_errors
|
||||
}
|
||||
message = render_to_string('accounts/change_secret_failed_info.html', context)
|
||||
|
||||
return {
|
||||
'subject': str(self.subject),
|
||||
'message': message
|
||||
}
|
||||
|
||||
@classmethod
|
||||
def gen_test_msg(cls):
|
||||
name = 'test'
|
||||
user = User.objects.first()
|
||||
record = ChangeSecretRecord.objects.first()
|
||||
execution_id = str(record.execution_id)
|
||||
return cls(name, execution_id, user, [])
|
||||
|
||||
@@ -431,11 +431,8 @@ class AssetAccountBulkSerializer(
|
||||
|
||||
class AccountSecretSerializer(SecretReadableMixin, AccountSerializer):
|
||||
class Meta(AccountSerializer.Meta):
|
||||
fields = AccountSerializer.Meta.fields + ['spec_info']
|
||||
extra_kwargs = {
|
||||
**AccountSerializer.Meta.extra_kwargs,
|
||||
'secret': {'write_only': False},
|
||||
'spec_info': {'label': _('Spec info')},
|
||||
}
|
||||
|
||||
|
||||
|
||||
@@ -67,14 +67,15 @@ class BaseAccountSerializer(AuthValidateMixin, ResourceLabelsMixin, BulkOrgResou
|
||||
fields_mini = ['id', 'name', 'username']
|
||||
fields_small = fields_mini + [
|
||||
'secret_type', 'secret', 'passphrase',
|
||||
'privileged', 'is_active',
|
||||
'privileged', 'is_active', 'spec_info',
|
||||
]
|
||||
fields_other = ['created_by', 'date_created', 'date_updated', 'comment']
|
||||
fields = fields_small + fields_other + ['labels']
|
||||
read_only_fields = [
|
||||
'date_verified', 'created_by', 'date_created',
|
||||
'spec_info', 'date_verified', 'created_by', 'date_created',
|
||||
]
|
||||
extra_kwargs = {
|
||||
'spec_info': {'label': _('Spec info')},
|
||||
'username': {'help_text': _(
|
||||
"Tip: If no username is required for authentication, fill in `null`, "
|
||||
"If AD account, like `username@domain`"
|
||||
|
||||
@@ -35,7 +35,6 @@ class AccountTemplateSerializer(BaseAccountSerializer):
|
||||
'su_from'
|
||||
]
|
||||
extra_kwargs = {
|
||||
**BaseAccountSerializer.Meta.extra_kwargs,
|
||||
'secret_strategy': {'help_text': _('Secret generation strategy for account creation')},
|
||||
'auto_push': {'help_text': _('Whether to automatically push the account to the asset')},
|
||||
'platforms': {
|
||||
@@ -65,9 +64,6 @@ class AccountTemplateSerializer(BaseAccountSerializer):
|
||||
|
||||
class AccountTemplateSecretSerializer(SecretReadableMixin, AccountTemplateSerializer):
|
||||
class Meta(AccountTemplateSerializer.Meta):
|
||||
fields = AccountTemplateSerializer.Meta.fields + ['spec_info']
|
||||
extra_kwargs = {
|
||||
**AccountTemplateSerializer.Meta.extra_kwargs,
|
||||
'secret': {'write_only': False},
|
||||
'spec_info': {'label': _('Spec info')},
|
||||
}
|
||||
|
||||
@@ -21,7 +21,6 @@ __all__ = [
|
||||
class BaseAutomationSerializer(PeriodTaskSerializerMixin, BulkOrgResourceModelSerializer):
|
||||
assets = ObjectRelatedField(many=True, required=False, queryset=Asset.objects, label=_('Assets'))
|
||||
nodes = ObjectRelatedField(many=True, required=False, queryset=Node.objects, label=_('Nodes'))
|
||||
is_periodic = serializers.BooleanField(default=False, required=False, label=_("Periodic perform"))
|
||||
|
||||
class Meta:
|
||||
read_only_fields = [
|
||||
|
||||
@@ -4,8 +4,7 @@ from django.utils.translation import gettext_lazy as _
|
||||
from rest_framework import serializers
|
||||
|
||||
from accounts.const import (
|
||||
AutomationTypes, SecretType, SecretStrategy,
|
||||
SSHKeyStrategy, ChangeSecretRecordStatusChoice
|
||||
AutomationTypes, SecretType, SecretStrategy, SSHKeyStrategy
|
||||
)
|
||||
from accounts.models import (
|
||||
Account, ChangeSecretAutomation,
|
||||
@@ -22,7 +21,6 @@ logger = get_logger(__file__)
|
||||
__all__ = [
|
||||
'ChangeSecretAutomationSerializer',
|
||||
'ChangeSecretRecordSerializer',
|
||||
'ChangeSecretRecordViewSecretSerializer',
|
||||
'ChangeSecretRecordBackUpSerializer',
|
||||
'ChangeSecretUpdateAssetSerializer',
|
||||
'ChangeSecretUpdateNodeSerializer',
|
||||
@@ -106,10 +104,7 @@ class ChangeSecretAutomationSerializer(AuthValidateMixin, BaseAutomationSerializ
|
||||
class ChangeSecretRecordSerializer(serializers.ModelSerializer):
|
||||
is_success = serializers.SerializerMethodField(label=_('Is success'))
|
||||
asset = ObjectRelatedField(queryset=Asset.objects, label=_('Asset'))
|
||||
account = ObjectRelatedField(
|
||||
queryset=Account.objects, label=_('Account'),
|
||||
attrs=("id", "name", "username")
|
||||
)
|
||||
account = ObjectRelatedField(queryset=Account.objects, label=_('Account'))
|
||||
execution = ObjectRelatedField(
|
||||
queryset=AutomationExecution.objects, label=_('Automation task execution')
|
||||
)
|
||||
@@ -124,16 +119,7 @@ class ChangeSecretRecordSerializer(serializers.ModelSerializer):
|
||||
|
||||
@staticmethod
|
||||
def get_is_success(obj):
|
||||
return obj.status == ChangeSecretRecordStatusChoice.success.value
|
||||
|
||||
|
||||
class ChangeSecretRecordViewSecretSerializer(serializers.ModelSerializer):
|
||||
class Meta:
|
||||
model = ChangeSecretRecord
|
||||
fields = [
|
||||
'id', 'old_secret', 'new_secret',
|
||||
]
|
||||
read_only_fields = fields
|
||||
return obj.status == 'success'
|
||||
|
||||
|
||||
class ChangeSecretRecordBackUpSerializer(serializers.ModelSerializer):
|
||||
@@ -159,7 +145,7 @@ class ChangeSecretRecordBackUpSerializer(serializers.ModelSerializer):
|
||||
|
||||
@staticmethod
|
||||
def get_is_success(obj):
|
||||
if obj.status == ChangeSecretRecordStatusChoice.success.value:
|
||||
if obj.status == 'success':
|
||||
return _("Success")
|
||||
return _("Failed")
|
||||
|
||||
|
||||
@@ -36,14 +36,14 @@ def execute_account_automation_task(pid, trigger, tp):
|
||||
instance.execute(trigger)
|
||||
|
||||
|
||||
def record_task_activity_callback(self, record_ids, *args, **kwargs):
|
||||
def record_task_activity_callback(self, record_id, *args, **kwargs):
|
||||
from accounts.models import ChangeSecretRecord
|
||||
with tmp_to_root_org():
|
||||
records = ChangeSecretRecord.objects.filter(id__in=record_ids)
|
||||
if not records:
|
||||
record = get_object_or_none(ChangeSecretRecord, id=record_id)
|
||||
if not record:
|
||||
return
|
||||
resource_ids = [str(i.id) for i in records]
|
||||
org_id = records[0].execution.org_id
|
||||
resource_ids = [record.id]
|
||||
org_id = record.execution.org_id
|
||||
return resource_ids, org_id
|
||||
|
||||
|
||||
@@ -51,26 +51,22 @@ def record_task_activity_callback(self, record_ids, *args, **kwargs):
|
||||
queue='ansible', verbose_name=_('Execute automation record'),
|
||||
activity_callback=record_task_activity_callback
|
||||
)
|
||||
def execute_automation_record_task(record_ids, tp):
|
||||
def execute_automation_record_task(record_id, tp):
|
||||
from accounts.models import ChangeSecretRecord
|
||||
task_name = gettext_noop('Execute automation record')
|
||||
|
||||
with tmp_to_root_org():
|
||||
records = ChangeSecretRecord.objects.filter(id__in=record_ids)
|
||||
|
||||
if not records:
|
||||
logger.error('No automation record found: {}'.format(record_ids))
|
||||
instance = get_object_or_none(ChangeSecretRecord, pk=record_id)
|
||||
if not instance:
|
||||
logger.error("No automation record found: {}".format(record_id))
|
||||
return
|
||||
|
||||
record = records[0]
|
||||
record_map = {f'{record.asset_id}-{record.account_id}': str(record.id) for record in records}
|
||||
task_name = gettext_noop('Execute automation record')
|
||||
task_snapshot = {
|
||||
'secret': instance.new_secret,
|
||||
'secret_type': instance.execution.snapshot.get('secret_type'),
|
||||
'accounts': [str(instance.account_id)],
|
||||
'assets': [str(instance.asset_id)],
|
||||
'params': {},
|
||||
'record_map': record_map,
|
||||
'secret': record.new_secret,
|
||||
'secret_type': record.execution.snapshot.get('secret_type'),
|
||||
'assets': [str(instance.asset_id) for instance in records],
|
||||
'accounts': [str(instance.account_id) for instance in records],
|
||||
'record_id': record_id,
|
||||
}
|
||||
with tmp_to_org(record.execution.org_id):
|
||||
with tmp_to_org(instance.execution.org_id):
|
||||
quickstart_automation_by_snapshot(task_name, tp, task_snapshot)
|
||||
|
||||
@@ -55,7 +55,7 @@ def clean_historical_accounts():
|
||||
history_model = Account.history.model
|
||||
history_id_mapper = defaultdict(list)
|
||||
|
||||
ids = history_model.objects.values('id').annotate(count=Count('id')) \
|
||||
ids = history_model.objects.values('id').annotate(count=Count('id', distinct=True)) \
|
||||
.filter(count__gte=limit).values_list('id', flat=True)
|
||||
|
||||
if not ids:
|
||||
|
||||
@@ -29,8 +29,7 @@ def template_sync_related_accounts(template_id, user_id=None):
|
||||
name = template.name
|
||||
username = template.username
|
||||
secret_type = template.secret_type
|
||||
print(
|
||||
f'\033[32m>>> 开始同步模板名称、用户名、密钥类型到相关联的账号 ({datetime.now().strftime("%Y-%m-%d %H:%M:%S")})')
|
||||
print(f'\033[32m>>> 开始同步模版名称、用户名、密钥类型到相关联的账号 ({datetime.now().strftime("%Y-%m-%d %H:%M:%S")})')
|
||||
with tmp_to_org(org_id):
|
||||
for account in accounts:
|
||||
account.name = name
|
||||
|
||||
@@ -1,10 +1,10 @@
|
||||
{% load i18n %}
|
||||
|
||||
<h3>{% trans 'Gather account change information' %}</h3>
|
||||
|
||||
<table style="width: 100%; border-collapse: collapse; max-width: 100%; text-align: left; margin-top: 20px;">
|
||||
<caption></caption>
|
||||
<tr style="background-color: #f2f2f2;">
|
||||
<th style="border: 1px solid #ddd; padding: 10px;">{% trans 'Asset' %}</th>
|
||||
<th style="border: 1px solid #ddd; padding: 10px; font-weight: bold;">{% trans 'Asset' %}</th>
|
||||
<th style="border: 1px solid #ddd; padding: 10px;">{% trans 'Added account' %}</th>
|
||||
<th style="border: 1px solid #ddd; padding: 10px;">{% trans 'Deleted account' %}</th>
|
||||
</tr>
|
||||
|
||||
@@ -1,36 +0,0 @@
|
||||
{% load i18n %}
|
||||
|
||||
<h3>{% trans 'Task name' %}: {{ name }}</h3>
|
||||
<h3>{% trans 'Task execution id' %}: {{ execution_id }}</h3>
|
||||
<p>{% trans 'Respectful' %} {{ recipient }}</p>
|
||||
<p>{% trans 'Hello! The following is the failure of changing the password of your assets or pushing the account. Please check and handle it in time.' %}</p>
|
||||
<table style="width: 100%; border-collapse: collapse; max-width: 100%; text-align: left; margin-top: 20px;">
|
||||
<caption></caption>
|
||||
<thead>
|
||||
<tr style="background-color: #f2f2f2;">
|
||||
<th style="border: 1px solid #ddd; padding: 10px;">{% trans 'Asset' %}</th>
|
||||
<th style="border: 1px solid #ddd; padding: 10px;">{% trans 'Account' %}</th>
|
||||
<th style="border: 1px solid #ddd; padding: 10px;">{% trans 'Error' %}</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
{% for asset_name, account_username, error in asset_account_errors %}
|
||||
<tr>
|
||||
<td style="border: 1px solid #ddd; padding: 10px;">{{ asset_name }}</td>
|
||||
<td style="border: 1px solid #ddd; padding: 10px;">{{ account_username }}</td>
|
||||
<td style="border: 1px solid #ddd; padding: 10px;">
|
||||
<div style="
|
||||
max-width: 90%;
|
||||
white-space: nowrap;
|
||||
overflow: hidden;
|
||||
text-overflow: ellipsis;
|
||||
display: block;"
|
||||
title="{{ error }}"
|
||||
>
|
||||
{{ error }}
|
||||
</div>
|
||||
</td>
|
||||
</tr>
|
||||
{% endfor %}
|
||||
</tbody>
|
||||
</table>
|
||||
@@ -32,7 +32,6 @@ __all__ = [
|
||||
|
||||
class AssetFilterSet(BaseFilterSet):
|
||||
platform = django_filters.CharFilter(method='filter_platform')
|
||||
exclude_platform = django_filters.CharFilter(field_name="platform__name", lookup_expr='exact', exclude=True)
|
||||
domain = django_filters.CharFilter(method='filter_domain')
|
||||
type = django_filters.CharFilter(field_name="platform__type", lookup_expr="exact")
|
||||
category = django_filters.CharFilter(field_name="platform__category", lookup_expr="exact")
|
||||
@@ -93,6 +92,7 @@ class AssetViewSet(SuggestionMixin, OrgBulkModelViewSet):
|
||||
model = Asset
|
||||
filterset_class = AssetFilterSet
|
||||
search_fields = ("name", "address", "comment")
|
||||
ordering = ('name',)
|
||||
ordering_fields = ('name', 'address', 'connectivity', 'platform', 'date_updated', 'date_created')
|
||||
serializer_classes = (
|
||||
("default", serializers.AssetSerializer),
|
||||
|
||||
@@ -19,6 +19,7 @@ class DomainViewSet(OrgBulkModelViewSet):
|
||||
model = Domain
|
||||
filterset_fields = ("name",)
|
||||
search_fields = filterset_fields
|
||||
ordering = ('name',)
|
||||
serializer_classes = {
|
||||
'default': serializers.DomainSerializer,
|
||||
'list': serializers.DomainListSerializer,
|
||||
@@ -29,10 +30,6 @@ class DomainViewSet(OrgBulkModelViewSet):
|
||||
return serializers.DomainWithGatewaySerializer
|
||||
return super().get_serializer_class()
|
||||
|
||||
def partial_update(self, request, *args, **kwargs):
|
||||
kwargs['partial'] = True
|
||||
return self.update(request, *args, **kwargs)
|
||||
|
||||
|
||||
class GatewayViewSet(HostViewSet):
|
||||
perm_model = Gateway
|
||||
|
||||
@@ -22,7 +22,6 @@ from orgs.utils import current_org
|
||||
from rbac.permissions import RBACPermission
|
||||
from .. import serializers
|
||||
from ..models import Node
|
||||
from ..signal_handlers import update_nodes_assets_amount
|
||||
from ..tasks import (
|
||||
update_node_assets_hardware_info_manual,
|
||||
test_node_assets_connectivity_manual,
|
||||
@@ -95,7 +94,6 @@ class NodeAddChildrenApi(generics.UpdateAPIView):
|
||||
children = Node.objects.filter(id__in=node_ids)
|
||||
for node in children:
|
||||
node.parent = instance
|
||||
update_nodes_assets_amount.delay(ttl=5, node_ids=(instance.id,))
|
||||
return Response("OK")
|
||||
|
||||
|
||||
|
||||
@@ -21,7 +21,6 @@ class AssetPlatformViewSet(JMSModelViewSet):
|
||||
}
|
||||
filterset_fields = ['name', 'category', 'type']
|
||||
search_fields = ['name']
|
||||
ordering = ['-internal', 'name']
|
||||
rbac_perms = {
|
||||
'categories': 'assets.view_platform',
|
||||
'type_constraints': 'assets.view_platform',
|
||||
|
||||
@@ -12,8 +12,7 @@ from sshtunnel import SSHTunnelForwarder
|
||||
|
||||
from assets.automations.methods import platform_automation_methods
|
||||
from common.utils import get_logger, lazyproperty, is_openssh_format_key, ssh_pubkey_gen
|
||||
from ops.ansible import JMSInventory, DefaultCallback, SuperPlaybookRunner
|
||||
from ops.ansible.interface import interface
|
||||
from ops.ansible import JMSInventory, PlaybookRunner, DefaultCallback
|
||||
|
||||
logger = get_logger(__name__)
|
||||
|
||||
@@ -55,9 +54,7 @@ class SSHTunnelManager:
|
||||
not_valid.append(k)
|
||||
else:
|
||||
local_bind_port = server.local_bind_port
|
||||
|
||||
host['ansible_host'] = jms_asset['address'] = host[
|
||||
'login_host'] = interface.get_gateway_proxy_host()
|
||||
host['ansible_host'] = jms_asset['address'] = host['login_host'] = '127.0.0.1'
|
||||
host['ansible_port'] = jms_asset['port'] = host['login_port'] = local_bind_port
|
||||
servers.append(server)
|
||||
|
||||
@@ -272,7 +269,7 @@ class BasePlaybookManager:
|
||||
if not playbook_path:
|
||||
continue
|
||||
|
||||
runer = SuperPlaybookRunner(
|
||||
runer = PlaybookRunner(
|
||||
inventory_path,
|
||||
playbook_path,
|
||||
self.runtime_dir,
|
||||
@@ -300,16 +297,12 @@ class BasePlaybookManager:
|
||||
for host in hosts:
|
||||
result = cb.host_results.get(host)
|
||||
if state == 'ok':
|
||||
self.on_host_success(host, result.get('ok', ''))
|
||||
self.on_host_success(host, result)
|
||||
elif state == 'skipped':
|
||||
pass
|
||||
else:
|
||||
error = hosts.get(host)
|
||||
self.on_host_error(
|
||||
host, error,
|
||||
result.get('failures', '')
|
||||
or result.get('dark', '')
|
||||
)
|
||||
self.on_host_error(host, error, result)
|
||||
|
||||
def on_runner_failed(self, runner, e):
|
||||
print("Runner failed: {} {}".format(e, self))
|
||||
@@ -321,7 +314,7 @@ class BasePlaybookManager:
|
||||
def delete_runtime_dir(self):
|
||||
if settings.DEBUG_DEV:
|
||||
return
|
||||
shutil.rmtree(self.runtime_dir, ignore_errors=True)
|
||||
shutil.rmtree(self.runtime_dir)
|
||||
|
||||
def run(self, *args, **kwargs):
|
||||
print(">>> 任务准备阶段\n")
|
||||
@@ -340,7 +333,6 @@ class BasePlaybookManager:
|
||||
ssh_tunnel = SSHTunnelManager()
|
||||
ssh_tunnel.local_gateway_prepare(runner)
|
||||
try:
|
||||
kwargs.update({"clean_workspace": False})
|
||||
cb = runner.run(**kwargs)
|
||||
self.on_runner_success(runner, cb)
|
||||
except Exception as e:
|
||||
|
||||
@@ -3,7 +3,6 @@
|
||||
vars:
|
||||
ansible_shell_type: sh
|
||||
ansible_connection: local
|
||||
ansible_python_interpreter: /opt/py3/bin/python
|
||||
|
||||
tasks:
|
||||
- name: Test asset connection (pyfreerdp)
|
||||
|
||||
@@ -19,6 +19,3 @@
|
||||
become_user: "{{ custom_become_user | default('') }}"
|
||||
become_password: "{{ custom_become_password | default('') }}"
|
||||
become_private_key_path: "{{ custom_become_private_key_path | default(None) }}"
|
||||
old_ssh_version: "{{ jms_asset.old_ssh_version | default(False) }}"
|
||||
gateway_args: "{{ jms_asset.ansible_ssh_common_args | default(None) }}"
|
||||
|
||||
|
||||
@@ -1,11 +0,0 @@
|
||||
- hosts: custom
|
||||
gather_facts: no
|
||||
vars:
|
||||
ansible_connection: local
|
||||
ansible_shell_type: sh
|
||||
|
||||
tasks:
|
||||
- name: Test asset connection (telnet)
|
||||
telnet_ping:
|
||||
login_host: "{{ jms_asset.address }}"
|
||||
login_port: "{{ jms_asset.port }}"
|
||||
@@ -1,16 +0,0 @@
|
||||
id: ping_by_telnet
|
||||
name: "{{ 'Ping by telnet' | trans }}"
|
||||
category:
|
||||
- device
|
||||
- host
|
||||
type:
|
||||
- all
|
||||
method: ping
|
||||
protocol: telnet
|
||||
priority: 50
|
||||
|
||||
i18n:
|
||||
Ping by telnet:
|
||||
zh: '使用 Python 模块 telnet 测试主机可连接性'
|
||||
en: 'Ping by telnet module'
|
||||
ja: 'Pythonモジュールtelnetを使用したホスト接続性のテスト'
|
||||
@@ -25,22 +25,14 @@ class PingManager(BasePlaybookManager):
|
||||
|
||||
def on_host_success(self, host, result):
|
||||
asset, account = self.host_asset_and_account_mapper.get(host)
|
||||
try:
|
||||
asset.set_connectivity(Connectivity.OK)
|
||||
if not account:
|
||||
return
|
||||
account.set_connectivity(Connectivity.OK)
|
||||
except Exception as e:
|
||||
print(f'\033[31m Update account {account.name} or '
|
||||
f'update asset {asset.name} connectivity failed: {e} \033[0m\n')
|
||||
asset.set_connectivity(Connectivity.OK)
|
||||
if not account:
|
||||
return
|
||||
account.set_connectivity(Connectivity.OK)
|
||||
|
||||
def on_host_error(self, host, error, result):
|
||||
asset, account = self.host_asset_and_account_mapper.get(host)
|
||||
try:
|
||||
asset.set_connectivity(Connectivity.ERR)
|
||||
if not account:
|
||||
return
|
||||
account.set_connectivity(Connectivity.ERR)
|
||||
except Exception as e:
|
||||
print(f'\033[31m Update account {account.name} or '
|
||||
f'update asset {asset.name} connectivity failed: {e} \033[0m\n')
|
||||
asset.set_connectivity(Connectivity.ERR)
|
||||
if not account:
|
||||
return
|
||||
account.set_connectivity(Connectivity.ERR)
|
||||
|
||||
@@ -92,26 +92,18 @@ class PingGatewayManager:
|
||||
@staticmethod
|
||||
def on_host_success(gateway, account):
|
||||
print('\033[32m {} -> {}\033[0m\n'.format(gateway, account))
|
||||
try:
|
||||
gateway.set_connectivity(Connectivity.OK)
|
||||
if not account:
|
||||
return
|
||||
account.set_connectivity(Connectivity.OK)
|
||||
except Exception as e:
|
||||
print(f'\033[31m Update account {account.name} or '
|
||||
f'update asset {gateway.name} connectivity failed: {e} \033[0m\n')
|
||||
gateway.set_connectivity(Connectivity.OK)
|
||||
if not account:
|
||||
return
|
||||
account.set_connectivity(Connectivity.OK)
|
||||
|
||||
@staticmethod
|
||||
def on_host_error(gateway, account, error):
|
||||
print('\033[31m {} -> {} 原因: {} \033[0m\n'.format(gateway, account, error))
|
||||
try:
|
||||
gateway.set_connectivity(Connectivity.ERR)
|
||||
if not account:
|
||||
return
|
||||
account.set_connectivity(Connectivity.ERR)
|
||||
except Exception as e:
|
||||
print(f'\033[31m Update account {account.name} or '
|
||||
f'update asset {gateway.name} connectivity failed: {e} \033[0m\n')
|
||||
gateway.set_connectivity(Connectivity.ERR)
|
||||
if not account:
|
||||
return
|
||||
account.set_connectivity(Connectivity.ERR)
|
||||
|
||||
@staticmethod
|
||||
def before_runner_start():
|
||||
|
||||
@@ -38,14 +38,6 @@ class Protocol(ChoicesMixin, models.TextChoices):
|
||||
cls.ssh: {
|
||||
'port': 22,
|
||||
'secret_types': ['password', 'ssh_key'],
|
||||
'setting': {
|
||||
'old_ssh_version': {
|
||||
'type': 'bool',
|
||||
'default': False,
|
||||
'label': _('Old SSH version'),
|
||||
'help_text': _('Old SSH version like openssh 5.x or 6.x')
|
||||
}
|
||||
}
|
||||
},
|
||||
cls.sftp: {
|
||||
'port': 22,
|
||||
@@ -195,14 +187,6 @@ class Protocol(ChoicesMixin, models.TextChoices):
|
||||
'port': 27017,
|
||||
'required': True,
|
||||
'secret_types': ['password'],
|
||||
'setting': {
|
||||
'auth_source': {
|
||||
'type': 'str',
|
||||
'default': 'admin',
|
||||
'label': _('Auth source'),
|
||||
'help_text': _('The database to authenticate against')
|
||||
}
|
||||
}
|
||||
},
|
||||
cls.redis: {
|
||||
'port': 6379,
|
||||
@@ -286,7 +270,7 @@ class Protocol(ChoicesMixin, models.TextChoices):
|
||||
'label': _('API mode'),
|
||||
'choices': [
|
||||
('gpt-3.5-turbo', 'GPT-3.5 Turbo'),
|
||||
('gpt-3.5-turbo-1106', 'GPT-3.5 Turbo 1106'),
|
||||
('gpt-3.5-turbo-16k', 'GPT-3.5 Turbo 16K'),
|
||||
]
|
||||
}
|
||||
}
|
||||
@@ -296,8 +280,7 @@ class Protocol(ChoicesMixin, models.TextChoices):
|
||||
choices = protocols[cls.chatgpt]['setting']['api_mode']['choices']
|
||||
choices.extend([
|
||||
('gpt-4', 'GPT-4'),
|
||||
('gpt-4-turbo', 'GPT-4 Turbo'),
|
||||
('gpt-4o', 'GPT-4o'),
|
||||
('gpt-4-32k', 'GPT-4 32K'),
|
||||
])
|
||||
return protocols
|
||||
|
||||
|
||||
@@ -1,7 +1,6 @@
|
||||
# Generated by Django 3.2.12 on 2022-07-11 06:13
|
||||
|
||||
import time
|
||||
import math
|
||||
from django.utils import timezone
|
||||
from itertools import groupby
|
||||
from django.db import migrations
|
||||
@@ -41,13 +40,9 @@ def migrate_asset_accounts(apps, schema_editor):
|
||||
if system_user:
|
||||
# 更新一次系统用户的认证属性
|
||||
account_values.update({attr: getattr(system_user, attr, '') for attr in all_attrs})
|
||||
account_values['created_by'] = str(system_user.id)
|
||||
account_values['privileged'] = system_user.type == 'admin' \
|
||||
or system_user.username in ['root', 'Administrator']
|
||||
if system_user.su_enabled and system_user.su_from:
|
||||
created_by = f'{str(system_user.id)}::{str(system_user.su_from.username)}'
|
||||
else:
|
||||
created_by = str(system_user.id)
|
||||
account_values['created_by'] = created_by
|
||||
|
||||
auth_book_auth = {attr: getattr(auth_book, attr, '') for attr in all_attrs if getattr(auth_book, attr, '')}
|
||||
# 最终优先使用 auth_book 的认证属性
|
||||
@@ -122,70 +117,6 @@ def migrate_asset_accounts(apps, schema_editor):
|
||||
print("\t - histories: {}".format(len(accounts_to_history)))
|
||||
|
||||
|
||||
def update_asset_accounts_su_from(apps, schema_editor):
|
||||
# Update accounts su_from
|
||||
print("\n\tStart update asset accounts su_from field")
|
||||
account_model = apps.get_model('accounts', 'Account')
|
||||
platform_model = apps.get_model('assets', 'Platform')
|
||||
asset_model = apps.get_model('assets', 'Asset')
|
||||
platform_ids = list(platform_model.objects.filter(su_enabled=True).values_list('id', flat=True))
|
||||
|
||||
count = 0
|
||||
step_size = 1000
|
||||
count_account = 0
|
||||
while True:
|
||||
start = time.time()
|
||||
asset_ids = asset_model.objects \
|
||||
.filter(platform_id__in=platform_ids) \
|
||||
.values_list('id', flat=True)[count:count + step_size]
|
||||
asset_ids = list(asset_ids)
|
||||
if not asset_ids:
|
||||
break
|
||||
count += len(asset_ids)
|
||||
|
||||
accounts = list(account_model.objects.filter(asset_id__in=asset_ids))
|
||||
|
||||
# {asset_id_account_username: account.id}}
|
||||
asset_accounts_mapper = {}
|
||||
for a in accounts:
|
||||
try:
|
||||
k = f'{a.asset_id}_{a.username}'
|
||||
asset_accounts_mapper[k] = str(a.id)
|
||||
except Exception as e:
|
||||
pass
|
||||
|
||||
update_accounts = []
|
||||
for a in accounts:
|
||||
try:
|
||||
if not a.created_by:
|
||||
continue
|
||||
created_by_list = a.created_by.split('::')
|
||||
if len(created_by_list) != 2:
|
||||
continue
|
||||
su_from_username = created_by_list[1]
|
||||
if not su_from_username:
|
||||
continue
|
||||
k = f'{a.asset_id}_{su_from_username}'
|
||||
su_from_id = asset_accounts_mapper.get(k)
|
||||
if not su_from_id:
|
||||
continue
|
||||
a.su_from_id = su_from_id
|
||||
update_accounts.append(a)
|
||||
except Exception as e:
|
||||
pass
|
||||
|
||||
count_account += len(update_accounts)
|
||||
|
||||
log_msg = "\t - [{}]: Update accounts su_from: {}-{} {:.2f}s"
|
||||
try:
|
||||
account_model.objects.bulk_update(update_accounts, ['su_from_id'])
|
||||
except Exception as e:
|
||||
status = 'Failed'
|
||||
else:
|
||||
status = 'Success'
|
||||
print(log_msg.format(status, count_account - len(update_accounts), count_account, time.time() - start))
|
||||
|
||||
|
||||
def migrate_db_accounts(apps, schema_editor):
|
||||
app_perm_model = apps.get_model('perms', 'ApplicationPermission')
|
||||
account_model = apps.get_model('accounts', 'Account')
|
||||
@@ -265,6 +196,5 @@ class Migration(migrations.Migration):
|
||||
|
||||
operations = [
|
||||
migrations.RunPython(migrate_asset_accounts),
|
||||
migrations.RunPython(update_asset_accounts_su_from),
|
||||
migrations.RunPython(migrate_db_accounts),
|
||||
]
|
||||
|
||||
@@ -73,7 +73,3 @@ class Gateway(Host):
|
||||
def private_key_path(self):
|
||||
account = self.select_account
|
||||
return account.private_key_path if account else None
|
||||
|
||||
def get_private_key_path(self, path):
|
||||
account = self.select_account
|
||||
return account.get_private_key_path(path) if account else None
|
||||
|
||||
@@ -1,7 +1,8 @@
|
||||
from django.db import models
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from assets.const import AllTypes, Category, Protocol
|
||||
from assets.const import AllTypes
|
||||
from assets.const import Protocol
|
||||
from common.db.fields import JsonDictTextField
|
||||
from common.db.models import JMSBaseModel
|
||||
|
||||
@@ -118,15 +119,6 @@ class Platform(LabeledMixin, JMSBaseModel):
|
||||
)
|
||||
return linux.id
|
||||
|
||||
def is_huawei(self):
|
||||
if self.category != Category.DEVICE:
|
||||
return False
|
||||
if 'huawei' in self.name.lower():
|
||||
return True
|
||||
if '华为' in self.name:
|
||||
return True
|
||||
return False
|
||||
|
||||
def __str__(self):
|
||||
return self.name
|
||||
|
||||
|
||||
@@ -22,36 +22,6 @@ class WebSpecSerializer(serializers.ModelSerializer):
|
||||
'submit_selector', 'script'
|
||||
]
|
||||
|
||||
def get_fields(self):
|
||||
fields = super().get_fields()
|
||||
if self.is_retrieve():
|
||||
# 查看 Web 资产详情时
|
||||
self.pop_fields_if_need(fields)
|
||||
return fields
|
||||
|
||||
def is_retrieve(self):
|
||||
try:
|
||||
self.context.get('request').method and self.parent.instance.web
|
||||
return True
|
||||
except Exception:
|
||||
return False
|
||||
|
||||
def pop_fields_if_need(self, fields):
|
||||
fields_script = ['script']
|
||||
fields_basic = ['username_selector', 'password_selector', 'submit_selector']
|
||||
autofill = self.parent.instance.web.autofill
|
||||
pop_fields_mapper = {
|
||||
FillType.no: fields_script + fields_basic,
|
||||
FillType.basic: fields_script,
|
||||
FillType.script: fields_basic,
|
||||
}
|
||||
fields_pop = pop_fields_mapper.get(autofill, [])
|
||||
for f in fields_pop:
|
||||
fields.pop(f, None)
|
||||
return fields
|
||||
|
||||
|
||||
|
||||
|
||||
category_spec_serializer_map = {
|
||||
'database': DatabaseSpecSerializer,
|
||||
|
||||
@@ -1,13 +1,12 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
#
|
||||
from django.db.models import Count, Q
|
||||
from django.db.models import Count
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
from rest_framework import serializers
|
||||
|
||||
from common.serializers import ResourceLabelsMixin
|
||||
from common.serializers.fields import ObjectRelatedField
|
||||
from orgs.mixins.serializers import BulkOrgResourceModelSerializer
|
||||
from assets.models.gateway import Gateway
|
||||
from .gateway import GatewayWithAccountSecretSerializer
|
||||
from ..models import Domain
|
||||
|
||||
@@ -16,7 +15,7 @@ __all__ = ['DomainSerializer', 'DomainWithGatewaySerializer', 'DomainListSeriali
|
||||
|
||||
class DomainSerializer(ResourceLabelsMixin, BulkOrgResourceModelSerializer):
|
||||
gateways = ObjectRelatedField(
|
||||
many=True, required=False, label=_('Gateway'), queryset=Gateway.objects
|
||||
many=True, required=False, label=_('Gateway'), read_only=True,
|
||||
)
|
||||
|
||||
class Meta:
|
||||
@@ -26,9 +25,6 @@ class DomainSerializer(ResourceLabelsMixin, BulkOrgResourceModelSerializer):
|
||||
fields_m2m = ['assets', 'gateways']
|
||||
read_only_fields = ['date_created']
|
||||
fields = fields_small + fields_m2m + read_only_fields
|
||||
extra_kwargs = {
|
||||
'assets': {'required': False},
|
||||
}
|
||||
|
||||
def to_representation(self, instance):
|
||||
data = super().to_representation(instance)
|
||||
@@ -39,17 +35,12 @@ class DomainSerializer(ResourceLabelsMixin, BulkOrgResourceModelSerializer):
|
||||
data['assets'] = [i for i in assets if str(i['id']) not in gateway_ids]
|
||||
return data
|
||||
|
||||
def create(self, validated_data):
|
||||
assets = validated_data.pop('assets', [])
|
||||
gateways = validated_data.pop('gateways', [])
|
||||
validated_data['assets'] = assets + gateways
|
||||
return super().create(validated_data)
|
||||
|
||||
def update(self, instance, validated_data):
|
||||
assets = validated_data.pop('assets', list(instance.assets.all()))
|
||||
gateways = validated_data.pop('gateways', list(instance.gateways.all()))
|
||||
validated_data['assets'] = assets + gateways
|
||||
return super().update(instance, validated_data)
|
||||
assets = validated_data.pop('assets', [])
|
||||
assets = assets + list(instance.gateways)
|
||||
validated_data['assets'] = assets
|
||||
instance = super().update(instance, validated_data)
|
||||
return instance
|
||||
|
||||
@classmethod
|
||||
def setup_eager_loading(cls, queryset):
|
||||
@@ -67,7 +58,7 @@ class DomainListSerializer(DomainSerializer):
|
||||
@classmethod
|
||||
def setup_eager_loading(cls, queryset):
|
||||
queryset = queryset.annotate(
|
||||
assets_amount=Count('assets', filter=~Q(assets__platform__name='Gateway'), distinct=True),
|
||||
assets_amount=Count('assets', distinct=True),
|
||||
)
|
||||
return queryset
|
||||
|
||||
|
||||
@@ -67,5 +67,5 @@ def set_assets_size_to_setting(sender, **kwargs):
|
||||
|
||||
if amount > 20000:
|
||||
settings.ASSET_SIZE = 'large'
|
||||
elif amount > 5000:
|
||||
elif amount > 2000:
|
||||
settings.ASSET_SIZE = 'medium'
|
||||
|
||||
@@ -88,7 +88,8 @@ class KubernetesClient:
|
||||
try:
|
||||
data = getattr(self, func_name)(*args)
|
||||
except Exception as e:
|
||||
logger.error(f'K8S tree get {tp} error: {e}')
|
||||
logger.error(e)
|
||||
raise e
|
||||
|
||||
if self.server:
|
||||
self.server.stop()
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
#
|
||||
|
||||
from importlib import import_module
|
||||
|
||||
from django.conf import settings
|
||||
@@ -65,7 +66,7 @@ class FTPLogViewSet(OrgModelViewSet):
|
||||
date_range_filter_fields = [
|
||||
('date_start', ('date_from', 'date_to'))
|
||||
]
|
||||
filterset_fields = ['user', 'asset', 'account', 'filename', 'session']
|
||||
filterset_fields = ['user', 'asset', 'account', 'filename']
|
||||
search_fields = filterset_fields
|
||||
ordering = ['-date_start']
|
||||
http_method_names = ['post', 'get', 'head', 'options', 'patch']
|
||||
@@ -268,7 +269,7 @@ class UserSessionViewSet(CommonApiMixin, viewsets.ModelViewSet):
|
||||
return user_ids
|
||||
|
||||
def get_queryset(self):
|
||||
keys = user_session_manager.get_keys()
|
||||
keys = UserSession.get_keys()
|
||||
queryset = UserSession.objects.filter(key__in=keys)
|
||||
if current_org.is_root():
|
||||
return queryset
|
||||
@@ -287,6 +288,6 @@ class UserSessionViewSet(CommonApiMixin, viewsets.ModelViewSet):
|
||||
|
||||
keys = queryset.values_list('key', flat=True)
|
||||
for key in keys:
|
||||
user_session_manager.remove(key)
|
||||
user_session_manager.decrement_or_remove(key)
|
||||
queryset.delete()
|
||||
return Response(status=status.HTTP_200_OK)
|
||||
|
||||
@@ -12,10 +12,7 @@ from common.utils.timezone import as_current_tz
|
||||
from jumpserver.utils import current_request
|
||||
from orgs.models import Organization
|
||||
from orgs.utils import get_current_org_id
|
||||
from settings.models import Setting
|
||||
from settings.serializers import SettingsSerializer
|
||||
from users.models import Preference
|
||||
from users.serializers import PreferenceSerializer
|
||||
from .backends import get_operate_log_storage
|
||||
|
||||
logger = get_logger(__name__)
|
||||
@@ -90,15 +87,19 @@ class OperatorLogHandler(metaclass=Singleton):
|
||||
return log_id, before, after
|
||||
|
||||
@staticmethod
|
||||
def get_resource_display(resource):
|
||||
if isinstance(resource, Setting):
|
||||
serializer = SettingsSerializer()
|
||||
resource_display = serializer.get_field_label(resource.name)
|
||||
elif isinstance(resource, Preference):
|
||||
serializer = PreferenceSerializer()
|
||||
resource_display = serializer.get_field_label(resource.name)
|
||||
else:
|
||||
resource_display = str(resource)
|
||||
def get_resource_display_from_setting(resource):
|
||||
resource_display = None
|
||||
setting_serializer = SettingsSerializer()
|
||||
label = setting_serializer.get_field_label(resource)
|
||||
if label is not None:
|
||||
resource_display = label
|
||||
return resource_display
|
||||
|
||||
def get_resource_display(self, resource):
|
||||
resource_display = str(resource)
|
||||
return_value = self.get_resource_display_from_setting(resource_display)
|
||||
if return_value is not None:
|
||||
resource_display = return_value
|
||||
return resource_display
|
||||
|
||||
@staticmethod
|
||||
|
||||
@@ -288,9 +288,16 @@ class UserSession(models.Model):
|
||||
ttl = caches[settings.SESSION_CACHE_ALIAS].ttl(cache_key)
|
||||
return timezone.now() + timedelta(seconds=ttl)
|
||||
|
||||
@staticmethod
|
||||
def get_keys():
|
||||
session_store_cls = import_module(settings.SESSION_ENGINE).SessionStore
|
||||
cache_key_prefix = session_store_cls.cache_key_prefix
|
||||
keys = caches[settings.SESSION_CACHE_ALIAS].iter_keys('*')
|
||||
return [k.replace(cache_key_prefix, '') for k in keys]
|
||||
|
||||
@classmethod
|
||||
def clear_expired_sessions(cls):
|
||||
keys = user_session_manager.get_keys()
|
||||
keys = cls.get_keys()
|
||||
cls.objects.exclude(key__in=keys).delete()
|
||||
|
||||
class Meta:
|
||||
|
||||
@@ -43,7 +43,7 @@ class FTPLogSerializer(serializers.ModelSerializer):
|
||||
fields_small = fields_mini + [
|
||||
"user", "remote_addr", "asset", "account",
|
||||
"org_id", "operate", "filename", "date_start",
|
||||
"is_success", "has_file", "session"
|
||||
"is_success", "has_file",
|
||||
]
|
||||
fields = fields_small
|
||||
|
||||
|
||||
@@ -36,7 +36,6 @@ class AuthBackendLabelMapping(LazyObject):
|
||||
backend_label_mapping[settings.AUTH_BACKEND_AUTH_TOKEN] = _("Auth Token")
|
||||
backend_label_mapping[settings.AUTH_BACKEND_WECOM] = _("WeCom")
|
||||
backend_label_mapping[settings.AUTH_BACKEND_FEISHU] = _("FeiShu")
|
||||
backend_label_mapping[settings.AUTH_BACKEND_LARK] = 'Lark'
|
||||
backend_label_mapping[settings.AUTH_BACKEND_SLACK] = _("Slack")
|
||||
backend_label_mapping[settings.AUTH_BACKEND_DINGTALK] = _("DingTalk")
|
||||
backend_label_mapping[settings.AUTH_BACKEND_TEMP_TOKEN] = _("Temporary token")
|
||||
|
||||
@@ -178,7 +178,7 @@ def on_django_start_set_operate_log_monitor_models(sender, **kwargs):
|
||||
'PermedAsset', 'PermedAccount', 'MenuPermission',
|
||||
'Permission', 'TicketSession', 'ApplyLoginTicket',
|
||||
'ApplyCommandTicket', 'ApplyLoginAssetTicket',
|
||||
'FavoriteAsset',
|
||||
'FavoriteAsset', 'Asset'
|
||||
}
|
||||
for i, app in enumerate(apps.get_models(), 1):
|
||||
app_name = app._meta.app_label
|
||||
|
||||
@@ -7,17 +7,18 @@ import subprocess
|
||||
from celery import shared_task
|
||||
from django.conf import settings
|
||||
from django.core.files.storage import default_storage
|
||||
from django.db import transaction
|
||||
from django.utils import timezone
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from common.const.crontab import CRONTAB_AT_AM_TWO
|
||||
from common.storage.ftp_file import FTPFileStorageHandler
|
||||
from common.utils import get_log_keep_day, get_logger
|
||||
from ops.celery.decorator import register_as_period_task
|
||||
from common.storage.ftp_file import FTPFileStorageHandler
|
||||
from ops.celery.decorator import (
|
||||
register_as_period_task, after_app_shutdown_clean_periodic
|
||||
)
|
||||
from ops.models import CeleryTaskExecution
|
||||
from terminal.backends import server_replay_storage
|
||||
from terminal.models import Session, Command
|
||||
from terminal.backends import server_replay_storage
|
||||
from .models import UserLoginLog, OperateLog, FTPLog, ActivityLog, PasswordChangeLog
|
||||
|
||||
logger = get_logger(__name__)
|
||||
@@ -56,9 +57,9 @@ def clean_ftp_log_period():
|
||||
now = timezone.now()
|
||||
days = get_log_keep_day('FTP_LOG_KEEP_DAYS')
|
||||
expired_day = now - datetime.timedelta(days=days)
|
||||
file_store_dir = os.path.join(default_storage.base_location, FTPLog.upload_to)
|
||||
file_store_dir = os.path.join(default_storage.base_location, 'ftp_file')
|
||||
FTPLog.objects.filter(date_start__lt=expired_day).delete()
|
||||
command = "find %s -mtime +%s -type f -exec rm -f {} \\;" % (
|
||||
command = "find %s -mtime +%s -exec rm -f {} \\;" % (
|
||||
file_store_dir, days
|
||||
)
|
||||
subprocess.call(command, shell=True)
|
||||
@@ -83,15 +84,6 @@ def clean_celery_tasks_period():
|
||||
subprocess.call(command, shell=True)
|
||||
|
||||
|
||||
def batch_delete(queryset, batch_size=3000):
|
||||
model = queryset.model
|
||||
count = queryset.count()
|
||||
with transaction.atomic():
|
||||
for i in range(0, count, batch_size):
|
||||
pks = queryset[i:i + batch_size].values_list('id', flat=True)
|
||||
model.objects.filter(id__in=list(pks)).delete()
|
||||
|
||||
|
||||
def clean_expired_session_period():
|
||||
logger.info("Start clean expired session record, commands and replay")
|
||||
days = get_log_keep_day('TERMINAL_SESSION_KEEP_DURATION')
|
||||
@@ -101,9 +93,9 @@ def clean_expired_session_period():
|
||||
expired_commands = Command.objects.filter(timestamp__lt=timestamp)
|
||||
replay_dir = os.path.join(default_storage.base_location, 'replay')
|
||||
|
||||
batch_delete(expired_sessions)
|
||||
expired_sessions.delete()
|
||||
logger.info("Clean session item done")
|
||||
batch_delete(expired_commands)
|
||||
expired_commands.delete()
|
||||
logger.info("Clean session command done")
|
||||
command = "find %s -mtime +%s \\( -name '*.json' -o -name '*.tar' -o -name '*.gz' \\) -exec rm -f {} \\;" % (
|
||||
replay_dir, days
|
||||
@@ -116,6 +108,7 @@ def clean_expired_session_period():
|
||||
|
||||
@shared_task(verbose_name=_('Clean audits session task log'))
|
||||
@register_as_period_task(crontab=CRONTAB_AT_AM_TWO)
|
||||
@after_app_shutdown_clean_periodic
|
||||
def clean_audits_log_period():
|
||||
print("Start clean audit session task log")
|
||||
clean_login_log_period()
|
||||
|
||||
@@ -2,15 +2,13 @@
|
||||
#
|
||||
|
||||
from .access_key import *
|
||||
from .common import *
|
||||
from .confirm import *
|
||||
from .connection_token import *
|
||||
from .feishu import *
|
||||
from .lark import *
|
||||
from .login_confirm import *
|
||||
from .mfa import *
|
||||
from .password import *
|
||||
from .session import *
|
||||
from .sso import *
|
||||
from .temp_token import *
|
||||
from .token import *
|
||||
from .common import *
|
||||
|
||||
@@ -12,6 +12,7 @@ from common.permissions import IsValidUser, OnlySuperUser
|
||||
from common.utils import get_logger
|
||||
from users.models import User
|
||||
|
||||
|
||||
logger = get_logger(__file__)
|
||||
|
||||
|
||||
@@ -23,7 +24,6 @@ class QRUnBindBase(APIView):
|
||||
'wecom': {'user_field': 'wecom_id', 'not_bind_err': errors.WeComNotBound},
|
||||
'dingtalk': {'user_field': 'dingtalk_id', 'not_bind_err': errors.DingTalkNotBound},
|
||||
'feishu': {'user_field': 'feishu_id', 'not_bind_err': errors.FeiShuNotBound},
|
||||
'lark': {'user_field': 'lark_id', 'not_bind_err': errors.LarkNotBound},
|
||||
'slack': {'user_field': 'slack_id', 'not_bind_err': errors.SlackNotBound},
|
||||
}
|
||||
user = self.user
|
||||
|
||||
@@ -223,17 +223,12 @@ class ExtraActionApiMixin(RDPFileClientProtocolURLMixin):
|
||||
validate_exchange_token: callable
|
||||
|
||||
@action(methods=['POST', 'GET'], detail=True, url_path='rdp-file')
|
||||
def get_rdp_file(self, request, *args, **kwargs):
|
||||
def get_rdp_file(self, *args, **kwargs):
|
||||
token = self.get_object()
|
||||
token.is_valid()
|
||||
filename, content = self.get_rdp_file_info(token)
|
||||
filename = '{}.rdp'.format(filename)
|
||||
response = HttpResponse(content, content_type='application/octet-stream')
|
||||
|
||||
if is_true(request.query_params.get('reusable')):
|
||||
token.set_reusable(True)
|
||||
filename = '{}-{}'.format(filename, token.date_expired.strftime('%Y%m%d_%H%M%S'))
|
||||
|
||||
filename += '.rdp'
|
||||
response['Content-Disposition'] = 'attachment; filename*=UTF-8\'\'%s' % filename
|
||||
return response
|
||||
|
||||
@@ -384,7 +379,6 @@ class ConnectionTokenViewSet(ExtraActionApiMixin, RootOrgViewMixin, JMSModelView
|
||||
|
||||
if account.username != AliasAccount.INPUT:
|
||||
data['input_username'] = ''
|
||||
|
||||
ticket = self._validate_acl(user, asset, account)
|
||||
if ticket:
|
||||
data['from_ticket'] = ticket
|
||||
@@ -419,10 +413,7 @@ class ConnectionTokenViewSet(ExtraActionApiMixin, RootOrgViewMixin, JMSModelView
|
||||
|
||||
def _validate_acl(self, user, asset, account):
|
||||
from acls.models import LoginAssetACL
|
||||
kwargs = {'user': user, 'asset': asset, 'account': account}
|
||||
if account.username == AliasAccount.INPUT:
|
||||
kwargs['account_username'] = self.input_username
|
||||
acls = LoginAssetACL.filter_queryset(**kwargs)
|
||||
acls = LoginAssetACL.filter_queryset(user=user, asset=asset, account=account)
|
||||
ip = get_request_ip_or_data(self.request)
|
||||
acl = LoginAssetACL.get_match_rule_acls(user, ip, acls)
|
||||
if not acl:
|
||||
@@ -512,16 +503,20 @@ class SuperConnectionTokenViewSet(ConnectionTokenViewSet):
|
||||
token.is_valid()
|
||||
serializer = self.get_serializer(instance=token)
|
||||
|
||||
expire_now = request.data.get('expire_now', True)
|
||||
expire_now = request.data.get('expire_now', None)
|
||||
asset_type = token.asset.type
|
||||
# 设置默认值
|
||||
if asset_type in ['k8s', 'kubernetes']:
|
||||
expire_now = False
|
||||
if expire_now is None:
|
||||
# TODO 暂时特殊处理 k8s 不过期
|
||||
if asset_type in ['k8s', 'kubernetes']:
|
||||
expire_now = False
|
||||
else:
|
||||
expire_now = not settings.CONNECTION_TOKEN_REUSABLE
|
||||
|
||||
if token.is_reusable and settings.CONNECTION_TOKEN_REUSABLE:
|
||||
logger.debug('Token is reusable, not expire now')
|
||||
elif is_false(expire_now):
|
||||
if is_false(expire_now):
|
||||
logger.debug('Api specified, now expire now')
|
||||
elif token.is_reusable and settings.CONNECTION_TOKEN_REUSABLE:
|
||||
logger.debug('Token is reusable, not expire now')
|
||||
else:
|
||||
token.expire()
|
||||
|
||||
|
||||
@@ -1,8 +0,0 @@
|
||||
from common.utils import get_logger
|
||||
from .feishu import FeiShuEventSubscriptionCallback
|
||||
|
||||
logger = get_logger(__name__)
|
||||
|
||||
|
||||
class LarkEventSubscriptionCallback(FeiShuEventSubscriptionCallback):
|
||||
pass
|
||||
@@ -9,7 +9,6 @@ from common.utils import get_logger
|
||||
from .. import errors, mixins
|
||||
|
||||
__all__ = ['TicketStatusApi']
|
||||
|
||||
logger = get_logger(__name__)
|
||||
|
||||
|
||||
|
||||
@@ -9,7 +9,7 @@ from rest_framework.permissions import AllowAny
|
||||
from rest_framework.response import Response
|
||||
from rest_framework.serializers import ValidationError
|
||||
|
||||
from common.exceptions import JMSException, UnexpectError
|
||||
from common.exceptions import UnexpectError
|
||||
from common.utils import get_logger
|
||||
from users.models.user import User
|
||||
from .. import errors
|
||||
@@ -50,10 +50,7 @@ class MFASendCodeApi(AuthMixin, CreateAPIView):
|
||||
mfa_type = serializer.validated_data['type']
|
||||
|
||||
if not username:
|
||||
try:
|
||||
user = self.get_user_from_session()
|
||||
except errors.SessionEmptyError as e:
|
||||
raise ValidationError({'error': e})
|
||||
user = self.get_user_from_session()
|
||||
else:
|
||||
user = self.get_user_from_db(username)
|
||||
|
||||
@@ -64,8 +61,6 @@ class MFASendCodeApi(AuthMixin, CreateAPIView):
|
||||
|
||||
try:
|
||||
mfa_backend.send_challenge()
|
||||
except JMSException:
|
||||
raise
|
||||
except Exception as e:
|
||||
raise UnexpectError(str(e))
|
||||
|
||||
|
||||
@@ -1,68 +0,0 @@
|
||||
import time
|
||||
from threading import Thread
|
||||
|
||||
from django.conf import settings
|
||||
from django.contrib.auth import logout
|
||||
from django.contrib.auth.models import AnonymousUser
|
||||
from rest_framework import generics
|
||||
from rest_framework import status
|
||||
from rest_framework.response import Response
|
||||
|
||||
from common.sessions.cache import user_session_manager
|
||||
from common.utils import get_logger
|
||||
|
||||
__all__ = ['UserSessionApi']
|
||||
|
||||
logger = get_logger(__name__)
|
||||
|
||||
|
||||
class UserSessionManager:
|
||||
|
||||
def __init__(self, request):
|
||||
self.request = request
|
||||
self.session = request.session
|
||||
|
||||
def connect(self):
|
||||
user_session_manager.add_or_increment(self.session.session_key)
|
||||
|
||||
def disconnect(self):
|
||||
user_session_manager.decrement(self.session.session_key)
|
||||
if self.should_delete_session():
|
||||
thread = Thread(target=self.delay_delete_session)
|
||||
thread.start()
|
||||
|
||||
def should_delete_session(self):
|
||||
return (self.session.modified or settings.SESSION_SAVE_EVERY_REQUEST) and \
|
||||
not self.session.is_empty() and \
|
||||
self.session.get_expire_at_browser_close() and \
|
||||
not user_session_manager.check_active(self.session.session_key)
|
||||
|
||||
def delay_delete_session(self):
|
||||
timeout = 6
|
||||
check_interval = 0.5
|
||||
|
||||
start_time = time.time()
|
||||
while time.time() - start_time < timeout:
|
||||
time.sleep(check_interval)
|
||||
if user_session_manager.check_active(self.session.session_key):
|
||||
return
|
||||
|
||||
logout(self.request)
|
||||
|
||||
|
||||
class UserSessionApi(generics.RetrieveDestroyAPIView):
|
||||
permission_classes = ()
|
||||
|
||||
def retrieve(self, request, *args, **kwargs):
|
||||
if isinstance(request.user, AnonymousUser):
|
||||
return Response(status=status.HTTP_200_OK)
|
||||
|
||||
UserSessionManager(request).connect()
|
||||
return Response(status=status.HTTP_200_OK)
|
||||
|
||||
def destroy(self, request, *args, **kwargs):
|
||||
if isinstance(request.user, AnonymousUser):
|
||||
return Response(status=status.HTTP_200_OK)
|
||||
|
||||
UserSessionManager(request).disconnect()
|
||||
return Response(status=status.HTTP_204_NO_CONTENT)
|
||||
@@ -5,13 +5,11 @@ from django.conf import settings
|
||||
from django.contrib.auth import login
|
||||
from django.http.response import HttpResponseRedirect
|
||||
from rest_framework import serializers
|
||||
from rest_framework import status
|
||||
from rest_framework.decorators import action
|
||||
from rest_framework.permissions import AllowAny
|
||||
from rest_framework.request import Request
|
||||
from rest_framework.response import Response
|
||||
|
||||
from authentication.errors import ACLError
|
||||
from common.api import JMSGenericViewSet
|
||||
from common.const.http import POST, GET
|
||||
from common.permissions import OnlySuperUser
|
||||
@@ -19,10 +17,7 @@ from common.serializers import EmptySerializer
|
||||
from common.utils import reverse, safe_next_url
|
||||
from common.utils.timezone import utc_now
|
||||
from users.models import User
|
||||
from users.utils import LoginBlockUtil, LoginIpBlockUtil
|
||||
from ..errors import (
|
||||
SSOAuthClosed, AuthFailedError, LoginConfirmBaseError, SSOAuthKeyTTLError
|
||||
)
|
||||
from ..errors import SSOAuthClosed
|
||||
from ..filters import AuthKeyQueryDeclaration
|
||||
from ..mixins import AuthMixin
|
||||
from ..models import SSOToken
|
||||
@@ -68,58 +63,31 @@ class SSOViewSet(AuthMixin, JMSGenericViewSet):
|
||||
此接口违反了 `Restful` 的规范
|
||||
`GET` 应该是安全的方法,但此接口是不安全的
|
||||
"""
|
||||
status_code = status.HTTP_400_BAD_REQUEST
|
||||
request.META['HTTP_X_JMS_LOGIN_TYPE'] = 'W'
|
||||
authkey = request.query_params.get(AUTH_KEY)
|
||||
next_url = request.query_params.get(NEXT_URL)
|
||||
if not next_url or not next_url.startswith('/'):
|
||||
next_url = reverse('index')
|
||||
|
||||
try:
|
||||
if not authkey:
|
||||
raise serializers.ValidationError("authkey is required")
|
||||
if not authkey:
|
||||
raise serializers.ValidationError("authkey is required")
|
||||
|
||||
try:
|
||||
authkey = UUID(authkey)
|
||||
token = SSOToken.objects.get(authkey=authkey, expired=False)
|
||||
except (ValueError, SSOToken.DoesNotExist, serializers.ValidationError) as e:
|
||||
error_msg = str(e)
|
||||
self.send_auth_signal(success=False, reason=error_msg)
|
||||
return Response({'error': error_msg}, status=status_code)
|
||||
|
||||
error_msg = None
|
||||
user = token.user
|
||||
username = user.username
|
||||
ip = self.get_request_ip()
|
||||
|
||||
try:
|
||||
if (utc_now().timestamp() - token.date_created.timestamp()) > settings.AUTH_SSO_AUTHKEY_TTL:
|
||||
raise SSOAuthKeyTTLError()
|
||||
|
||||
self._check_is_block(username, True)
|
||||
self._check_only_allow_exists_user_auth(username)
|
||||
self._check_login_acl(user, ip)
|
||||
self.check_user_login_confirm_if_need(user)
|
||||
|
||||
self.request.session['auth_backend'] = settings.AUTH_BACKEND_SSO
|
||||
login(self.request, user, settings.AUTH_BACKEND_SSO)
|
||||
self.send_auth_signal(success=True, user=user)
|
||||
self.mark_mfa_ok('otp', user)
|
||||
|
||||
LoginIpBlockUtil(ip).clean_block_if_need()
|
||||
LoginBlockUtil(username, ip).clean_failed_count()
|
||||
self.clear_auth_mark()
|
||||
except (ACLError, LoginConfirmBaseError): # 无需记录日志
|
||||
pass
|
||||
except (AuthFailedError, SSOAuthKeyTTLError) as e:
|
||||
error_msg = e.msg
|
||||
except Exception as e:
|
||||
error_msg = str(e)
|
||||
finally:
|
||||
# 先过期,只能访问这一次
|
||||
token.expired = True
|
||||
token.save()
|
||||
|
||||
if error_msg:
|
||||
self.send_auth_signal(success=False, username=username, reason=error_msg)
|
||||
return Response({'error': error_msg}, status=status_code)
|
||||
else:
|
||||
except (ValueError, SSOToken.DoesNotExist):
|
||||
self.send_auth_signal(success=False, reason='authkey_invalid')
|
||||
return HttpResponseRedirect(next_url)
|
||||
|
||||
# 判断是否过期
|
||||
if (utc_now().timestamp() - token.date_created.timestamp()) > settings.AUTH_SSO_AUTHKEY_TTL:
|
||||
self.send_auth_signal(success=False, reason='authkey_timeout')
|
||||
return HttpResponseRedirect(next_url)
|
||||
|
||||
user = token.user
|
||||
login(self.request, user, settings.AUTH_BACKEND_SSO)
|
||||
self.send_auth_signal(success=True, user=user)
|
||||
return HttpResponseRedirect(next_url)
|
||||
|
||||
@@ -4,13 +4,10 @@ from django.contrib import auth
|
||||
from django.http import HttpResponseRedirect
|
||||
from django.urls import reverse
|
||||
from django.utils.http import urlencode
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from authentication.utils import build_absolute_uri
|
||||
from authentication.views.mixins import FlashMessageMixin
|
||||
from authentication.mixins import authenticate
|
||||
from common.utils import get_logger
|
||||
|
||||
from authentication.mixins import authenticate
|
||||
|
||||
logger = get_logger(__file__)
|
||||
|
||||
@@ -42,7 +39,7 @@ class OAuth2AuthRequestView(View):
|
||||
return HttpResponseRedirect(redirect_url)
|
||||
|
||||
|
||||
class OAuth2AuthCallbackView(View, FlashMessageMixin):
|
||||
class OAuth2AuthCallbackView(View):
|
||||
http_method_names = ['get', ]
|
||||
|
||||
def get(self, request):
|
||||
@@ -54,11 +51,6 @@ class OAuth2AuthCallbackView(View, FlashMessageMixin):
|
||||
if 'code' in callback_params:
|
||||
logger.debug(log_prompt.format('Process authenticate'))
|
||||
user = authenticate(code=callback_params['code'], request=request)
|
||||
|
||||
if err_msg := getattr(request, 'error_message', ''):
|
||||
login_url = reverse('authentication:login') + '?admin=1'
|
||||
return self.get_failed_response(login_url, title=_('Authentication failed'), msg=err_msg)
|
||||
|
||||
if user and user.is_valid:
|
||||
logger.debug(log_prompt.format('Login: {}'.format(user)))
|
||||
auth.login(self.request, user)
|
||||
|
||||
@@ -55,12 +55,6 @@ class FeiShuAuthentication(JMSModelBackend):
|
||||
pass
|
||||
|
||||
|
||||
class LarkAuthentication(FeiShuAuthentication):
|
||||
@staticmethod
|
||||
def is_enabled():
|
||||
return settings.AUTH_LARK
|
||||
|
||||
|
||||
class SlackAuthentication(JMSModelBackend):
|
||||
"""
|
||||
什么也不做呀😺
|
||||
@@ -78,6 +72,5 @@ class AuthorizationTokenAuthentication(JMSModelBackend):
|
||||
"""
|
||||
什么也不做呀😺
|
||||
"""
|
||||
|
||||
def authenticate(self, request, **kwargs):
|
||||
pass
|
||||
|
||||
@@ -52,10 +52,6 @@ class AuthFailedError(Exception):
|
||||
return str(self.msg)
|
||||
|
||||
|
||||
class SSOAuthKeyTTLError(Exception):
|
||||
msg = 'sso_authkey_timeout'
|
||||
|
||||
|
||||
class BlockGlobalIpLoginError(AuthFailedError):
|
||||
error = 'block_global_ip_login'
|
||||
|
||||
|
||||
@@ -33,11 +33,6 @@ class FeiShuNotBound(JMSException):
|
||||
default_detail = _('FeiShu is not bound')
|
||||
|
||||
|
||||
class LarkNotBound(JMSException):
|
||||
default_code = 'lark_not_bound'
|
||||
default_detail = _('Lark is not bound')
|
||||
|
||||
|
||||
class SlackNotBound(JMSException):
|
||||
default_code = 'slack_not_bound'
|
||||
default_detail = _('Slack is not bound')
|
||||
|
||||
@@ -17,6 +17,10 @@ class EncryptedField(forms.CharField):
|
||||
|
||||
|
||||
class UserLoginForm(forms.Form):
|
||||
days_auto_login = int(settings.SESSION_COOKIE_AGE / 3600 / 24)
|
||||
disable_days_auto_login = settings.SESSION_EXPIRE_AT_BROWSER_CLOSE \
|
||||
or days_auto_login < 1
|
||||
|
||||
username = forms.CharField(
|
||||
label=_('Username'), max_length=100,
|
||||
widget=forms.TextInput(attrs={
|
||||
@@ -30,15 +34,15 @@ class UserLoginForm(forms.Form):
|
||||
)
|
||||
auto_login = forms.BooleanField(
|
||||
required=False, initial=False,
|
||||
widget=forms.CheckboxInput()
|
||||
widget=forms.CheckboxInput(
|
||||
attrs={'disabled': disable_days_auto_login}
|
||||
)
|
||||
)
|
||||
|
||||
def __init__(self, *args, **kwargs):
|
||||
super().__init__(*args, **kwargs)
|
||||
auto_login_field = self.fields['auto_login']
|
||||
auto_login_field.label = _("Auto login")
|
||||
if settings.SESSION_EXPIRE_AT_BROWSER_CLOSE:
|
||||
auto_login_field.widget = forms.HiddenInput()
|
||||
auto_login_field.label = _("{} days auto login").format(self.days_auto_login or 1)
|
||||
|
||||
def confirm_login_allowed(self, user):
|
||||
if not user.is_staff:
|
||||
|
||||
@@ -363,6 +363,7 @@ class AuthACLMixin:
|
||||
if acl.is_action(acl.ActionChoices.notice):
|
||||
self.request.session['auth_notice_required'] = '1'
|
||||
self.request.session['auth_acl_id'] = str(acl.id)
|
||||
return
|
||||
|
||||
def _check_third_party_login_acl(self):
|
||||
request = self.request
|
||||
|
||||
@@ -82,15 +82,12 @@ class ConnectionToken(JMSOrgBaseModel):
|
||||
self.save(update_fields=['date_expired'])
|
||||
|
||||
def set_reusable(self, is_reusable):
|
||||
if not settings.CONNECTION_TOKEN_REUSABLE:
|
||||
return
|
||||
self.is_reusable = is_reusable
|
||||
if self.is_reusable:
|
||||
seconds = settings.CONNECTION_TOKEN_REUSABLE_EXPIRATION
|
||||
else:
|
||||
seconds = settings.CONNECTION_TOKEN_ONETIME_EXPIRATION
|
||||
|
||||
self.date_expired = self.date_created + timedelta(seconds=seconds)
|
||||
self.date_expired = timezone.now() + timedelta(seconds=seconds)
|
||||
self.save(update_fields=['is_reusable', 'date_expired'])
|
||||
|
||||
def renewal(self):
|
||||
@@ -204,14 +201,12 @@ class ConnectionToken(JMSOrgBaseModel):
|
||||
|
||||
host, account, lock_key = bulk_get(host_account, ('host', 'account', 'lock_key'))
|
||||
gateway = host.domain.select_gateway() if host.domain else None
|
||||
platform = host.platform
|
||||
|
||||
data = {
|
||||
'id': lock_key,
|
||||
'applet': applet,
|
||||
'host': host,
|
||||
'gateway': gateway,
|
||||
'platform': platform,
|
||||
'account': account,
|
||||
'remote_app_option': self.get_remote_app_option()
|
||||
}
|
||||
|
||||
@@ -161,7 +161,6 @@ class ConnectTokenAppletOptionSerializer(serializers.Serializer):
|
||||
host = _ConnectionTokenAssetSerializer(read_only=True)
|
||||
account = _ConnectionTokenAccountSerializer(read_only=True)
|
||||
gateway = _ConnectionTokenGatewaySerializer(read_only=True)
|
||||
platform = _ConnectionTokenPlatformSerializer(read_only=True)
|
||||
remote_app_option = serializers.JSONField(read_only=True)
|
||||
|
||||
|
||||
|
||||
@@ -1,3 +1,5 @@
|
||||
from importlib import import_module
|
||||
|
||||
from django.conf import settings
|
||||
from django.contrib.auth import user_logged_in
|
||||
from django.core.cache import cache
|
||||
@@ -6,7 +8,6 @@ from django_cas_ng.signals import cas_user_authenticated
|
||||
|
||||
from apps.jumpserver.settings.auth import AUTHENTICATION_BACKENDS_THIRD_PARTY
|
||||
from audits.models import UserSession
|
||||
from common.sessions.cache import user_session_manager
|
||||
from .signals import post_auth_success, post_auth_failed, user_auth_failed, user_auth_success
|
||||
|
||||
|
||||
@@ -31,7 +32,8 @@ def on_user_auth_login_success(sender, user, request, **kwargs):
|
||||
lock_key = 'single_machine_login_' + str(user.id)
|
||||
session_key = cache.get(lock_key)
|
||||
if session_key and session_key != request.session.session_key:
|
||||
user_session_manager.remove(session_key)
|
||||
session = import_module(settings.SESSION_ENGINE).SessionStore(session_key)
|
||||
session.delete()
|
||||
UserSession.objects.filter(key=session_key).delete()
|
||||
cache.set(lock_key, request.session.session_key, None)
|
||||
|
||||
|
||||
@@ -95,7 +95,6 @@ function doRequestAuth() {
|
||||
}
|
||||
clearInterval(interval);
|
||||
clearInterval(checkInterval);
|
||||
cancelTicket();
|
||||
$(".copy-btn").attr('disabled', 'disabled');
|
||||
errorMsgRef.html(data.msg)
|
||||
}
|
||||
|
||||
@@ -22,9 +22,6 @@ urlpatterns = [
|
||||
path('feishu/event/subscription/callback/', api.FeiShuEventSubscriptionCallback.as_view(),
|
||||
name='feishu-event-subscription-callback'),
|
||||
|
||||
path('lark/event/subscription/callback/', api.LarkEventSubscriptionCallback.as_view(),
|
||||
name='lark-event-subscription-callback'),
|
||||
|
||||
path('auth/', api.TokenCreateApi.as_view(), name='user-auth'),
|
||||
path('confirm-oauth/', api.ConfirmBindORUNBindOAuth.as_view(), name='confirm-oauth'),
|
||||
path('tokens/', api.TokenCreateApi.as_view(), name='auth-token'),
|
||||
@@ -35,7 +32,6 @@ urlpatterns = [
|
||||
path('password/reset-code/', api.UserResetPasswordSendCodeApi.as_view(), name='reset-password-code'),
|
||||
path('password/verify/', api.UserPasswordVerifyApi.as_view(), name='user-password-verify'),
|
||||
path('login-confirm-ticket/status/', api.TicketStatusApi.as_view(), name='login-confirm-ticket-status'),
|
||||
path('user-session/', api.UserSessionApi.as_view(), name='user-session'),
|
||||
]
|
||||
|
||||
urlpatterns += router.urls + passkey_urlpatterns
|
||||
|
||||
@@ -49,12 +49,6 @@ urlpatterns = [
|
||||
path('feishu/qr/bind/callback/', views.FeiShuQRBindCallbackView.as_view(), name='feishu-qr-bind-callback'),
|
||||
path('feishu/qr/login/callback/', views.FeiShuQRLoginCallbackView.as_view(), name='feishu-qr-login-callback'),
|
||||
|
||||
path('lark/bind/start/', views.LarkEnableStartView.as_view(), name='lark-bind-start'),
|
||||
path('lark/qr/bind/', views.LarkQRBindView.as_view(), name='lark-qr-bind'),
|
||||
path('lark/qr/login/', views.LarkQRLoginView.as_view(), name='lark-qr-login'),
|
||||
path('lark/qr/bind/callback/', views.LarkQRBindCallbackView.as_view(), name='lark-qr-bind-callback'),
|
||||
path('lark/qr/login/callback/', views.LarkQRLoginCallbackView.as_view(), name='lark-qr-login-callback'),
|
||||
|
||||
path('slack/bind/start/', views.SlackEnableStartView.as_view(), name='slack-bind-start'),
|
||||
path('slack/qr/bind/', views.SlackQRBindView.as_view(), name='slack-qr-bind'),
|
||||
path('slack/qr/login/', views.SlackQRLoginView.as_view(), name='slack-qr-login'),
|
||||
|
||||
@@ -1,9 +1,8 @@
|
||||
# -*- coding: utf-8 -*-
|
||||
#
|
||||
from .dingtalk import *
|
||||
from .feishu import *
|
||||
from .lark import *
|
||||
from .login import *
|
||||
from .mfa import *
|
||||
from .slack import *
|
||||
from .wecom import *
|
||||
from .dingtalk import *
|
||||
from .feishu import *
|
||||
from .slack import *
|
||||
|
||||
@@ -1,8 +1,8 @@
|
||||
from functools import lru_cache
|
||||
|
||||
from django.conf import settings
|
||||
from django.contrib.auth import logout as auth_logout
|
||||
from django.db.utils import IntegrityError
|
||||
from django.contrib.auth import logout as auth_logout
|
||||
from django.utils.module_loading import import_string
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
from django.views import View
|
||||
@@ -12,8 +12,8 @@ from authentication import errors
|
||||
from authentication.mixins import AuthMixin
|
||||
from authentication.notifications import OAuthBindMessage
|
||||
from common.utils import get_logger
|
||||
from common.utils.common import get_request_ip
|
||||
from common.utils.django import reverse, get_object_or_none
|
||||
from common.utils.common import get_request_ip
|
||||
from users.models import User
|
||||
from users.signal_handlers import check_only_allow_exist_user_auth
|
||||
from .mixins import FlashMessageMixin
|
||||
@@ -83,15 +83,7 @@ class BaseLoginCallbackView(AuthMixin, FlashMessageMixin, IMClientMixin, View):
|
||||
if not self.verify_state():
|
||||
return self.get_verify_state_failed_response(redirect_url)
|
||||
|
||||
try:
|
||||
user_id, other_info = self.client.get_user_id_by_code(code)
|
||||
except Exception:
|
||||
response = self.get_failed_response(
|
||||
login_url, title=self.msg_client_err,
|
||||
msg=self.msg_not_found_user_from_client_err
|
||||
)
|
||||
return response
|
||||
|
||||
user_id, other_info = self.client.get_user_id_by_code(code)
|
||||
if not user_id:
|
||||
# 正常流程不会出这个错误,hack 行为
|
||||
err = self.msg_not_found_user_from_client_err
|
||||
|
||||
@@ -21,45 +21,24 @@ from .mixins import FlashMessageMixin
|
||||
|
||||
logger = get_logger(__file__)
|
||||
|
||||
|
||||
class FeiShuEnableStartView(UserVerifyPasswordView):
|
||||
category = 'feishu'
|
||||
|
||||
def get_success_url(self):
|
||||
referer = self.request.META.get('HTTP_REFERER')
|
||||
redirect_url = self.request.GET.get("redirect_url")
|
||||
|
||||
success_url = reverse(f'authentication:{self.category}-qr-bind')
|
||||
|
||||
success_url += '?' + urlencode({
|
||||
'redirect_url': redirect_url or referer
|
||||
})
|
||||
|
||||
return success_url
|
||||
FEISHU_STATE_SESSION_KEY = '_feishu_state'
|
||||
|
||||
|
||||
class FeiShuQRMixin(UserConfirmRequiredExceptionMixin, PermissionsMixin, FlashMessageMixin, View):
|
||||
category = 'feishu'
|
||||
error = _('FeiShu Error')
|
||||
error_msg = _('FeiShu is already bound')
|
||||
state_session_key = f'_{category}_state'
|
||||
|
||||
@property
|
||||
def url_object(self):
|
||||
return URL()
|
||||
|
||||
def dispatch(self, request, *args, **kwargs):
|
||||
try:
|
||||
return super().dispatch(request, *args, **kwargs)
|
||||
except APIException as e:
|
||||
msg = str(e.detail)
|
||||
return self.get_failed_response(
|
||||
'/', self.error, msg
|
||||
'/',
|
||||
_('FeiShu Error'),
|
||||
msg
|
||||
)
|
||||
|
||||
def verify_state(self):
|
||||
state = self.request.GET.get('state')
|
||||
session_state = self.request.session.get(self.state_session_key)
|
||||
session_state = self.request.session.get(FEISHU_STATE_SESSION_KEY)
|
||||
if state != session_state:
|
||||
return False
|
||||
return True
|
||||
@@ -70,18 +49,19 @@ class FeiShuQRMixin(UserConfirmRequiredExceptionMixin, PermissionsMixin, FlashMe
|
||||
|
||||
def get_qr_url(self, redirect_uri):
|
||||
state = random_string(16)
|
||||
self.request.session[self.state_session_key] = state
|
||||
self.request.session[FEISHU_STATE_SESSION_KEY] = state
|
||||
|
||||
params = {
|
||||
'app_id': getattr(settings, f'{self.category}_APP_ID'.upper()),
|
||||
'app_id': settings.FEISHU_APP_ID,
|
||||
'state': state,
|
||||
'redirect_uri': redirect_uri,
|
||||
}
|
||||
url = self.url_object.authen + '?' + urlencode(params)
|
||||
url = URL().authen + '?' + urlencode(params)
|
||||
return url
|
||||
|
||||
def get_already_bound_response(self, redirect_url):
|
||||
response = self.get_failed_response(redirect_url, self.error_msg, self.error_msg)
|
||||
msg = _('FeiShu is already bound')
|
||||
response = self.get_failed_response(redirect_url, msg, msg)
|
||||
return response
|
||||
|
||||
|
||||
@@ -91,7 +71,7 @@ class FeiShuQRBindView(FeiShuQRMixin, View):
|
||||
def get(self, request: HttpRequest):
|
||||
redirect_url = request.GET.get('redirect_url')
|
||||
|
||||
redirect_uri = reverse(f'authentication:{self.category}-qr-bind-callback', external=True)
|
||||
redirect_uri = reverse('authentication:feishu-qr-bind-callback', external=True)
|
||||
redirect_uri += '?' + urlencode({'redirect_url': redirect_url})
|
||||
|
||||
url = self.get_qr_url(redirect_uri)
|
||||
@@ -101,16 +81,25 @@ class FeiShuQRBindView(FeiShuQRMixin, View):
|
||||
class FeiShuQRBindCallbackView(FeiShuQRMixin, BaseBindCallbackView):
|
||||
permission_classes = (IsAuthenticated,)
|
||||
|
||||
client_type_path = 'common.sdk.im.feishu.FeiShu'
|
||||
client_auth_params = {'app_id': 'FEISHU_APP_ID', 'app_secret': 'FEISHU_APP_SECRET'}
|
||||
auth_type = 'feishu'
|
||||
auth_type_label = _('FeiShu')
|
||||
client_type_path = f'common.sdk.im.{auth_type}.FeiShu'
|
||||
|
||||
@property
|
||||
def client_auth_params(self):
|
||||
return {
|
||||
'app_id': f'{self.auth_type}_APP_ID'.upper(),
|
||||
'app_secret': f'{self.auth_type}_APP_SECRET'.upper()
|
||||
}
|
||||
|
||||
class FeiShuEnableStartView(UserVerifyPasswordView):
|
||||
|
||||
def get_success_url(self):
|
||||
referer = self.request.META.get('HTTP_REFERER')
|
||||
redirect_url = self.request.GET.get("redirect_url")
|
||||
|
||||
success_url = reverse('authentication:feishu-qr-bind')
|
||||
|
||||
success_url += '?' + urlencode({
|
||||
'redirect_url': redirect_url or referer
|
||||
})
|
||||
|
||||
return success_url
|
||||
|
||||
|
||||
class FeiShuQRLoginView(FeiShuQRMixin, View):
|
||||
@@ -118,7 +107,7 @@ class FeiShuQRLoginView(FeiShuQRMixin, View):
|
||||
|
||||
def get(self, request: HttpRequest):
|
||||
redirect_url = request.GET.get('redirect_url') or reverse('index')
|
||||
redirect_uri = reverse(f'authentication:{self.category}-qr-login-callback', external=True)
|
||||
redirect_uri = reverse('authentication:feishu-qr-login-callback', external=True)
|
||||
redirect_uri += '?' + urlencode({
|
||||
'redirect_url': redirect_url,
|
||||
})
|
||||
@@ -130,19 +119,11 @@ class FeiShuQRLoginView(FeiShuQRMixin, View):
|
||||
class FeiShuQRLoginCallbackView(FeiShuQRMixin, BaseLoginCallbackView):
|
||||
permission_classes = (AllowAny,)
|
||||
|
||||
client_type_path = 'common.sdk.im.feishu.FeiShu'
|
||||
client_auth_params = {'app_id': 'FEISHU_APP_ID', 'app_secret': 'FEISHU_APP_SECRET'}
|
||||
user_type = 'feishu'
|
||||
auth_type = user_type
|
||||
client_type_path = f'common.sdk.im.{auth_type}.FeiShu'
|
||||
auth_backend = 'AUTH_BACKEND_FEISHU'
|
||||
|
||||
msg_client_err = _('FeiShu Error')
|
||||
msg_user_not_bound_err = _('FeiShu is not bound')
|
||||
msg_not_found_user_from_client_err = _('Failed to get user from FeiShu')
|
||||
|
||||
auth_backend = f'AUTH_BACKEND_{auth_type}'.upper()
|
||||
|
||||
@property
|
||||
def client_auth_params(self):
|
||||
return {
|
||||
'app_id': f'{self.auth_type}_APP_ID'.upper(),
|
||||
'app_secret': f'{self.auth_type}_APP_SECRET'.upper()
|
||||
}
|
||||
|
||||
@@ -1,51 +0,0 @@
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
|
||||
from common.sdk.im.lark import URL
|
||||
from common.utils import get_logger
|
||||
from .feishu import (
|
||||
FeiShuEnableStartView, FeiShuQRBindView, FeiShuQRBindCallbackView,
|
||||
FeiShuQRLoginView, FeiShuQRLoginCallbackView
|
||||
)
|
||||
|
||||
logger = get_logger(__file__)
|
||||
|
||||
|
||||
class LarkEnableStartView(FeiShuEnableStartView):
|
||||
category = 'lark'
|
||||
|
||||
|
||||
class BaseLarkQRMixin:
|
||||
category = 'lark'
|
||||
error = _('Lark Error')
|
||||
error_msg = _('Lark is already bound')
|
||||
state_session_key = f'_{category}_state'
|
||||
|
||||
@property
|
||||
def url_object(self):
|
||||
return URL()
|
||||
|
||||
|
||||
class LarkQRBindView(BaseLarkQRMixin, FeiShuQRBindView):
|
||||
pass
|
||||
|
||||
|
||||
class LarkQRBindCallbackView(BaseLarkQRMixin, FeiShuQRBindCallbackView):
|
||||
auth_type = 'lark'
|
||||
auth_type_label = auth_type.capitalize()
|
||||
client_type_path = f'common.sdk.im.{auth_type}.Lark'
|
||||
|
||||
|
||||
class LarkQRLoginView(BaseLarkQRMixin, FeiShuQRLoginView):
|
||||
pass
|
||||
|
||||
|
||||
class LarkQRLoginCallbackView(BaseLarkQRMixin, FeiShuQRLoginCallbackView):
|
||||
user_type = 'lark'
|
||||
auth_type = user_type
|
||||
client_type_path = f'common.sdk.im.{auth_type}.Lark'
|
||||
|
||||
msg_client_err = _('Lark Error')
|
||||
msg_user_not_bound_err = _('Lark is not bound')
|
||||
msg_not_found_user_from_client_err = _('Failed to get user from Lark')
|
||||
|
||||
auth_backend = f'AUTH_BACKEND_{auth_type}'.upper()
|
||||
@@ -91,12 +91,6 @@ class UserLoginContextMixin:
|
||||
'url': reverse('authentication:feishu-qr-login'),
|
||||
'logo': static('img/login_feishu_logo.png')
|
||||
},
|
||||
{
|
||||
'name': 'Lark',
|
||||
'enabled': settings.AUTH_LARK,
|
||||
'url': reverse('authentication:lark-qr-login'),
|
||||
'logo': static('img/login_lark_logo.png')
|
||||
},
|
||||
{
|
||||
'name': _('Slack'),
|
||||
'enabled': settings.AUTH_SLACK,
|
||||
@@ -119,10 +113,6 @@ class UserLoginContextMixin:
|
||||
'title': '中文(简体)',
|
||||
'code': 'zh-hans'
|
||||
},
|
||||
{
|
||||
'title': '中文(繁體)',
|
||||
'code': 'zh-hant'
|
||||
},
|
||||
{
|
||||
'title': 'English',
|
||||
'code': 'en'
|
||||
|
||||
@@ -6,7 +6,6 @@ from typing import Callable
|
||||
|
||||
from django.db import models
|
||||
from django.db.models.signals import m2m_changed
|
||||
from rest_framework.request import Request
|
||||
from rest_framework.response import Response
|
||||
from rest_framework.settings import api_settings
|
||||
|
||||
@@ -20,7 +19,7 @@ from .serializer import SerializerMixin
|
||||
|
||||
__all__ = [
|
||||
'CommonApiMixin', 'PaginatedResponseMixin', 'RelationMixin',
|
||||
'ExtraFilterFieldsMixin'
|
||||
'ExtraFilterFieldsMixin',
|
||||
]
|
||||
|
||||
logger = get_logger(__name__)
|
||||
@@ -90,7 +89,6 @@ class RelationMixin:
|
||||
|
||||
class QuerySetMixin:
|
||||
action: str
|
||||
request: Request
|
||||
get_serializer_class: Callable
|
||||
get_queryset: Callable
|
||||
|
||||
@@ -100,18 +98,8 @@ class QuerySetMixin:
|
||||
return queryset
|
||||
if self.action == 'metadata':
|
||||
queryset = queryset.none()
|
||||
queryset = self.setup_eager_loading(queryset)
|
||||
return queryset
|
||||
|
||||
# Todo: 未来考虑自定义 pagination
|
||||
def setup_eager_loading(self, queryset):
|
||||
if self.request.query_params.get('format') not in ['csv', 'xlsx']:
|
||||
return queryset
|
||||
serializer_class = self.get_serializer_class()
|
||||
if not serializer_class or not hasattr(serializer_class, 'setup_eager_loading'):
|
||||
return queryset
|
||||
return serializer_class.setup_eager_loading(queryset)
|
||||
|
||||
def paginate_queryset(self, queryset):
|
||||
page = super().paginate_queryset(queryset)
|
||||
serializer_class = self.get_serializer_class()
|
||||
@@ -198,7 +186,10 @@ class OrderingFielderFieldsMixin:
|
||||
model = self.queryset.model
|
||||
else:
|
||||
queryset = self.get_queryset()
|
||||
model = None if isinstance(queryset, list) else queryset.model
|
||||
if isinstance(queryset, list):
|
||||
model = None
|
||||
else:
|
||||
model = queryset.model
|
||||
|
||||
if not model:
|
||||
return []
|
||||
|
||||
@@ -27,8 +27,6 @@ class SerializerMixin:
|
||||
return None
|
||||
serializer_classes = dict(serializer_classes)
|
||||
view_action = self.request.query_params.get('action') or self.action or 'list'
|
||||
if self.request.query_params.get('format'):
|
||||
view_action = 'retrieve'
|
||||
serializer_class = serializer_classes.get(view_action)
|
||||
|
||||
if serializer_class is None:
|
||||
|
||||
@@ -469,7 +469,7 @@ class JSONManyToManyDescriptor:
|
||||
rule_match = rule.get('match', 'exact')
|
||||
|
||||
custom_filter_q = None
|
||||
spec_attr_filter = getattr(to_model, "get_{}_filter_attr_q".format(rule['name']), None)
|
||||
spec_attr_filter = getattr(to_model, "get_filter_{}_attr_q".format(rule['name']), None)
|
||||
if spec_attr_filter:
|
||||
custom_filter_q = spec_attr_filter(rule_value, rule_match)
|
||||
elif custom_attr_filter:
|
||||
@@ -478,61 +478,59 @@ class JSONManyToManyDescriptor:
|
||||
custom_q &= custom_filter_q
|
||||
continue
|
||||
|
||||
match rule_match:
|
||||
case 'in':
|
||||
res &= value in rule_value or '*' in rule_value
|
||||
case 'exact':
|
||||
res &= value == rule_value or rule_value == '*'
|
||||
case 'contains':
|
||||
res &= rule_value in value
|
||||
case 'startswith':
|
||||
res &= str(value).startswith(str(rule_value))
|
||||
case 'endswith':
|
||||
res &= str(value).endswith(str(rule_value))
|
||||
case 'regex':
|
||||
try:
|
||||
matched = bool(re.search(r'{}'.format(rule_value), value))
|
||||
except Exception as e:
|
||||
logging.error('Error regex match: %s', e)
|
||||
matched = False
|
||||
res &= matched
|
||||
case 'not':
|
||||
res &= value != rule_value
|
||||
case 'gte' | 'lte' | 'gt' | 'lt':
|
||||
operations = {
|
||||
'gte': lambda x, y: x >= y,
|
||||
'lte': lambda x, y: x <= y,
|
||||
'gt': lambda x, y: x > y,
|
||||
'lt': lambda x, y: x < y
|
||||
}
|
||||
res &= operations[rule_match](value, rule_value)
|
||||
case 'ip_in':
|
||||
if isinstance(rule_value, str):
|
||||
rule_value = [rule_value]
|
||||
res &= '*' in rule_value or contains_ip(value, rule_value)
|
||||
case rule_match if rule_match.startswith('m2m'):
|
||||
if isinstance(value, Manager):
|
||||
value = value.values_list('id', flat=True)
|
||||
elif isinstance(value, QuerySet):
|
||||
value = value.values_list('id', flat=True)
|
||||
elif isinstance(value, models.Model):
|
||||
value = [value.id]
|
||||
if isinstance(rule_value, (str, int)):
|
||||
rule_value = [rule_value]
|
||||
value = set(map(str, value))
|
||||
rule_value = set(map(str, rule_value))
|
||||
if rule_match == 'in':
|
||||
res &= value in rule_value or '*' in rule_value
|
||||
elif rule_match == 'exact':
|
||||
res &= value == rule_value or rule_value == '*'
|
||||
elif rule_match == 'contains':
|
||||
res &= (rule_value in value)
|
||||
elif rule_match == 'startswith':
|
||||
res &= str(value).startswith(str(rule_value))
|
||||
elif rule_match == 'endswith':
|
||||
res &= str(value).endswith(str(rule_value))
|
||||
elif rule_match == 'regex':
|
||||
try:
|
||||
matched = bool(re.search(r'{}'.format(rule_value), value))
|
||||
except Exception as e:
|
||||
logging.error('Error regex match: %s', e)
|
||||
matched = False
|
||||
res &= matched
|
||||
elif rule_match == 'not':
|
||||
res &= value != rule_value
|
||||
elif rule['match'] == 'gte':
|
||||
res &= value >= rule_value
|
||||
elif rule['match'] == 'lte':
|
||||
res &= value <= rule_value
|
||||
elif rule['match'] == 'gt':
|
||||
res &= value > rule_value
|
||||
elif rule['match'] == 'lt':
|
||||
res &= value < rule_value
|
||||
elif rule['match'] == 'ip_in':
|
||||
if isinstance(rule_value, str):
|
||||
rule_value = [rule_value]
|
||||
res &= '*' in rule_value or contains_ip(value, rule_value)
|
||||
elif rule['match'].startswith('m2m'):
|
||||
if isinstance(value, Manager):
|
||||
value = value.values_list('id', flat=True)
|
||||
elif isinstance(value, QuerySet):
|
||||
value = value.values_list('id', flat=True)
|
||||
elif isinstance(value, models.Model):
|
||||
value = [value.id]
|
||||
if isinstance(rule_value, (str, int)):
|
||||
rule_value = [rule_value]
|
||||
value = set(map(str, value))
|
||||
rule_value = set(map(str, rule_value))
|
||||
|
||||
if rule['match'] == 'm2m_all':
|
||||
res &= rule_value.issubset(value)
|
||||
else:
|
||||
res &= bool(value & rule_value)
|
||||
case __:
|
||||
logging.error("unknown match: {}".format(rule['match']))
|
||||
res &= False
|
||||
if rule['match'] == 'm2m_all':
|
||||
res &= rule_value.issubset(value)
|
||||
else:
|
||||
res &= bool(value & rule_value)
|
||||
else:
|
||||
logging.error("unknown match: {}".format(rule['match']))
|
||||
res &= False
|
||||
|
||||
if not res:
|
||||
return res
|
||||
|
||||
if custom_q:
|
||||
res &= to_model.objects.filter(custom_q).filter(id=obj.id).exists()
|
||||
return res
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
from contextlib import contextmanager
|
||||
|
||||
from django.db import connections, transaction
|
||||
from django.db import connections
|
||||
from django.utils.encoding import force_str
|
||||
|
||||
from common.utils import get_logger, signer, crypto
|
||||
@@ -58,17 +58,6 @@ def safe_db_connection():
|
||||
close_old_connections()
|
||||
|
||||
|
||||
@contextmanager
|
||||
def open_db_connection(alias='default'):
|
||||
connection = transaction.get_connection(alias)
|
||||
try:
|
||||
connection.connect()
|
||||
with transaction.atomic():
|
||||
yield connection
|
||||
finally:
|
||||
connection.close()
|
||||
|
||||
|
||||
class Encryptor:
|
||||
def __init__(self, value):
|
||||
self.value = force_str(value)
|
||||
|
||||
@@ -12,7 +12,6 @@ from functools import wraps
|
||||
from django.db import transaction
|
||||
|
||||
from .utils import logger
|
||||
from .db.utils import open_db_connection
|
||||
|
||||
|
||||
def on_transaction_commit(func):
|
||||
@@ -147,9 +146,7 @@ ignore_err_exceptions = (
|
||||
def _run_func_with_org(key, org, func, *args, **kwargs):
|
||||
from orgs.utils import set_current_org
|
||||
try:
|
||||
with open_db_connection() as conn:
|
||||
# 保证执行时使用的是新的 connection 数据库连接
|
||||
# 避免出现 MySQL server has gone away 的情况
|
||||
with transaction.atomic():
|
||||
set_current_org(org)
|
||||
func(*args, **kwargs)
|
||||
except Exception as e:
|
||||
@@ -204,8 +201,6 @@ def merge_delay_run(ttl=5, key=None):
|
||||
|
||||
def delay(func, *args, **kwargs):
|
||||
from orgs.utils import get_current_org
|
||||
# 每次调用 delay 时可以指定本次调用的 ttl
|
||||
current_ttl = kwargs.pop('ttl', ttl)
|
||||
suffix_key_func = key if key else default_suffix_key
|
||||
org = get_current_org()
|
||||
func_name = f'{func.__module__}_{func.__name__}'
|
||||
@@ -222,7 +217,7 @@ def merge_delay_run(ttl=5, key=None):
|
||||
else:
|
||||
cache_kwargs[k] = cache_kwargs[k].union(v)
|
||||
_loop_debouncer_func_args_cache[cache_key] = cache_kwargs
|
||||
run_debouncer_func(cache_key, org, current_ttl, func, *args, **cache_kwargs)
|
||||
run_debouncer_func(cache_key, org, ttl, func, *args, **cache_kwargs)
|
||||
|
||||
def apply(func, sync=False, *args, **kwargs):
|
||||
if sync:
|
||||
|
||||
@@ -1 +0,0 @@
|
||||
CSV_FILE_ESCAPE_CHARS = ['=', '@', '0']
|
||||
@@ -3,7 +3,6 @@
|
||||
import base64
|
||||
import json
|
||||
import logging
|
||||
from collections import defaultdict
|
||||
|
||||
from django.core.cache import cache
|
||||
from django.core.exceptions import ImproperlyConfigured
|
||||
@@ -13,7 +12,6 @@ from rest_framework import filters
|
||||
from rest_framework.compat import coreapi, coreschema
|
||||
from rest_framework.fields import DateTimeField
|
||||
from rest_framework.serializers import ValidationError
|
||||
from rest_framework.filters import OrderingFilter
|
||||
|
||||
from common import const
|
||||
from common.db.fields import RelatedManager
|
||||
@@ -25,7 +23,6 @@ __all__ = [
|
||||
'IDInFilterBackend', "CustomFilterBackend",
|
||||
"BaseFilterSet", 'IDNotFilterBackend',
|
||||
'NotOrRelFilterBackend', 'LabelFilterBackend',
|
||||
'RewriteOrderingFilter'
|
||||
]
|
||||
|
||||
|
||||
@@ -183,7 +180,7 @@ class LabelFilterBackend(filters.BaseFilterBackend):
|
||||
]
|
||||
|
||||
@staticmethod
|
||||
def parse_labels(labels_id):
|
||||
def parse_label_ids(labels_id):
|
||||
from labels.models import Label
|
||||
label_ids = [i.strip() for i in labels_id.split(',')]
|
||||
cleaned = []
|
||||
@@ -204,8 +201,8 @@ class LabelFilterBackend(filters.BaseFilterBackend):
|
||||
q = Q()
|
||||
for kwarg in args:
|
||||
q |= Q(**kwarg)
|
||||
labels = Label.objects.filter(q)
|
||||
cleaned.extend(list(labels))
|
||||
ids = Label.objects.filter(q).values_list('id', flat=True)
|
||||
cleaned.extend(list(ids))
|
||||
return cleaned
|
||||
|
||||
def filter_queryset(self, request, queryset, view):
|
||||
@@ -224,23 +221,13 @@ class LabelFilterBackend(filters.BaseFilterBackend):
|
||||
app_label = model._meta.app_label
|
||||
model_name = model._meta.model_name
|
||||
|
||||
full_resources = labeled_resource_cls.objects.filter(
|
||||
resources = labeled_resource_cls.objects.filter(
|
||||
res_type__app_label=app_label, res_type__model=model_name,
|
||||
)
|
||||
labels = self.parse_labels(labels_id)
|
||||
grouped = defaultdict(set)
|
||||
for label in labels:
|
||||
grouped[label.name].add(label.id)
|
||||
|
||||
matched_ids = set()
|
||||
for name, label_ids in grouped.items():
|
||||
resources = model.filter_resources_by_labels(full_resources, label_ids, rel='any')
|
||||
res_ids = resources.values_list('res_id', flat=True)
|
||||
if not matched_ids:
|
||||
matched_ids = set(res_ids)
|
||||
else:
|
||||
matched_ids &= set(res_ids)
|
||||
queryset = queryset.filter(id__in=matched_ids)
|
||||
label_ids = self.parse_label_ids(labels_id)
|
||||
resources = model.filter_resources_by_labels(resources, label_ids)
|
||||
res_ids = resources.values_list('res_id', flat=True)
|
||||
queryset = queryset.filter(id__in=set(res_ids))
|
||||
return queryset
|
||||
|
||||
|
||||
@@ -337,17 +324,3 @@ class NotOrRelFilterBackend(filters.BaseFilterBackend):
|
||||
queryset.query.where.connector = 'OR'
|
||||
queryset._result_cache = None
|
||||
return queryset
|
||||
|
||||
|
||||
class RewriteOrderingFilter(OrderingFilter):
|
||||
default_ordering_if_has = ('name', )
|
||||
|
||||
def get_default_ordering(self, view):
|
||||
ordering = super().get_default_ordering(view)
|
||||
# 如果 view.ordering = [] 表示不排序, 这样可以节约性能 (比如: 用户授权的资产)
|
||||
if ordering is not None:
|
||||
return ordering
|
||||
ordering_fields = getattr(view, 'ordering_fields', self.ordering_fields)
|
||||
if ordering_fields:
|
||||
ordering = tuple([f for f in ordering_fields if f in self.default_ordering_if_has])
|
||||
return ordering
|
||||
|
||||
@@ -4,25 +4,13 @@
|
||||
import chardet
|
||||
import unicodecsv
|
||||
|
||||
from common.utils import lazyproperty
|
||||
from .base import BaseFileParser
|
||||
from ..const import CSV_FILE_ESCAPE_CHARS
|
||||
|
||||
|
||||
class CSVFileParser(BaseFileParser):
|
||||
|
||||
media_type = 'text/csv'
|
||||
|
||||
@lazyproperty
|
||||
def match_escape_chars(self):
|
||||
chars = []
|
||||
for c in CSV_FILE_ESCAPE_CHARS:
|
||||
dq_char = '"{}'.format(c)
|
||||
sg_char = "'{}".format(c)
|
||||
chars.append(dq_char)
|
||||
chars.append(sg_char)
|
||||
return tuple(chars)
|
||||
|
||||
|
||||
@staticmethod
|
||||
def _universal_newlines(stream):
|
||||
"""
|
||||
@@ -30,14 +18,6 @@ class CSVFileParser(BaseFileParser):
|
||||
"""
|
||||
for line in stream.splitlines():
|
||||
yield line
|
||||
|
||||
def __parse_row(self, row):
|
||||
row_escape = []
|
||||
for d in row:
|
||||
if isinstance(d, str) and d.strip().startswith(self.match_escape_chars):
|
||||
d = d.lstrip("'").lstrip('"')
|
||||
row_escape.append(d)
|
||||
return row_escape
|
||||
|
||||
def generate_rows(self, stream_data):
|
||||
detect_result = chardet.detect(stream_data)
|
||||
@@ -45,5 +25,4 @@ class CSVFileParser(BaseFileParser):
|
||||
lines = self._universal_newlines(stream_data)
|
||||
csv_reader = unicodecsv.reader(lines, encoding=encoding)
|
||||
for row in csv_reader:
|
||||
row = self.__parse_row(row)
|
||||
yield row
|
||||
|
||||
@@ -4,7 +4,6 @@ import re
|
||||
from datetime import datetime
|
||||
|
||||
import pyzipper
|
||||
from django.conf import settings
|
||||
from django.utils.translation import gettext_lazy as _
|
||||
from rest_framework import serializers
|
||||
from rest_framework.renderers import BaseRenderer
|
||||
@@ -17,7 +16,7 @@ logger = get_logger(__file__)
|
||||
|
||||
|
||||
class BaseFileRenderer(BaseRenderer):
|
||||
# 渲染模板标识, 导入、导出、更新模板: ['import', 'update', 'export']
|
||||
# 渲染模版标识, 导入、导出、更新模版: ['import', 'update', 'export']
|
||||
template = 'export'
|
||||
serializer = None
|
||||
|
||||
@@ -78,7 +77,7 @@ class BaseFileRenderer(BaseRenderer):
|
||||
results = [results[0]] if results else results
|
||||
else:
|
||||
# 限制数据数量
|
||||
results = results[:settings.MAX_LIMIT_PER_PAGE]
|
||||
results = results[:10000]
|
||||
# 会将一些 UUID 字段转化为 string
|
||||
results = json.loads(json.dumps(results, cls=encoders.JSONEncoder))
|
||||
return results
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user