mirror of
https://github.com/hwchase17/langchain.git
synced 2026-01-13 20:06:24 +00:00
Bumps [actions/download-artifact](https://github.com/actions/download-artifact) from 4 to 5. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/actions/download-artifact/releases">actions/download-artifact's releases</a>.</em></p> <blockquote> <h2>v5.0.0</h2> <h2>What's Changed</h2> <ul> <li>Update README.md by <a href="https://github.com/nebuk89"><code>@nebuk89</code></a> in <a href="https://redirect.github.com/actions/download-artifact/pull/407">actions/download-artifact#407</a></li> <li>BREAKING fix: inconsistent path behavior for single artifact downloads by ID by <a href="https://github.com/GrantBirki"><code>@GrantBirki</code></a> in <a href="https://redirect.github.com/actions/download-artifact/pull/416">actions/download-artifact#416</a></li> </ul> <h2>v5.0.0</h2> <h3>🚨 Breaking Change</h3> <p>This release fixes an inconsistency in path behavior for single artifact downloads by ID. <strong>If you're downloading single artifacts by ID, the output path may change.</strong></p> <h4>What Changed</h4> <p>Previously, <strong>single artifact downloads</strong> behaved differently depending on how you specified the artifact:</p> <ul> <li><strong>By name</strong>: <code>name: my-artifact</code> → extracted to <code>path/</code> (direct)</li> <li><strong>By ID</strong>: <code>artifact-ids: 12345</code> → extracted to <code>path/my-artifact/</code> (nested)</li> </ul> <p>Now both methods are consistent:</p> <ul> <li><strong>By name</strong>: <code>name: my-artifact</code> → extracted to <code>path/</code> (unchanged)</li> <li><strong>By ID</strong>: <code>artifact-ids: 12345</code> → extracted to <code>path/</code> (fixed - now direct)</li> </ul> <h4>Migration Guide</h4> <h5>✅ No Action Needed If:</h5> <ul> <li>You download artifacts by <strong>name</strong></li> <li>You download <strong>multiple</strong> artifacts by ID</li> <li>You already use <code>merge-multiple: true</code> as a workaround</li> </ul> <h5>⚠️ Action Required If:</h5> <p>You download <strong>single artifacts by ID</strong> and your workflows expect the nested directory structure.</p> <p><strong>Before v5 (nested structure):</strong></p> <pre lang="yaml"><code>- uses: actions/download-artifact@v4 with: artifact-ids: 12345 path: dist # Files were in: dist/my-artifact/ </code></pre> <blockquote> <p>Where <code>my-artifact</code> is the name of the artifact you previously uploaded</p> </blockquote> <p><strong>To maintain old behavior (if needed):</strong></p> <pre lang="yaml"><code></tr></table> </code></pre> </blockquote> <p>... (truncated)</p> </details> <details> <summary>Commits</summary> <ul> <li><a href="634f93cb29"><code>634f93c</code></a> Merge pull request <a href="https://redirect.github.com/actions/download-artifact/issues/416">#416</a> from actions/single-artifact-id-download-path</li> <li><a href="b19ff43027"><code>b19ff43</code></a> refactor: resolve download path correctly in artifact download tests (mainly ...</li> <li><a href="e262cbee4a"><code>e262cbe</code></a> bundle dist</li> <li><a href="bff23f9308"><code>bff23f9</code></a> update docs</li> <li><a href="fff8c148a8"><code>fff8c14</code></a> fix download path logic when downloading a single artifact by id</li> <li><a href="448e3f862a"><code>448e3f8</code></a> Merge pull request <a href="https://redirect.github.com/actions/download-artifact/issues/407">#407</a> from actions/nebuk89-patch-1</li> <li><a href="47225c44b3"><code>47225c4</code></a> Update README.md</li> <li>See full diff in <a href="https://github.com/actions/download-artifact/compare/v4...v5">compare view</a></li> </ul> </details> <br /> [](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) </details> Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
504 lines
19 KiB
YAML
504 lines
19 KiB
YAML
name: '🚀 Package Release'
|
|
run-name: 'Release ${{ inputs.working-directory }} ${{ inputs.release-version }}'
|
|
on:
|
|
workflow_call:
|
|
inputs:
|
|
working-directory:
|
|
required: true
|
|
type: string
|
|
description: "From which folder this pipeline executes"
|
|
workflow_dispatch:
|
|
inputs:
|
|
working-directory:
|
|
required: true
|
|
type: string
|
|
description: "From which folder this pipeline executes"
|
|
default: 'libs/langchain'
|
|
release-version:
|
|
required: true
|
|
type: string
|
|
default: '0.1.0'
|
|
description: "New version of package being released"
|
|
dangerous-nonmaster-release:
|
|
required: false
|
|
type: boolean
|
|
default: false
|
|
description: "Release from a non-master branch (danger!) - Only use for hotfixes"
|
|
|
|
env:
|
|
PYTHON_VERSION: "3.11"
|
|
UV_FROZEN: "true"
|
|
UV_NO_SYNC: "true"
|
|
|
|
jobs:
|
|
# Build the distribution package and extract version info
|
|
# Runs in isolated environment with minimal permissions for security
|
|
build:
|
|
if: github.ref == 'refs/heads/master' || inputs.dangerous-nonmaster-release
|
|
environment: Scheduled testing
|
|
runs-on: ubuntu-latest
|
|
|
|
outputs:
|
|
pkg-name: ${{ steps.check-version.outputs.pkg-name }}
|
|
version: ${{ steps.check-version.outputs.version }}
|
|
|
|
steps:
|
|
- uses: actions/checkout@v4
|
|
|
|
- name: Set up Python + uv
|
|
uses: "./.github/actions/uv_setup"
|
|
with:
|
|
python-version: ${{ env.PYTHON_VERSION }}
|
|
|
|
# We want to keep this build stage *separate* from the release stage,
|
|
# so that there's no sharing of permissions between them.
|
|
# The release stage has trusted publishing and GitHub repo contents write access,
|
|
# and we want to keep the scope of that access limited just to the release job.
|
|
# Otherwise, a malicious `build` step (e.g. via a compromised dependency)
|
|
# could get access to our GitHub or PyPI credentials.
|
|
#
|
|
# Per the trusted publishing GitHub Action:
|
|
# > It is strongly advised to separate jobs for building [...]
|
|
# > from the publish job.
|
|
# https://github.com/pypa/gh-action-pypi-publish#non-goals
|
|
- name: Build project for distribution
|
|
run: uv build
|
|
working-directory: ${{ inputs.working-directory }}
|
|
|
|
- name: Upload build
|
|
uses: actions/upload-artifact@v4
|
|
with:
|
|
name: dist
|
|
path: ${{ inputs.working-directory }}/dist/
|
|
|
|
- name: Check version
|
|
id: check-version
|
|
shell: python
|
|
working-directory: ${{ inputs.working-directory }}
|
|
run: |
|
|
import os
|
|
import tomllib
|
|
with open("pyproject.toml", "rb") as f:
|
|
data = tomllib.load(f)
|
|
pkg_name = data["project"]["name"]
|
|
version = data["project"]["version"]
|
|
with open(os.environ["GITHUB_OUTPUT"], "a") as f:
|
|
f.write(f"pkg-name={pkg_name}\n")
|
|
f.write(f"version={version}\n")
|
|
release-notes:
|
|
needs:
|
|
- build
|
|
runs-on: ubuntu-latest
|
|
outputs:
|
|
release-body: ${{ steps.generate-release-body.outputs.release-body }}
|
|
steps:
|
|
- uses: actions/checkout@v4
|
|
with:
|
|
repository: langchain-ai/langchain
|
|
path: langchain
|
|
sparse-checkout: | # this only grabs files for relevant dir
|
|
${{ inputs.working-directory }}
|
|
ref: ${{ github.ref }} # this scopes to just ref'd branch
|
|
fetch-depth: 0 # this fetches entire commit history
|
|
- name: Check tags
|
|
id: check-tags
|
|
shell: bash
|
|
working-directory: langchain/${{ inputs.working-directory }}
|
|
env:
|
|
PKG_NAME: ${{ needs.build.outputs.pkg-name }}
|
|
VERSION: ${{ needs.build.outputs.version }}
|
|
run: |
|
|
# Handle regular versions and pre-release versions differently
|
|
if [[ "$VERSION" == *"-"* ]]; then
|
|
# This is a pre-release version (contains a hyphen)
|
|
# Extract the base version without the pre-release suffix
|
|
BASE_VERSION=${VERSION%%-*}
|
|
# Look for the latest release of the same base version
|
|
REGEX="^$PKG_NAME==$BASE_VERSION\$"
|
|
PREV_TAG=$(git tag --sort=-creatordate | (grep -P "$REGEX" || true) | head -1)
|
|
|
|
# If no exact base version match, look for the latest release of any kind
|
|
if [ -z "$PREV_TAG" ]; then
|
|
REGEX="^$PKG_NAME==\\d+\\.\\d+\\.\\d+\$"
|
|
PREV_TAG=$(git tag --sort=-creatordate | (grep -P "$REGEX" || true) | head -1)
|
|
fi
|
|
else
|
|
# Regular version handling
|
|
PREV_TAG="$PKG_NAME==${VERSION%.*}.$(( ${VERSION##*.} - 1 ))"; [[ "${VERSION##*.}" -eq 0 ]] && PREV_TAG=""
|
|
|
|
# backup case if releasing e.g. 0.3.0, looks up last release
|
|
# note if last release (chronologically) was e.g. 0.1.47 it will get
|
|
# that instead of the last 0.2 release
|
|
if [ -z "$PREV_TAG" ]; then
|
|
REGEX="^$PKG_NAME==\\d+\\.\\d+\\.\\d+\$"
|
|
echo $REGEX
|
|
PREV_TAG=$(git tag --sort=-creatordate | (grep -P $REGEX || true) | head -1)
|
|
fi
|
|
fi
|
|
|
|
# if PREV_TAG is empty, let it be empty
|
|
if [ -z "$PREV_TAG" ]; then
|
|
echo "No previous tag found - first release"
|
|
else
|
|
# confirm prev-tag actually exists in git repo with git tag
|
|
GIT_TAG_RESULT=$(git tag -l "$PREV_TAG")
|
|
if [ -z "$GIT_TAG_RESULT" ]; then
|
|
echo "Previous tag $PREV_TAG not found in git repo"
|
|
exit 1
|
|
fi
|
|
fi
|
|
|
|
|
|
TAG="${PKG_NAME}==${VERSION}"
|
|
if [ "$TAG" == "$PREV_TAG" ]; then
|
|
echo "No new version to release"
|
|
exit 1
|
|
fi
|
|
echo tag="$TAG" >> $GITHUB_OUTPUT
|
|
echo prev-tag="$PREV_TAG" >> $GITHUB_OUTPUT
|
|
- name: Generate release body
|
|
id: generate-release-body
|
|
working-directory: langchain
|
|
env:
|
|
WORKING_DIR: ${{ inputs.working-directory }}
|
|
PKG_NAME: ${{ needs.build.outputs.pkg-name }}
|
|
TAG: ${{ steps.check-tags.outputs.tag }}
|
|
PREV_TAG: ${{ steps.check-tags.outputs.prev-tag }}
|
|
run: |
|
|
PREAMBLE="Changes since $PREV_TAG"
|
|
# if PREV_TAG is empty, then we are releasing the first version
|
|
if [ -z "$PREV_TAG" ]; then
|
|
PREAMBLE="Initial release"
|
|
PREV_TAG=$(git rev-list --max-parents=0 HEAD)
|
|
fi
|
|
{
|
|
echo 'release-body<<EOF'
|
|
echo $PREAMBLE
|
|
echo
|
|
git log --format="%s" "$PREV_TAG"..HEAD -- $WORKING_DIR
|
|
echo EOF
|
|
} >> "$GITHUB_OUTPUT"
|
|
|
|
test-pypi-publish:
|
|
needs:
|
|
- build
|
|
- release-notes
|
|
uses:
|
|
./.github/workflows/_test_release.yml
|
|
permissions: write-all
|
|
with:
|
|
working-directory: ${{ inputs.working-directory }}
|
|
dangerous-nonmaster-release: ${{ inputs.dangerous-nonmaster-release }}
|
|
secrets: inherit
|
|
|
|
pre-release-checks:
|
|
needs:
|
|
- build
|
|
- release-notes
|
|
- test-pypi-publish
|
|
runs-on: ubuntu-latest
|
|
timeout-minutes: 20
|
|
steps:
|
|
- uses: actions/checkout@v4
|
|
|
|
# We explicitly *don't* set up caching here. This ensures our tests are
|
|
# maximally sensitive to catching breakage.
|
|
#
|
|
# For example, here's a way that caching can cause a falsely-passing test:
|
|
# - Make the langchain package manifest no longer list a dependency package
|
|
# as a requirement. This means it won't be installed by `pip install`,
|
|
# and attempting to use it would cause a crash.
|
|
# - That dependency used to be required, so it may have been cached.
|
|
# When restoring the venv packages from cache, that dependency gets included.
|
|
# - Tests pass, because the dependency is present even though it wasn't specified.
|
|
# - The package is published, and it breaks on the missing dependency when
|
|
# used in the real world.
|
|
|
|
- name: Set up Python + uv
|
|
uses: "./.github/actions/uv_setup"
|
|
id: setup-python
|
|
with:
|
|
python-version: ${{ env.PYTHON_VERSION }}
|
|
|
|
- uses: actions/download-artifact@v5
|
|
with:
|
|
name: dist
|
|
path: ${{ inputs.working-directory }}/dist/
|
|
|
|
- name: Import dist package
|
|
shell: bash
|
|
working-directory: ${{ inputs.working-directory }}
|
|
env:
|
|
PKG_NAME: ${{ needs.build.outputs.pkg-name }}
|
|
VERSION: ${{ needs.build.outputs.version }}
|
|
# Here we use:
|
|
# - The default regular PyPI index as the *primary* index, meaning
|
|
# that it takes priority (https://pypi.org/simple)
|
|
# - The test PyPI index as an extra index, so that any dependencies that
|
|
# are not found on test PyPI can be resolved and installed anyway.
|
|
# (https://test.pypi.org/simple). This will include the PKG_NAME==VERSION
|
|
# package because VERSION will not have been uploaded to regular PyPI yet.
|
|
# - attempt install again after 5 seconds if it fails because there is
|
|
# sometimes a delay in availability on test pypi
|
|
run: |
|
|
uv venv
|
|
VIRTUAL_ENV=.venv uv pip install dist/*.whl
|
|
|
|
# Replace all dashes in the package name with underscores,
|
|
# since that's how Python imports packages with dashes in the name.
|
|
# also remove _official suffix
|
|
IMPORT_NAME="$(echo "$PKG_NAME" | sed s/-/_/g | sed s/_official//g)"
|
|
|
|
uv run python -c "import $IMPORT_NAME; print(dir($IMPORT_NAME))"
|
|
|
|
- name: Import test dependencies
|
|
run: uv sync --group test
|
|
working-directory: ${{ inputs.working-directory }}
|
|
|
|
# Overwrite the local version of the package with the built version
|
|
- name: Import published package (again)
|
|
working-directory: ${{ inputs.working-directory }}
|
|
shell: bash
|
|
env:
|
|
PKG_NAME: ${{ needs.build.outputs.pkg-name }}
|
|
VERSION: ${{ needs.build.outputs.version }}
|
|
run: |
|
|
VIRTUAL_ENV=.venv uv pip install dist/*.whl
|
|
|
|
- name: Run unit tests
|
|
run: make tests
|
|
working-directory: ${{ inputs.working-directory }}
|
|
|
|
- name: Check for prerelease versions
|
|
working-directory: ${{ inputs.working-directory }}
|
|
run: |
|
|
uv run python $GITHUB_WORKSPACE/.github/scripts/check_prerelease_dependencies.py pyproject.toml
|
|
|
|
- name: Get minimum versions
|
|
working-directory: ${{ inputs.working-directory }}
|
|
id: min-version
|
|
run: |
|
|
VIRTUAL_ENV=.venv uv pip install packaging requests
|
|
python_version="$(uv run python --version | awk '{print $2}')"
|
|
min_versions="$(uv run python $GITHUB_WORKSPACE/.github/scripts/get_min_versions.py pyproject.toml release $python_version)"
|
|
echo "min-versions=$min_versions" >> "$GITHUB_OUTPUT"
|
|
echo "min-versions=$min_versions"
|
|
|
|
- name: Run unit tests with minimum dependency versions
|
|
if: ${{ steps.min-version.outputs.min-versions != '' }}
|
|
env:
|
|
MIN_VERSIONS: ${{ steps.min-version.outputs.min-versions }}
|
|
run: |
|
|
VIRTUAL_ENV=.venv uv pip install --force-reinstall $MIN_VERSIONS --editable .
|
|
make tests
|
|
working-directory: ${{ inputs.working-directory }}
|
|
|
|
- name: Import integration test dependencies
|
|
run: uv sync --group test --group test_integration
|
|
working-directory: ${{ inputs.working-directory }}
|
|
|
|
- name: Run integration tests
|
|
if: ${{ startsWith(inputs.working-directory, 'libs/partners/') }}
|
|
env:
|
|
AI21_API_KEY: ${{ secrets.AI21_API_KEY }}
|
|
GOOGLE_API_KEY: ${{ secrets.GOOGLE_API_KEY }}
|
|
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
|
|
MISTRAL_API_KEY: ${{ secrets.MISTRAL_API_KEY }}
|
|
TOGETHER_API_KEY: ${{ secrets.TOGETHER_API_KEY }}
|
|
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
|
|
AZURE_OPENAI_API_VERSION: ${{ secrets.AZURE_OPENAI_API_VERSION }}
|
|
AZURE_OPENAI_API_BASE: ${{ secrets.AZURE_OPENAI_API_BASE }}
|
|
AZURE_OPENAI_API_KEY: ${{ secrets.AZURE_OPENAI_API_KEY }}
|
|
AZURE_OPENAI_CHAT_DEPLOYMENT_NAME: ${{ secrets.AZURE_OPENAI_CHAT_DEPLOYMENT_NAME }}
|
|
AZURE_OPENAI_LEGACY_CHAT_DEPLOYMENT_NAME: ${{ secrets.AZURE_OPENAI_LEGACY_CHAT_DEPLOYMENT_NAME }}
|
|
AZURE_OPENAI_LLM_DEPLOYMENT_NAME: ${{ secrets.AZURE_OPENAI_LLM_DEPLOYMENT_NAME }}
|
|
AZURE_OPENAI_EMBEDDINGS_DEPLOYMENT_NAME: ${{ secrets.AZURE_OPENAI_EMBEDDINGS_DEPLOYMENT_NAME }}
|
|
NVIDIA_API_KEY: ${{ secrets.NVIDIA_API_KEY }}
|
|
GOOGLE_SEARCH_API_KEY: ${{ secrets.GOOGLE_SEARCH_API_KEY }}
|
|
GOOGLE_CSE_ID: ${{ secrets.GOOGLE_CSE_ID }}
|
|
GROQ_API_KEY: ${{ secrets.GROQ_API_KEY }}
|
|
HUGGINGFACEHUB_API_TOKEN: ${{ secrets.HUGGINGFACEHUB_API_TOKEN }}
|
|
EXA_API_KEY: ${{ secrets.EXA_API_KEY }}
|
|
NOMIC_API_KEY: ${{ secrets.NOMIC_API_KEY }}
|
|
WATSONX_APIKEY: ${{ secrets.WATSONX_APIKEY }}
|
|
WATSONX_PROJECT_ID: ${{ secrets.WATSONX_PROJECT_ID }}
|
|
ASTRA_DB_API_ENDPOINT: ${{ secrets.ASTRA_DB_API_ENDPOINT }}
|
|
ASTRA_DB_APPLICATION_TOKEN: ${{ secrets.ASTRA_DB_APPLICATION_TOKEN }}
|
|
ASTRA_DB_KEYSPACE: ${{ secrets.ASTRA_DB_KEYSPACE }}
|
|
ES_URL: ${{ secrets.ES_URL }}
|
|
ES_CLOUD_ID: ${{ secrets.ES_CLOUD_ID }}
|
|
ES_API_KEY: ${{ secrets.ES_API_KEY }}
|
|
MONGODB_ATLAS_URI: ${{ secrets.MONGODB_ATLAS_URI }}
|
|
UPSTAGE_API_KEY: ${{ secrets.UPSTAGE_API_KEY }}
|
|
FIREWORKS_API_KEY: ${{ secrets.FIREWORKS_API_KEY }}
|
|
XAI_API_KEY: ${{ secrets.XAI_API_KEY }}
|
|
DEEPSEEK_API_KEY: ${{ secrets.DEEPSEEK_API_KEY }}
|
|
PPLX_API_KEY: ${{ secrets.PPLX_API_KEY }}
|
|
run: make integration_tests
|
|
working-directory: ${{ inputs.working-directory }}
|
|
|
|
# Test select published packages against new core
|
|
test-prior-published-packages-against-new-core:
|
|
needs:
|
|
- build
|
|
- release-notes
|
|
- test-pypi-publish
|
|
- pre-release-checks
|
|
runs-on: ubuntu-latest
|
|
strategy:
|
|
matrix:
|
|
partner: [openai, anthropic]
|
|
fail-fast: false # Continue testing other partners if one fails
|
|
env:
|
|
ANTHROPIC_API_KEY: ${{ secrets.ANTHROPIC_API_KEY }}
|
|
ANTHROPIC_FILES_API_IMAGE_ID: ${{ secrets.ANTHROPIC_FILES_API_IMAGE_ID }}
|
|
ANTHROPIC_FILES_API_PDF_ID: ${{ secrets.ANTHROPIC_FILES_API_PDF_ID }}
|
|
OPENAI_API_KEY: ${{ secrets.OPENAI_API_KEY }}
|
|
AZURE_OPENAI_API_VERSION: ${{ secrets.AZURE_OPENAI_API_VERSION }}
|
|
AZURE_OPENAI_API_BASE: ${{ secrets.AZURE_OPENAI_API_BASE }}
|
|
AZURE_OPENAI_API_KEY: ${{ secrets.AZURE_OPENAI_API_KEY }}
|
|
AZURE_OPENAI_CHAT_DEPLOYMENT_NAME: ${{ secrets.AZURE_OPENAI_CHAT_DEPLOYMENT_NAME }}
|
|
AZURE_OPENAI_LEGACY_CHAT_DEPLOYMENT_NAME: ${{ secrets.AZURE_OPENAI_LEGACY_CHAT_DEPLOYMENT_NAME }}
|
|
AZURE_OPENAI_LLM_DEPLOYMENT_NAME: ${{ secrets.AZURE_OPENAI_LLM_DEPLOYMENT_NAME }}
|
|
AZURE_OPENAI_EMBEDDINGS_DEPLOYMENT_NAME: ${{ secrets.AZURE_OPENAI_EMBEDDINGS_DEPLOYMENT_NAME }}
|
|
steps:
|
|
- uses: actions/checkout@v4
|
|
|
|
# We implement this conditional as Github Actions does not have good support
|
|
# for conditionally needing steps. https://github.com/actions/runner/issues/491
|
|
- name: Check if libs/core
|
|
run: |
|
|
if [ "${{ startsWith(inputs.working-directory, 'libs/core') }}" != "true" ]; then
|
|
echo "Not in libs/core. Exiting successfully."
|
|
exit 0
|
|
fi
|
|
|
|
- name: Set up Python + uv
|
|
if: startsWith(inputs.working-directory, 'libs/core')
|
|
uses: "./.github/actions/uv_setup"
|
|
with:
|
|
python-version: ${{ env.PYTHON_VERSION }}
|
|
|
|
- uses: actions/download-artifact@v5
|
|
if: startsWith(inputs.working-directory, 'libs/core')
|
|
with:
|
|
name: dist
|
|
path: ${{ inputs.working-directory }}/dist/
|
|
|
|
- name: Test against ${{ matrix.partner }}
|
|
if: startsWith(inputs.working-directory, 'libs/core')
|
|
run: |
|
|
# Identify latest tag, excluding pre-releases
|
|
LATEST_PACKAGE_TAG="$(
|
|
git ls-remote --tags origin "langchain-${{ matrix.partner }}*" \
|
|
| awk '{print $2}' \
|
|
| sed 's|refs/tags/||' \
|
|
| grep -Ev '==[^=]*(\.?dev[0-9]*|\.?rc[0-9]*)$' \
|
|
| sort -Vr \
|
|
| head -n 1
|
|
)"
|
|
echo "Latest package tag: $LATEST_PACKAGE_TAG"
|
|
|
|
# Shallow-fetch just that single tag
|
|
git fetch --depth=1 origin tag "$LATEST_PACKAGE_TAG"
|
|
|
|
# Checkout the latest package files
|
|
rm -rf $GITHUB_WORKSPACE/libs/partners/${{ matrix.partner }}/*
|
|
rm -rf $GITHUB_WORKSPACE/libs/standard-tests/*
|
|
cd $GITHUB_WORKSPACE/libs/
|
|
git checkout "$LATEST_PACKAGE_TAG" -- standard-tests/
|
|
git checkout "$LATEST_PACKAGE_TAG" -- partners/${{ matrix.partner }}/
|
|
cd partners/${{ matrix.partner }}
|
|
|
|
# Print as a sanity check
|
|
echo "Version number from pyproject.toml: "
|
|
cat pyproject.toml | grep "version = "
|
|
|
|
# Run tests
|
|
uv sync --group test --group test_integration
|
|
uv pip install ../../core/dist/*.whl
|
|
make integration_tests
|
|
|
|
publish:
|
|
needs:
|
|
- build
|
|
- release-notes
|
|
- test-pypi-publish
|
|
- pre-release-checks
|
|
- test-prior-published-packages-against-new-core
|
|
runs-on: ubuntu-latest
|
|
permissions:
|
|
# This permission is used for trusted publishing:
|
|
# https://blog.pypi.org/posts/2023-04-20-introducing-trusted-publishers/
|
|
#
|
|
# Trusted publishing has to also be configured on PyPI for each package:
|
|
# https://docs.pypi.org/trusted-publishers/adding-a-publisher/
|
|
id-token: write
|
|
|
|
defaults:
|
|
run:
|
|
working-directory: ${{ inputs.working-directory }}
|
|
|
|
steps:
|
|
- uses: actions/checkout@v4
|
|
|
|
- name: Set up Python + uv
|
|
uses: "./.github/actions/uv_setup"
|
|
with:
|
|
python-version: ${{ env.PYTHON_VERSION }}
|
|
|
|
- uses: actions/download-artifact@v5
|
|
with:
|
|
name: dist
|
|
path: ${{ inputs.working-directory }}/dist/
|
|
|
|
- name: Publish package distributions to PyPI
|
|
uses: pypa/gh-action-pypi-publish@release/v1
|
|
with:
|
|
packages-dir: ${{ inputs.working-directory }}/dist/
|
|
verbose: true
|
|
print-hash: true
|
|
# Temp workaround since attestations are on by default as of gh-action-pypi-publish v1.11.0
|
|
attestations: false
|
|
|
|
mark-release:
|
|
needs:
|
|
- build
|
|
- release-notes
|
|
- test-pypi-publish
|
|
- pre-release-checks
|
|
- publish
|
|
runs-on: ubuntu-latest
|
|
permissions:
|
|
# This permission is needed by `ncipollo/release-action` to
|
|
# create the GitHub release.
|
|
contents: write
|
|
|
|
defaults:
|
|
run:
|
|
working-directory: ${{ inputs.working-directory }}
|
|
|
|
steps:
|
|
- uses: actions/checkout@v4
|
|
|
|
- name: Set up Python + uv
|
|
uses: "./.github/actions/uv_setup"
|
|
with:
|
|
python-version: ${{ env.PYTHON_VERSION }}
|
|
|
|
- uses: actions/download-artifact@v5
|
|
with:
|
|
name: dist
|
|
path: ${{ inputs.working-directory }}/dist/
|
|
|
|
- name: Create Tag
|
|
uses: ncipollo/release-action@v1
|
|
with:
|
|
artifacts: "dist/*"
|
|
token: ${{ secrets.GITHUB_TOKEN }}
|
|
generateReleaseNotes: false
|
|
tag: ${{needs.build.outputs.pkg-name}}==${{ needs.build.outputs.version }}
|
|
body: ${{ needs.release-notes.outputs.release-body }}
|
|
commit: ${{ github.sha }}
|
|
makeLatest: ${{ needs.build.outputs.pkg-name == 'langchain-core'}}
|